93 resultados para wavelength tuning
Resumo:
Este proyecto tiene como objetivo principal el análisis de experiencias enseñanza - aprendizaje de la competencia comunicativa, en el proceso de adaptación de los planes de estudio de Grados de Ingeniería Informática.Además incluye la reflexión sobre las competencias genéricas aprendidas en el propio plan de estudios cursado en la UOC.Todo ello dentro del contexto actual de proceso de adaptación al EEES de los planes de estudio de las universidades y de la importancia que se da al desarrollo de las competencias genéricas y específicas de cada disciplina, según las directrices marcadas por el Proyecto Tuning.El primer paso ha sido identificar las competencias genéricas que debe desarrollar un Ingeniero en Informática, basadas en referencias existentes como es el libro blanco del Título de Grado en Ingeniería Informática.El segundo paso ha incluido el análisis específico de la competencia instrumental de comunicación escrita, como ejemplo representativo de desarrollo de competencia genérica dentro del mundo académico. Para lo cual se han buscado referentes principales sobre la materia en artículos de expertos que tratasen sobre metodologías de enseñanza aprendizaje y evaluación de competencias trasversales en los planes de estudio de Ingeniería Informática. Una vez se tenían clasificados los métodos más destacados de estos expertos, se ha analizado una muestra de planes docentes de Grado de Ingeniería Informática del curso académico 2010-2011, de Universidades Españolas. El objetivo perseguido era determinar en qué grado incorporan los métodos nombrados por los expertos para el desarrollo de la competencia de comunicación escrita.Finalmente, y en base al conjunto de competencias genéricas identificadas como las que debe adquirir un Ingeniero en Informática, se ha reflexionado sobre el desarrollo de las mismas en el plan de estudios cursado en la UOC y si el propio modelo educativo favorece en alguna medida este desarrollo.
Resumo:
At CoDaWork'03 we presented work on the analysis of archaeological glass composi-tional data. Such data typically consist of geochemical compositions involving 10-12variables and approximates completely compositional data if the main component, sil-ica, is included. We suggested that what has been termed `crude' principal componentanalysis (PCA) of standardized data often identi ed interpretable pattern in the datamore readily than analyses based on log-ratio transformed data (LRA). The funda-mental problem is that, in LRA, minor oxides with high relative variation, that maynot be structure carrying, can dominate an analysis and obscure pattern associatedwith variables present at higher absolute levels. We investigate this further using sub-compositional data relating to archaeological glasses found on Israeli sites. A simplemodel for glass-making is that it is based on a `recipe' consisting of two `ingredients',sand and a source of soda. Our analysis focuses on the sub-composition of componentsassociated with the sand source. A `crude' PCA of standardized data shows two clearcompositional groups that can be interpreted in terms of di erent recipes being used atdi erent periods, reected in absolute di erences in the composition. LRA analysis canbe undertaken either by normalizing the data or de ning a `residual'. In either case,after some `tuning', these groups are recovered. The results from the normalized LRAare di erently interpreted as showing that the source of sand used to make the glassdi ered. These results are complementary. One relates to the recipe used. The otherrelates to the composition (and presumed sources) of one of the ingredients. It seemsto be axiomatic in some expositions of LRA that statistical analysis of compositionaldata should focus on relative variation via the use of ratios. Our analysis suggests thatabsolute di erences can also be informative
Resumo:
A recent study defines a new network plane: the knowledge plane. The incorporation of the knowledge plane over the network allows having more accurate information of the current and future network states. In this paper, the introduction and management of the network reliability information in the knowledge plane is proposed in order to improve the quality of service with protection routing algorithms in GMPLS over WDM networks. Different experiments prove the efficiency and scalability of the proposed scheme in terms of the percentage of resources used to protect the network
Resumo:
The effects of the nongray absorption (i.e., atmospheric opacity varying with wavelength) on the possible upper bound of the outgoing longwave radiation (OLR) emitted by a planetary atmosphere have been examined. This analysis is based on the semigray approach, which appears to be a reasonable compromise between the complexity of nongray models and the simplicity of the gray assumption (i.e., atmospheric absorption independent of wavelength). Atmospheric gases in semigray atmospheres make use of constant absorption coefficients in finite-width spectral bands. Here, such a semigray absorption is introduced in a one-dimensional (1D) radiative– convective model with a stratosphere in radiative equilibrium and a troposphere fully saturated with water vapor, which is the semigray gas. A single atmospheric window in the infrared spectrum has been assumed. In contrast to the single absolute limit of OLR found in gray atmospheres, semigray ones may also show a relative limit. This means that both finite and infinite runaway effects may arise in some semigray cases. Of particular importance is the finding of an entirely new branch of stable steady states that does not appear in gray atmospheres. This new multiple equilibrium is a consequence of the nongray absorption only. It is suspected that this new set of stable solutions has not been previously revealed in analyses of radiative–convective models since it does not appear for an atmosphere with nongray parameters similar to those for the earth’s current state
Resumo:
The longwave emission of planetary atmospheres that contain a condensable absorbing gas in the infrared (i.e., longwave), which is in equilibrium with its liquid phase at the surface, may exhibit an upper bound. Here we analyze the effect of the atmospheric absorption of sunlight on this radiation limit. We assume that the atmospheric absorption of infrared radiation is independent of wavelength except within the spectral width of the atmospheric window, where it is zero. The temperature profile in radiative equilibrium is obtained analytically as a function of the longwave optical thickness. For illustrative purposes, numerical values for the infrared atmospheric absorption (i.e., greenhouse effect) and the liquid vapor equilibrium curve of the condensable absorbing gas refer to water. Values for the atmospheric absorption of sunlight (i.e., antigreenhouse effect) take a wide range since our aim is to provide a qualitative view of their effects. We find that atmospheres with a transparent region in the infrared spectrum do not present an absolute upper bound on the infrared emission. This result may be also found in atmospheres opaque at all infrared wavelengths if the fraction of absorbed sunlight in the atmosphere increases with the longwave opacity
Resumo:
From a managerial point of view, the more effcient, simple, and parameter-free (ESP) an algorithm is, the more likely it will be used in practice for solving real-life problems. Following this principle, an ESP algorithm for solving the Permutation Flowshop Sequencing Problem (PFSP) is proposed in this article. Using an Iterated Local Search (ILS) framework, the so-called ILS-ESP algorithm is able to compete in performance with other well-known ILS-based approaches, which are considered among the most effcient algorithms for the PFSP. However, while other similar approaches still employ several parameters that can affect their performance if not properly chosen, our algorithm does not require any particular fine-tuning process since it uses basic "common sense" rules for the local search, perturbation, and acceptance criterion stages of the ILS metaheuristic. Our approach defines a new operator for the ILS perturbation process, a new acceptance criterion based on extremely simple and transparent rules, and a biased randomization process of the initial solution to randomly generate different alternative initial solutions of similar quality -which is attained by applying a biased randomization to a classical PFSP heuristic. This diversification of the initial solution aims at avoiding poorly designed starting points and, thus, allows the methodology to take advantage of current trends in parallel and distributed computing. A set of extensive tests, based on literature benchmarks, has been carried out in order to validate our algorithm and compare it against other approaches. These tests show that our parameter-free algorithm is able to compete with state-of-the-art metaheuristics for the PFSP. Also, the experiments show that, when using parallel computing, it is possible to improve the top ILS-based metaheuristic by just incorporating to it our biased randomization process with a high-quality pseudo-random number generator.
Resumo:
Business organisations are excellent representations of what in physics and mathematics are designated "chaotic" systems. Because a culture of innovation will be vital for organisational survival in the 21st century, the present paper proposes that viewing organisations in terms of "complexity theory" may assist leaders in fine-tuning managerial philosophies that provide orderly management emphasizing stability within a culture of organised chaos, for it is on the "boundary of chaos" that the greatest creativity occurs. It is argued that 21st century companies, as chaotic social systems, will no longer be effectively managed by rigid objectives (MBO) nor by instructions (MBI). Their capacity for self-organisation will be derived essentially from how their members accept a shared set of values or principles for action (MBV). Complexity theory deals with systems that show complex structures in time or space, often hiding simple deterministic rules. This theory holds that once these rules are found, it is possible to make effective predictions and even to control the apparent complexity. The state of chaos that self-organises, thanks to the appearance of the "strange attractor", is the ideal basis for creativity and innovation in the company. In this self-organised state of chaos, members are not confined to narrow roles, and gradually develop their capacity for differentiation and relationships, growing continuously toward their maximum potential contribution to the efficiency of the organisation. In this way, values act as organisers or "attractors" of disorder, which in the theory of chaos are equations represented by unusually regular geometric configurations that predict the long-term behaviour of complex systems. In business organisations (as in all kinds of social systems) the starting principles end up as the final principles in the long term. An attractor is a model representation of the behavioral results of a system. The attractor is not a force of attraction or a goal-oriented presence in the system; it simply depicts where the system is headed based on its rules of motion. Thus, in a culture that cultivates or shares values of autonomy, responsibility, independence, innovation, creativity, and proaction, the risk of short-term chaos is mitigated by an overall long-term sense of direction. A more suitable approach to manage the internal and external complexities that organisations are currently confronting is to alter their dominant culture under the principles of MBV.
Resumo:
Business organisations are excellent representations of what in physics and mathematics are designated "chaotic" systems. Because a culture of innovation will be vital for organisational survival in the 21st century, the present paper proposes that viewing organisations in terms of "complexity theory" may assist leaders in fine-tuning managerial philosophies that provide orderly management emphasizing stability within a culture of organised chaos, for it is on the "boundary of chaos" that the greatest creativity occurs. It is argued that 21st century companies, as chaotic social systems, will no longer be effectively managed by rigid objectives (MBO) nor by instructions (MBI). Their capacity for self-organisation will be derived essentially from how their members accept a shared set of values or principles for action (MBV). Complexity theory deals with systems that show complex structures in time or space, often hiding simple deterministic rules. This theory holds that once these rules are found, it is possible to make effective predictions and even to control the apparent complexity. The state of chaos that self-organises, thanks to the appearance of the "strange attractor", is the ideal basis for creativity and innovation in the company. In this self-organised state of chaos, members are not confined to narrow roles, and gradually develop their capacity for differentiation and relationships, growing continuously toward their maximum potential contribution to the efficiency of the organisation. In this way, values act as organisers or "attractors" of disorder, which in the theory of chaos are equations represented by unusually regular geometric configurations that predict the long-term behaviour of complex systems. In business organisations (as in all kinds of social systems) the starting principles end up as the final principles in the long term. An attractor is a model representation of the behavioral results of a system. The attractor is not a force of attraction or a goal-oriented presence in the system; it simply depicts where the system is headed based on its rules of motion. Thus, in a culture that cultivates or shares values of autonomy, responsibility, independence, innovation, creativity, and proaction, the risk of short-term chaos is mitigated by an overall long-term sense of direction. A more suitable approach to manage the internal and external complexities that organisations are currently confronting is to alter their dominant culture under the principles of MBV.
Resumo:
The main information sources to study a particular piece of music are symbolic scores and audio recordings. These are complementary representations of the piece and it isvery useful to have a proper linking between the two of the musically meaningful events. For the case of makam music of Turkey, linking the available scores with the correspondingaudio recordings requires taking the specificities of this music into account, such as the particular tunings, the extensive usage of non-notated expressive elements, and the way in which the performer repeats fragmentsof the score. Moreover, for most of the pieces of the classical repertoire, there is no score written by the original composer. In this paper, we propose a methodology to pair sections of a score to the corresponding fragments of audio recording performances. The pitch information obtained from both sources is used as the common representationto be paired. From an audio recording, fundamental frequency estimation and tuning analysis is done to compute a pitch contour. From the corresponding score, symbolic note names and durations are converted to a syntheticpitch contour. Then, a linking operation is performed between these pitch contours in order to find the best correspondences.The method is tested on a dataset of 11 compositions spanning 44 audio recordings, which are mostly monophonic. An F3-score of 82% and 89% are obtained with automatic and semi-automatic karar detection respectively,showing that the methodology may give us a needed tool for further computational tasks such as form analysis, audio-score alignment and makam recognition.
Resumo:
The spread of mineral particles over southwestern, western, and central Europeresulting from a strong Saharan dust outbreak in October 2001 was observed at10 stations of the European Aerosol Research Lidar Network (EARLINET). For the firsttime, an optically dense desert dust plume over Europe was characterized coherentlywith high vertical resolution on a continental scale. The main layer was located abovethe boundary layer (above 1-km height above sea level (asl)) up to 3–5-km height, andtraces of dust particles reached heights of 7–8 km. The particle optical depth typicallyranged from 0.1 to 0.5 above 1-km height asl at the wavelength of 532 nm, andmaximum values close to 0.8 were found over northern Germany. The lidar observationsare in qualitative agreement with values of optical depth derived from Total OzoneMapping Spectrometer (TOMS) data. Ten-day backward trajectories clearly indicated theSahara as the source region of the particles and revealed that the dust layer observed,e.g., over Belsk, Poland, crossed the EARLINET site Aberystwyth, UK, and southernScandinavia 24–48 hours before. Lidar-derived particle depolarization ratios,backscatter- and extinction-related A ° ngstro¨m exponents, and extinction-to-backscatterratios mainly ranged from 15 to 25%, 0.5 to 0.5, and 40–80 sr, respectively, within thelofted dust plumes. A few atmospheric model calculations are presented showing the dustconcentration over Europe. The simulations were found to be consistent with thenetwork observations.
Resumo:
Desarrollada dentro de la empresa fabricante de herramientas de corte Myfhe, S.A., esta aplicación web multilingüe cataloga el amplio programa de productos de este exportador mundial. Se presentan también una interfaz asistida (para usuarios noveles) así como otra interfaz manual (para usuarios expertos) que permiten generar el desglose de productos, el plano acotado de estos mismos ensamblados y otra información técnica una vez el usuario completa un sencillo cuestionario. De esta manera, se consigue automatizar – mejorando en tiempo y en calidad – una tarea que es imprescindible realizar siempre que se compran los productos de esta empresa y antes de la puesta en marcha de estos en el taller.
Resumo:
A considerable fraction of the -ray sources discovered with the Energetic Gamma-Ray Experiment Telescope (EGRET) remain unidentified. The EGRET sources that have been properly identified are either pulsars or variable sources at both radio and gamma-ray wavelengths. Most of the variable sources are strong radio blazars. However, some low galactic-latitude EGRET sources, with highly variable -ray emission, lack any evident counterpart according to the radio data available until now. Aims. The primary goal of this paper is to identify and characterise the potential radio counterparts of four highly variable -ray sources in the galactic plane through mapping the radio surroundings of the EGRET confidence contours and determining the variable radio sources in the field whenever possible. Methods. We have carried out a radio exploration of the fields of the selected EGRET sources using the Giant Metrewave Radio Telescope (GMRT) interferometer at 21 cm wavelength, with pointings being separated by months. Results. We detected a total of 151 radio sources. Among them, we identified a few radio sources whose flux density has apparently changed on timescales of months. Despite the limitations of our search, their possible variability makes these objects a top-priority target for multiwavelength studies of the potential counterparts of highly variable, unidentified gamma-ray sources.
Resumo:
We present an extensive study of the structural and optical emission properties in aluminum silicates and soda-lime silicates codoped with Si nanoclusters (Si-nc) and Er. Si excess of 5 and 15¿at.¿% and Er concentrations ranging from 2×1019 up to 6×1020¿cm¿3 were introduced by ion implantation. Thermal treatments at different temperatures were carried out before and after Er implantation. Structural characterization of the resulting structures was performed to obtain the layer composition and the size distribution of Si clusters. A comprehensive study has been carried out of the light emission as a function of the matrix characteristics, Si and Er contents, excitation wavelength, and power. Er emission at 1540¿nm has been detected in all coimplanted glasses, with similar intensities. We estimated lifetimes ranging from 2.5¿to¿12¿ms (depending on the Er dose and Si excess) and an effective excitation cross section of about 1×10¿17¿cm2 at low fluxes that decreases at high pump power. By quantifying the amount of Er ions excited through Si-nc we find a fraction of 10% of the total Er concentration. Upconversion coefficients of about 3×10¿18¿cm¿3¿s¿1 have been found for soda-lime glasses and one order of magnitude lower in aluminum silicates.
Resumo:
The sensitizing action of amorphous silicon nanoclusters on erbium ions in thin silica films has been studied under low-energy (long wavelength) optical excitation. Profound differences in fast visible and infrared emission dynamics have been found with respect to the high-energy (shortwavelength) case. These findings point out to a strong dependence of the energy transfer process on the optical excitation energy. Total inhibition of energy transfer to erbium states higher than thefirst excited state (4I13/2) has been demonstrated for excitation energy below 1.82 eV (excitation wavelength longer than 680 nm). Direct excitation of erbium ions to the first excited state (4I13/2)has been confirmed to be the dominant energy transfer mechanism over the whole spectral range of optical excitation used (540 nm¿680 nm).
Resumo:
Power leakage properties and guiding conditions of rib antiresonant reflecting optical waveguides (rib-ARROW) have been theoretically and experimentally studied as a function of wavelength and polarization of the light for different geometrical and optical parameters that characterize the rib-ARROW structure. Obtained results show that rib-ARROWs can only be fabricated with low losses in a wavelength range when determined rib configurations are adopted. Furthermore, these waveguides exhibit a polarization sensitivity that largely depends on the core-substrate refractive index difference. Together with the experimental results, theoretical calculations from different modeling methods are also presented and discussed.