870 resultados para METHODOLOGY
Resumo:
This paper proposes an interleaved multiphase buck converter with minimum time control strategy for envelope amplifiers in high efficiency RF power amplifiers. The solution of the envelope amplifier is to combine the proposed converter with a linear regulator in series. High system efficiency can be obtained through modulating the supply voltage of the envelope amplifier with the fast output voltage variation of the converter working with several particular duty cycles that achieve total ripple cancellation. The transient model for minimum time control is explained, and the calculation of transient times that are pre-calculated and inserted into a look-up table is presented. The filter design trade-off that limits capability of envelope modulation is also discussed. The experimental results verify the fast voltage transient obtained with a 4-phase buck prototype.
Resumo:
Several types of parallelism can be exploited in logic programs while preserving correctness and efficiency, i.e. ensuring that the parallel execution obtains the same results as the sequential one and the amount of work performed is not greater. However, such results do not take into account a number of overheads which appear in practice, such as process creation and scheduling, which can induce a slow-down, or, at least, limit speedup, if they are not controlled in some way. This paper describes a methodology whereby the granularity of parallel tasks, i.e. the work available under them, is efficiently estimated and used to limit parallelism so that the effect of such overheads is controlled. The run-time overhead associated with the approach is usually quite small, since as much work is done at compile time as possible. Also,a number of run-time optimizations are proposed. Moreover, a static analysis of the overhead associated with the granularity control process is performed in order to decide its convenience. The performance improvements resulting from the incorporation of grain size control are shown to be quite good, specially for systems with medium to large parallel execution overheads.
Resumo:
Changes in the geomorphology of rivers have serious repercussions, causing losses in the dynamics and naturalness of their forms, going in many cases, from a type of meandering channel, with constant erosion and sedimentation processes, to a channelized narrow river with rigid and stable margins, where the only possibility of movement occurs in the vertical, causing the only changes in channel geometry occur in the river bed. On the other hand, these changes seriously affect the naturalness of the banks, preventing the development of riparian vegetation and reducing the cross connectivity of the riparian corridor. Common canalizations and disconnections of meanders increase the slope, and therefore speed, resulting in processes of regressive erosion, effect increased as a result of the narrowing of the channel and the concentration of flows. This process of incision may turn the flood plain to be "hung", being completely disconnected from the water table, with important consequences for vegetation. As an example of the effects of these changes, it has been chosen the case of the Arga River The Arga river has been channelized and rectified, as it passes along the meander RamalHondo and Soto Gil (Funes, Navarra). The effects on fish habitat and riparian vegetation by remeandering the Arga River are presented. and Ttwo very contrasting situationsrestoration hypothesis, in terms of geomorphology concerns, have been established to assess the effects these changes have on the habitat of one of the major fish species in the area (Luciobabus graellsii) and on the riparian vegetation. To accomplish this goal, it has been necessary to used the a digital elevation model provided by LIDAR flight, bathymetric data, flow data, as inputs, and a hydraulic simulation model 2D (Infoworks RS). The results obtained not only helped to evaluate the effects of the past alterations of geomorphologic characteristics, but also to predict fish and vegetation habitat responses to this type of changes.
Resumo:
Current trends in the European Higher Education Area (EHEA) are moving towards the continuous evaluation of the students in substitution of the traditional evaluation based on a single test or exam. This fact and the increase in the number of students during last years in Engineering Schools, requires to modify evaluation procedures making them compatible with the educational and research activities. This work presents a methodology for the automatic generation of questions. These questions can be used as self assessment questions by the student and/or as queries by the teacher. The proposed approach is based on the utilization of parametric questions, formulated as multiple choice questions and generated and supported by the utilization of common programs of data sheets and word processors. Through this approach, every teacher can apply the proposed methodology without the use of programs or tools different from those normally used in his/her daily activity
Resumo:
During the Peninsular War, Napoleon's and Wellington's armies were aware of the lack of precision in the maps of Spain and its provinces that appeared in Tomas Lopez \s Geographical Atlas of Spain. The errors were due to the non-topographical surveying method he used which he had learned from his teacher Jean Baptiste Bourguignon D 'Anville. To map all of the Spanish provinces, Tomas Lopez divided them into circles of three leagues in diameter (16,718 m), taking a particular town as the centre. He asked the town's priest to draw a map of the territory and to complete a questionnaire that Tomas Lopez sent to him. The priest was to return the two documents after he had completed them. Subsequently, at his desk, Tomas Lopez used the maps and reports as well as other graphic and written sources from various locations to make an outline of each map. Next, he made a mosaic that served as a pattern for drawing the final provincial map. We will see the way that this method was applied in two concrete cases: the villages ofChavaler and Monteagudo, situated in the Spanish province of Soria, and verify their degree of accuracy. We will use the maps drawn by the priests in 1767, the final map of the province which was published in 1804 by Tomás López, and a current map of the province showing the angular and linear errors in Lopez \s map.
Resumo:
The aim of this research was to implement a methodology through the generation of a supervised classifier based on the Mahalanobis distance to characterize the grapevine canopy and assess leaf area and yield using RGB images. The method automatically processes sets of images, and calculates the areas (number of pixels) corresponding to seven different classes (Grapes, Wood, Background, and four classes of Leaf, of increasing leaf age). Each one is initialized by the user, who selects a set of representative pixels for every class in order to induce the clustering around them. The proposed methodology was evaluated with 70 grapevine (V. vinifera L. cv. Tempranillo) images, acquired in a commercial vineyard located in La Rioja (Spain), after several defoliation and de-fruiting events on 10 vines, with a conventional RGB camera and no artificial illumination. The segmentation results showed a performance of 92% for leaves and 98% for clusters, and allowed to assess the grapevine’s leaf area and yield with R2 values of 0.81 (p < 0.001) and 0.73 (p = 0.002), respectively. This methodology, which operates with a simple image acquisition setup and guarantees the right number and kind of pixel classes, has shown to be suitable and robust enough to provide valuable information for vineyard management.
Resumo:
We propose a computational methodology -"B-LOG"-, which offers the potential for an effective implementation of Logic Programming in a parallel computer. We also propose a weighting scheme to guide the search process through the graph and we apply the concepts of parallel "branch and bound" algorithms in order to perform a "best-first" search using an information theoretic bound. The concept of "session" is used to speed up the search process in a succession of similar queries. Within a session, we strongly modify the bounds in a local database, while bounds kept in a global database are weakly modified to provide a better initial condition for other sessions. We also propose an implementation scheme based on a database machine using "semantic paging", and the "B-LOG processor" based on a scoreboard driven controller.
Resumo:
This introduction gives a general perspective of the debugging methodology and the tools developed in the ESPRIT IV project DiSCiPl Debugging Systems for Constraint Programming. It has been prepared by the editors of this volume by substantial rewriting of the DiSCiPl deliverable CP Debugging Tools [1]. This introduction is organised as follows. Section 1 outlines the DiSCiPl view of debugging, its associated debugging methodology, and motivates the kinds of tools proposed: the assertion based tools, the declarative diagnoser and the visualisation tools. Sections 2 through 4 provide a short presentation of the tools of each kind. Finally, Section 5 presents a summary of the tools developed in the project. This introduction gives only a general view of the DiSCiPl debugging methodology and tools. For details and for specific bibliographic referenees the reader is referred to the subsequent chapters.
Resumo:
Mesh adaptation based on error estimation has become a key technique to improve th eaccuracy o fcomputational-fluid-dynamics computations. The adjoint-based approach for error estimation is one of the most promising techniques for computational-fluid-dynamics applications. Nevertheless, the level of implementation of this technique in the aeronautical industrial environment is still low because it is a computationally expensive method. In the present investigation, a new mesh refinement method based on estimation of truncation error is presented in the context of finite-volume discretization. The estimation method uses auxiliary coarser meshes to estimate the local truncation error, which can be used for driving an adaptation algorithm. The method is demonstrated in the context of two-dimensional NACA0012 and three-dimensional ONERA M6 wing inviscid flows, and the results are compared against the adjoint-based approach and physical sensors based on features of the flow field.
Resumo:
The Direct Boundary Element Method (DBEM) is presented to solve the elastodynamic field equations in 2D, and a complete comprehensive implementation is given. The DBEM is a useful approach to obtain reliable numerical estimates of site effects on seismic ground motion due to irregular geological configurations, both of layering and topography. The method is based on the discretization of the classical Somigliana's elastodynamic representation equation which stems from the reciprocity theorem. This equation is given in terms of the Green's function which is the full-space harmonic steady-state fundamental solution. The formulation permits the treatment of viscoelastic media, therefore site models with intrinsic attenuation can be examined. By means of this approach, the calculation of 2D scattering of seismic waves, due to the incidence of P and SV waves on irregular topographical profiles is performed. Sites such as, canyons, mountains and valleys in irregular multilayered media are computed to test the technique. The obtained transfer functions show excellent agreement with already published results.
Resumo:
Landcover is subject to continuous changes on a wide variety of temporal and spatial scales. Those changes produce significant effects in human and natural activities. Maintaining an updated spatial database with the occurred changes allows a better monitoring of the Earth?s resources and management of the environment. Change detection (CD) techniques using images from different sensors, such as satellite imagery, aerial photographs, etc., have proven to be suitable and secure data sources from which updated information can be extracted efficiently, so that changes can also be inventoried and monitored. In this paper, a multisource CD methodology for multiresolution datasets is applied. First, different change indices are processed, then different thresholding algorithms for change/no_change are applied to these indices in order to better estimate the statistical parameters of these categories, finally the indices are integrated into a change detection multisource fusion process, which allows generating a single CD result from several combination of indices. This methodology has been applied to datasets with different spectral and spatial resolution properties. Then, the obtained results are evaluated by means of a quality control analysis, as well as with complementary graphical representations. The suggested methodology has also been proved efficiently for identifying the change detection index with the higher contribution.
Resumo:
Greenhouse gas emission reduction is the pillar of the Kyoto Protocol and one of the main goals of the European Union (UE) energy policy. National reduction targets for EU member states and an overall target for the EU-15 (8%) were set by the Kyoto Protocol. This reduction target is based on emissions in the reference year (1990) and must be reached by 2012. EU energy policy does not set any national targets, only an overall reduction target of 20% by 2020. This paper transfers global greenhouse gas emission reduction targets in both these documents to the transport sector and specifically to CO2 emissions. It proposes a nonlinear distribution method with objective, dynamic targets for reducing CO2 emissions in the transport sector, according to the context and characteristics of each geographical area. First, we analyse CO2 emissions from transport in the reference year (1990) and their evolution from 1990 to 2007. We then propose a nonlinear methodology for distributing dynamic CO2 emission reduction targets. We have applied the proposed distribution function for 2012 and 2020 at two territorial levels (EU member states and Spanish autonomous regions). The weighted distribution is based on per capita CO2 emissions and CO2 emissions per gross domestic product. Finally, we show the weighted targets found for each EU member state and each Spanish autonomous region, compare them with the real achievements to date, and forecast the situation for the years the Kyoto and EU goals are to be met. The results underline the need for ?weighted? decentralised decisions to be made at different territorial levels with a view to achieving a common goal, so relative convergence of all the geographical areas is reached over time. Copyright © 2011 John Wiley & Sons, Ltd.
Resumo:
By using the spray pyrolysis methodology in its classical configuration we have grown self-assembled MgxZn1−xO quantum dots (size [similar]4–6 nm) in the overall range of compositions 0 ≤ x ≤ 1 on c-sapphire, Si (100) and quartz substrates. Composition of the quantum dots was determined by means of transmission electron microscopy-energy dispersive X-ray analysis (TEM-EDAX) and X-ray photoelectron spectroscopy. Selected area electron diffraction reveals the growth of single phase hexagonal MgxZn1−xO quantum dots with composition 0 ≤ x ≤ 0.32 by using a nominal concentration of Mg in the range 0 to 45%. Onset of Mg concentration about 50% (nominal) forces the hexagonal lattice to undergo a phase transition from hexagonal to a cubic structure which resulted in the growth of hexagonal and cubic phases of MgxZn1−xO in the intermediate range of Mg concentrations 50 to 85% (0.39 ≤ x ≤ 0.77), whereas higher nominal concentration of Mg ≥ 90% (0.81 ≤ x ≤ 1) leads to the growth of single phase cubic MgxZn1−xO quantum dots. High resolution transmission electron microscopy and fast Fourier transform confirm the results and show clearly distinguishable hexagonal and cubic crystal structures of the respective quantum dots. A difference of 0.24 eV was detected between the core levels (Zn 2p and Mg 1s) measured in quantum dots with hexagonal and cubic structures by X-ray photoemission. The shift of these core levels can be explained in the frame of the different coordination of cations in the hexagonal and cubic configurations. Finally, the optical absorption measurements performed on single phase hexagonal MgxZn1−xO QDs exhibited a clear shift in optical energy gap on increasing the Mg concentration from 0 to 40%, which is explained as an effect of substitution of Zn2+ by Mg2+ in the ZnO lattice.
Resumo:
Investigating cell dynamics during early zebrafish embryogenesis requires specific image acquisition and analysis strategies. Multiharmonic microscopy, i.e., second- and third-harmonic generations, allows imaging cell divisions and cell membranes in unstained zebrafish embryos from 1- to 1000-cell stage. This paper presents the design and implementation of a dedicated image processing pipeline (tracking and segmentation) for the reconstruction of cell dynamics during these developmental stages. This methodology allows the reconstruction of the cell lineage tree including division timings, spatial coordinates, and cell shape until the 1000-cell stage with minute temporal accuracy and micrometer spatial resolution. Data analysis of the digital embryos provides an extensive quantitative description of early zebrafish embryogenesis.
Resumo:
A methodology is presented to measure the fiber/matrix interface shear strength in composites. The strategy is based on performing a fiber push-in test at the central fiber of highly-packed fiber clusters with hexagonal symmetry which are often found in unidirectional composites with a high volume fraction of fibers. The mechanics of this test was analyzed in detail by means of three-dimensional finite element simulations. In particular, the influence of different parameters (interface shear strength, toughness and friction as well as fiber longitudinal elastic modulus and curing stresses) on the critical load at the onset of debonding was established. From the results of the numerical simulations, a simple relationship between the critical load and the interface shear strength is proposed. The methodology was validated in an unidirectional C/epoxy composite and the advantages and limitations of the proposed methodology are indicated.