960 resultados para equivalent web thickness method
Resumo:
This article proposes a systematic approach to determine the most suitable analogue redesign method to be used for forward-type converters under digital voltage mode control. The focus of the method is to achieve the highest phase margin at the particular switching and crossover frequencies chosen by the designer. It is shown that at high crossover frequencies with respect to switching frequency, controllers designed using backward integration have the largest phase margin; whereas at low crossover frequencies with respect to switching frequency, controllers designed using bilinear integration with pre-warping have the largest phase margins. An algorithm has been developed to determine the frequency of the crossing point where the recommended discretisation method changes. An accurate model of the power stage is used for simulation and experimental results from a Buck converter are collected. The performance of the digital controllers is compared to that of the equivalent analogue controller both in simulation and experiment. Excellent closeness between the simulation and experimental results is presented. This work provides a concrete example to allow academics and engineers to systematically choose a discretisation method.
Resumo:
This paper investigates the use of a particle filter for data assimilation with a full scale coupled ocean–atmosphere general circulation model. Synthetic twin experiments are performed to assess the performance of the equivalent weights filter in such a high-dimensional system. Artificial 2-dimensional sea surface temperature fields are used as observational data every day. Results are presented for different values of the free parameters in the method. Measures of the performance of the filter are root mean square errors, trajectories of individual variables in the model and rank histograms. Filter degeneracy is not observed and the performance of the filter is shown to depend on the ability to keep maximum spread in the ensemble.
Resumo:
Projections of Arctic sea ice thickness (SIT) have the potential to inform stakeholders about accessibility to the region, but are currently rather uncertain. The latest suite of CMIP5 Global Climate Models (GCMs) produce a wide range of simulated SIT in the historical period (1979–2014) and exhibit various biases when compared with the Pan-Arctic Ice Ocean Modelling and Assimilation System (PIOMAS) sea ice reanalysis. We present a new method to constrain such GCM simulations of SIT via a statistical bias correction technique. The bias correction successfully constrains the spatial SIT distribution and temporal variability in the CMIP5 projections whilst retaining the climatic fluctuations from individual ensemble members. The bias correction acts to reduce the spread in projections of SIT and reveals the significant contributions of climate internal variability in the first half of the century and of scenario uncertainty from mid-century onwards. The projected date of ice-free conditions in the Arctic under the RCP8.5 high emission scenario occurs in the 2050s, which is a decade earlier than without the bias correction, with potentially significant implications for stakeholders in the Arctic such as the shipping industry. The bias correction methodology developed could be similarly applied to other variables to reduce spread in climate projections more generally.
Resumo:
Accurate knowledge of ice-production rates within the marginal ice zones of the Arctic Ocean requires monitoring of the thin-ice distribution within polynyas. The thickness of the ice layer controls the heat loss and hence the new-ice formation. An established thinice algorithm using high-resolution MODIS data allows deriving the ice-thickness distribution within polynyas. The average uncertainty is ±4.7 cm for ice thicknesses below 0.2 m. In this study, the ice-thickness distributions within the Laptev Sea polynya for the two winter seasons 2007/08 and 2008/09 are calculated. Then, a new method is applied to determine a daily MODIS thin-ice product.
Resumo:
P>Estimates of effective elastic thickness (T(e)) for the western portion of the South American Plate using, independently, forward flexural modelling and coherence analysis, suggest different thermomechanical properties for the same continental lithosphere. We present a review of these T(e) estimates and carry out a critical reappraisal using a common methodology of 3-D finite element method to solve a differential equation for the bending of a thin elastic plate. The finite element flexural model incorporates lateral variations of T(e) and the Andes topography as the load. Three T(e) maps for the entire Andes were analysed: Stewart & Watts (1997), Tassara et al. (2007) and Perez-Gussinye et al. (2007). The predicted flexural deformation obtained for each T(e) map was compared with the depth to the base of the foreland basin sequence. Likewise, the gravity effect of flexurally induced crust-mantle deformation was compared with the observed Bouguer gravity. T(e) estimates using forward flexural modelling by Stewart & Watts (1997) better predict the geological and gravity data for most of the Andean system, particularly in the Central Andes, where T(e) ranges from greater than 70 km in the sub-Andes to less than 15 km under the Andes Cordillera. The misfit between the calculated and observed foreland basin subsidence and the gravity anomaly for the Maranon basin in Peru and the Bermejo basin in Argentina, regardless of the assumed T(e) map, may be due to a dynamic topography component associated with the shallow subduction of the Nazca Plate beneath the Andes at these latitudes.
Resumo:
A method for linearly constrained optimization which modifies and generalizes recent box-constraint optimization algorithms is introduced. The new algorithm is based on a relaxed form of Spectral Projected Gradient iterations. Intercalated with these projected steps, internal iterations restricted to faces of the polytope are performed, which enhance the efficiency of the algorithm. Convergence proofs are given and numerical experiments are included and commented. Software supporting this paper is available through the Tango Project web page: http://www.ime.usp.br/similar to egbirgin/tango/.
Resumo:
The synthesis of Y(0.9)Er(0.1)Al(3)(BO(3))(4) crystalline powders and vitreous thin films were studied. Precursor solutions were obtained using a modified polymeric precursor method using D-sorbitol as complexant agent. The chemical reactions were described. Y(0.)9Er(0.1)Al(3)(BO(3))(4) composition presents good thermal stability with regard to crystallization. The Y(0.9)Er(0.1)Al(3)(BO(3))(4) crystallized phase can be obtained at 1,150 degrees C, in agreement with other authors. Crack- and porosity-free films were obtained with very small grain size and low RMS roughness. The films thickness revealed to be linearly dependent on precursor solution viscosity, being the value of 25 mPa s useful to prepare high-quality amorphous multi-layers (up to similar to 800 nm) at 740 degrees C during 2 h onto silica substrates by spin coating with a gyrset technology.
Resumo:
The fabrication of controlled molecular architectures is essential for organic devices, as is the case of emission of polarized light for the information industry. In this study, we show that optimized conditions can be established to allow layer-by-layer (LbL) films of poly(p-phenylene vinylene) (PPV)+dodecylbenzenesulfonate (DBS) to be obtained with anisotropic properties. Films with five layers and converted at 110 degrees C had a dichroic ratio delta = 2.3 and order parameter r = 34%, as indicated in optical spectroscopy and emission ellipsometry data. This anisotropy was decreased with the number of layers deposited, with delta = 1.0 for a 75-layer LbL PPV + DBS film. The analysis with atomic force microscopy showed the formation of polymer clusters in a random growth process with the normalized height distribution being represented by a Gaussian function. In spite of this randomness in film growth, the self-covariance function pointed to a correlation between clusters, especially for thick films. In summary, the LbL method may be exploited to obtain both anisotropic films with polarized emission and regular, nanostructured surfaces. (c) 2010 Wiley Periodicals, Inc. J Polym Sci Part B: Polym Phys 49: 206-213, 2011
Resumo:
Bismuth germanate films were prepared by dip coating and spin coating techniques and the dependence of the luminescent properties of the samples on the resin viscosity and deposition technique was investigated. The resin used for the preparation of the films was obtained via Pechini method, employing the precursors Bi(2)O(3) and GeO(2). Citric acid and ethylene glycol were used as chelating and cross-linking agents, respectively. Results from X-ray diffraction and Raman spectroscopy indicated that the films sintered at 700 degrees C for 10 h presented the single crystalline phase Bi(4)Ge(3)O(12). SEM images of the films have shown that homogeneous flat films can be produced by the two techniques investigated. All the samples presented the typical Bi(4)Ge(3)O(12) emission band centred at 505 nm. Films with 3.1 mu m average thickness presented 80% of the luminescence intensity registered for the single crystal at the maximum wavelength. Published by Elsevier B.V.
Resumo:
Optimization methods that employ the classical Powell-Hestenes-Rockafellar augmented Lagrangian are useful tools for solving nonlinear programming problems. Their reputation decreased in the last 10 years due to the comparative success of interior-point Newtonian algorithms, which are asymptotically faster. In this research, a combination of both approaches is evaluated. The idea is to produce a competitive method, being more robust and efficient than its `pure` counterparts for critical problems. Moreover, an additional hybrid algorithm is defined, in which the interior-point method is replaced by the Newtonian resolution of a Karush-Kuhn-Tucker (KKT) system identified by the augmented Lagrangian algorithm. The software used in this work is freely available through the Tango Project web page:http://www.ime.usp.br/similar to egbirgin/tango/.
Resumo:
The aim of this study was to develop a fast capillary electrophoresis method for the determination of propranolol in pharmaceutical preparations. In the method development the pH and constituents of the background electrolyte were selected using the effective mobility versus pH curves. Benzylamine was used as the internal standard. The background electrolyte was composed of 60 mmol L(-1) tris(hydroxymethyl)aminomethane and 30 mmol L(-1) 2-hydroxyisobutyric acid,at pH 8.1. Separation was conducted in a fused-silica capillary (32 cm total length and 8.5 cm effective length, 50 mu m I.D.) with a short-end injection configuration and direct UV detection at 214 nm. The run time was only 14 s. Three different strategies were studied in order to develop a fast CE method with low total analysis time for propranolol analysis: low flush time (Lflush) 35 runs/h, without flush (Wflush) 52 runs/h, and Invert (switched polarity) 45 runs/h. Since the three strategies developed are statistically equivalent, Mush was selected due to the higher analytical frequency in comparison with the other methods. A few figures of merit of the proposed method include: good linearity (R(2) > 0.9999); limit of detection of 0.5 mg L(-1): inter-day precision better than 1.03% (n = 9) and recovery in the range of 95.1-104.5%. (C) 2009 Elsevier B.V. All rights reserved.
Resumo:
The purpose of this presentation is to introduce the research project progress in “the mapping of pedagogical methods in web-based language teaching" by Högskolan Dalarna (Dalarna University). This project will identify the differences in pedagogical methods that are used for online language classes. The pedagogical method defined in this project is what the teachers do to ensure students attain the learning outcomes, for example, planning, designing courses, leading students, knowing students' abilities, implementing activities, etc. So far the members of this project have analyzed the course plans (in the language department at Dalarna University) and categorized the learning outcomes. A questionnaire was constructed based on the learning outcomes and then either sent out remotely to teachers or completed face to face through interviews. The answers provided to the questionnaires enabled the project to identify many differences in how language teachers interact with their students but also, the way of giving feedback, motivating and helping students, types of class activities and materials used. This presentation introduces the progress of the project and identifies the challenges at the language department at Dalarna University. Finally, the advantages and problems of online language proficiency courses will be discussed and suggestions made for future improvement.
Resumo:
With the constant grow of enterprises and the need to share information across departments and business areas becomes more critical, companies are turning to integration to provide a method for interconnecting heterogeneous, distributed and autonomous systems. Whether the sales application needs to interface with the inventory application, the procurement application connect to an auction site, it seems that any application can be made better by integrating it with other applications. Integration between applications can face several troublesome due the fact that applications may not have been designed and implemented having integration in mind. Regarding to integration issues, two tier software systems, composed by the database tier and by the “front-end” tier (interface), have shown some limitations. As a solution to overcome the two tier limitations, three tier systems were proposed in the literature. Thus, by adding a middle-tier (referred as middleware) between the database tier and the “front-end” tier (or simply referred application), three main benefits emerge. The first benefit is related with the fact that the division of software systems in three tiers enables increased integration capabilities with other systems. The second benefit is related with the fact that any modifications to the individual tiers may be carried out without necessarily affecting the other tiers and integrated systems and the third benefit, consequence of the others, is related with less maintenance tasks in software system and in all integrated systems. Concerning software development in three tiers, this dissertation focus on two emerging technologies, Semantic Web and Service Oriented Architecture, combined with middleware. These two technologies blended with middleware, which resulted in the development of Swoat framework (Service and Semantic Web Oriented ArchiTecture), lead to the following four synergic advantages: (1) allow the creation of loosely-coupled systems, decoupling the database from “front-end” tiers, therefore reducing maintenance; (2) the database schema is transparent to “front-end” tiers which are aware of the information model (or domain model) that describes what data is accessible; (3) integration with other heterogeneous systems is allowed by providing services provided by the middleware; (4) the service request by the “frontend” tier focus on ‘what’ data and not on ‘where’ and ‘how’ related issues, reducing this way the application development time by developers.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
This dissertation of Mestrado investigated the performance and quality of web sites. The target of the research is the proposal of an integrated model of evaluation of services of digital information in web educational sites. The universe of the research was constituted by eighteen Brazilian Universities that offer after-graduation courses, in the levels of mestrado and doutorado in the area of Engineering of Production. The adopted methodology was a descriptive and exploratory research, using the technique of systematic comment and focus group, for the collection of the data, using itself changeable independent dependents and, through the application of two instruments of research. The analysis protocol was the instrument adopted for evaluation and attainment of qualitative results. E the analysis grating was applied for evaluation and attainment of the quantitative results. The qualitative results had identified to the lack of standardization of web sites, under the attributes of content, hierarchy of information, design of the colors and letters. It of accessibility for carriers of auditory and visual special necessities was observed inexistence, as well as the lack of convergence of medias and assistivas technologies. The language of the sites also was evaluated and all present Portuguese only language. The general result demonstrated in grafico and tables with classification of the Universities, predominating the Good note As for the quantitative results, analysis method ed was estatistico, in order to get the descriptive and inferencial result between the dependent and independent variaveis. How much a category of analysis of the services of the evaluated sites, was found it props up and the index generality weighed. These results had served of base for ranking of existence or inexistence the Universities, how much of the information of services in its web sites. In analysis inferencial the result of the test of correlation or association of the independent variaveis (level, concept of the CAPES and period of existence of the program) with the caracteristicas, called was gotten categories of services. For this analysis the estatisticos methods had been used: coefficient of Spearman and the Test of Fisher. But the category you discipline of the Program of Mestrado presented significance with variavel independent and concept of the CAPES. Main conclusion of this study it was ausencia of satandardization o how much to the subjective aspects, design, hierarchy of information navigability and content precision and the accessibility inexistence and convergence. How much to the quantitative aspects, the information services offered by web sites of the evaluated Universities, still they do not present a satisfactory and including quality. Absence of strategies, adoption of tools web, techniques of institucional marketing and services that become them more interactive, navigable is perceived and with aggregate value