970 resultados para Modeling Techniques


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Spatio-temporal modelling is an area of increasing importance in which models and methods have often been developed to deal with specific applications. In this study, a spatio-temporal model was used to estimate daily rainfall data. Rainfall records from several weather stations, obtained from the Agritempo system for two climatic homogeneous zones, were used. Rainfall values obtained for two fixed dates (January 1 and May 1, 2012) using the spatio-temporal model were compared with the geostatisticals techniques of ordinary kriging and ordinary cokriging with altitude as auxiliary variable. The spatio-temporal model was more than 17% better at producing estimates of daily precipitation compared to kriging and cokriging in the first zone and more than 18% in the second zone. The spatio-temporal model proved to be a versatile technique, adapting to different seasons and dates.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Early water resources modeling efforts were aimed mostly at representing hydrologic processes, but the need for interdisciplinary studies has led to increasing complexity and integration of environmental, social, and economic functions. The gradual shift from merely employing engineering-based simulation models to applying more holistic frameworks is an indicator of promising changes in the traditional paradigm for the application of water resources models, supporting more sustainable management decisions. This dissertation contributes to application of a quantitative-qualitative framework for sustainable water resources management using system dynamics simulation, as well as environmental systems analysis techniques to provide insights for water quality management in the Great Lakes basin. The traditional linear thinking paradigm lacks the mental and organizational framework for sustainable development trajectories, and may lead to quick-fix solutions that fail to address key drivers of water resources problems. To facilitate holistic analysis of water resources systems, systems thinking seeks to understand interactions among the subsystems. System dynamics provides a suitable framework for operationalizing systems thinking and its application to water resources problems by offering useful qualitative tools such as causal loop diagrams (CLD), stock-and-flow diagrams (SFD), and system archetypes. The approach provides a high-level quantitative-qualitative modeling framework for "big-picture" understanding of water resources systems, stakeholder participation, policy analysis, and strategic decision making. While quantitative modeling using extensive computer simulations and optimization is still very important and needed for policy screening, qualitative system dynamics models can improve understanding of general trends and the root causes of problems, and thus promote sustainable water resources decision making. Within the system dynamics framework, a growth and underinvestment (G&U) system archetype governing Lake Allegan's eutrophication problem was hypothesized to explain the system's problematic behavior and identify policy leverage points for mitigation. A system dynamics simulation model was developed to characterize the lake's recovery from its hypereutrophic state and assess a number of proposed total maximum daily load (TMDL) reduction policies, including phosphorus load reductions from point sources (PS) and non-point sources (NPS). It was shown that, for a TMDL plan to be effective, it should be considered a component of a continuous sustainability process, which considers the functionality of dynamic feedback relationships between socio-economic growth, land use change, and environmental conditions. Furthermore, a high-level simulation-optimization framework was developed to guide watershed scale BMP implementation in the Kalamazoo watershed. Agricultural BMPs should be given priority in the watershed in order to facilitate cost-efficient attainment of the Lake Allegan's TP concentration target. However, without adequate support policies, agricultural BMP implementation may adversely affect the agricultural producers. Results from a case study of the Maumee River basin show that coordinated BMP implementation across upstream and downstream watersheds can significantly improve cost efficiency of TP load abatement.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

For the past three decades the automotive industry is facing two main conflicting challenges to improve fuel economy and meet emissions standards. This has driven the engineers and researchers around the world to develop engines and powertrain which can meet these two daunting challenges. Focusing on the internal combustion engines there are very few options to enhance their performance beyond the current standards without increasing the price considerably. The Homogeneous Charge Compression Ignition (HCCI) engine technology is one of the combustion techniques which has the potential to partially meet the current critical challenges including CAFE standards and stringent EPA emissions standards. HCCI works on very lean mixtures compared to current SI engines, resulting in very low combustion temperatures and ultra-low NOx emissions. These engines when controlled accurately result in ultra-low soot formation. On the other hand HCCI engines face a problem of high unburnt hydrocarbon and carbon monoxide emissions. This technology also faces acute combustion controls problem, which if not dealt properly with yields highly unfavorable operating conditions and exhaust emissions. This thesis contains two main parts. One part deals in developing an HCCI experimental setup and the other focusses on developing a grey box modelling technique to control HCCI exhaust gas emissions. The experimental part gives the complete details on modification made on the stock engine to run in HCCI mode. This part also comprises details and specifications of all the sensors, actuators and other auxiliary parts attached to the conventional SI engine in order to run and monitor the engine in SI mode and future SI-HCCI mode switching studies. In the latter part around 600 data points from two different HCCI setups for two different engines are studied. A grey-box model for emission prediction is developed. The grey box model is trained with the use of 75% data and the remaining data is used for validation purpose. An average of 70% increase in accuracy for predicting engine performance is found while using the grey-box over an empirical (black box) model during this study. The grey-box model provides a solution for the difficulty faced for real time control of an HCCI engine. The grey-box model in this thesis is the first study in literature to develop a control oriented model for predicting HCCI engine emissions for control.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The thermoset epoxy resin EPON 862, coupled with the DETDA hardening agent, are utilized as the polymer matrix component in many graphite (carbon fiber) composites. Because it is difficult to experimentally characterize the interfacial region, computational molecular modeling is a necessary tool for understanding the influence of the interfacial molecular structure on bulk-level material properties. The purpose of this research is to investigate the many possible variables that may influence the interfacial structure and the effect they will have on the mechanical behavior of the bulk level composite. Molecular models are established for EPON 862-DETDA polymer in the presence of a graphite surface. Material characteristics such as polymer mass-density, residual stresses, and molecular potential energy are investigated near the polymer/fiber interface. Because the exact degree of crosslinking in these thermoset systems is not known, many different crosslink densities (degrees of curing) are investigated. It is determined that a region exists near the carbon fiber surface in which the polymer mass density is different than that of the bulk mass density. These surface effects extend ~10 Å into the polymer from the center of the outermost graphite layer. Early simulations predict polymer residual stress levels to be higher near the graphite surface. It is also seen that the molecular potential energy in polymer atoms decreases with increasing crosslink density. New models are then established in order to investigate the interface between EPON 862-DETDA polymer and graphene nanoplatelets (GNPs) of various atomic thicknesses. Mechanical properties are extracted from the models using Molecular Dynamics techniques. These properties are then implemented into micromechanics software that utilizes the generalized method of cells to create representations of macro-scale composites. Micromechanics models are created representing GNP doped epoxy with varying number of graphene layers and interfacial polymer crosslink densities. The initial micromechanics results for the GNP doped epoxy are then taken to represent the matrix component and are re-run through the micromechanics software with the addition of a carbon fiber to simulate a GNP doped epoxy/carbon fiber composite. Micromechanics results agree well with experimental data, and indicate GNPs of 1 to 2 atomic layers to be highly favorable. The effect of oxygen bonded to the surface of the GNPs is lastly investigated. Molecular Models are created for systems with varying graphene atomic thickness, along with different amounts of oxygen species attached to them. Models are created for graphene containing hydroxyl groups only, epoxide groups only, and a combination of epoxide and hydroxyl groups. Results show models of oxidized graphene to decrease in both tensile and shear modulus. Attaching only epoxide groups gives the best results for mechanical properties, though pristine graphene is still favored.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The study of volcano deformation data can provide information on magma processes and help assess the potential for future eruptions. In employing inverse deformation modeling on these data, we attempt to characterize the geometry, location and volume/pressure change of a deformation source. Techniques currently used to model sheet intrusions (e.g., dikes and sills) often require significant a priori assumptions about source geometry and can require testing a large number of parameters. Moreover, surface deformations are a non-linear function of the source geometry and location. This requires the use of Monte Carlo inversion techniques which leads to long computation times. Recently, ‘displacement tomography’ models have been used to characterize magma reservoirs by inverting source deformation data for volume changes using a grid of point sources in the subsurface. The computations involved in these models are less intensive as no assumptions are made on the source geometry and location, and the relationship between the point sources and the surface deformation is linear. In this project, seeking a less computationally intensive technique for fracture sources, we tested if this displacement tomography method for reservoirs could be used for sheet intrusions. We began by simulating the opening of three synthetic dikes of known geometry and location using an established deformation model for fracture sources. We then sought to reproduce the displacements and volume changes undergone by the fractures using the sources employed in the tomography methodology. Results of this validation indicate the volumetric point sources are not appropriate for locating fracture sources, however they may provide useful qualitative information on volume changes occurring in the surrounding rock, and therefore indirectly indicate the source location.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The low-frequency electromagnetic compatibility (EMC) is an increasingly important aspect in the design of practical systems to ensure the functional safety and reliability of complex products. The opportunities for using numerical techniques to predict and analyze system’s EMC are therefore of considerable interest in many industries. As the first phase of study, a proper model, including all the details of the component, was required. Therefore, the advances in EMC modeling were studied with classifying analytical and numerical models. The selected model was finite element (FE) modeling, coupled with the distributed network method, to generate the model of the converter’s components and obtain the frequency behavioral model of the converter. The method has the ability to reveal the behavior of parasitic elements and higher resonances, which have critical impacts in studying EMI problems. For the EMC and signature studies of the machine drives, the equivalent source modeling was studied. Considering the details of the multi-machine environment, including actual models, some innovation in equivalent source modeling was performed to decrease the simulation time dramatically. Several models were designed in this study and the voltage current cube model and wire model have the best result. The GA-based PSO method is used as the optimization process. Superposition and suppression of the fields in coupling the components were also studied and verified. The simulation time of the equivalent model is 80-100 times lower than the detailed model. All tests were verified experimentally. As the application of EMC and signature study, the fault diagnosis and condition monitoring of an induction motor drive was developed using radiated fields. In addition to experimental tests, the 3DFE analysis was coupled with circuit-based software to implement the incipient fault cases. The identification was implemented using ANN for seventy various faulty cases. The simulation results were verified experimentally. Finally, the identification of the types of power components were implemented. The results show that it is possible to identify the type of components, as well as the faulty components, by comparing the amplitudes of their stray field harmonics. The identification using the stray fields is nondestructive and can be used for the setups that cannot go offline and be dismantled

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Every space launch increases the overall amount of space debris. Satellites have limited awareness of nearby objects that might pose a collision hazard. Astrometric, radiometric, and thermal models for the study of space debris in low-Earth orbit have been developed. This modeled approach proposes analysis methods that provide increased Local Area Awareness for satellites in low-Earth and geostationary orbit. Local Area Awareness is defined as the ability to detect, characterize, and extract useful information regarding resident space objects as they move through the space environment surrounding a spacecraft. The study of space debris is of critical importance to all space-faring nations. Characterization efforts are proposed using long-wave infrared sensors for space-based observations of debris objects in low-Earth orbit. Long-wave infrared sensors are commercially available and do not require solar illumination to be observed, as their received signal is temperature dependent. The characterization of debris objects through means of passive imaging techniques allows for further studies into the origination, specifications, and future trajectory of debris objects. Conclusions are made regarding the aforementioned thermal analysis as a function of debris orbit, geometry, orientation with respect to time, and material properties. Development of a thermal model permits the characterization of debris objects based upon their received long-wave infrared signals. Information regarding the material type, size, and tumble-rate of the observed debris objects are extracted. This investigation proposes the utilization of long-wave infrared radiometric models of typical debris to develop techniques for the detection and characterization of debris objects via signal analysis of unresolved imagery. Knowledge regarding the orbital type and semi-major axis of the observed debris object are extracted via astrometric analysis. This knowledge may aid in the constraint of the admissible region for the initial orbit determination process. The resultant orbital information is then fused with the radiometric characterization analysis enabling further characterization efforts of the observed debris object. This fused analysis, yielding orbital, material, and thermal properties, significantly increases a satellite’s Local Area Awareness via an intimate understanding of the debris environment surrounding the spacecraft.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The field of bioelectronics involves the use of electrodes to exchange electrical signals with biological systems for diagnostic and therapeutic purposes in biomedical devices and healthcare applications. However, the mechanical compatibility of implantable devices with the human body has been a challenge, particularly with long-term implantation into target organs. Current rigid bioelectronics can trigger inflammatory responses and cause unstable device functions due to the mechanical mismatch with the surrounding soft tissue. Recent advances in flexible and stretchable electronics have shown promise in making bioelectronic interfaces more biocompatible. To fully achieve this goal, material science and engineering of soft electronic devices must be combined with quantitative characterization and modeling tools to understand the mechanical issues at the interface between electronic technology and biological tissue. Local mechanical characterization is crucial to understand the activation of failure mechanisms and optimizing the devices. Experimental techniques for testing mechanical properties at the nanoscale are emerging, and the Atomic Force Microscope (AFM) is a good candidate for in situ local mechanical characterization of soft bioelectronic interfaces. In this work, in situ experimental techniques with solely AFM supported by interpretive models for the characterization of planar and three-dimensional devices suitable for in vivo and in vitro biomedical experimentations are reported. The combination of the proposed models and experimental techniques provides access to the local mechanical properties of soft bioelectronic interfaces. The study investigates the nanomechanics of hard thin gold films on soft polymeric substrates (Poly(dimethylsiloxane) PDMS) and 3D inkjet-printed micropillars under different deformation states. The proposed characterization methods provide a rapid and precise determination of mechanical properties, thus giving the possibility to parametrize the microfabrication steps and investigate their impact on the final device.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Brain functioning relies on the interaction of several neural populations connected through complex connectivity networks, enabling the transmission and integration of information. Recent advances in neuroimaging techniques, such as electroencephalography (EEG), have deepened our understanding of the reciprocal roles played by brain regions during cognitive processes. The underlying idea of this PhD research is that EEG-related functional connectivity (FC) changes in the brain may incorporate important neuromarkers of behavior and cognition, as well as brain disorders, even at subclinical levels. However, a complete understanding of the reliability of the wide range of existing connectivity estimation techniques is still lacking. The first part of this work addresses this limitation by employing Neural Mass Models (NMMs), which simulate EEG activity and offer a unique tool to study interconnected networks of brain regions in controlled conditions. NMMs were employed to test FC estimators like Transfer Entropy and Granger Causality in linear and nonlinear conditions. Results revealed that connectivity estimates reflect information transmission between brain regions, a quantity that can be significantly different from the connectivity strength, and that Granger causality outperforms the other estimators. A second objective of this thesis was to assess brain connectivity and network changes on EEG data reconstructed at the cortical level. Functional brain connectivity has been estimated through Granger Causality, in both temporal and spectral domains, with the following goals: a) detect task-dependent functional connectivity network changes, focusing on internal-external attention competition and fear conditioning and reversal; b) identify resting-state network alterations in a subclinical population with high autistic traits. Connectivity-based neuromarkers, compared to the canonical EEG analysis, can provide deeper insights into brain mechanisms and may drive future diagnostic methods and therapeutic interventions. However, further methodological studies are required to fully understand the accuracy and information captured by FC estimates, especially concerning nonlinear phenomena.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Ultrafast pump-probe spectroscopy is a conceptually simple and versatile tool for resolving photoinduced dynamics in molecular systems. Due to the fast development of new experimental setups, such as synchrotron light sources and X-ray free electron lasers (XFEL), new spectral windows are becoming accessible. On the one hand, these sources have enabled scientist to access faster and faster time scales and to reach unprecedent insights into dynamical properties of matter. On the other hand, the complementarity of well-developed and novel techniques allows to study the same physical process from different points of views, integrating the advantages and overcoming the limitations of each approach. In this context, it is highly desirable to reach a clear understanding of which type of spectroscopy is more suited to capture a certain facade of a given photo-induced process, that is, to establish a correlation between the process to be unraveled and the technique to be used. In this thesis, I will show how computational spectroscopy can be a tool to establish such a correlation. I will study a specific process, which is the ultrafast energy transfer in the nicotinamide adenine dinucleotide dimer (NADH). This process will be observed in different spectral windows (from UV-VIS to X-rays), accessing the ability of different spectroscopic techniques to unravel the system evolution by means of state-of-the-art theoretical models and methodologies. The comparison of different spectroscopic simulations will demonstrate their complementarity, eventually allowing to identify the type of spectroscopy that is best suited to resolve the ultrafast energy transfer.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis aims to illustrate the construction of a mathematical model of a hydraulic system, oriented to the design of a model predictive control (MPC) algorithm. The modeling procedure starts with the basic formulation of a piston-servovalve system. The latter is a complex non linear system with some unknown and not measurable effects that constitute a challenging problem for the modeling procedure. The first level of approximation for system parameters is obtained basing on datasheet informations, provided workbench tests and other data from the company. Then, to validate and refine the model, open-loop simulations have been made for data matching with the characteristics obtained from real acquisitions. The final developed set of ODEs captures all the main peculiarities of the system despite some characteristics due to highly varying and unknown hydraulic effects, like the unmodeled resistive elements of the pipes. After an accurate analysis, since the model presents many internal complexities, a simplified version is presented. The latter is used to linearize and discretize correctly the non linear model. Basing on that, a MPC algorithm for reference tracking with linear constraints is implemented. The results obtained show the potential of MPC in this kind of industrial applications, thus a high quality tracking performances while satisfying state and input constraints. The increased robustness and flexibility are evident with respect to the standard control techniques, such as PID controllers, adopted for these systems. The simulations for model validation and the controlled system have been carried out in a Python code environment.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aim of this investigation was to compare the skeletal stability of three different rigid fixation methods after mandibular advancement. Fifty-five class II malocclusion patients treated with the use of bilateral sagittal split ramus osteotomy and mandibular advancement were selected for this retrospective study. Group 1 (n = 17) had miniplates with monocortical screws, Group 2 (n = 16) had bicortical screws and Group 3 (n = 22) had the osteotomy fixed by means of the hybrid technique. Cephalograms were taken preoperatively, 1 week within the postoperative care period, and 6 months after the orthognathic surgery. Linear and angular changes of the cephalometric landmarks of the chin region were measured at each period, and the changes at each cephalometric landmark were determined for the time gaps. Postoperative changes in the mandibular shape were analyzed to determine the stability of fixation methods. There was minimum difference in the relapse of the mandibular advancement among the three groups. Statistical analysis showed no significant difference in postoperative stability. However, a positive correlation between the amount of advancement and the amount of postoperative relapse was demonstrated by the linear multiple regression test (p < 0.05). It can be concluded that all techniques can be used to obtain stable postoperative results in mandibular advancement after 6 months.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Quantification of dermal exposure to pesticides in rural workers, used in risk assessment, can be performed with different techniques such as patches or whole body evaluation. However, the wide variety of methods can jeopardize the process by producing disparate results, depending on the principles in sample collection. A critical review was thus performed on the main techniques for quantifying dermal exposure, calling attention to this issue and the need to establish a single methodology for quantification of dermal exposure in rural workers. Such harmonization of different techniques should help achieve safer and healthier working conditions. Techniques that can provide reliable exposure data are an essential first step towards avoiding harm to workers' health.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Centers for High Cost Medication (Centros de Medicação de Alto Custo, CEDMAC), Health Department, São Paulo were instituted by project in partnership with the Clinical Hospital of the Faculty of Medicine, USP, sponsored by the Foundation for Research Support of the State of São Paulo (Fundação de Amparo à Pesquisa do Estado de São Paulo, FAPESP) aimed at the formation of a statewide network for comprehensive care of patients referred for use of immunobiological agents in rheumatological diseases. The CEDMAC of Hospital de Clínicas, Universidade Estadual de Campinas (HC-Unicamp), implemented by the Division of Rheumatology, Faculty of Medical Sciences, identified the need for standardization of the multidisciplinary team conducts, in face of the specificity of care conducts, verifying the importance of describing, in manual format, their operational and technical processes. The aim of this study is to present the methodology applied to the elaboration of the CEDMAC/HC-Unicamp Manual as an institutional tool, with the aim of offering the best assistance and administrative quality. In the methodology for preparing the manuals at HC-Unicamp since 2008, the premise was to obtain a document that is participatory, multidisciplinary, focused on work processes integrated with institutional rules, with objective and didactic descriptions, in a standardized format and with electronic dissemination. The CEDMAC/HC-Unicamp Manual was elaborated in 10 months, with involvement of the entire multidisciplinary team, with 19 chapters on work processes and techniques, in addition to those concerning the organizational structure and its annexes. Published in the electronic portal of HC Manuals in July 2012 as an e-Book (ISBN 978-85-63274-17-5), the manual has been a valuable instrument in guiding professionals in healthcare, teaching and research activities.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Abstract The aim of this study was to evaluate three transfer techniques used to obtain working casts of implant-supported prostheses through the marginal misfit and strain induced to metallic framework. Thirty working casts were obtained from a metallic master cast, each one containing two implant analogues simulating a clinical situation of three-unit implant-supported fixed prostheses, according to the following transfer impression techniques: Group A, squared transfers splinted with dental floss and acrylic resin, sectioned and re-splinted; Group B, squared transfers splinted with dental floss and bis-acrylic resin; and Group N, squared transfers not splinted. A metallic framework was made for marginal misfit and strain measurements from the metallic master cast. The misfit between metallic framework and the working casts was evaluated with an optical microscope following the single-screw test protocol. In the same conditions, the strain was evaluated using strain gauges placed on the metallic framework. The data was submitted to one-way ANOVA followed by the Tukey's test (α=5%). For both marginal misfit and strain, there were statistically significant differences between Groups A and N (p<0.01) and Groups B and N (p<0.01), with greater values for the Group N. According to the Pearson's test, there was a positive correlation between the variables misfit and strain (r=0.5642). The results of this study showed that the impression techniques with splinted transfers promoted better accuracy than non-splinted one, regardless of the splinting material utilized.