984 resultados para Context modeling


Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a methodology for modeling high intensity discharge lamps based on artificial neural networks. The methodology provides a model which is able to represent the device operating in the frequency of distribution systems, facing events related to power quality. With the aid of a data acquisition system to monitor the laboratory experiment, and using $$\text{ MATLAB }^{\textregistered }$$ software, data was obtained for the training of two neural networks. These neural networks, working together, were able to represent with high fidelity the behavior of a discharge lamp. The excellent performance obtained by these models allowed the simulation of a group of lamps in a distribution system with shorter simulation time when compared to mathematical models. This fact justified the application of this family of loads in electric power systems. The representation of the device facing power quality disturbances also proved to be a useful tool for more complex studies in distribution systems. © 2013 Brazilian Society for Automatics - SBA.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Finite Element Method is a well-known technique, being extensively applied in different areas. Studies using the Finite Element Method (FEM) are targeted to improve cardiac ablation procedures. For such simulations, the finite element meshes should consider the size and histological features of the target structures. However, it is possible to verify that some methods or tools used to generate meshes of human body structures are still limited, due to nondetailed models, nontrivial preprocessing, or mainly limitation in the use condition. In this paper, alternatives are demonstrated to solid modeling and automatic generation of highly refined tetrahedral meshes, with quality compatible with other studies focused on mesh generation. The innovations presented here are strategies to integrate Open Source Software (OSS). The chosen techniques and strategies are presented and discussed, considering cardiac structures as a first application context. © 2013 E. Pavarino et al.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a new technique to model interfaces by means of degenerated solid finite elements, i.e., elements with a very high aspect ratio, with the smallest dimension corresponding to the thickness of the interfaces. It is shown that, as the aspect ratio increases, the element strains also increase, approaching the kinematics of the strong discontinuity. A tensile damage constitutive relation between strains and stresses is proposed to describe the nonlinear behavior of the interfaces associated with crack opening. To represent crack propagation, couples of triangular interface elements are introduced in between all regular (bulk) elements of the original mesh. With this technique the analyses can be performed integrally in the context of the continuum mechanics and complex crack patterns involving multiple cracks can be simulated without the need of tracking algorithms. Numerical tests are performed to show the applicability of the proposed technique, studding also aspects related to mesh objectivity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A method is presented for estimating age-specific mortality based on minimal information: a model life table and an estimate of longevity. This approach uses expected patterns of mammalian survivorship to define a general model of age-specific mortality rates. One such model life table is based on data for northern fur seals (Callorhinus ursinus) using Siler’s (1979) 5-parameter competing risk model. Alternative model life tables are based on historical data for human females and on a published model for Old World monkeys. Survival rates for a marine mammal species are then calculated by scaling these models by the longevity of that species. By using a realistic model (instead of assuming constant mortality), one can see more easily the real biological limits to population growth. The mortality estimation procedure is illustrated with examples of spotted dolphins (Stenella attenuata) and harbor porpoise (Phocoena phocoena).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Analyses of ecological data should account for the uncertainty in the process(es) that generated the data. However, accounting for these uncertainties is a difficult task, since ecology is known for its complexity. Measurement and/or process errors are often the only sources of uncertainty modeled when addressing complex ecological problems, yet analyses should also account for uncertainty in sampling design, in model specification, in parameters governing the specified model, and in initial and boundary conditions. Only then can we be confident in the scientific inferences and forecasts made from an analysis. Probability and statistics provide a framework that accounts for multiple sources of uncertainty. Given the complexities of ecological studies, the hierarchical statistical model is an invaluable tool. This approach is not new in ecology, and there are many examples (both Bayesian and non-Bayesian) in the literature illustrating the benefits of this approach. In this article, we provide a baseline for concepts, notation, and methods, from which discussion on hierarchical statistical modeling in ecology can proceed. We have also planted some seeds for discussion and tried to show where the practical difficulties lie. Our thesis is that hierarchical statistical modeling is a powerful way of approaching ecological analysis in the presence of inevitable but quantifiable uncertainties, even if practical issues sometimes require pragmatic compromises.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Blast traumatic brain injury (BTBI) has become an important topic of study because of the increase of such incidents, especially due to the recent growth of improvised explosive devices (IEDs). This thesis discusses a project in which laboratory testing of BTBI was made possible by performing blast loading on experimental models simulating the human head. Three versions of experimental models were prepared – one having a simple geometry and the other two having geometry similar to a human head. For developing the head models, three important parts of the head were considered for material modeling and analysis – the skin, skull and brain. The materials simulating skin, skull and brain went through many testing procedures including dynamic mechanical analysis (DMA). For finding a suitable brain simulant, several materials were tested under low and high frequencies. Step response analysis, rheometry and DMA tests were performed on materials such as water based gels, oil based mixtures and silicone gels cured at different temperatures. The gelatins and silicone gels showed promising results toward their use as brain surrogate materials. Temperature degradation tests were performed on gelatins, indicating the fast degradation of gelatins at room temperature. Silicone gels were much more stable compared to the water based gels. Silicone gels were further processed using a thinner-type additive gel to bring the dynamic modulus values closer to those of human brain matter. The obtained values from DMA were compared to the values for human brain as found in literature. Then a silicone rubber brain mold was prepared to give the brain model accurate geometry. All the components were put together to make the entire head model. A steel mount was prepared to attach the head for testing at the end of the shock tube. Instrumentation was implemented in the head model to obtain effective results for understanding more about the possible mechanisms of BTBI. The final head model was named the Realistic Explosive Dummy Head or the “RED Head.” The RED Head offered potential for realistic experimental testing in blast loading conditions by virtue of its material properties and geometrical accuracy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Drawing on longitudinal data from the Early Childhood Longitudinal Study, Kindergarten Class of 1998–1999, this study used IRT modeling to operationalize a measure of parental educational investments based on Lareau’s notion of concerted cultivation. It used multilevel piecewise growth models regressing children’s math and reading achievement from entry into kindergarten through the third grade on concerted cultivation and family context variables. The results indicate that educational investments are an important mediator of socioeconomic and racial/ethnic disparities, completely explaining the black-white reading gap at kindergarten entry and consistently explaining 20 percent to 60 percent and 30 percent to 50 percent of the black-white and Hispanic-white disparities in the growth parameters, respectively, and approximately 20 percent of the socioeconomic gradients. Notably, concerted cultivation played a more significant role in explaining racial/ethnic gaps in achievement than expected from Lareau’s discussion, which suggests that after socioeconomic background is controlled, concerted cultivation should not be implicated in racial/ethnic disparities in learning.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Land development in the vicinity of airports often leads to land-use that can attract birds that are hazardous to aviation operations. For this reason, certain forms of land-use have traditionally been discouraged within prescribed distances of Canadian airports. However, this often leads to an unrealistic prohibition of land-use in the vicinity of airports located in urban settings. Furthermore, it is often unclear that the desired safety goals have been achieved. This paper describes a model that was created to assist in the development of zoning regulations for a future airport site in Canada. The framework links land-use to bird-related safety-risks and aircraft operations by categorizing the predictable relationships between: (i) different land uses found in urbanized and urbanizing settings near airports; (ii) bird species; and (iii) the different safety-risks to aircraft during various phases of flight. The latter is assessed relative to the runway approach and departure paths. Bird species are ranked to reflect the potential severity of an impact with an aircraft (using bird weight, flocking characteristics, and flight behaviours). These criteria are then employed to chart bird-related safety-risks relative to runway reference points. Each form of land-use is categorized to reflect the degree to which it attracts hazardous bird species. From this information, hazard and risk matrices have been developed and applied to the future airport setting, thereby providing risk-based guidance on appropriate land-uses that range from prohibited to acceptable. The framework has subsequently been applied to an existing Canadian airport, and is currently being adapted for national application. The framework provides a risk-based and science-based approach that offers municipalities and property owner’s flexibility in managing the risks to aviation related to their land use.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Forward modeling is commonly applied to gravity field data of impact structures to determine the main gravity anomaly sources. In this context, we have developed 2.5-D gravity models of the Serra da Cangalha impact structure for the purpose of investigating geological bodies/structures underneath the crater. Interpretation of the models was supported by ground magnetic data acquired along profiles, as well as by high resolution aeromagnetic data. Ground magnetic data reveal the presence of short-wavelength anomalies probably related to shallow magnetic sources that could have been emplaced during the cratering process. Aeromagnetic data show that the basement underneath the crater occurs at an average depth of about 1.9 km, whereas in the region beneath the central uplift it is raised to 0.51 km below the current surface. These depths are also supported by 2.5-D gravity models showing a gentle relief for the basement beneath the central uplift area. Geophysical data were used to provide further constraints for numeral modeling of crater formation that provided important information on the structural modification that affected the rocks underneath the crater, as well as on shock-induced modifications of target rocks. The results showed that the morphology is consistent with the current observations of the crater and that Serra da Cangalha was formed by a meteorite of approximately 1.4 km diameter striking at 12 km s-1.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Building facilities have become important infrastructures for modern productive plants dedicated to services. In this context, the control systems of intelligent buildings have evolved while their reliability has evidently improved. However, the occurrence of faults is inevitable in systems conceived, constructed and operated by humans. Thus, a practical alternative approach is found to be very useful to reduce the consequences of faults. Yet, only few publications address intelligent building modeling processes that take into consideration the occurrence of faults and how to manage their consequences. In the light of the foregoing, a procedure is proposed for the modeling of intelligent building control systems, considersing their functional specifications in normal operation and in the of the event of faults. The proposed procedure adopts the concepts of discrete event systems and holons, and explores Petri nets and their extensions so as to represent the structure and operation of control systems for intelligent buildings under normal and abnormal situations. (C) 2012 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Primary voice production occurs in the larynx through vibrational movements carried out by vocal folds. However, many problems can affect this complex system resulting in voice disorders. In this context, time-frequency-shape analysis based on embedding phase space plots and nonlinear dynamics methods have been used to evaluate the vocal fold dynamics during phonation. For this purpose, the present work used high-speed video to record the vocal fold movements of three subjects and extract the glottal area time series using an image segmentation algorithm. This signal is used for an optimization method which combines genetic algorithms and a quasi-Newton method to optimize the parameters of a biomechanical model of vocal folds based on lumped elements (masses, springs and dampers). After optimization, this model is capable of simulating the dynamics of recorded vocal folds and their glottal pulse. Bifurcation diagrams and phase space analysis were used to evaluate the behavior of this deterministic system in different circumstances. The results showed that this methodology can be used to extract some physiological parameters of vocal folds and reproduce some complex behaviors of these structures contributing to the scientific and clinical evaluation of voice production. (C) 2010 Elsevier Inc. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Sleeping sickness is a major cause of death in Africa. Since no secure treatment is available, the development of novel therapeutic agents is urgent. In this context, the enzyme trypanothione reductase (TR) is a prominent molecular target that has been investigated in drug design for sleeping sickness. Results: In this study, comparative molecular field analysis models were generated for a series of Trypanosoma brucei TR inhibitors. Statistically significant results were obtained and the models were applied to predict the activity of external test sets, with good correlation between predicted and experimental results. We have also investigated the structural requirements for the selective inhibition of the parasite's enzyme over the human glutathione reductase. Conclusion: The quantitative structure-activity relationship models provided valuable information regarding the essential molecular requirements for the inhibitory activity upon the target protein, providing important insights into the design of more potent and selective TR inhibitors.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The discovery and development of a new drug are time-consuming, difficult and expensive. This complex process has evolved from classical methods into an integration of modern technologies and innovative strategies addressed to the design of new chemical entities to treat a variety of diseases. The development of new drug candidates is often limited by initial compounds lacking reasonable chemical and biological properties for further lead optimization. Huge libraries of compounds are frequently selected for biological screening using a variety of techniques and standard models to assess potency, affinity and selectivity. In this context, it is very important to study the pharmacokinetic profile of the compounds under investigation. Recent advances have been made in the collection of data and the development of models to assess and predict pharmacokinetic properties (ADME - absorption, distribution, metabolism and excretion) of bioactive compounds in the early stages of drug discovery projects. This paper provides a brief perspective on the evolution of in silico ADME tools, addressing challenges, limitations, and opportunities in medicinal chemistry.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Slope failure occurs in many areas throughout the world and it becomes an important problem when it interferes with human activity, in which disasters provoke loss of life and property damage. In this research we investigate the slope failure through the centrifuge modeling, where a reduced-scale model, N times smaller than the full-scale (prototype), is used whereas the acceleration is increased by N times (compared with the gravity acceleration) to preserve the stress and the strain behavior. The aims of this research “Centrifuge modeling of sandy slopes” are in extreme synthesis: 1) test the reliability of the centrifuge modeling as a tool to investigate the behavior of a sandy slope failure; 2) understand how the failure mechanism is affected by changing the slope angle and obtain useful information for the design. In order to achieve this scope we arranged the work as follows: Chapter one: centrifuge modeling of slope failure. In this chapter we provide a general view about the context in which we are working on. Basically we explain what is a slope failure, how it happens and which are the tools available to investigate this phenomenon. Afterwards we introduce the technology used to study this topic, that is the geotechnical centrifuge. Chapter two: testing apparatus. In the first section of this chapter we describe all the procedures and facilities used to perform a test in the centrifuge. Then we explain the characteristics of the soil (Nevada sand), like the dry unit weight, water content, relative density, and its strength parameters (c,φ), which have been calculated in laboratory through the triaxial test. Chapter three: centrifuge tests. In this part of the document are presented all the results from the tests done in centrifuge. When we talk about results we refer to the acceleration at failure for each model tested and its failure surface. In our case study we tested models with the same soil and geometric characteristics but different angles. The angles tested in this research were: 60°, 75° and 90°. Chapter four: slope stability analysis. We introduce the features and the concept of the software: ReSSA (2.0). This software allows us to calculate the theoretical failure surfaces of the prototypes. Then we show in this section the comparisons between the experimental failure surfaces of the prototype, traced in the laboratory, and the one calculated by the software. Chapter five: conclusion. The conclusion of the research presents the results obtained in relation to the two main aims, mentioned above.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The term "Brain Imaging" identi�es a set of techniques to analyze the structure and/or functional behavior of the brain in normal and/or pathological situations. These techniques are largely used in the study of brain activity. In addition to clinical usage, analysis of brain activity is gaining popularity in others recent �fields, i.e. Brain Computer Interfaces (BCI) and the study of cognitive processes. In this context, usage of classical solutions (e.g. f MRI, PET-CT) could be unfeasible, due to their low temporal resolution, high cost and limited portability. For these reasons alternative low cost techniques are object of research, typically based on simple recording hardware and on intensive data elaboration process. Typical examples are ElectroEncephaloGraphy (EEG) and Electrical Impedance Tomography (EIT), where electric potential at the patient's scalp is recorded by high impedance electrodes. In EEG potentials are directly generated from neuronal activity, while in EIT by the injection of small currents at the scalp. To retrieve meaningful insights on brain activity from measurements, EIT and EEG relies on detailed knowledge of the underlying electrical properties of the body. This is obtained from numerical models of the electric �field distribution therein. The inhomogeneous and anisotropic electric properties of human tissues make accurate modeling and simulation very challenging, leading to a tradeo�ff between physical accuracy and technical feasibility, which currently severely limits the capabilities of these techniques. Moreover elaboration of data recorded requires usage of regularization techniques computationally intensive, which influences the application with heavy temporal constraints (such as BCI). This work focuses on the parallel implementation of a work-flow for EEG and EIT data processing. The resulting software is accelerated using multi-core GPUs, in order to provide solution in reasonable times and address requirements of real-time BCI systems, without over-simplifying the complexity and accuracy of the head models.