966 resultados para REALISTIC MODELS


Relevância:

100.00% 100.00%

Publicador:

Resumo:

[EN]In previous works, many authors have widely used mass consistent models for wind field simulation by the finite element method. On one hand, we have developed a 3-D mass consistent model by using tetrahedral meshes which are simultaneously adapted to complex orography and to terrain roughness length. In addition, we have included a local refinement strategy around several measurement or control points, significant contours, as for example shorelines, or numerical solution singularities. On the other hand, we have developed a 2.5-D model for simulating the wind velocity in a 3-D domain in terms of the terrain elevation, the surface temperature and the meteorological wind, which is consider as an averaged wind on vertical boundaries...

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Theterahertz (THz) propagation in real tissues causes heating as with any other electromagnetic radiation propagation. A finite-element (FE) model that provides numerical solutions to the heat conduction equation coupled with realistic models of tissues is employed in this study to compute the temperature raise due to THz propagation. The results indicate that the temperature raise is dependent on the tissue type and is highly localized. The developed FE model was validated through obtaining solutions for the steady-state case and showing that they were in good agreement with the analytical solutions. These types of models can also enable computation of specific absorption rates, which are very critical in planning/setting up experiments involving biological tissues.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Individual-based models (IBMs) can simulate the actions of individual animals as they interact with one another and the landscape in which they live. When used in spatially-explicit landscapes IBMs can show how populations change over time in response to management actions. For instance, IBMs are being used to design strategies of conservation and of the exploitation of fisheries, and for assessing the effects on populations of major construction projects and of novel agricultural chemicals. In such real world contexts, it becomes especially important to build IBMs in a principled fashion, and to approach calibration and evaluation systematically. We argue that insights from physiological and behavioural ecology offer a recipe for building realistic models, and that Approximate Bayesian Computation (ABC) is a promising technique for the calibration and evaluation of IBMs. IBMs are constructed primarily from knowledge about individuals. In ecological applications the relevant knowledge is found in physiological and behavioural ecology, and we approach these from an evolutionary perspective by taking into account how physiological and behavioural processes contribute to life histories, and how those life histories evolve. Evolutionary life history theory shows that, other things being equal, organisms should grow to sexual maturity as fast as possible, and then reproduce as fast as possible, while minimising per capita death rate. Physiological and behavioural ecology are largely built on these principles together with the laws of conservation of matter and energy. To complete construction of an IBM information is also needed on the effects of competitors, conspecifics and food scarcity; the maximum rates of ingestion, growth and reproduction, and life-history parameters. Using this knowledge about physiological and behavioural processes provides a principled way to build IBMs, but model parameters vary between species and are often difficult to measure. A common solution is to manually compare model outputs with observations from real landscapes and so to obtain parameters which produce acceptable fits of model to data. However, this procedure can be convoluted and lead to over-calibrated and thus inflexible models. Many formal statistical techniques are unsuitable for use with IBMs, but we argue that ABC offers a potential way forward. It can be used to calibrate and compare complex stochastic models and to assess the uncertainty in their predictions. We describe methods used to implement ABC in an accessible way and illustrate them with examples and discussion of recent studies. Although much progress has been made, theoretical issues remain, and some of these are outlined and discussed.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

70.00% 70.00%

Publicador:

Resumo:

The Assimilation in the Unstable Subspace (AUS) was introduced by Trevisan and Uboldi in 2004, and developed by Trevisan, Uboldi and Carrassi, to minimize the analysis and forecast errors by exploiting the flow-dependent instabilities of the forecast-analysis cycle system, which may be thought of as a system forced by observations. In the AUS scheme the assimilation is obtained by confining the analysis increment in the unstable subspace of the forecast-analysis cycle system so that it will have the same structure of the dominant instabilities of the system. The unstable subspace is estimated by Breeding on the Data Assimilation System (BDAS). AUS- BDAS has already been tested in realistic models and observational configurations, including a Quasi-Geostrophicmodel and a high dimensional, primitive equation ocean model; the experiments include both fixed and“adaptive”observations. In these contexts, the AUS-BDAS approach greatly reduces the analysis error, with reasonable computational costs for data assimilation with respect, for example, to a prohibitive full Extended Kalman Filter. This is a follow-up study in which we revisit the AUS-BDAS approach in the more basic, highly nonlinear Lorenz 1963 convective model. We run observation system simulation experiments in a perfect model setting, and with two types of model error as well: random and systematic. In the different configurations examined, and in a perfect model setting, AUS once again shows better efficiency than other advanced data assimilation schemes. In the present study, we develop an iterative scheme that leads to a significant improvement of the overall assimilation performance with respect also to standard AUS. In particular, it boosts the efficiency of regime’s changes tracking, with a low computational cost. Other data assimilation schemes need estimates of ad hoc parameters, which have to be tuned for the specific model at hand. In Numerical Weather Prediction models, tuning of parameters — and in particular an estimate of the model error covariance matrix — may turn out to be quite difficult. Our proposed approach, instead, may be easier to implement in operational models.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

The PC12 and SH-SY5Y cell models have been proposed as potentially realistic models to investigate neuronal cell toxicity. The effects of oxidative stress (OS) caused by both H2O2 and Aβ on both cell models were assessed by several methods. Cell toxicity was quantitated by measuring cell viability using the 3-(4,5-dimethylthiazol-2-yl)-2,5-diphenyltetrazolium (MTT) viability assay, an indicator of the integrity of the electron transfer chain (ETC), and cell morphology by fluorescence and video microscopy, both of which showed OS to cause decreased viability and changes in morphology. Levels of intracellular peroxide production, and changes in glutathione and carbonyl levels were also assessed, which showed OS to cause increases in intracellular peroxide production, glutathione and carbonyl levels. Differentiated SH-SY5y cells were also employed and observed to exhibit the greatest sensitivity to toxicity. The neurotrophic factor, nerve growth factor (NGF) was shown to cause protection against OS. Cells pre-treated with NGF showed higher viability after OS, generally less apoptotic morphology, recorded less apoptotic nucleiods, generally lower levels of intracellular peroxides and changes in gene expression. The neutrophic factor, brain derived growth factor (BDNF) and ascorbic acid (AA) were also investigated. BDNF showed no specific neuroprotection, however the preliminary data does warrant further investigation. AA showed a 'janus face' showing either anti-oxidant action and neuroprotection or pro-oxidant action depending on the situation. Results showed that the toxic effects of compounds such as Aβ and H2O2 are cell type dependent, and that OS alters glutathione metabolism in neuronal cells. Following toxic insult, glutathione levels are depleted to low levels. It is herein suggested that this lowering triggers an adaptive response causing alterations in glutathione metabolism as assessed by evaluation of glutathione mRNA biosynthetic enzyme expression and the subsequent increase in glutathione peroxidase (GPX) levels.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Reliability and dependability modeling can be employed during many stages of analysis of a computing system to gain insights into its critical behaviors. To provide useful results, realistic models of systems are often necessarily large and complex. Numerical analysis of these models presents a formidable challenge because the sizes of their state-space descriptions grow exponentially in proportion to the sizes of the models. On the other hand, simulation of the models requires analysis of many trajectories in order to compute statistically correct solutions. This dissertation presents a novel framework for performing both numerical analysis and simulation. The new numerical approach computes bounds on the solutions of transient measures in large continuous-time Markov chains (CTMCs). It extends existing path-based and uniformization-based methods by identifying sets of paths that are equivalent with respect to a reward measure and related to one another via a simple structural relationship. This relationship makes it possible for the approach to explore multiple paths at the same time,· thus significantly increasing the number of paths that can be explored in a given amount of time. Furthermore, the use of a structured representation for the state space and the direct computation of the desired reward measure (without ever storing the solution vector) allow it to analyze very large models using a very small amount of storage. Often, path-based techniques must compute many paths to obtain tight bounds. In addition to presenting the basic path-based approach, we also present algorithms for computing more paths and tighter bounds quickly. One resulting approach is based on the concept of path composition whereby precomputed subpaths are composed to compute the whole paths efficiently. Another approach is based on selecting important paths (among a set of many paths) for evaluation. Many path-based techniques suffer from having to evaluate many (unimportant) paths. Evaluating the important ones helps to compute tight bounds efficiently and quickly.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The world is facing problems due to the effects of increased atmospheric pollution, climate change and global warming. Innovative technologies to identify, quantify and assess fluxes exchange of the pollutant gases between the Earth’s surface and atmosphere are required. This paper proposes the development of a gas sensor system for a small UAV to monitor pollutant gases, collect data and geo-locate where the sample was taken. The prototype has two principal systems: a light portable gas sensor and an optional electric–solar powered UAV. The prototype will be suitable to: operate in the lower troposphere (100-500m); collect samples; stamp time and geo-locate each sample. One of the limitations of a small UAV is the limited power available therefore a small and low power consumption payload is designed and built for this research. The specific gases targeted in this research are NO2, mostly produce by traffic, and NH3 from farming, with concentrations above 0.05 ppm and 35 ppm respectively which are harmful to human health. The developed prototype will be a useful tool for scientists to analyse the behaviour and tendencies of pollutant gases producing more realistic models of them.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Crowds of noncombatants play a large and increasingly recognized role in modern military operations and often create substantial difficulties for the combatant forces involved. However, realistic models of crowds are essentially absent from current military simulations. To address this problem, the authors are developing a crowd simulation capable of generating crowds of noncombatant civilians that exhibit a variety of realistic individual and group behaviors at differing levels of fidelity. The crowd simulation is interoperable with existing military simulations using a standard, distributed simulation architecture. Commercial game technology is used in the crowd simulation to model both urban terrain and the physical behaviors of the human characters that make up the crowd. The objective of this article is to present the design and development process of a simulation that integrates commercially available game technology with current military simulations to generate realistic and believable crowd behavior.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Crowds of non-combatants play a large and increasingly recognized role in modern military operations, and often create substantial difficulties for the combatant forces involved. However, realistic models of crowds are essentially absent from current military simulations. To address this problem we are developing a crowd simulation capable of generating crowds of non-combatant civilians that exhibit a variety of realistic individual and group behaviours at differing levels of fidelity. The crowd simulation is interoperable with existing military simulations using a standard distributed simulation architecture. Commercial game technology is utilized in the crowd simulation to model both urban terrain and the physical behaviours of the human characters that make up the crowd. The objective of this paper is to present the process involved with the design and development of a simulation that integrates commercially available game technology with current military simulations in order to generate realistic and believable crowd behaviour.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Introduction A novel realistic 3D virtual reality (VR) application has been developed to allow medical imaging students at Queensland University of Technology to practice radiographic techniques independently outside the usual radiography laboratory. Methods A flexible agile development methodology was used to create the software rapidly and effectively. A 3D gaming environment and realistic models were used to engender presence in the software while tutor-determined gold standards enabled students to compare their performance and learn in a problem-based learning pedagogy. Results Students reported high levels of satisfaction and perceived value and the software enabled up to 40 concurrent users to prepare for clinical practice. Student feedback also indicated that they found 3D to be of limited value in the desktop version compared to the usual 2D approach. A randomised comparison between groups receiving software-based and traditional practice measured performance in a formative role play with real equipment. The results of this work indicated superior performance with the equipment for the VR trained students (P = 0.0366) and confirmed the value of VR for enhancing 3D equipment-based problem-solving skills. Conclusions Students practising projection techniques virtually performed better at role play assessments than students practising in a traditional radiography laboratory only. The application particularly helped with 3D equipment configuration, suggesting that teaching 3D problem solving is an ideal use of such medical equipment simulators. Ongoing development work aims to establish the role of VR software in preparing students for clinical practice with a range of medical imaging equipment.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Many species inhabit fragmented landscapes, resulting either from anthropogenic or from natural processes. The ecological and evolutionary dynamics of spatially structured populations are affected by a complex interplay between endogenous and exogenous factors. The metapopulation approach, simplifying the landscape to a discrete set of patches of breeding habitat surrounded by unsuitable matrix, has become a widely applied paradigm for the study of species inhabiting highly fragmented landscapes. In this thesis, I focus on the construction of biologically realistic models and their parameterization with empirical data, with the general objective of understanding how the interactions between individuals and their spatially structured environment affect ecological and evolutionary processes in fragmented landscapes. I study two hierarchically structured model systems, which are the Glanville fritillary butterfly in the Åland Islands, and a system of two interacting aphid species in the Tvärminne archipelago, both being located in South-Western Finland. The interesting and challenging feature of both study systems is that the population dynamics occur over multiple spatial scales that are linked by various processes. My main emphasis is in the development of mathematical and statistical methodologies. For the Glanville fritillary case study, I first build a Bayesian framework for the estimation of death rates and capture probabilities from mark-recapture data, with the novelty of accounting for variation among individuals in capture probabilities and survival. I then characterize the dispersal phase of the butterflies by deriving a mathematical approximation of a diffusion-based movement model applied to a network of patches. I use the movement model as a building block to construct an individual-based evolutionary model for the Glanville fritillary butterfly metapopulation. I parameterize the evolutionary model using a pattern-oriented approach, and use it to study how the landscape structure affects the evolution of dispersal. For the aphid case study, I develop a Bayesian model of hierarchical multi-scale metapopulation dynamics, where the observed extinction and colonization rates are decomposed into intrinsic rates operating specifically at each spatial scale. In summary, I show how analytical approaches, hierarchical Bayesian methods and individual-based simulations can be used individually or in combination to tackle complex problems from many different viewpoints. In particular, hierarchical Bayesian methods provide a useful tool for decomposing ecological complexity into more tractable components.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This thesis deals with theoretical modeling of the electrodynamics of auroral ionospheres. In the five research articles forming the main part of the thesis we have concentrated on two main themes: Development of new data-analysis techniques and study of inductive phenomena in the ionospheric electrodynamics. The introductory part of the thesis provides a background for these new results and places them in the wider context of ionospheric research. In this thesis we have developed a new tool (called 1D SECS) for analysing ground based magnetic measurements from a 1-dimensional magnetometer chain (usually aligned in the North-South direction) and a new method for obtaining ionospheric electric field from combined ground based magnetic measurements and estimated ionospheric electric conductance. Both these methods are based on earlier work, but contain important new features: 1D SECS respects the spherical geometry of large scale ionospheric electrojet systems and due to an innovative way of implementing boundary conditions the new method for obtaining electric fields can be applied also at local scale studies. These new calculation methods have been tested using both simulated and real data. The tests indicate that the new methods are more reliable than the previous techniques. Inductive phenomena are intimately related to temporal changes in electric currents. As the large scale ionospheric current systems change relatively slowly, in time scales of several minutes or hours, inductive effects are usually assumed to be negligible. However, during the past ten years, it has been realised that induction can play an important part in some ionospheric phenomena. In this thesis we have studied the role of inductive electric fields and currents in ionospheric electrodynamics. We have formulated the induction problem so that only ionospheric electric parameters are used in the calculations. This is in contrast to previous studies, which require knowledge of the magnetospheric-ionosphere coupling. We have applied our technique to several realistic models of typical auroral phenomena. The results indicate that inductive electric fields and currents are locally important during the most dynamical phenomena (like the westward travelling surge, WTS). In these situations induction may locally contribute up to 20-30% of the total ionospheric electric field and currents. Inductive phenomena do also change the field-aligned currents flowing between the ionosphere and magnetosphere, thus modifying the coupling between the two regions.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The CCEM method (Contact Criteria and Energy Minimisation) has been developed and applied to study protein-carbohydrate interactions. The method uses available X-ray data even on the native protein at low resolution (above 2.4 Å) to generate realistic models of a variety of proteins with various ligands.The two examples discussed in this paper are arabinose-binding protein (ABP) and pea lectin. The X-ray crystal structure data reported on ABP-β-l-arabinose complex at 2.8, 2.4 and 1.7 Å resolution differ drastically in predicting the nature of the interactions between the protein and ligand. It is shown that, using the data at 2.4 Å resolution, the CCEM method generates complexes which are as good as the higher (1.7 Å) resolution data. The CCEM method predicts some of the important hydrogen bonds between the ligand and the protein which are missing in the interpretation of the X-ray data at 2.4 Å resolution. The theoretically predicted hydrogen bonds are in good agreement with those reported at 1.7 Å resolution. Pea lectin has been solved only in the native form at 3 Å resolution. Application of the CCEM method also enables us to generate complexes of pea lectin with methyl-α-d-glucopyranoside and methyl-2,3-dimethyl-α-d-glucopyranoside which explain well the available experimental data in solution.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

We present the details of a formalism for calculating spatially varying zero-frequency response functions and equal-time correlation functions in models of magnetic and mixed-valence impurities of metals. The method is based on a combination of perturbative, thermodynamic scaling theory [H. R. Krishna-murthy and C. Jayaprakash, Phys. Rev. B 30, 2806 (1984)] and a nonperturbative technique such as the Wilson renormalization group. We illustrate the formalism for the spin-1/2 Kondo problem and present results for the conduction-spin-density�impurity-spin correlation function and conduction-electron charge density near the impurity. We also discuss qualitative features that emerge from our calculations and discuss how they can be carried over to the case of realistic models for transition-metal impurities.