34 resultados para semi-physical simulation


Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this contribution we aim at anchoring Agent-Based Modeling (ABM) simulations in actual models of human psychology. More specifically, we apply unidirectional ABM to social psychological models using low level agents (i.e., intra-individual) to examine whether they generate better predictions, in comparison to standard statistical approaches, concerning the intentions of performing a behavior and the behavior. Moreover, this contribution tests to what extent the predictive validity of models of attitude such as the Theory of Planned Behavior (TPB) or Model of Goal-directed Behavior (MGB) depends on the assumption that peoples’ decisions and actions are purely rational. Simulations were therefore run by considering different deviations from rationality of the agents with a trembling hand method. Two data sets concerning respectively the consumption of soft drinks and physical activity were used. Three key findings emerged from the simulations. First, compared to standard statistical approach the agent-based simulation generally improves the prediction of behavior from intention. Second, the improvement in prediction is inversely proportional to the complexity of the underlying theoretical model. Finally, the introduction of varying degrees of deviation from rationality in agents’ behavior can lead to an improvement in the goodness of fit of the simulations. By demonstrating the potential of ABM as a complementary perspective to evaluating social psychological models, this contribution underlines the necessity of better defining agents in terms of psychological processes before examining higher levels such as the interactions between individuals.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

MD simulation studies showing the influence of porosity and carbon surface oxidation on phenol adsorption from aqueous solutions on carbons are reported. Based on a realistic model of activated carbon, three carbon structures with gradually changed microporosity were created. Next, a different number of surface oxygen groups was introduced. The pores with diameters around 0.6 nm are optimal for phenol adsorption and after the introduction of surface oxygen functionalities, adsorption of phenol decreases (in accordance with experimental data) for all studied models. This decrease is caused by a pore blocking effect due to the saturation of surface oxygen groups by highly hydrogen-bounded water molecules.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

There is currently an increased interest of Government and Industry in the UK, as well as at the European Community level and International Agencies (i.e. Department of Energy, American International Energy Agency), to improve the performance and uptake of Ground Coupled Heat Pumps (GCHP), in order to meet the 2020 renewable energy target. A sound knowledge base is required to help inform the Government Agencies and advisory bodies; detailed site studies providing reliable data for model verification have an important role to play in this. In this study we summarise the effect of heat extraction by a horizontal ground heat exchanger (installed at 1 m depth) on the soil physical environment (between 0 and 1 m depth) for a site in the south of the UK. Our results show that the slinky influences the surrounding soil by significantly decreasing soil temperatures. Furthermore, soil moisture contents were lower for the GCHP soil profile, most likely due to temperature-gradient related soil moisture migration effects and a decreased hydraulic conductivity, the latter as a result of increased viscosity (caused by the lower temperatures for the GCHP soil profile). The effects also caused considerable differences in soil thermal properties. This is the first detailed mechanistic study conducted in the UK with the aim to understand the interactions between the soil, horizontal heat exchangers and the aboveground environment. An increased understanding of these interactions will help to achieve an optimum and sustainable use of the soil heat resources in the future. The results of this study will help to calibrate and verify a simulation model that will provide UK-wide recommendations to improve future GCHP uptake and performance, while safeguarding the soil physical resources.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Integrated simulation models can be useful tools in farming system research. This chapter reviews three commonly used approaches, i.e. linear programming, system dynamics and agent-based models. Applications of each approach are presented and strengths and drawbacks discussed. We argue that, despite some challenges, mainly related to the integration of different approaches, model validation and the representation of human agents, integrated simulation models contribute important insights to the analysis of farming systems. They help unravelling the complex and dynamic interactions and feedbacks among bio-physical, socio-economic, and institutional components across scales and levels in farming systems. In addition, they can provide a platform for integrative research, and can support transdisciplinary research by functioning as learning platforms in participatory processes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Global flood hazard maps can be used in the assessment of flood risk in a number of different applications, including (re)insurance and large scale flood preparedness. Such global hazard maps can be generated using large scale physically based models of rainfall-runoff and river routing, when used in conjunction with a number of post-processing methods. In this study, the European Centre for Medium Range Weather Forecasts (ECMWF) land surface model is coupled to ERA-Interim reanalysis meteorological forcing data, and resultant runoff is passed to a river routing algorithm which simulates floodplains and flood flow across the global land area. The global hazard map is based on a 30 yr (1979–2010) simulation period. A Gumbel distribution is fitted to the annual maxima flows to derive a number of flood return periods. The return periods are calculated initially for a 25×25 km grid, which is then reprojected onto a 1×1 km grid to derive maps of higher resolution and estimate flooded fractional area for the individual 25×25 km cells. Several global and regional maps of flood return periods ranging from 2 to 500 yr are presented. The results compare reasonably to a benchmark data set of global flood hazard. The developed methodology can be applied to other datasets on a global or regional scale.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

As a major mode of intraseasonal variability, which interacts with weather and climate systems on a near-global scale, the Madden – Julian Oscillation (MJO) is a crucial source of predictability for numerical weather prediction (NWP) models. Despite its global significance and comprehensive investigation, improvements in the representation of the MJO in an NWP context remain elusive. However, recent modifications to the model physics in the ECMWF model led to advances in the representation of atmospheric variability and the unprecedented propagation of the MJO signal through the entire integration period. In light of these recent advances, a set of hindcast experiments have been designed to assess the sensitivity of MJO simulation to the formulation of convection. Through the application of established MJO diagnostics, it is shown that the improvements in the representation of the MJO can be directly attributed to the modified convective parametrization. Furthermore, the improvements are attributed to the move from a moisture-convergent- to a relative-humidity-dependent formulation for organized deep entrainment. It is concluded that, in order to understand the physical mechanisms through which a relative-humidity-dependent formulation for entrainment led to an improved simulation of the MJO, a more process-based approach should be taken. T he application of process-based diagnostics t o t he hindcast experiments presented here will be the focus of Part II of this study.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In Part I of this study it was shown that moving from a moisture-convergent- to a relative-humidity-dependent organized entrainment rate in the formulation for deep convection was responsible for significant advances in the simulation of the Madden – Julian Oscillation (MJO) in the ECMWF model. However, the application of traditional MJO diagnostics were not adequate to understand why changing the control on convection had such a pronounced impact on the representation of the MJO. In this study a set of process-based diagnostics are applied to the hindcast experiments described in Part I to identify the physical mechanisms responsible for the advances in MJO simulation. Increasing the sensitivity of the deep convection scheme to environmental moisture is shown to modify the relationship between precipitation and moisture in the model. Through dry-air entrainment, convective plumes ascending in low-humidity environments terminate lower in the atmosphere. As a result, there is an increase in the occurrence of cumulus congestus, which acts to moisten the mid troposphere. Due to the modified precipitation – moisture relationship more moisture is able to build up, which effectively preconditions the tropical atmosphere for the t ransition t o d eep convection. R esults from this study suggest that a tropospheric moisture control on convection is key to simulating the interaction between the convective heating and the large-scale wave forcing associated with the MJO.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

During long-range transport, many distinct processes – including photochemistry, deposition, emissions and mixing – contribute to the transformation of air mass composition. Partitioning the effects of different processes can be useful when considering the sensitivity of chemical transformation to, for example, a changing environment or anthropogenic influence. However, transformation is not observed directly, since mixing ratios are measured, and models must be used to relate changes to processes. Here, four cases from the ITCT-Lagrangian 2004 experiment are studied. In each case, aircraft intercepted a distinct air mass several times during transport over the North Atlantic, providing a unique dataset and quantifying the net changes in composition from all processes. A new framework is presented to deconstruct the change in O3 mixing ratio (Δ O3) into its component processes, which were not measured directly, taking into account the uncertainty in measurements, initial air mass variability and its time evolution. The results show that the net chemical processing (Δ O3chem) over the whole simulation is greater than net physical processing (Δ O3phys) in all cases. This is in part explained by cancellation effects associated with mixing. In contrast, each case is in a regime of either net photochemical destruction (lower tropospheric transport) or production (an upper tropospheric biomass burning case). However, physical processes influence O3 indirectly through addition or removal of precursor gases, so that changes to physical parameters in a model can have a larger effect on Δ O3chem than Δ O3phys. Despite its smaller magnitude, the physical processing distinguishes the lower tropospheric export cases, since the net photochemical O3 change is −5 ppbv per day in all three cases. Processing is quantified using a Lagrangian photochemical model with a novel method for simulating mixing through an ensemble of trajectories and a background profile that evolves with them. The model is able to simulate the magnitude and variability of the observations (of O3, CO, NOy and some hydrocarbons) and is consistent with the time-average OH following air-masses inferred from hydrocarbon measurements alone (by Arnold et al., 2007). Therefore, it is a useful new method to simulate air mass evolution and variability, and its sensitivity to process parameters.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Many modern statistical applications involve inference for complex stochastic models, where it is easy to simulate from the models, but impossible to calculate likelihoods. Approximate Bayesian computation (ABC) is a method of inference for such models. It replaces calculation of the likelihood by a step which involves simulating artificial data for different parameter values, and comparing summary statistics of the simulated data with summary statistics of the observed data. Here we show how to construct appropriate summary statistics for ABC in a semi-automatic manner. We aim for summary statistics which will enable inference about certain parameters of interest to be as accurate as possible. Theoretical results show that optimal summary statistics are the posterior means of the parameters. Although these cannot be calculated analytically, we use an extra stage of simulation to estimate how the posterior means vary as a function of the data; and we then use these estimates of our summary statistics within ABC. Empirical results show that our approach is a robust method for choosing summary statistics that can result in substantially more accurate ABC analyses than the ad hoc choices of summary statistics that have been proposed in the literature. We also demonstrate advantages over two alternative methods of simulation-based inference.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We have incorporated a semi-mechanistic isoprene emission module into the JULES land-surface scheme, as a first step towards a modelling tool that can be applied for studies of vegetation – atmospheric chemistry interactions, including chemistry-climate feedbacks. Here, we evaluate the coupled model against local above-canopy isoprene emission flux measurements from six flux tower sites as well as satellite-derived estimates of isoprene emission over tropical South America and east and south Asia. The model simulates diurnal variability well: correlation coefficients are significant (at the 95 % level) for all flux tower sites. The model reproduces day-to-day variability with significant correlations (at the 95 % confidence level) at four of the six flux tower sites. At the UMBS site, a complete set of seasonal observations is available for two years (2000 and 2002). The model reproduces the seasonal pattern of emission during 2002, but does less well in the year 2000. The model overestimates observed emissions at all sites, which is partially because it does not include isoprene loss through the canopy. Comparison with the satellite-derived isoprene-emission estimates suggests that the model simulates the main spatial patterns, seasonal and inter-annual variability over tropical regions. The model yields a global annual isoprene emission of 535 ± 9 TgC yr−1 during the 1990s, 78 % of which from forested areas.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The fully compressible semi-geostrophic system is widely used in the modelling of large-scale atmospheric flows. In this paper, we prove rigorously the existence of weak Lagrangian solutions of this system, formulated in the original physical coordinates. In addition, we provide an alternative proof of the earlier result on the existence of weak solutions of this system expressed in the so-called geostrophic, or dual, coordinates. The proofs are based on the optimal transport formulation of the problem and on recent general results concerning transport problems posed in the Wasserstein space of probability measures.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We investigate the initialisation of Northern Hemisphere sea ice in the global climate model ECHAM5/MPI-OM by assimilating sea-ice concentration data. The analysis updates for concentration are given by Newtonian relaxation, and we discuss different ways of specifying the analysis updates for mean thickness. Because the conservation of mean ice thickness or actual ice thickness in the analysis updates leads to poor assimilation performance, we introduce a proportional dependence between concentration and mean thickness analysis updates. Assimilation with these proportional mean-thickness analysis updates leads to good assimilation performance for sea-ice concentration and thickness, both in identical-twin experiments and when assimilating sea-ice observations. The simulation of other Arctic surface fields in the coupled model is, however, not significantly improved by the assimilation. To understand the physical aspects of assimilation errors, we construct a simple prognostic model of the sea-ice thermodynamics, and analyse its response to the assimilation. We find that an adjustment of mean ice thickness in the analysis update is essential to arrive at plausible state estimates. To understand the statistical aspects of assimilation errors, we study the model background error covariance between ice concentration and ice thickness. We find that the spatial structure of covariances is best represented by the proportional mean-thickness analysis updates. Both physical and statistical evidence supports the experimental finding that assimilation with proportional mean-thickness updates outperforms the other two methods considered. The method described here is very simple to implement, and gives results that are sufficiently good to be used for initialising sea ice in a global climate model for seasonal to decadal predictions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A computer simulation method has been used to study the three-dimensional structural formation and transition of eleetromagnetorheological (EMR) suspensions under compatible electric and magnetic fields. When the fields are applied simultaneously and perpendicularly to each other, the particles rapidly arrange into single layer structures parallel to both fields. In each layer, there is a two-dimensional hexagonal lattice. The single layers then combine together to form thicker sheetlike structures. With the help of the thermal fluctuations, the thicker structures relax into three-dimensional close-packed structures, which may be face-centered cubic (fcc), hexagonal close-packed (hup) lattices, or, more probably, the mixture of them, depending on the initial configurations and the thermal fluctuations. On the other hand, if the electric field is applied first to induce the body-centered tetragonal (bct) columns in the system, and then the magnetic field is applied in the perpendicular direction, the bet to fee structure transition is observed in a very short time. Following that, the structure keeps on evolving due to the demagnetization effect and finally forms close-packed structures with fee and hcp lattice character. The simulation results are in agreement with the theoretical and experimental results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A lattice Boltzmann model able to simulate viscous fluid systems with elastic and movable boundaries is proposed. By introducing the virtual distribution function at the boundary, the Galilean invariance is recovered for the full system. As examples of application, the how in elastic vessels is simulated with the pressure-radius relationship similar to that of the pulmonary blood vessels. The numerical results for steady how are in good agreement with the analytical prediction, while the simulation results for pulsative how agree with the experimental observation of the aortic flows qualitatively. The approach has potential application in the study of the complex fluid systems such as the suspension system as well as the arterial blood flow.