883 resultados para Simulation Based Method


Relevância:

40.00% 40.00%

Publicador:

Resumo:

Many well-established statistical methods in genetics were developed in a climate of severe constraints on computational power. Recent advances in simulation methodology now bring modern, flexible statistical methods within the reach of scientists having access to a desktop workstation. We illustrate the potential advantages now available by considering the problem of assessing departures from Hardy-Weinberg (HW) equilibrium. Several hypothesis tests of HW have been established, as well as a variety of point estimation methods for the parameter which measures departures from HW under the inbreeding model. We propose a computational, Bayesian method for assessing departures from HW, which has a number of important advantages over existing approaches. The method incorporates the effects-of uncertainty about the nuisance parameters--the allele frequencies--as well as the boundary constraints on f (which are functions of the nuisance parameters). Results are naturally presented visually, exploiting the graphics capabilities of modern computer environments to allow straightforward interpretation. Perhaps most importantly, the method is founded on a flexible, likelihood-based modelling framework, which can incorporate the inbreeding model if appropriate, but also allows the assumptions of the model to he investigated and, if necessary, relaxed. Under appropriate conditions, information can be shared across loci and, possibly, across populations, leading to more precise estimation. The advantages of the method are illustrated by application both to simulated data and to data analysed by alternative methods in the recent literature.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In this contribution we aim at anchoring Agent-Based Modeling (ABM) simulations in actual models of human psychology. More specifically, we apply unidirectional ABM to social psychological models using low level agents (i.e., intra-individual) to examine whether they generate better predictions, in comparison to standard statistical approaches, concerning the intentions of performing a behavior and the behavior. Moreover, this contribution tests to what extent the predictive validity of models of attitude such as the Theory of Planned Behavior (TPB) or Model of Goal-directed Behavior (MGB) depends on the assumption that peoples’ decisions and actions are purely rational. Simulations were therefore run by considering different deviations from rationality of the agents with a trembling hand method. Two data sets concerning respectively the consumption of soft drinks and physical activity were used. Three key findings emerged from the simulations. First, compared to standard statistical approach the agent-based simulation generally improves the prediction of behavior from intention. Second, the improvement in prediction is inversely proportional to the complexity of the underlying theoretical model. Finally, the introduction of varying degrees of deviation from rationality in agents’ behavior can lead to an improvement in the goodness of fit of the simulations. By demonstrating the potential of ABM as a complementary perspective to evaluating social psychological models, this contribution underlines the necessity of better defining agents in terms of psychological processes before examining higher levels such as the interactions between individuals.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Recent activity in the development of future weather data for building performance simulation follows recognition of the limitations of traditional methods, which have been based on a stationary (observed) climate. In the UK, such developments have followed on from the availability of regional climate models as delivered in UKCIP02 and recently the probabilistic projections released under UKCP09. One major area of concern is the future performance and adaptability of buildings which employ exclusively passive or low-energy cooling systems. One such method which can be employed in an integral or retrofit situation is direct or indirect evaporative cooling. The effectiveness of evaporative cooling is most strongly influenced by the wet-bulb depression of the ambient air, hence is generally regarded as most suited to hot, dry climates. However, this technology has been shown to be effective in the UK, primarily in mixed-mode buildings or as a retrofit to industrial/commercial applications. Climate projections for the UK generally indicate an increase in the summer wet-bulb depression, suggesting an enhanced potential for the application of evaporative cooling. The paper illustrates this potential by an analysis of the probabilistic scenarios released under UKCP09, together with a detailed building/plant simulation of case study building located in the South-East of England. The results indicate a high probability that evaporative cooling will still be a viable low-energy technique in the 2050s.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In Part I of this study it was shown that moving from a moisture-convergent- to a relative-humidity-dependent organized entrainment rate in the formulation for deep convection was responsible for significant advances in the simulation of the Madden – Julian Oscillation (MJO) in the ECMWF model. However, the application of traditional MJO diagnostics were not adequate to understand why changing the control on convection had such a pronounced impact on the representation of the MJO. In this study a set of process-based diagnostics are applied to the hindcast experiments described in Part I to identify the physical mechanisms responsible for the advances in MJO simulation. Increasing the sensitivity of the deep convection scheme to environmental moisture is shown to modify the relationship between precipitation and moisture in the model. Through dry-air entrainment, convective plumes ascending in low-humidity environments terminate lower in the atmosphere. As a result, there is an increase in the occurrence of cumulus congestus, which acts to moisten the mid troposphere. Due to the modified precipitation – moisture relationship more moisture is able to build up, which effectively preconditions the tropical atmosphere for the t ransition t o d eep convection. R esults from this study suggest that a tropospheric moisture control on convection is key to simulating the interaction between the convective heating and the large-scale wave forcing associated with the MJO.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Nowadays the changing environment becomes the main challenge for most of organizations, since they have to evaluate proper policies to adapt to the environment. In this paper, we propose a multi-agent simulation method to evaluate policies based on complex adaptive system theory. Furthermore, we propose a semiotic EDA (Epistemic, Deontic, Axiological) agent model to simulate agent's behavior in the system by incorporating the social norms reflecting the policy. A case study is also provided to validate our approach. Our research present better adaptability and validity than the qualitative analysis and experiment approach and the semiotic agent model provides high creditability to simulate agents' behavior.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Capacity dimensioning is one of the key problems in wireless network planning. Analytical and simulation methods are usually used to pursue the accurate capacity dimensioning of wireless network. In this paper, an analytical capacity dimensioning method for WCDMA with high speed wireless link is proposed based on the analysis on relations among system performance and high speed wireless transmission technologies, such as H-ARQ, AMC and fast scheduling. It evaluates system capacity in closed-form expressions from link level and system level. Numerical results show that the proposed method can calculate link level and system level capacity for WCDMA system with HSDPA and HSUPA.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

We have developed a model of the local field potential (LFP) based on the conservation of charge, the independence principle of ionic flows and the classical Hodgkin–Huxley (HH) type intracellular model of synaptic activity. Insights were gained through the simulation of the HH intracellular model on the nonlinear relationship between the balance of synaptic conductances and that of post-synaptic currents. The latter is dependent not only on the former, but also on the temporal lag between the excitatory and inhibitory conductances, as well as the strength of the afferent signal. The proposed LFP model provides a method for decomposing the LFP recordings near the soma of layer IV pyramidal neurons in the barrel cortex of anaesthetised rats into two highly correlated components with opposite polarity. The temporal dynamics and the proportional balance of the two components are comparable to the excitatory and inhibitory post-synaptic currents computed from the HH model. This suggests that the two components of the LFP reflect the underlying excitatory and inhibitory post-synaptic currents of the local neural population. We further used the model to decompose a sequence of evoked LFP responses under repetitive electrical stimulation (5 Hz) of the whisker pad. We found that as neural responses adapted, the excitatory and inhibitory components also adapted proportionately, while the temporal lag between the onsets of the two components increased during frequency adaptation. Our results demonstrated that the balance between neural excitation and inhibition can be investigated using extracellular recordings. Extension of the model to incorporate multiple compartments should allow more quantitative interpretations of surface Electroencephalography (EEG) recordings into components reflecting the excitatory, inhibitory and passive ionic current flows generated by local neural populations.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

We have incorporated a semi-mechanistic isoprene emission module into the JULES land-surface scheme, as a first step towards a modelling tool that can be applied for studies of vegetation – atmospheric chemistry interactions, including chemistry-climate feedbacks. Here, we evaluate the coupled model against local above-canopy isoprene emission flux measurements from six flux tower sites as well as satellite-derived estimates of isoprene emission over tropical South America and east and south Asia. The model simulates diurnal variability well: correlation coefficients are significant (at the 95 % level) for all flux tower sites. The model reproduces day-to-day variability with significant correlations (at the 95 % confidence level) at four of the six flux tower sites. At the UMBS site, a complete set of seasonal observations is available for two years (2000 and 2002). The model reproduces the seasonal pattern of emission during 2002, but does less well in the year 2000. The model overestimates observed emissions at all sites, which is partially because it does not include isoprene loss through the canopy. Comparison with the satellite-derived isoprene-emission estimates suggests that the model simulates the main spatial patterns, seasonal and inter-annual variability over tropical regions. The model yields a global annual isoprene emission of 535 ± 9 TgC yr−1 during the 1990s, 78 % of which from forested areas.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In this paper a modified algorithm is suggested for developing polynomial neural network (PNN) models. Optimal partial description (PD) modeling is introduced at each layer of the PNN expansion, a task accomplished using the orthogonal least squares (OLS) method. Based on the initial PD models determined by the polynomial order and the number of PD inputs, OLS selects the most significant regressor terms reducing the output error variance. The method produces PNN models exhibiting a high level of accuracy and superior generalization capabilities. Additionally, parsimonious models are obtained comprising a considerably smaller number of parameters compared to the ones generated by means of the conventional PNN algorithm. Three benchmark examples are elaborated, including modeling of the gas furnace process as well as the iris and wine classification problems. Extensive simulation results and comparison with other methods in the literature, demonstrate the effectiveness of the suggested modeling approach.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This paper presents a software-based study of a hardware-based non-sorting median calculation method on a set of integer numbers. The method divides the binary representation of each integer element in the set into bit slices in order to find the element located in the middle position. The method exhibits a linear complexity order and our analysis shows that the best performance in execution time is obtained when slices of 4-bit in size are used for 8-bit and 16-bit integers, in mostly any data set size. Results suggest that software implementation of bit slice method for median calculation outperforms sorting-based methods with increasing improvement for larger data set size. For data set sizes of N > 5, our simulations show an improvement of at least 40%.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

We discussed a floating mechanism based on quasi-magnetic levitation method that can be attached at the endpoint of a robot arm in order to construct a novel redundant robot arm for producing compliant motions. The floating mechanism can be composed of magnets and a constraint mechanism such that the repelling force of the magnets floats the endpoint part of the mechanism stable for the guided motions. The analytical and experimental results show that the proposed floating mechanism can produce stable floating motions with small inertia and viscosity. The results also show that the proposed mechanism can detect small force applied to the endpoint part because the friction force of the mechanism is very small.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Dietary assessment in older adults can be challenging. The Novel Assessment of Nutrition and Ageing (NANA) method is a touch-screen computer-based food record that enables older adults to record their dietary intakes. The objective of the present study was to assess the relative validity of the NANA method for dietary assessment in older adults. For this purpose, three studies were conducted in which a total of ninety-four older adults (aged 65–89 years) used the NANA method of dietary assessment. On a separate occasion, participants completed a 4 d estimated food diary. Blood and 24 h urine samples were also collected from seventy-six of the volunteers for the analysis of biomarkers of nutrient intake. The results from all the three studies were combined, and nutrient intake data collected using the NANA method were compared against the 4 d estimated food diary and biomarkers of nutrient intake. Bland–Altman analysis showed a reasonable agreement between the dietary assessment methods for energy and macronutrient intake; however, there were small, but significant, differences for energy and protein intake, reflecting the tendency for the NANA method to record marginally lower energy intakes. Significant positive correlations were observed between urinary urea and dietary protein intake using both the NANA and the 4 d estimated food diary methods, and between plasma ascorbic acid and dietary vitamin C intake using the NANA method. The results demonstrate the feasibility of computer-based dietary assessment in older adults, and suggest that the NANA method is comparable to the 4 d estimated food diary, and could be used as an alternative to the food diary for the short-term assessment of an individual’s dietary intake.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Active remote sensing of marine boundary-layer clouds is challenging as drizzle drops often dominate the observed radar reflectivity. We present a new method to simultaneously retrieve cloud and drizzle vertical profiles in drizzling boundary-layer clouds using surface-based observations of radar reflectivity, lidar attenuated backscatter, and zenith radiances under conditions when precipitation does not reach the surface. Specifically, the vertical structure of droplet size and water content of both cloud and drizzle is characterised throughout the cloud. An ensemble optimal estimation approach provides full error statistics given the uncertainty in the observations. To evaluate the new method, we first perform retrievals using synthetic measurements from large-eddy simulation snapshots of cumulus under stratocumulus, where cloud water path is retrieved with an error of 31 g m−2 . The method also performs well in non-drizzling clouds where no assumption of the cloud profile is required. We then apply the method to observations of marine stratocumulus obtained during the Atmospheric Radiation Measurement MAGIC deployment in the Northeast Pacific. Here, retrieved cloud water path agrees well with independent three-channel microwave radiometer retrievals, with a root mean square difference of 10–20 g m−2.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Imagery registration is a fundamental step, which greatly affects later processes in image mosaic, multi-spectral image fusion, digital surface modelling, etc., where the final solution needs blending of pixel information from more than one images. It is highly desired to find a way to identify registration regions among input stereo image pairs with high accuracy, particularly in remote sensing applications in which ground control points (GCPs) are not always available, such as in selecting a landing zone on an outer space planet. In this paper, a framework for localization in image registration is developed. It strengthened the local registration accuracy from two aspects: less reprojection error and better feature point distribution. Affine scale-invariant feature transform (ASIFT) was used for acquiring feature points and correspondences on the input images. Then, a homography matrix was estimated as the transformation model by an improved random sample consensus (IM-RANSAC) algorithm. In order to identify a registration region with a better spatial distribution of feature points, the Euclidean distance between the feature points is applied (named the S criterion). Finally, the parameters of the homography matrix were optimized by the Levenberg–Marquardt (LM) algorithm with selective feature points from the chosen registration region. In the experiment section, the Chang’E-2 satellite remote sensing imagery was used for evaluating the performance of the proposed method. The experiment result demonstrates that the proposed method can automatically locate a specific region with high registration accuracy between input images by achieving lower root mean square error (RMSE) and better distribution of feature points.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Despite the importance of dust aerosol in the Earth system, state-of-the-art models show a large variety for North African dust emission. This study presents a systematic evaluation of dust emitting-winds in 30 years of the historical model simulation with the UK Met Office Earth-system model HadGEM2-ES for the Coupled Model Intercomparison Project Phase 5. Isolating the effect of winds on dust emission and using an automated detection for nocturnal low-level jets (NLLJs) allow an in-depth evaluation of the model performance for dust emission from a meteorological perspective. The findings highlight that NLLJs are a key driver for dust emission in HadGEM2-ES in terms of occurrence frequency and strength. The annually and spatially averaged occurrence frequency of NLLJs is similar in HadGEM2-ES and ERA-Interim from the European Centre for Medium-Range Weather Forecasts. Compared to ERA-Interim, a stronger pressure ridge over northern Africa in winter and the southward displaced heat low in summer result in differences in location and strength of NLLJs. Particularly the larger geostrophic winds associated with the stronger ridge have a strengthening effect on NLLJs over parts of West Africa in winter. Stronger NLLJs in summer may rather result from an artificially increased mixing coefficient under stable stratification that is weaker in HadGEM2-ES. NLLJs in the Bodélé Depression are affected by stronger synoptic-scale pressure gradients in HadGEM2-ES. Wintertime geostrophic winds can even be so strong that the associated vertical wind shear prevents the formation of NLLJs. These results call for further model improvements in the synoptic-scale dynamics and the physical parametrization of the nocturnal stable boundary layer to better represent dust-emitting processes in the atmospheric model. The new approach could be used for identifying systematic behavior in other models with respect to meteorological processes for dust emission. This would help to improve dust emission simulations and contribute to decreasing the currently large uncertainty in climate change projections with respect to dust aerosol.