999 resultados para stochastic development


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Many animals that live in groups maintain competitive relationships, yet avoid continual fighting, by forming dominance hierarchies. We compare predictions of stochastic, individual-based models with empirical experimental evidence using shore crabs to test competing hypotheses regarding hierarchy development. The models test (1) what information individuals use when deciding to fight or retreat, (2) how past experience affects current resource-holding potential, and (3) how individuals deal with changes to the social environment. First, we conclude that crabs assess only their own state and not their opponent's when deciding to fight or retreat. Second, willingness to enter, and performance in, aggressive contests are influenced by previous contest outcomes. Winning increases the likelihood of both fighting and winning future interactions, while losing has the opposite effect. Third, when groups with established dominance hierarchies dissolve and new groups form, individuals reassess their ranks, showing no memory of previous rank or group affiliation. With every change in group composition, individuals fight for their new ranks. This iterative process carries over as groups dissolve and form, which has important implications for the relationship between ability and hierarchy rank. We conclude that dominance hierarchies emerge through an interaction of individual and social factors, and discuss these findings in terms of an underlying mechanism. Overall, our results are consistent with crabs using a cumulative assessment strategy iterated across changes in group composition, in which aggression is constrained by an absolute threshold in energy spent and damage received while fighting.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A complete life cycle model for northern corn rootworm, Diabrotica barberi Smith and Lawrence, is developed using a published single-season model of adult population dynamics and data from field experiments. Temperature-dependent development and age-dependent advancement determine adult population dynamics and oviposition, while a simple stochastic hatch and density-dependent larval survival model determine adult emergence. Dispersal is not modeled. To evaluate the long-run performance of the model, stochastically generated daily air and soil temperatures are used for 100-year simulations for a variety of corn planting and flowering dates in Ithaca, NY, and Brookings, SD. Once the model is corrected for a bias in oviposition, model predictions for both locations are consistent with anecdotal field data. Extinctions still occur, but these may be consistent with northern corn rootworm metapopulation dynamics.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cultural variation in a population is affected by the rate of occurrence of cultural innovations, whether such innovations are preferred or eschewed, how they are transmitted between individuals in the population, and the size of the population. An innovation, such as a modification in an attribute of a handaxe, may be lost or may become a property of all handaxes, which we call "fixation of the innovation." Alternatively, several innovations may attain appreciable frequencies, in which case properties of the frequency distribution-for example, of handaxe measurements-is important. Here we apply the Moran model from the stochastic theory of population genetics to study the evolution of cultural innovations. We obtain the probability that an initially rare innovation becomes fixed, and the expected time this takes. When variation in cultural traits is due to recurrent innovation, copy error, and sampling from generation to generation, we describe properties of this variation, such as the level of heterogeneity expected in the population. For all of these, we determine the effect of the mode of social transmission: conformist, where there is a tendency for each naïve newborn to copy the most popular variant; pro-novelty bias, where the newborn prefers a specific variant if it exists among those it samples; one-to-many transmission, where the variant one individual carries is copied by all newborns while that individual remains alive. We compare our findings with those predicted by prevailing theories for rates of cultural change and the distribution of cultural variation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper estimates a translog stochastic frontier production function in the analysis of all 48 contiguous U.S. states in the period 1970-1983, to attempt to measure and explain changes in technical efficiency. The model allows technical inefficiency to vary over time, and inefficiency effects to be a function of a set of explanatory variables in which the level and composition of public capital plays an important role. Results indicated that U.S. state inefficiency levels were significantly and positively correlated with the ratio of public capital to private capital. The proportion of public capital devoted to highways is negatively correlated with technical inefficiency, suggesting that not only the level but also the composition of public capital influenced state efficiency.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The principal aim of this paper is to estimate a stochastic frontier costfunction and an inefficiency effects model in the analysis of the primaryhealth care services purchased by the public authority and supplied by 180providers in 1996 in Catalonia. The evidence from our sample does not supportthe premise that contracting out has helped improve purchasing costefficiency in primary care. Inefficient purchasing cost was observed in thecomponent of this purchasing cost explicitly included in the contract betweenpurchaser and provider. There are no observable incentives for thecontracted-out primary health care teams to minimise prescription costs, whichare not explicitly included in the present contracting system.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Abstract Traditionally, the common reserving methods used by the non-life actuaries are based on the assumption that future claims are going to behave in the same way as they did in the past. There are two main sources of variability in the processus of development of the claims: the variability of the speed with which the claims are settled and the variability between the severity of the claims from different accident years. High changes in these processes will generate distortions in the estimation of the claims reserves. The main objective of this thesis is to provide an indicator which firstly identifies and quantifies these two influences and secondly to determine which model is adequate for a specific situation. Two stochastic models were analysed and the predictive distributions of the future claims were obtained. The main advantage of the stochastic models is that they provide measures of variability of the reserves estimates. The first model (PDM) combines one conjugate family Dirichlet - Multinomial with the Poisson distribution. The second model (NBDM) improves the first one by combining two conjugate families Poisson -Gamma (for distribution of the ultimate amounts) and Dirichlet Multinomial (for distribution of the incremental claims payments). It was found that the second model allows to find the speed variability in the reporting process and development of the claims severity as function of two above mentioned distributions' parameters. These are the shape parameter of the Gamma distribution and the Dirichlet parameter. Depending on the relation between them we can decide on the adequacy of the claims reserve estimation method. The parameters have been estimated by the Methods of Moments and Maximum Likelihood. The results were tested using chosen simulation data and then using real data originating from the three lines of business: Property/Casualty, General Liability, and Accident Insurance. These data include different developments and specificities. The outcome of the thesis shows that when the Dirichlet parameter is greater than the shape parameter of the Gamma, resulting in a model with positive correlation between the past and future claims payments, suggests the Chain-Ladder method as appropriate for the claims reserve estimation. In terms of claims reserves, if the cumulated payments are high the positive correlation will imply high expectations for the future payments resulting in high claims reserves estimates. The negative correlation appears when the Dirichlet parameter is lower than the shape parameter of the Gamma, meaning low expected future payments for the same high observed cumulated payments. This corresponds to the situation when claims are reported rapidly and fewer claims remain expected subsequently. The extreme case appears in the situation when all claims are reported at the same time leading to expectations for the future payments of zero or equal to the aggregated amount of the ultimate paid claims. For this latter case, the Chain-Ladder is not recommended.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

How immature CD4+CD8+ thymocytes become committed to either the CD4 (helper) or CD8 (cytotoxic) lineage is controversial. Genetic ablation of a silencer element in the gene encoding CD4 provides new evidence that CD8 lineage commitment occurs via a stochastic, rather than instructive, mechanism.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The stochastic convergence amongst Mexican Federal entities is analyzed in panel data framework. The joint consideration of cross-section dependence and multiple structural breaks is required to ensure that the statistical inference is based on statistics with good statistical properties. Once these features are accounted for, evidence in favour of stochastic convergence is found. Since stochastic convergence is a necessary, yet insufficient condition for convergence as predicted by economic growth models, the paper also investigates whether-convergence process has taken place. We found that the Mexican states have followed either heterogeneous convergence patterns or divergence process throughout the analyzed period.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Innovative gas cooled reactors, such as the pebble bed reactor (PBR) and the gas cooled fast reactor (GFR) offer higher efficiency and new application areas for nuclear energy. Numerical methods were applied and developed to analyse the specific features of these reactor types with fully three dimensional calculation models. In the first part of this thesis, discrete element method (DEM) was used for a physically realistic modelling of the packing of fuel pebbles in PBR geometries and methods were developed for utilising the DEM results in subsequent reactor physics and thermal-hydraulics calculations. In the second part, the flow and heat transfer for a single gas cooled fuel rod of a GFR were investigated with computational fluid dynamics (CFD) methods. An in-house DEM implementation was validated and used for packing simulations, in which the effect of several parameters on the resulting average packing density was investigated. The restitution coefficient was found out to have the most significant effect. The results can be utilised in further work to obtain a pebble bed with a specific packing density. The packing structures of selected pebble beds were also analysed in detail and local variations in the packing density were observed, which should be taken into account especially in the reactor core thermal-hydraulic analyses. Two open source DEM codes were used to produce stochastic pebble bed configurations to add realism and improve the accuracy of criticality calculations performed with the Monte Carlo reactor physics code Serpent. Russian ASTRA criticality experiments were calculated. Pebble beds corresponding to the experimental specifications within measurement uncertainties were produced in DEM simulations and successfully exported into the subsequent reactor physics analysis. With the developed approach, two typical issues in Monte Carlo reactor physics calculations of pebble bed geometries were avoided. A novel method was developed and implemented as a MATLAB code to calculate porosities in the cells of a CFD calculation mesh constructed over a pebble bed obtained from DEM simulations. The code was further developed to distribute power and temperature data accurately between discrete based reactor physics and continuum based thermal-hydraulics models to enable coupled reactor core calculations. The developed method was also found useful for analysing sphere packings in general. CFD calculations were performed to investigate the pressure losses and heat transfer in three dimensional air cooled smooth and rib roughened rod geometries, housed inside a hexagonal flow channel representing a sub-channel of a single fuel rod of a GFR. The CFD geometry represented the test section of the L-STAR experimental facility at Karlsruhe Institute of Technology and the calculation results were compared to the corresponding experimental results. Knowledge was gained of the adequacy of various turbulence models and of the modelling requirements and issues related to the specific application. The obtained pressure loss results were in a relatively good agreement with the experimental data. Heat transfer in the smooth rod geometry was somewhat under predicted, which can partly be explained by unaccounted heat losses and uncertainties. In the rib roughened geometry heat transfer was severely under predicted by the used realisable k − epsilon turbulence model. An additional calculation with a v2 − f turbulence model showed significant improvement in the heat transfer results, which is most likely due to the better performance of the model in separated flow problems. Further investigations are suggested before using CFD to make conclusions of the heat transfer performance of rib roughened GFR fuel rod geometries. It is suggested that the viewpoints of numerical modelling are included in the planning of experiments to ease the challenging model construction and simulations and to avoid introducing additional sources of uncertainties. To facilitate the use of advanced calculation approaches, multi-physical aspects in experiments should also be considered and documented in a reasonable detail.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A Parts of Speech tagger for Malayalam which uses a stochastic approach has been proposed. The tagger makes use of word frequencies and bigram statistics from a corpus. The morphological analyzer is used to generate a tagged corpus due to the unavailability of an annotated corpus in Malayalam. Although the experiments have been performed on a very small corpus, the results have shown that the statistical approach works well with a highly agglutinative language like Malayalam

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Our group considered the desirability of including representations of uncertainty in the development of parameterizations. (By ‘uncertainty’ here we mean the deviation of sub-grid scale fluxes or tendencies in any given model grid box from truth.) We unanimously agreed that the ECWMF should attempt to provide a more physical basis for uncertainty estimates than the very effective but ad hoc methods being used at present. Our discussions identified several issues that will arise.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A direct method is presented for determining the uncertainty in reservoir pressure, flow, and net present value (NPV) using the time-dependent, one phase, two- or three-dimensional equations of flow through a porous medium. The uncertainty in the solution is modelled as a probability distribution function and is computed from given statistical data for input parameters such as permeability. The method generates an expansion for the mean of the pressure about a deterministic solution to the system equations using a perturbation to the mean of the input parameters. Hierarchical equations that define approximations to the mean solution at each point and to the field covariance of the pressure are developed and solved numerically. The procedure is then used to find the statistics of the flow and the risked value of the field, defined by the NPV, for a given development scenario. This method involves only one (albeit complicated) solution of the equations and contrasts with the more usual Monte-Carlo approach where many such solutions are required. The procedure is applied easily to other physical systems modelled by linear or nonlinear partial differential equations with uncertain data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Westerly wind bursts (WWBs) that occur in the western tropical Pacific are believed to play an important role in the development of El Niño events. Here, following the study of Lengaigne et al. (Clim Dyn 23(6):601–620, 2004), we conduct numerical simulations in which we reexamine the response of the climate system to an observed wind burst added to a coupled general circulation model. Two sets of twin ensemble experiments are conducted (each set has control and perturbed experiments). In the first set, the initial ocean heat content of the system is higher than the model climatology (recharged), while in the second set it is nearly normal (neutral). For the recharged state, in the absence of WWBs, a moderate El Niño with a maximum warming in the central Pacific (CP) develops in about a year. In contrast, for the neutral state, there develops a weak La Niña. However, when the WWB is imposed, the situation dramatically changes: the recharged state slides into an El Niño with a maximum warming in the eastern Pacific, while the neutral set produces a weak CP El Niño instead of previous La Niña conditions. The different response of the system to the exact same perturbations is controlled by the initial state of the ocean and the subsequent ocean–atmosphere interactions involving the interplay between the eastward shift of the warm pool and the warming of the eastern equatorial Pacific. Consequently, the observed diversity of El Niño, including the occurrence of extreme events, may depend on stochastic atmospheric processes, modulating El Niño properties within a broad continuum.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Plant–Craig stochastic convection parameterization (version 2.0) is implemented in the Met Office Regional Ensemble Prediction System (MOGREPS-R) and is assessed in comparison with the standard convection scheme with a simple stochastic scheme only, from random parameter variation. A set of 34 ensemble forecasts, each with 24 members, is considered, over the month of July 2009. Deterministic and probabilistic measures of the precipitation forecasts are assessed. The Plant–Craig parameterization is found to improve probabilistic forecast measures, particularly the results for lower precipitation thresholds. The impact on deterministic forecasts at the grid scale is neutral, although the Plant–Craig scheme does deliver improvements when forecasts are made over larger areas. The improvements found are greater in conditions of relatively weak synoptic forcing, for which convective precipitation is likely to be less predictable.