942 resultados para Stochastic simulation algorithm
Resumo:
A version of the Agricultural Production Systems Simulator (APSIM) capable of simulating the key agronomic aspects of intercropping maize between legume shrub hedgerows was described and parameterised in the first paper of this series (Nelson et al., this issue). In this paper, APSIM is used to simulate maize yields and soil erosion from traditional open-field farming and hedgerow intercropping in the Philippine uplands. Two variants of open-field farming were simulated using APSIM, continuous and fallow, for comparison with intercropping maize between leguminous shrub hedgerows. Continuous open-field maize farming was predicted to be unsustainable in the long term, while fallow open-field farming was predicted to slow productivity decline by spreading the effect of erosion over a larger cropping area. Hedgerow intercropping was predicted to reduce erosion by maintaining soil surface cover during periods of intense rainfall, contributing to sustainable production of maize in the long term. In the third paper in this series, Nelson et al. (this issue) use cost-benefit analysis to compare the economic viability of hedgerow intercropping relative to traditional open-field farming of maize in relatively inaccessible upland areas. (C) 1998 Elsevier Science Ltd. All rights reserved.
Resumo:
We introduce the study of dynamical quantum noise in Bose-Einstein condensates through numerical simulation of stochastic partial differential equations obtained using phase-space representations. We derive evolution equations for a single trapped condensate in both the positive-P and Wigner representations and perform simulations to compare the predictions of the two methods. The positive-P approach is found to be highly susceptible to the stability problems that have been observed in other strongly nonlinear, weakly damped systems. Using the Wigner representation, we examine the evolution of several quantities of interest using from a variety of choices of initial stare for the condensate and compare results to those for single-mode models. [S1050-2947(98)06612-8].
Resumo:
A space-marching code for the simulation and optimization of inviscid supersonic flow in three dimensions is described. The now in a scramjet module with a relatively complex three-dimensional geometry is examined and wall-pressure estimates are compared with experimental data. Given that viscous effects are not presently included, the comparison is reasonable. The thermodynamic compromise of adding heat in a diverging combustor is also examined. The code is then used to optimize the shape of a thrust surface for a simpler (box-section) scramjet module in the presence of uniform and nonuniform heat distributions. The optimum two-dimensional profiles for the thrust surface are obtained via a perturbation procedure that requires about 30-50 now solutions. It is found that the final shapes are fairly insensitive to the details of the heat distribution.
Resumo:
To translate and transfer solution data between two totally different meshes (i.e. mesh 1 and mesh 2), a consistent point-searching algorithm for solution interpolation in unstructured meshes consisting of 4-node bilinear quadrilateral elements is presented in this paper. The proposed algorithm has the following significant advantages: (1) The use of a point-searching strategy allows a point in one mesh to be accurately related to an element (containing this point) in another mesh. Thus, to translate/transfer the solution of any particular point from mesh 2 td mesh 1, only one element in mesh 2 needs to be inversely mapped. This certainly minimizes the number of elements, to which the inverse mapping is applied. In this regard, the present algorithm is very effective and efficient. (2) Analytical solutions to the local co ordinates of any point in a four-node quadrilateral element, which are derived in a rigorous mathematical manner in the context of this paper, make it possible to carry out an inverse mapping process very effectively and efficiently. (3) The use of consistent interpolation enables the interpolated solution to be compatible with an original solution and, therefore guarantees the interpolated solution of extremely high accuracy. After the mathematical formulations of the algorithm are presented, the algorithm is tested and validated through a challenging problem. The related results from the test problem have demonstrated the generality, accuracy, effectiveness, efficiency and robustness of the proposed consistent point-searching algorithm. Copyright (C) 1999 John Wiley & Sons, Ltd.
Resumo:
A mixture model for long-term survivors has been adopted in various fields such as biostatistics and criminology where some individuals may never experience the type of failure under study. It is directly applicable in situations where the only information available from follow-up on individuals who will never experience this type of failure is in the form of censored observations. In this paper, we consider a modification to the model so that it still applies in the case where during the follow-up period it becomes known that an individual will never experience failure from the cause of interest. Unless a model allows for this additional information, a consistent survival analysis will not be obtained. A partial maximum likelihood (ML) approach is proposed that preserves the simplicity of the long-term survival mixture model and provides consistent estimators of the quantities of interest. Some simulation experiments are performed to assess the efficiency of the partial ML approach relative to the full ML approach for survival in the presence of competing risks.
Resumo:
Two experimental studies were conducted to examine whether the stress-buffering effects of behavioral control on work task responses varied as a function of procedural information. Study 1 manipulated low and high levels of task demands, behavioral control, and procedural information for 128 introductory psychology students completing an in-basket activity. ANOVA procedures revealed a significant three-way interaction among these variables in the prediction of subjective task performance and task satisfaction. It was found that procedural information buffered the negative effects of task demands on ratings of performance and satisfaction only under conditions of low behavioral control. This pattern of results suggests that procedural information may have a compensatory effect when the work environment is characterized by a combination of high task demands and low behavioral control. Study 2 (N = 256) utilized simple and complex versions of the in-basket activity to examine the extent to which the interactive relationship among task demands, behavioral control, and procedural information varied as a function of task complexity. There was further support for the stress-buffering role of procedural information on work task responses under conditions of low behavioral control. This effect was, however, only present when the in-basket activity was characterized by high task complexity, suggesting that the interactive relationship among these variables may depend on the type of tasks performed at work. Copyright (C) 1999 John Wiley & Sons, Ltd.
Resumo:
RWMODEL II simulates the Rescorla-Wagner model of Pavlovian conditioning. It is written in Delphi and runs under Windows 3.1 and Windows 95. The program was designed for novice and expert users and can be employed in teaching, as well as in research. It is user friendly and requires a minimal level of computer literacy but is sufficiently flexible to permit a wide range of simulations. It allows the display of empirical data, against which predictions from the model can be validated.
Resumo:
1. Establishing biological control agents in the field is a major step in any classical biocontrol programme, yet there are few general guidelines to help the practitioner decide what factors might enhance the establishment of such agents. 2. A stochastic dynamic programming (SDP) approach, linked to a metapopulation model, was used to find optimal release strategies (number and size of releases), given constraints on time and the number of biocontrol agents available. By modelling within a decision-making framework we derived rules of thumb that will enable biocontrol workers to choose between management options, depending on the current state of the system. 3. When there are few well-established sites, making a few large releases is the optimal strategy. For other states of the system, the optimal strategy ranges from a few large releases, through a mixed strategy (a variety of release sizes), to many small releases, as the probability of establishment of smaller inocula increases. 4. Given that the probability of establishment is rarely a known entity, we also strongly recommend a mixed strategy in the early stages of a release programme, to accelerate learning and improve the chances of finding the optimal approach.
Resumo:
OBJECTIVE: To evaluate a diagnostic algorithm for pulmonary tuberculosis based on smear microscopy and objective response to trial of antibiotics. SETTING: Adult medical wards, Hlabisa Hospital, South Africa, 1996-1997. METHODS: Adults with chronic chest symptoms and abnormal chest X-ray had sputum examined for Ziehl-Neelsen stained acid-fast bacilli by light microscopy. Those with negative smears were treated with amoxycillin for 5 days and assessed. Those who had not improved were treated with erythromycin for 5 days and reassessed. Response was compared with mycobacterial culture. RESULTS: Of 280 suspects who completed the diagnostic pathway, 160 (57%) had a positive smear, 46 (17%) responded to amoxycillin, 34 (12%) responded to erythromycin and 40 (14%) were treated as smear-negative tuberculosis. The sensitivity (89%) and specificity (84%) of the full algorithm for culture-positive tuberculosis were high. However, 11 patients (positive predictive value [PPV] 95%) were incorrectly diagnosed with tuberculosis, and 24 cases of tuberculosis (negative predictive value [NPV] 70%) were not identified. NPV improved to 75% when anaemia was included as a predictor. Algorithm performance was independent of human immunodeficiency virus status. CONCLUSION: Sputum smear microscopy plus trial of antibiotic algorithm among a selected group of tuberculosis suspects may increase diagnostic accuracy in district hospitals in developing countries.
Resumo:
Intracavity and external third order correlations in the damped nondegenerate parametric oscillator are calculated for quantum mechanics and stochastic electrodynamics (SED), a semiclassical theory. The two theories yield greatly different results, with the correlations of quantum mechanics being cubic in the system's nonlinear coupling constant and those of SED being linear in the same constant. In particular, differences between the two theories are present in at least a mesoscopic regime. They also exist when realistic damping is included. Such differences illustrate distinctions between quantum mechanics and a hidden variable theory for continuous variables.
Resumo:
In quantum measurement theory it is necessary to show how a, quantum source conditions a classical stochastic record of measured results. We discuss mesoscopic conductance using quantum stochastic calculus to elucidate the quantum nature of the measurement taking place in these systems. To illustrate the method we derive the current fluctuations in a two terminal mesoscopic circuit with two tunnel barriers containing a single quasi bound state on the well. The method enables us to focus on either the incoming/ outgoing Fermi fields in the leads, or on the irreversible dynamics of the well state itself. We show an equivalence between the approach of Buttiker and the Fermi quantum stochastic calculus for mesoscopic systems.
Resumo:
In this paper, the minimum-order stable recursive filter design problem is proposed and investigated. This problem is playing an important role in pipeline implementation sin signal processing. Here, the existence of a high-order stable recursive filter is proved theoretically, in which the upper bound for the highest order of stable filters is given. Then the minimum-order stable linear predictor is obtained via solving an optimization problem. In this paper, the popular genetic algorithm approach is adopted since it is a heuristic probabilistic optimization technique and has been widely used in engineering designs. Finally, an illustrative example is sued to show the effectiveness of the proposed algorithm.
Resumo:
We analyze folding phenomena in finely layered viscoelastic rock. Fine is meant in the sense that the thickness of each layer is considerably smaller than characteristic structural dimensions. For this purpose we derive constitutive relations and apply a computational simulation scheme (a finite-element based particle advection scheme; see MORESI et al., 2001) suitable for problems involving very large deformations of layered viscous and viscoelastic rocks. An algorithm for the time integration of the governing equations as well as details of the finite-element implementation is also given. We then consider buckling instabilities in a finite, rectangular domain. Embedded within this domain, parallel to the longer dimension we consider a stiff, layered plate. The domain is compressed along the layer axis by prescribing velocities along the sides. First, for the viscous limit we consider the response to a series of harmonic perturbations of the director orientation. The Fourier spectra of the initial folding velocity are compared for different viscosity ratios. Turning to the nonlinear regime we analyze viscoelastic folding histories up to 40% shortening. The effect of layering manifests itself in that appreciable buckling instabilities are obtained at much lower viscosity ratios (1:10) as is required for the buckling of isotropic plates (1:500). The wavelength induced by the initial harmonic perturbation of the director orientation seems to be persistent. In the section of the parameter space considered here elasticity seems to delay or inhibit the occurrence of a second, larger wavelength. Finally, in a linear instability analysis we undertake a brief excursion into the potential role of couple stresses on the folding process. The linear instability analysis also provides insight into the expected modes of deformation at the onset of instability, and the different regimes of behavior one might expect to observe.
Resumo:
This article deals with the efficiency of fractional integration parameter estimators. This study was based on Monte Carlo experiments involving simulated stochastic processes with integration orders in the range]-1,1[. The evaluated estimation methods were classified into two groups: heuristics and semiparametric/maximum likelihood (ML). The study revealed that the comparative efficiency of the estimators, measured by the lesser mean squared error, depends on the stationary/non-stationary and persistency/anti-persistency conditions of the series. The ML estimator was shown to be superior for stationary persistent processes; the wavelet spectrum-based estimators were better for non-stationary mean reversible and invertible anti-persistent processes; the weighted periodogram-based estimator was shown to be superior for non-invertible anti-persistent processes.
Resumo:
A graph clustering algorithm constructs groups of closely related parts and machines separately. After they are matched for the least intercell moves, a refining process runs on the initial cell formation to decrease the number of intercell moves. A simple modification of this main approach can deal with some practical constraints, such as the popular constraint of bounding the maximum number of machines in a cell. Our approach makes a big improvement in the computational time. More importantly, improvement is seen in the number of intercell moves when the computational results were compared with best known solutions from the literature. (C) 2009 Elsevier Ltd. All rights reserved.