932 resultados para Stochastic simulation methods


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Grand canonical Monte Carlo (GCMC) simulation was used for the systematic investigation of the supercritical methane adsorption at 273 K on an open graphite surface and in slitlike micropores of different sizes. For both considered adsorption systems the calculated excess adsorption isotherms exhibit a maximum. The effect of the pore size on the maximum surface excess and isosteric enthalpy of adsorption for methane storage at 273 K is discussed. The microscopic detailed picture of methane densification near the homogeneous graphite wall and in slitlike pores at 273 K is presented with selected local density profiles and snapshots. Finally, the reliable pore size distributions, obtained in the range of the microporosity, for two pitch-based microporous activated carbon fibers are calculated from the local excess adsorption isotherms obtained via the GCMC simulation. The current systematic study of supercritical methane adsorption both on an open graphite surface and in slitlike micropores performed by the GCMC summarizes recent investigations performed at slightly different temperatures and usually a lower pressure range by advanced methods based on the statistical thermodynamics.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The buffer allocation problem (BAP) is a well-known difficult problem in the design of production lines. We present a stochastic algorithm for solving the BAP, based on the cross-entropy method, a new paradigm for stochastic optimization. The algorithm involves the following iterative steps: (a) the generation of buffer allocations according to a certain random mechanism, followed by (b) the modification of this mechanism on the basis of cross-entropy minimization. Through various numerical experiments we demonstrate the efficiency of the proposed algorithm and show that the method can quickly generate (near-)optimal buffer allocations for fairly large production lines.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we utilise a stochastic address model of broadcast oligopoly markets to analyse the Australian broadcast television market. In particular, we examine the effect of the presence of a single government market participant in this market. An examination of the dynamics of the simulations demonstrates that the presence of a government market participant can simultaneously generate positive outcomes for viewers as well as for other market suppliers. Further examination of simulation dynamics indicates that privatisation of the government market participant results in reduced viewer choice and diversity. We also demonstrate that additional private market participants would not result in significant benefits to viewers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present Ehrenfest relations for the high temperature stochastic Gross-Pitaevskii equation description of a trapped Bose gas, including the effect of growth noise and the energy cutoff. A condition for neglecting the cutoff terms in the Ehrenfest relations is found which is more stringent than the usual validity condition of the truncated Wigner or classical field method-that all modes are highly occupied. The condition requires a small overlap of the nonlinear interaction term with the lowest energy single particle state of the noncondensate band, and gives a means to constrain dynamical artefacts arising from the energy cutoff in numerical simulations. We apply the formalism to two simple test problems: (i) simulation of the Kohn mode oscillation for a trapped Bose gas at zero temperature, and (ii) computing the equilibrium properties of a finite temperature Bose gas within the classical field method. The examples indicate ways to control the effects of the cutoff, and that there is an optimal choice of plane wave basis for a given cutoff energy. This basis gives the best reproduction of the single particle spectrum, the condensate fraction and the position and momentum densities.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Two stochastic production frontier models are formulated within the generalized production function framework popularized by Zellner and Revankar (Rev. Econ. Stud. 36 (1969) 241) and Zellner and Ryu (J. Appl. Econometrics 13 (1998) 101). This framework is convenient for parsimonious modeling of a production function with returns to scale specified as a function of output. Two alternatives for introducing the stochastic inefficiency term and the stochastic error are considered. In the first the errors are added to an equation of the form h(log y, theta) = log f (x, beta) where y denotes output, x is a vector of inputs and (theta, beta) are parameters. In the second the equation h(log y,theta) = log f(x, beta) is solved for log y to yield a solution of the form log y = g[theta, log f(x, beta)] and the errors are added to this equation. The latter alternative is novel, but it is needed to preserve the usual definition of firm efficiency. The two alternative stochastic assumptions are considered in conjunction with two returns to scale functions, making a total of four models that are considered. A Bayesian framework for estimating all four models is described. The techniques are applied to USDA state-level data on agricultural output and four inputs. Posterior distributions for all parameters, for firm efficiencies and for the efficiency rankings of firms are obtained. The sensitivity of the results to the returns to scale specification and to the stochastic specification is examined. (c) 2004 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In a recent study, severe distortions in the proton images of an excised, fixed, human brain in an 11.1 Tesla/40 cm MR instrument have been observed, and the effect modeled on phantom images using a finite difference time domain (FDTD) model. in the present study, we extend these simulations to that of a complete human head, employing a hybrid FDTD and method of moments (MoM) approach, which provides a validated method for simulating biological samples in coil structures. The effect of fixative on the image distortions is explored. importantly, temperature distributions within the head are also simulated using a bioheat method based on parameters derived from the electromagnetic simulations. The MoM/FDTD simulations confirm that the transverse magnetic field (B,) from a ReCav resonator exhibits good homogeneity in air but strong inhomogeneity when loaded with the head with or without fixative. The fixative serves to increase the distortions, but they are still significant for the in vivo simulations. The simulated signal intensity (SI) distribution within the sample confirm the distortions in the experimental images are caused by the complex interactions of the incident electromagnetic fields with tissue, which is heterogeneous in terms of conductivity and permittivity. The temperature distribution is likewise heterogeneous, raising concerns regarding hot spot generation in the sample that may exceed acceptable levels in future in vivo studies. As human imaging at 11.1 T is some time away, simulations are important in terms of predicting potential safety issues as well as evaluating practical concerns about the quality of images. Simulation on a whole human head at 11.1 T implies the wave behavior presents significant engineering challenges for ultra-high-field (UHF) MRI. Novel strategies will have to be employed in imaging technique and resonator design for UHF MRI to achieve the theoretical signal-to-noise ratio (SNR) improvements it offers over lower field systems. (C) 2005 Wiley Periodicals, Inc.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Fuzzy data has grown to be an important factor in data mining. Whenever uncertainty exists, simulation can be used as a model. Simulation is very flexible, although it can involve significant levels of computation. This article discusses fuzzy decision-making using the grey related analysis method. Fuzzy models are expected to better reflect decision-making uncertainty, at some cost in accuracy relative to crisp models. Monte Carlo simulation is used to incorporate experimental levels of uncertainty into the data and to measure the impact of fuzzy decision tree models using categorical data. Results are compared with decision tree models based on crisp continuous data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Aim To develop an appropriate dosing strategy for continuous intravenous infusions (CII) of enoxaparin by minimizing the percentage of steady-state anti-Xa concentration (C-ss) outside the therapeutic range of 0.5-1.2 IU ml(-1). Methods A nonlinear mixed effects model was developed with NONMEM (R) for 48 adult patients who received CII of enoxaparin with infusion durations that ranged from 8 to 894 h at rates between 100 and 1600 IU h(-1). Three hundred and sixty-three anti-Xa concentration measurements were available from patients who received CII. These were combined with 309 anti-Xa concentrations from 35 patients who received subcutaneous enoxaparin. The effects of age, body size, height, sex, creatinine clearance (CrCL) and patient location [intensive care unit (ICU) or general medical unit] on pharmacokinetic (PK) parameters were evaluated. Monte Carlo simulations were used to (i) evaluate covariate effects on C-ss and (ii) compare the impact of different infusion rates on predicted C-ss. The best dose was selected based on the highest probability that the C-ss achieved would lie within the therapeutic range. Results A two-compartment linear model with additive and proportional residual error for general medical unit patients and only a proportional error for patients in ICU provided the best description of the data. Both CrCL and weight were found to affect significantly clearance and volume of distribution of the central compartment, respectively. Simulations suggested that the best doses for patients in the ICU setting were 50 IU kg(-1) per 12 h (4.2 IU kg(-1) h(-1)) if CrCL < 30 ml min(-1); 60 IU kg(-1) per 12 h (5.0 IU kg(-1) h(-1)) if CrCL was 30-50 ml min(-1); and 70 IU kg(-1) per 12 h (5.8 IU kg(-1) h(-1)) if CrCL > 50 ml min(-1). The best doses for patients in the general medical unit were 60 IU kg(-1) per 12 h (5.0 IU kg(-1) h(-1)) if CrCL < 30 ml min(-1); 70 IU kg(-1) per 12 h (5.8 IU kg(-1) h(-1)) if CrCL was 30-50 ml min(-1); and 100 IU kg(-1) per 12 h (8.3 IU kg(-1) h(-1)) if CrCL > 50 ml min(-1). These best doses were selected based on providing the lowest equal probability of either being above or below the therapeutic range and the highest probability that the C-ss achieved would lie within the therapeutic range. Conclusion The dose of enoxaparin should be individualized to the patients' renal function and weight. There is some evidence to support slightly lower doses of CII enoxaparin in patients in the ICU setting.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objectives: In this paper, we present a unified electrodynamic heart model that permits simulations of the body surface potentials generated by the heart in motion. The inclusion of motion in the heart model significantly improves the accuracy of the simulated body surface potentials and therefore also the 12-lead ECG. Methods: The key step is to construct an electromechanical heart model. The cardiac excitation propagation is simulated by an electrical heart model, and the resulting cardiac active forces are used to calculate the ventricular wall motion based on a mechanical model. The source-field point relative position changes during heart systole and diastole. These can be obtained, and then used to calculate body surface ECG based on the electrical heart-torso model. Results: An electromechanical biventricular heart model is constructed and a standard 12-lead ECG is simulated. Compared with a simulated ECG based on the static electrical heart model, the simulated ECG based on the dynamic heart model is more accordant with a clinically recorded ECG, especially for the ST segment and T wave of a V1-V6 lead ECG. For slight-degree myocardial ischemia ECG simulation, the ST segment and T wave changes can be observed from the simulated ECG based on a dynamic heart model, while the ST segment and T wave of simulated ECG based on a static heart model is almost unchanged when compared with a normal ECG. Conclusions: This study confirms the importance of the mechanical factor in the ECG simulation. The dynamic heart model could provide more accurate ECG simulation, especially for myocardial ischemia or infarction simulation, since the main ECG changes occur at the ST segment and T wave, which correspond with cardiac systole and diastole phases.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study was undertaken to develop a simple laboratory-based method for simulating the freezing profiles of beef trim so that their effect on E. coli 0157 survival could be better assessed. A commercially available apparatus of the type used for freezing embryos, together with an associated temperature logger and software, was used for this purpose with a -80 degrees C freezer as a heat sink. Four typical beef trim freezing profiles, of different starting temperatures or lengths, were selected and modelled as straight lines for ease of manipulation. A further theoretical profile with an extended freezing plateau was also developed. The laboratory-based setup worked well and the modelled freezing profiles fitted closely to the original data. No change in numbers of any of the strains was apparent for the three simulated profiles of different lengths starting at 25 degrees C. Slight but significant (P < 0.05) decreases in numbers (similar to 0.2 log cfu g(-1)) of all strains were apparent for a profile starting at 12 degrees C. A theoretical version of this profile with a freezing plateau phase extended from 11 h to 17 h resulted in significant (P < 0.05) decreases in numbers (similar to 1.2 log cfu g(-1)) of all strains. Results indicated possible avenues for future research in controlling this pathogen. The method developed in this study proved a useful and cost-effective way for simulating freezing profiles of beef trim. (c) 2005 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Knowledge of the adsorption behavior of coal-bed gases, mainly under supercritical high-pressure conditions, is important for optimum design of production processes to recover coal-bed methane and to sequester CO2 in coal-beds. Here, we compare the two most rigorous adsorption methods based on the statistical mechanics approach, which are Density Functional Theory (DFT) and Grand Canonical Monte Carlo (GCMC) simulation, for single and binary mixtures of methane and carbon dioxide in slit-shaped pores ranging from around 0.75 to 7.5 nm in width, for pressure up to 300 bar, and temperature range of 308-348 K, as a preliminary study for the CO2 sequestration problem. For single component adsorption, the isotherms generated by DFT, especially for CO2, do not match well with GCMC calculation, and simulation is subsequently pursued here to investigate the binary mixture adsorption. For binary adsorption, upon increase of pressure, the selectivity of carbon dioxide relative to methane in a binary mixture initially increases to a maximum value, and subsequently drops before attaining a constant value at pressures higher than 300 bar. While the selectivity increases with temperature in the initial pressure-sensitive region, the constant high-pressure value is also temperature independent. Optimum selectivity at any temperature is attained at a pressure of 90-100 bar at low bulk mole fraction of CO2, decreasing to approximately 35 bar at high bulk mole fractions. (c) 2005 American Institute of Chemical Engineers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Systems biology is based on computational modelling and simulation of large networks of interacting components. Models may be intended to capture processes, mechanisms, components and interactions at different levels of fidelity. Input data are often large and geographically disperse, and may require the computation to be moved to the data, not vice versa. In addition, complex system-level problems require collaboration across institutions and disciplines. Grid computing can offer robust, scaleable solutions for distributed data, compute and expertise. We illustrate some of the range of computational and data requirements in systems biology with three case studies: one requiring large computation but small data (orthologue mapping in comparative genomics), a second involving complex terabyte data (the Visible Cell project) and a third that is both computationally and data-intensive (simulations at multiple temporal and spatial scales). Authentication, authorisation and audit systems are currently not well scalable and may present bottlenecks for distributed collaboration particularly where outcomes may be commercialised. Challenges remain in providing lightweight standards to facilitate the penetration of robust, scalable grid-type computing into diverse user communities to meet the evolving demands of systems biology.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Many populations have a negative impact on their habitat, or upon other species in the environment, if their numbers become too large. For this reason they are often managed using some form of control. The objective is to keep numbers at a sustainable level, while ensuring survival of the population.+Here we present models that allow population management programs to be assessed. Two common control regimes will be considered: reduction and suppression. Under the suppression regime the previous population is maintained close to a particular threshold through near continuous control, while under the reduction regime, control begins once the previous population reaches a certain threshold and continues until it falls below a lower pre-defined level. We discuss how to best choose the control parameters, and we provide tools that allow population managers to select reduction levels and control rates. Additional tools will be provided to assess the effect of different control regimes, in terms of population persistence and cost.In particular we consider the effects of each regime on the probability of extinction and the expected time to extinction, and compare the control methods in terms of the expected total cost of each regime over the life of the population. The usefulness of our results will be illustrated with reference to the control of a koala population inhabiting Kangaroo Island, Australia.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Analyzing, optimizing and designing flotation circuits using models and simulators have improved significantly over the last 15 years. Mineral flotation is now generally better understood through major advances in measuring and modeling the sub-processes within the flotation system. In addition, new and better methods have been derived to represent the floatability of particles as they move around a flotation circuit. A simulator has been developed that combines the effects of all of these sub-processes to predict the metallurgical performance of a flotation circuit. This paper presents an overview of the simulator, JKSimFloat V6.1PLUS, and its use in improving the industrial flotation plant performance. The application of the simulator at various operations is discussed with particular emphasis on the use of JKSimFloat V6.1PLUS in improving the flotation circuit performance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The deregulation of power industry worldwide has delivered the efficiency gains to the society; meanwhile, the intensity of competition has increased uncertainty and risks to market participants. Consequently, market participants are keen to hedge the market risks and maintain a competitive edge in the market; and this is a good explanation to the flourish of electricity derivative market. In this paper, the authors gave a comprehensive review of derivative contract pricing methods and proposed a new framework for energy derivative pricing to suit the needs of a deregulated electricity market