36 resultados para Monte Carlo methods


Relevância:

100.00% 100.00%

Publicador:

Resumo:

A new addition to the family of single-molecule magnets is reported: an Fete cage stabilized with benzoate and pyridonate ligands. Monte Carlo methods have been used to derive exchange parameters within the cage, and hence model susceptibility behavior.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

An important aspect in manufacturing design is the distribution of geometrical tolerances so that an assembly functions with given probability, while minimising the manufacturing cost. This requires a complex search over a multidimensional domain, much of which leads to infeasible solutions and which can have many local minima. As well, Monte-Carlo methods are often required to determine the probability that the assembly functions as designed. This paper describes a genetic algorithm for carrying out this search and successfully applies it to two specific mechanical designs, enabling comparisons of a new statistical tolerancing design method with existing methods. (C) 2003 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper describes two algorithms for adaptive power and bit allocations in a multiple input multiple output multiple-carrier code division multiple access (MIMO MC-CDMA) system. The first is the greedy algorithm, which has already been presented in the literature. The other one, which is proposed by the authors, is based on the use of the Lagrange multiplier method. The performances of the two algorithms are compared via Monte Carlo simulations. At present stage, the simulations are restricted to a single user MIMO MC-CDMA system, which is equivalent to a MIMO OFDM system. It is assumed that the system operates in a frequency selective fading environment. The transmitter has a partial knowledge of the channel whose properties are measured at the receiver. The use of the two algorithms results in similar system performances. The advantage of the Lagrange algorithm is that is much faster than the greedy algorithm. ©2005 IEEE

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A new methodology is proposed for the analysis of generation capacity investment in a deregulated market environment. This methodology proposes to make the investment appraisal using a probabilistic framework. The probabilistic production simulation (PPC) algorithm is used to compute the expected energy generated, taking into account system load variations and plant forced outage rates, while the Monte Carlo approach has been applied to model the electricity price variability seen in a realistic network. The model is able to capture the price and hence the profitability uncertainties for generator companies. Seasonal variation in the electricity prices and the system demand are independently modeled. The method is validated on IEEE RTS system, augmented with realistic market and plant data, by using it to compare the financial viability of several generator investments applying either conventional or directly connected generator (powerformer) technologies. The significance of the results is assessed using several financial risk measures.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objective: The Assessing Cost-Effectiveness - Mental Health (ACE-MH) study aims to assess from a health sector perspective, whether there are options for change that could improve the effectiveness and efficiency of Australia's current mental health services by directing available resources toward 'best practice' cost-effective services. Method: The use of standardized evaluation methods addresses the reservations expressed by many economists about the simplistic use of League Tables based on economic studies confounded by differences in methods, context and setting. The cost-effectiveness ratio for each intervention is calculated using economic and epidemiological data. This includes systematic reviews and randomised controlled trials for efficacy, the Australian Surveys of Mental Health and Wellbeing for current practice and a combination of trials and longitudinal studies for adherence. The cost-effectiveness ratios are presented as cost (A$) per disability-adjusted life year (DALY) saved with a 95% uncertainty interval based on Monte Carlo simulation modelling. An assessment of interventions on 'second filter' criteria ('equity', 'strength of evidence', 'feasibility' and 'acceptability to stakeholders') allows broader concepts of 'benefit' to be taken into account, as well as factors that might influence policy judgements in addition to cost-effectiveness ratios. Conclusions: The main limitation of the study is in the translation of the effect size from trials into a change in the DALY disability weight, which required the use of newly developed methods. While comparisons within disorders are valid, comparisons across disorders should be made with caution. A series of articles is planned to present the results.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Genetic assignment methods use genotype likelihoods to draw inference about where individuals were or were not born, potentially allowing direct, real-time estimates of dispersal. We used simulated data sets to test the power and accuracy of Monte Carlo resampling methods in generating statistical thresholds for identifying F-0 immigrants in populations with ongoing gene flow, and hence for providing direct, real-time estimates of migration rates. The identification of accurate critical values required that resampling methods preserved the linkage disequilibrium deriving from recent generations of immigrants and reflected the sampling variance present in the data set being analysed. A novel Monte Carlo resampling method taking into account these aspects was proposed and its efficiency was evaluated. Power and error were relatively insensitive to the frequency assumed for missing alleles. Power to identify F-0 immigrants was improved by using large sample size (up to about 50 individuals) and by sampling all populations from which migrants may have originated. A combination of plotting genotype likelihoods and calculating mean genotype likelihood ratios (D-LR) appeared to be an effective way to predict whether F-0 immigrants could be identified for a particular pair of populations using a given set of markers.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The generalized Gibbs sampler (GGS) is a recently developed Markov chain Monte Carlo (MCMC) technique that enables Gibbs-like sampling of state spaces that lack a convenient representation in terms of a fixed coordinate system. This paper describes a new sampler, called the tree sampler, which uses the GGS to sample from a state space consisting of phylogenetic trees. The tree sampler is useful for a wide range of phylogenetic applications, including Bayesian, maximum likelihood, and maximum parsimony methods. A fast new algorithm to search for a maximum parsimony phylogeny is presented, using the tree sampler in the context of simulated annealing. The mathematics underlying the algorithm is explained and its time complexity is analyzed. The method is tested on two large data sets consisting of 123 sequences and 500 sequences, respectively. The new algorithm is shown to compare very favorably in terms of speed and accuracy to the program DNAPARS from the PHYLIP package.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

I shall discuss the quantum and classical dynamics of a class of nonlinear Hamiltonian systems. The discussion will be restricted to systems with one degree of freedom. Such systems cannot exhibit chaos, unless the Hamiltonians are time dependent. Thus we shall consider systems with a potential function that has a higher than quadratic dependence on the position and, furthermore, we shall allow the potential function to be a periodic function of time. This is the simplest class of Hamiltonian system that can exhibit chaotic dynamics. I shall show how such systems can be realized in atom optics, where very cord atoms interact with optical dipole potentials of a far-off resonance laser. Such systems are ideal for quantum chaos studies as (i) the energy of the atom is small and action scales are of the order of Planck's constant, (ii) the systems are almost perfectly isolated from the decohering effects of the environment and (iii) optical methods enable exquisite time dependent control of the mechanical potentials seen by the atoms.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

We present a method of estimating HIV incidence rates in epidemic situations from data on age-specific prevalence and changes in the overall prevalence over time. The method is applied to women attending antenatal clinics in Hlabisa, a rural district of KwaZulu/Natal, South Africa, where transmission of HIV is overwhelmingly through heterosexual contact. A model which gives age-specific prevalence rates in the presence of a progressing epidemic is fitted to prevalence data for 1998 using maximum likelihood methods and used to derive the age-specific incidence. Error estimates are obtained using a Monte Carlo procedure. Although the method is quite general some simplifying assumptions are made concerning the form of the risk function and sensitivity analyses are performed to explore the importance of these assumptions. The analysis shows that in 1998 the annual incidence of infection per susceptible woman increased from 5.4 per cent (3.3-8.5 per cent; here and elsewhere ranges give 95 per cent confidence limits) at age 15 years to 24.5 per cent (20.6-29.1 per cent) at age 22 years and declined to 1.3 per cent (0.5-2.9 per cent) at age 50 years; standardized to a uniform age distribution, the overall incidence per susceptible woman aged 15 to 59 was 11.4 per cent (10.0-13.1 per cent); per women in the population it was 8.4 per cent (7.3-9.5 per cent). Standardized to the age distribution of the female population the average incidence per woman was 9.6 per cent (8.4-11.0 per cent); standardized to the age distribution of women attending antenatal clinics, it was 11.3 per cent (9.8-13.3 per cent). The estimated incidence depends on the values used for the epidemic growth rate and the AIDS related mortality. To ensure that, for this population, errors in these two parameters change the age specific estimates of the annual incidence by less than the standard deviation of the estimates of the age specific incidence, the AIDS related mortality should be known to within +/-50 per cent and the epidemic growth rate to within +/-25 per cent, both of which conditions are met. In the absence of cohort studies to measure the incidence of HIV infection directly, useful estimates of the age-specific incidence can be obtained from cross-sectional, age-specific prevalence data and repeat cross-sectional data on the overall prevalence of HIV infection. Several assumptions were made because of the lack of data but sensitivity analyses show that they are unlikely to affect the overall estimates significantly. These estimates are important in assessing the magnitude of the public health problem, for designing vaccine trials and for evaluating the impact of interventions. Copyright (C) 2001 John Wiley & Sons, Ltd.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In this paper, we propose a new nonlocal density functional theory characterization procedure, the finite wall thickness model, for nanoporous carbons, whereby heterogeneity of pore size and pore walls in the carbon is probed simultaneously. We determine the pore size distributions and pore wall thickness distributions of several commercial activated carbons and coal chars, with good correspondence with X-ray diffraction. It is shown that the conventional infinite wall thickness approach overestimates the pore size slightly. Pore-pore correlation has been shown to have a negligible effect on prediction of pore size and pore wall thickness distributions for small molecules such as argon used in characterization. By utilizing the structural parameters (pore size and pore wall thickness distribution) in the generalized adsorption isotherm (GAI) we are able to predict adsorption uptake of supercritical gases in BPL and Norit RI Extra carbons, in excellent agreement with experimental adsorption uptake data up to 60 MPa. The method offers a useful technique for probing features of the solid skeleton, hitherto studied by crystallographic methods.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper presents a method for estimating the posterior probability density of the cointegrating rank of a multivariate error correction model. A second contribution is the careful elicitation of the prior for the cointegrating vectors derived from a prior on the cointegrating space. This prior obtains naturally from treating the cointegrating space as the parameter of interest in inference and overcomes problems previously encountered in Bayesian cointegration analysis. Using this new prior and Laplace approximation, an estimator for the posterior probability of the rank is given. The approach performs well compared with information criteria in Monte Carlo experiments. (C) 2003 Elsevier B.V. All rights reserved.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In this paper use consider the problem of providing standard errors of the component means in normal mixture models fitted to univariate or multivariate data by maximum likelihood via the EM algorithm. Two methods of estimation of the standard errors are considered: the standard information-based method and the computationally-intensive bootstrap method. They are compared empirically by their application to three real data sets and by a small-scale Monte Carlo experiment.