988 resultados para Monte Carlo algorithms


Relevância:

80.00% 80.00%

Publicador:

Resumo:

PURPOSE: Few studies compare the variabilities that characterize environmental (EM) and biological monitoring (BM) data. Indeed, comparing their respective variabilities can help to identify the best strategy for evaluating occupational exposure. The objective of this study is to quantify the biological variability associated with 18 bio-indicators currently used in work environments. METHOD: Intra-individual (BV(intra)), inter-individual (BV(inter)), and total biological variability (BV(total)) were quantified using validated physiologically based toxicokinetic (PBTK) models coupled with Monte Carlo simulations. Two environmental exposure profiles with different levels of variability were considered (GSD of 1.5 and 2.0). RESULTS: PBTK models coupled with Monte Carlo simulations were successfully used to predict the biological variability of biological exposure indicators. The predicted values follow a lognormal distribution, characterized by GSD ranging from 1.1 to 2.3. Our results show that there is a link between biological variability and the half-life of bio-indicators, since BV(intra) and BV(total) both decrease as the biological indicator half-lives increase. BV(intra) is always lower than the variability in the air concentrations. On an individual basis, this means that the variability associated with the measurement of biological indicators is always lower than the variability characterizing airborne levels of contaminants. For a group of workers, BM is less variable than EM for bio-indicators with half-lives longer than 10-15 h. CONCLUSION: The variability data obtained in the present study can be useful in the development of BM strategies for exposure assessment and can be used to calculate the number of samples required for guiding industrial hygienists or medical doctors in decision-making.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Biological monitoring of occupational exposure is characterized by important variability, due both to variability in the environment and to biological differences between workers. A quantitative description and understanding of this variability is important for a dependable application of biological monitoring. This work describes this variability,using a toxicokinetic model, for a large range of chemicals for which reference biological reference values exist. A toxicokinetic compartmental model describing both the parent compound and its metabolites was used. For each chemical, compartments were given physiological meaning. Models were elaborated based on physiological, physicochemical, and biochemical data when available, and on half-lives and central compartment concentrations when not available. Fourteen chemicals were studied (arsenic, cadmium, carbon monoxide, chromium, cobalt, ethylbenzene, ethyleneglycol monomethylether, fluorides, lead, mercury, methyl isobutyl ketone, penthachlorophenol, phenol, and toluene), representing 20 biological indicators. Occupational exposures were simulated using Monte Carlo techniques with realistic distributions of both individual physiological parameters and exposure conditions. Resulting biological indicator levels were then analyzed to identify the contribution of environmental and biological variability to total variability. Comparison of predicted biological indicator levels with biological exposure limits showed a high correlation with the model for 19 out of 20 indicators. Variability associated with changes in exposure levels (GSD of 1.5 and 2.0) is shown to be mainly influenced by the kinetics of the biological indicator. Thus, with regard to variability, we can conclude that, for the 14 chemicals modeled, biological monitoring would be preferable to air monitoring. For short half-lives (less than 7 hr), this is very similar to the environmental variability. However, for longer half-lives, estimated variability decreased. [Supplementary materials are available for this article. Go to the publisher's online edition of Journal of Occupational and Environmental Hygiene for the following free supplemental resource: tables detailing the CBTK models for all 14 chemicals and the symbol nomenclature that was used.] [Authors]

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Recent experiments showed that the linear double-stranded DNA in bacteriophage capsids is both highly knotted and neatly structured. What is the physical basis of this organization? Here we show evidence from stochastic simulation techniques that suggests that a key element is the tendency of contacting DNA strands to order, as in cholesteric liquid crystals. This interaction favors their preferential juxtaposition at a small twist angle, thus promoting an approximately nematic (and apolar) local order. The ordering effect dramatically impacts the geometry and topology of DNA inside phages. Accounting for this local potential allows us to reproduce the main experimental data on DNA organization in phages, including the cryo-EM observations and detailed features of the spectrum of DNA knots formed inside viral capsids. The DNA knots we observe are strongly delocalized and, intriguingly, this is shown not to interfere with genome ejection out of the phage.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Using Monte Carlo simulations and reanalyzing the data of a validation study of the AEIM emotional intelligence test, we demonstrated that an atheoretical approach and the use of weak statistical procedures can result in biased validity estimates. These procedures included stepwise regression-and the general case of failing to include important theoretical controls-extreme scores analysis, and ignoring heteroscedasticity as well as measurement error. The authors of the AEIM test responded by offering more complete information about their analyses, allowing us to further examine the perils of ignoring theory and correct statistical procedures. In this paper we show with extended analyses that the AEIM test is invalid.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The activity of radiopharmaceuticals in nuclear medicine is measured before patient injection with radionuclide calibrators. In Switzerland, the general requirements for quality controls are defined in a federal ordinance and a directive of the Federal Office of Metrology (METAS) which require each instrument to be verified. A set of three gamma sources (Co-57, Cs-137 and Co-60) is used to verify the response of radionuclide calibrators in the gamma energy range of their use. A beta source, a mixture of (90)Sr and (90)Y in secular equilibrium, is used as well. Manufacturers are responsible for the calibration factors. The main goal of the study was to monitor the validity of the calibration factors by using two sources: a (90)Sr/(90)Y source and a (18)F source. The three types of commercial radionuclide calibrators tested do not have a calibration factor for the mixture but only for (90)Y. Activity measurements of a (90)Sr/(90)Y source with the (90)Y calibration factor are performed in order to correct for the extra-contribution of (90)Sr. The value of the correction factor was found to be 1.113 whereas Monte Carlo simulations of the radionuclide calibrators estimate the correction factor to be 1.117. Measurements with (18)F sources in a specific geometry are also performed. Since this radionuclide is widely used in Swiss hospitals equipped with PET and PET-CT, the metrology of the (18)F is very important. The (18)F response normalized to the (137)Cs response shows that the difference with a reference value does not exceed 3% for the three types of radionuclide calibrators.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this work we analyze how patchy distributions of CO2 and brine within sand reservoirs may lead to significant attenuation and velocity dispersion effects, which in turn may have a profound impact on surface seismic data. The ultimate goal of this paper is to contribute to the understanding of these processes within the framework of the seismic monitoring of CO2 sequestration, a key strategy to mitigate global warming. We first carry out a Monte Carlo analysis to study the statistical behavior of attenuation and velocity dispersion of compressional waves traveling through rocks with properties similar to those at the Utsira Sand, Sleipner field, containing quasi-fractal patchy distributions of CO2 and brine. These results show that the mean patch size and CO2 saturation play key roles in the observed wave-induced fluid flow effects. The latter can be remarkably important when CO2 concentrations are low and mean patch sizes are relatively large. To analyze these effects on the corresponding surface seismic data, we perform numerical simulations of wave propagation considering reservoir models and CO2 accumulation patterns similar to the CO2 injection site in the Sleipner field. These numerical experiments suggest that wave-induced fluid flow effects may produce changes in the reservoir's seismic response, modifying significantly the main seismic attributes usually employed in the characterization of these environments. Consequently, the determination of the nature of the fluid distributions as well as the proper modeling of the seismic data constitute important aspects that should not be ignored in the seismic monitoring of CO2 sequestration problems.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The paper addresses the concept of multicointegration in panel data frame- work. The proposal builds upon the panel data cointegration procedures developed in Pedroni (2004), for which we compute the moments of the parametric statistics. When individuals are either cross-section independent or cross-section dependence can be re- moved by cross-section demeaning, our approach can be applied to the wider framework of mixed I(2) and I(1) stochastic processes analysis. The paper also deals with the issue of cross-section dependence using approximate common factor models. Finite sample performance is investigated through Monte Carlo simulations. Finally, we illustrate the use of the procedure investigating inventories, sales and production relationship for a panel of US industries.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Empirical studies have shown little evidence to support the presence of all unit roots present in the $^{\Delta_4}$ filter in quarterly seasonal time series. This paper analyses the performance of the Hylleberg, Engle, Granger and Yoo (1990) (HEGY) procedure when the roots under the null are not all present. We exploit the Vector of Quarters representation and cointegration relationship between the quarters when factors $(1-L),(1+L),\bigg(1+L^2\bigg),\bigg(1-L^2\bigg) y \bigg(1+L+L^2+L^3\bigg)$ are a source of nonstationarity in a process in order to obtain the distribution of tests of the HEGY procedure when the underlying processes have a root at the zero, Nyquist frequency, two complex conjugates of frequency $^{\pi/2}$ and two combinations of the previous cases. We show both theoretically and through a Monte-Carlo analysis that the t-ratios $^{t_{{\hat\pi}_1}}$ and $^{t_{{\hat\pi}_2}}$ and the F-type tests used in the HEGY procedure have the same distribution as under the null of a seasonal random walk when the root(s) is/are present, although this is not the case for the t-ratio tests associated with unit roots at frequency $^{\pi/2}$.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We work out a semiclassical theory of shot noise in ballistic n+-i-n+ semiconductor structures aiming at studying two fundamental physical correlations coming from Pauli exclusion principle and long-range Coulomb interaction. The theory provides a unifying scheme which, in addition to the current-voltage characteristics, describes the suppression of shot noise due to Pauli and Coulomb correlations in the whole range of system parameters and applied bias. The whole scenario is summarized by a phase diagram in the plane of two dimensionless variables related to the sample length and contact chemical potential. Here different regions of physical interest can be identified where only Coulomb or only Pauli correlations are active, or where both are present with different relevance. The predictions of the theory are proven to be fully corroborated by Monte Carlo simulations.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We present a theoretical investigation of shot-noise properties in nondegenerate elastic diffusive conductors. Both Monte Carlo simulations and analytical approaches are used. Two interesting phenomena are found: (i) the display of enhanced shot noise for given energy dependences of the scattering time, and (ii) the recovery of full shot noise for asymptotic high applied bias. The first phenomenon is associated with the onset of negative differential conductivity in energy space that drives the system towards a dynamical electrical instability in excellent agreement with analytical predictions. The enhancement is found to be strongly amplified when the dimensionality in momentum space is lowered from three to two dimensions. The second phenomenon is due to the suppression of the effects of long-range Coulomb correlations that takes place when the transit time becomes the shortest time scale in the system, and is common to both elastic and inelastic nondegenerate diffusive conductors. These phenomena shed different light in the understanding of the anomalous behavior of shot noise in mesoscopic conductors, which is a signature of correlations among different current pulses.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Coulomb suppression of shot noise in a ballistic diode connected to degenerate ideal contacts is analyzed in terms of the correlations taking place between current fluctuations due to carriers injected with different energies. By using Monte Carlo simulations we show that at low frequencies the origin of Coulomb suppression can be traced back to the negative correlations existing between electrons injected with an energy close to that of the potential barrier present in the diode active region and all other carriers injected with higher energies. Correlations between electrons with energy above the potential barrier with the rest of electrons are found to influence significantly the spectra at high frequency in the cutoff region.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The paper addresses the concept of multicointegration in panel data frame- work. The proposal builds upon the panel data cointegration procedures developed in Pedroni (2004), for which we compute the moments of the parametric statistics. When individuals are either cross-section independent or cross-section dependence can be re- moved by cross-section demeaning, our approach can be applied to the wider framework of mixed I(2) and I(1) stochastic processes analysis. The paper also deals with the issue of cross-section dependence using approximate common factor models. Finite sample performance is investigated through Monte Carlo simulations. Finally, we illustrate the use of the procedure investigating inventories, sales and production relationship for a panel of US industries.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Empirical studies have shown little evidence to support the presence of all unit roots present in the $^{\Delta_4}$ filter in quarterly seasonal time series. This paper analyses the performance of the Hylleberg, Engle, Granger and Yoo (1990) (HEGY) procedure when the roots under the null are not all present. We exploit the Vector of Quarters representation and cointegration relationship between the quarters when factors $(1-L),(1+L),\bigg(1+L^2\bigg),\bigg(1-L^2\bigg) y \bigg(1+L+L^2+L^3\bigg)$ are a source of nonstationarity in a process in order to obtain the distribution of tests of the HEGY procedure when the underlying processes have a root at the zero, Nyquist frequency, two complex conjugates of frequency $^{\pi/2}$ and two combinations of the previous cases. We show both theoretically and through a Monte-Carlo analysis that the t-ratios $^{t_{{\hat\pi}_1}}$ and $^{t_{{\hat\pi}_2}}$ and the F-type tests used in the HEGY procedure have the same distribution as under the null of a seasonal random walk when the root(s) is/are present, although this is not the case for the t-ratio tests associated with unit roots at frequency $^{\pi/2}$.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We propose an iterative procedure to minimize the sum of squares function which avoids the nonlinear nature of estimating the first order moving average parameter and provides a closed form of the estimator. The asymptotic properties of the method are discussed and the consistency of the linear least squares estimator is proved for the invertible case. We perform various Monte Carlo experiments in order to compare the sample properties of the linear least squares estimator with its nonlinear counterpart for the conditional and unconditional cases. Some examples are also discussed

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In groundwater applications, Monte Carlo methods are employed to model the uncertainty on geological parameters. However, their brute-force application becomes computationally prohibitive for highly detailed geological descriptions, complex physical processes, and a large number of realizations. The Distance Kernel Method (DKM) overcomes this issue by clustering the realizations in a multidimensional space based on the flow responses obtained by means of an approximate (computationally cheaper) model; then, the uncertainty is estimated from the exact responses that are computed only for one representative realization per cluster (the medoid). Usually, DKM is employed to decrease the size of the sample of realizations that are considered to estimate the uncertainty. We propose to use the information from the approximate responses for uncertainty quantification. The subset of exact solutions provided by DKM is then employed to construct an error model and correct the potential bias of the approximate model. Two error models are devised that both employ the difference between approximate and exact medoid solutions, but differ in the way medoid errors are interpolated to correct the whole set of realizations. The Local Error Model rests upon the clustering defined by DKM and can be seen as a natural way to account for intra-cluster variability; the Global Error Model employs a linear interpolation of all medoid errors regardless of the cluster to which the single realization belongs. These error models are evaluated for an idealized pollution problem in which the uncertainty of the breakthrough curve needs to be estimated. For this numerical test case, we demonstrate that the error models improve the uncertainty quantification provided by the DKM algorithm and are effective in correcting the bias of the estimate computed solely from the MsFV results. The framework presented here is not specific to the methods considered and can be applied to other combinations of approximate models and techniques to select a subset of realizations