990 resultados para MONTE CARLOS METHOD
Resumo:
Magnetic properties of Fe nanodots are simulated using a scaling technique and Monte Carlo method, in good agreement with experimental results. For the 20-nm-thick dots with diameters larger than 60¿nm, the magnetization reversal via vortex state is observed. The role of magnetic interaction between dots in arrays in the reversal process is studied as a function of nanometric center-to-center distance. When this distance is more than twice the dot diameter, the interaction can be neglected and the magnetic properties of the entire array are determined by the magnetic configuration of the individual dots. The effect of crystalline anisotropy on the vortex state is investigated. For arrays of noninteracting dots, the anisotropy strongly affects the vortex nucleation field and coercivity, and only slightly affects the vortex annihilation field
Resumo:
We have included the effective description of squark interactions with charginos/neutralinos in the MadGraph MSSM model. This effective description includes the effective Yukawa couplings, and another logarithmic term which encodes the supersymmetry-breaking. We have performed an extensive test of our implementation analyzing the results of the partial decay widths of squarks into charginos and neutralinos obtained by using FeynArts/FormCalc programs and the new model file in MadGraph. We present results for the cross-section of top-squark production decaying into charginos and neutralinos.
Resumo:
We propose an iterative procedure to minimize the sum of squares function which avoids the nonlinear nature of estimating the first order moving average parameter and provides a closed form of the estimator. The asymptotic properties of the method are discussed and the consistency of the linear least squares estimator is proved for the invertible case. We perform various Monte Carlo experiments in order to compare the sample properties of the linear least squares estimator with its nonlinear counterpart for the conditional and unconditional cases. Some examples are also discussed
Resumo:
The diffusion of passive scalars convected by turbulent flows is addressed here. A practical procedure to obtain stochastic velocity fields with well¿defined energy spectrum functions is also presented. Analytical results are derived, based on the use of stochastic differential equations, where the basic hypothesis involved refers to a rapidly decaying turbulence. These predictions are favorable compared with direct computer simulations of stochastic differential equations containing multiplicative space¿time correlated noise.
Resumo:
The magnetic structure of the edge-sharing cuprate compound Li2CuO2 has been investigated with highly correlated ab initio electronic structure calculations. The first- and second-neighbor in-chain magnetic interactions are calculated to be 142 and -22 K, respectively. The ratio between the two parameters is smaller than suggested previously in the literature. The interchain interactions are antiferromagnetic in nature and of the order of a few K only. Monte Carlo simulations using the ab initio parameters to define the spin model Hamiltonian result in a Nel temperature in good agreement with experiment. Spin population analysis situates the magnetic moment on the copper and oxygen ions between the completely localized picture derived from experiment and the more delocalized picture based on local-density calculations.
Resumo:
PURPOSE: In the radiopharmaceutical therapy approach to the fight against cancer, in particular when it comes to translating laboratory results to the clinical setting, modeling has served as an invaluable tool for guidance and for understanding the processes operating at the cellular level and how these relate to macroscopic observables. Tumor control probability (TCP) is the dosimetric end point quantity of choice which relates to experimental and clinical data: it requires knowledge of individual cellular absorbed doses since it depends on the assessment of the treatment's ability to kill each and every cell. Macroscopic tumors, seen in both clinical and experimental studies, contain too many cells to be modeled individually in Monte Carlo simulation; yet, in particular for low ratios of decays to cells, a cell-based model that does not smooth away statistical considerations associated with low activity is a necessity. The authors present here an adaptation of the simple sphere-based model from which cellular level dosimetry for macroscopic tumors and their end point quantities, such as TCP, may be extrapolated more reliably. METHODS: Ten homogenous spheres representing tumors of different sizes were constructed in GEANT4. The radionuclide 131I was randomly allowed to decay for each model size and for seven different ratios of number of decays to number of cells, N(r): 1000, 500, 200, 100, 50, 20, and 10 decays per cell. The deposited energy was collected in radial bins and divided by the bin mass to obtain the average bin absorbed dose. To simulate a cellular model, the number of cells present in each bin was calculated and an absorbed dose attributed to each cell equal to the bin average absorbed dose with a randomly determined adjustment based on a Gaussian probability distribution with a width equal to the statistical uncertainty consistent with the ratio of decays to cells, i.e., equal to Nr-1/2. From dose volume histograms the surviving fraction of cells, equivalent uniform dose (EUD), and TCP for the different scenarios were calculated. Comparably sized spherical models containing individual spherical cells (15 microm diameter) in hexagonal lattices were constructed, and Monte Carlo simulations were executed for all the same previous scenarios. The dosimetric quantities were calculated and compared to the adjusted simple sphere model results. The model was then applied to the Bortezomib-induced enzyme-targeted radiotherapy (BETR) strategy of targeting Epstein-Barr virus (EBV)-expressing cancers. RESULTS: The TCP values were comparable to within 2% between the adjusted simple sphere and full cellular models. Additionally, models were generated for a nonuniform distribution of activity, and results were compared between the adjusted spherical and cellular models with similar comparability. The TCP values from the experimental macroscopic tumor results were consistent with the experimental observations for BETR-treated 1 g EBV-expressing lymphoma tumors in mice. CONCLUSIONS: The adjusted spherical model presented here provides more accurate TCP values than simple spheres, on par with full cellular Monte Carlo simulations while maintaining the simplicity of the simple sphere model. This model provides a basis for complementing and understanding laboratory and clinical results pertaining to radiopharmaceutical therapy.
Resumo:
We formulate a new mixing model to explore hydrological and chemical conditions under which the interface between the stream and catchment interface (SCI) influences the release of reactive solutes into stream water during storms. Physically, the SCI corresponds to the hyporheic/riparian sediments. In the new model this interface is coupled through a bidirectional water exchange to the conventional two components mixing model. Simulations show that the influence of the SCI on stream solute dynamics during storms is detectable when the runoff event is dominated by the infiltrated groundwater component that flows through the SCI before entering the stream and when the flux of solutes released from SCI sediments is similar to, or higher than, the solute flux carried by the groundwater. Dissolved organic carbon (DOC) and nitrate data from two small Mediterranean streams obtained during storms are compared to results from simulations using the new model to discern the circumstances under which the SCI is likely to control the dynamics of reactive solutes in streams. The simulations and the comparisons with empirical data suggest that the new mixing model may be especially appropriate for streams in which the periodic, or persistent, abrupt changes in the level of riparian groundwater exert hydrologic control on flux of biologically reactive fluxes between the riparian/hyporheic compartment and the stream water.
Resumo:
This paper examines statistical analysis of social reciprocity at group, dyadic, and individual levels. Given that testing statistical hypotheses regarding social reciprocity can be also of interest, a statistical procedure based on Monte Carlo sampling has been developed and implemented in R in order to allow social researchers to describe groups and make statistical decisions.
Resumo:
In the first part of the study, nine estimators of the first-order autoregressive parameter are reviewed and a new estimator is proposed. The relationships and discrepancies between the estimators are discussed in order to achieve a clear differentiation. In the second part of the study, the precision in the estimation of autocorrelation is studied. The performance of the ten lag-one autocorrelation estimators is compared in terms of Mean Square Error (combining bias and variance) using data series generated by Monte Carlo simulation. The results show that there is not a single optimal estimator for all conditions, suggesting that the estimator ought to be chosen according to sample size and to the information available of the possible direction of the serial dependence. Additionally, the probability of labelling an actually existing autocorrelation as statistically significant is explored using Monte Carlo sampling. The power estimates obtained are quite similar among the tests associated with the different estimators. These estimates evidence the small probability of detecting autocorrelation in series with less than 20 measurement times.
Resumo:
If single case experimental designs are to be used to establish guidelines for evidence-based interventions in clinical and educational settings, numerical values that reflect treatment effect sizes are required. The present study compares four recently developed procedures for quantifying the magnitude of intervention effect using data with known characteristics. Monte Carlo methods were used to generate AB designs data with potential confounding variables (serial dependence, linear and curvilinear trend, and heteroscedasticity between phases) and two types of treatment effect (level and slope change). The results suggest that data features are important for choosing the appropriate procedure and, thus, inspecting the graphed data visually is a necessary initial stage. In the presence of serial dependence or a change in data variability, the Nonoverlap of All Pairs (NAP) and the Slope and Level Change (SLC) were the only techniques of the four examined that performed adequately. Introducing a data correction step in NAP renders it unaffected by linear trend, as is also the case for the Percentage of Nonoverlapping Corrected Data and SLC. The performance of these techniques indicates that professionals" judgments concerning treatment effectiveness can be readily complemented by both visual and statistical analyses. A flowchart to guide selection of techniques according to the data characteristics identified by visual inspection is provided.
Resumo:
Compartmental and physiologically based toxicokinetic modeling coupled with Monte Carlo simulation were used to quantify the impact of biological variability (physiological, biochemical, and anatomic parameters) on the values of a series of bio-indicators of metal and organic industrial chemical exposures. A variability extent index and the main parameters affecting biological indicators were identified. Results show a large diversity in interindividual variability for the different categories of biological indicators examined. Measurement of the unchanged substance in blood, alveolar air, or urine is much less variable than the measurement of metabolites, both in blood and urine. In most cases, the alveolar flow and cardiac output were identified as the prime parameters determining biological variability, thus suggesting the importance of workload intensity on absorbed dose for inhaled chemicals.
Resumo:
This paper examines statistical analysis of social reciprocity, that is, the balance between addressing and receiving behaviour in social interactions. Specifically, it focuses on the measurement of social reciprocity by means of directionality and skew-symmetry statistics at different levels. Two statistics have been used as overall measures of social reciprocity at group level: the directional consistency and the skew-symmetry statistics. Furthermore, the skew-symmetry statistic allows social researchers to obtain complementary information at dyadic and individual levels. However, having computed these measures, social researchers may be interested in testing statistical hypotheses regarding social reciprocity. For this reason, it has been developed a statistical procedure, based on Monte Carlo sampling, in order to allow social researchers to describe groups and make statistical decisions.