992 resultados para Monte Carlo -simulointi
Resumo:
A problem in the archaeometric classification of Catalan Renaissance pottery is the fact, thatthe clay supply of the pottery workshops was centrally organized by guilds, and thereforeusually all potters of a single production centre produced chemically similar ceramics.However, analysing the glazes of the ware usually a large number of inclusions in the glaze isfound, which reveal technological differences between single workshops. These inclusionshave been used by the potters in order to opacify the transparent glaze and to achieve a whitebackground for further decoration.In order to distinguish different technological preparation procedures of the single workshops,at a Scanning Electron Microscope the chemical composition of those inclusions as well astheir size in the two-dimensional cut is recorded. Based on the latter, a frequency distributionof the apparent diameters is estimated for each sample and type of inclusion.Following an approach by S.D. Wicksell (1925), it is principally possible to transform thedistributions of the apparent 2D-diameters back to those of the true three-dimensional bodies.The applicability of this approach and its practical problems are examined using differentways of kernel density estimation and Monte-Carlo tests of the methodology. Finally, it istested in how far the obtained frequency distributions can be used to classify the pottery
Resumo:
Teicoplanin is frequently administered to treat Gram-positive infections in pediatric patients. However, not enough is known about the pharmacokinetics (PK) of teicoplanin in children to justify the optimal dosing regimen. The aim of this study was to determine the population PK of teicoplanin in children and evaluate the current dosage regimens. A PK hospital-based study was conducted. Current dosage recommendations were used for children up to 16 years of age. Thirty-nine children were recruited. Serum samples were collected at the first dose interval (1, 3, 6, and 24 h) and at steady state. A standard 2-compartment PK model was developed, followed by structural models that incorporated weight. Weight was allowed to affect clearance (CL) using linear and allometric scaling terms. The linear model best accounted for the observed data and was subsequently chosen for Monte Carlo simulations. The PK parameter medians/means (standard deviation [SD]) were as follows: CL, [0.019/0.023 (0.01)] × weight liters/h/kg of body weight; volume, 2.282/4.138 liters (4.14 liters); first-order rate constant from the central to peripheral compartment (Kcp), 0.474/3.876 h(-1) (8.16 h(-1)); and first-order rate constant from peripheral to central compartment (Kpc), 0.292/3.994 h(-1) (8.93 h(-1)). The percentage of patients with a minimum concentration of drug in serum (Cmin) of <10 mg/liter was 53.85%. The median/mean (SD) total population area under the concentration-time curve (AUC) was 619/527.05 mg · h/liter (166.03 mg · h/liter). Based on Monte Carlo simulations, only 30.04% (median AUC, 507.04 mg · h/liter), 44.88% (494.1 mg · h/liter), and 60.54% (452.03 mg · h/liter) of patients weighing 50, 25, and 10 kg, respectively, attained trough concentrations of >10 mg/liter by day 4 of treatment. The teicoplanin population PK is highly variable in children, with a wider AUC distribution spread than for adults. Therapeutic drug monitoring should be a routine requirement to minimize suboptimal concentrations. (This trial has been registered in the European Clinical Trials Database Registry [EudraCT] under registration number 2012-005738-12.).
Resumo:
RESUMOFertilizantes são importantes para a produção agrícola mundial devido à melhoria de produtividade que proporcionam. Neste artigo, utiliza-se a teoria de opções reais para avaliar a opção de troca de produto final, amônia ou ureia, em uma fábrica de fertilizantes nitrogenados. O método de simulação Monte Carlo foi utilizado para definir o valor da opção de troca na fábrica de fertilizantes, considerando as incertezas nos preços do gás natural (principal matéria-prima), amônia e ureia, assumindo que esses seguissem um movimento de reversão à média. Nos resultados, aponta-se que essa opção é relevante na análise de projetos de fábricas de fertilizantes, podendo ser fundamental, em muitos casos, sua consideração para a viabilidade do projeto.
Resumo:
The dynamics of homogeneously heated granular gases which fragment due to particle collisions is analyzed. We introduce a kinetic model which accounts for correlations induced at the grain collisions and analyze both the kinetics and relevant distribution functions these systems develop. The work combines analytical and numerical studies based on direct simulation Monte Carlo calculations. A broad family of fragmentation probabilities is considered, and its implications for the system kinetics are discussed. We show that generically these driven materials evolve asymptotically into a dynamical scaling regime. If the fragmentation probability tends to a constant, the grain number diverges at a finite time, leading to a shattering singularity. If the fragmentation probability vanishes, then the number of grains grows monotonously as a power law. We consider different homogeneous thermostats and show that the kinetics of these systems depends weakly on both the grain inelasticity and driving. We observe that fragmentation plays a relevant role in the shape of the velocity distribution of the particles. When the fragmentation is driven by local stochastic events, the longvelocity tail is essentially exponential independently of the heating frequency and the breaking rule. However, for a Lowe-Andersen thermostat, numerical evidence strongly supports the conjecture that the scaled velocity distribution follows a generalized exponential behavior f (c)~exp (−cⁿ), with n ≈1.2, regarding less the fragmentation mechanisms
Resumo:
Report for the scientific sojourn carried out at Massachusetts General Hospital Cancer Center-Harvard Medical School, Estats Units, from 2010 to 2011. The project aims to study the aggregation behavior of amphiphilic molecules in the continuous phase of highly concentrated emulsions, which can be used as templates for the synthesis of meso/macroporous materials. At this stage of the project, we have investigated the self-assembly of diblock and triblock surfactants under the effect of a confined geometry being surrounded by the droplets of the dispersed phase. These droplets limit the growth of the aggregates, deeply modify their orientation and hence alter their spatial arrangement as compared to the self-assembly taking place far enough from any boundary surface, that is in the bulk. By performing Monte Carlo simulations, we have showed that the interface between the dispersed and continuous phases as well as its shape has a significant impact on the structural order of the resulting aggregates and hence on the potential applications of highly concentrated emulsions as reaction media, drug delivery systems, or templates for meso/macroporous materials. Due to the combined effect of symmetry breaking and morphological frustration, very intriguing structures, such as square columnar liquid crystals, twisted X-shaped aggregates, and helical phases of cylindrical aggregates, never observed in the bulk for the same model surfactant, have been found. The presence of other more conventional structures, such as micelles and cubic and hexagonal liquid crystals, formed at low and high amphiphilic concentrations, respectively, further enhance the interest on this already rich aggregation behavior.
Resumo:
PURPOSE: In the radiopharmaceutical therapy approach to the fight against cancer, in particular when it comes to translating laboratory results to the clinical setting, modeling has served as an invaluable tool for guidance and for understanding the processes operating at the cellular level and how these relate to macroscopic observables. Tumor control probability (TCP) is the dosimetric end point quantity of choice which relates to experimental and clinical data: it requires knowledge of individual cellular absorbed doses since it depends on the assessment of the treatment's ability to kill each and every cell. Macroscopic tumors, seen in both clinical and experimental studies, contain too many cells to be modeled individually in Monte Carlo simulation; yet, in particular for low ratios of decays to cells, a cell-based model that does not smooth away statistical considerations associated with low activity is a necessity. The authors present here an adaptation of the simple sphere-based model from which cellular level dosimetry for macroscopic tumors and their end point quantities, such as TCP, may be extrapolated more reliably. METHODS: Ten homogenous spheres representing tumors of different sizes were constructed in GEANT4. The radionuclide 131I was randomly allowed to decay for each model size and for seven different ratios of number of decays to number of cells, N(r): 1000, 500, 200, 100, 50, 20, and 10 decays per cell. The deposited energy was collected in radial bins and divided by the bin mass to obtain the average bin absorbed dose. To simulate a cellular model, the number of cells present in each bin was calculated and an absorbed dose attributed to each cell equal to the bin average absorbed dose with a randomly determined adjustment based on a Gaussian probability distribution with a width equal to the statistical uncertainty consistent with the ratio of decays to cells, i.e., equal to Nr-1/2. From dose volume histograms the surviving fraction of cells, equivalent uniform dose (EUD), and TCP for the different scenarios were calculated. Comparably sized spherical models containing individual spherical cells (15 microm diameter) in hexagonal lattices were constructed, and Monte Carlo simulations were executed for all the same previous scenarios. The dosimetric quantities were calculated and compared to the adjusted simple sphere model results. The model was then applied to the Bortezomib-induced enzyme-targeted radiotherapy (BETR) strategy of targeting Epstein-Barr virus (EBV)-expressing cancers. RESULTS: The TCP values were comparable to within 2% between the adjusted simple sphere and full cellular models. Additionally, models were generated for a nonuniform distribution of activity, and results were compared between the adjusted spherical and cellular models with similar comparability. The TCP values from the experimental macroscopic tumor results were consistent with the experimental observations for BETR-treated 1 g EBV-expressing lymphoma tumors in mice. CONCLUSIONS: The adjusted spherical model presented here provides more accurate TCP values than simple spheres, on par with full cellular Monte Carlo simulations while maintaining the simplicity of the simple sphere model. This model provides a basis for complementing and understanding laboratory and clinical results pertaining to radiopharmaceutical therapy.
Resumo:
The problem of jointly estimating the number, the identities, and the data of active users in a time-varying multiuser environment was examined in a companion paper (IEEE Trans. Information Theory, vol. 53, no. 9, September 2007), at whose core was the use of the theory of finite random sets on countable spaces. Here we extend that theory to encompass the more general problem of estimating unknown continuous parameters of the active-user signals. This problem is solved here by applying the theory of random finite sets constructed on hybrid spaces. We doso deriving Bayesian recursions that describe the evolution withtime of a posteriori densities of the unknown parameters and data.Unlike in the above cited paper, wherein one could evaluate theexact multiuser set posterior density, here the continuous-parameter Bayesian recursions do not admit closed-form expressions. To circumvent this difficulty, we develop numerical approximationsfor the receivers that are based on Sequential Monte Carlo (SMC)methods (“particle filtering”). Simulation results, referring to acode-divisin multiple-access (CDMA) system, are presented toillustrate the theory.
Resumo:
Wireless “MIMO” systems, employing multiple transmit and receive antennas, promise a significant increase of channel capacity, while orthogonal frequency-division multiplexing (OFDM) is attracting a good deal of attention due to its robustness to multipath fading. Thus, the combination of both techniques is an attractive proposition for radio transmission. The goal of this paper is the description and analysis of a new and novel pilot-aided estimator of multipath block-fading channels. Typical models leading to estimation algorithms assume the number of multipath components and delays to be constant (and often known), while their amplitudes are allowed to vary with time. Our estimator is focused instead on the more realistic assumption that the number of channel taps is also unknown and varies with time following a known probabilistic model. The estimation problem arising from these assumptions is solved using Random-Set Theory (RST), whereby one regards the multipath-channel response as a single set-valued random entity.Within this framework, Bayesian recursive equations determine the evolution with time of the channel estimator. Due to the lack of a closed form for the solution of Bayesian equations, a (Rao–Blackwellized) particle filter (RBPF) implementation ofthe channel estimator is advocated. Since the resulting estimator exhibits a complexity which grows exponentially with the number of multipath components, a simplified version is also introduced. Simulation results describing the performance of our channel estimator demonstrate its effectiveness.
Resumo:
Monte Carlo simulations were carried out to study the response of a thyroid monitor for measuring intake activities of (125)I and (131)I. The aim of the study was 3-fold: to cross-validate the Monte Carlo simulation programs, to study the response of the detector using different phantoms and to study the effects of anatomical variations. Simulations were performed using the Swiss reference phantom and several voxelised phantoms. Determining the position of the thyroid is crucial for an accurate determination of radiological risks. The detector response using the Swiss reference phantom was in fairly good agreement with the response obtained using adult voxelised phantoms for (131)I, but should be revised for a better calibration for (125)I and for any measurements taken on paediatric patients.
Resumo:
The utility of sequencing a second highly variable locus in addition to the spa gene (e.g., double-locus sequence typing [DLST]) was investigated to overcome limitations of a Staphylococcus aureus single-locus typing method. Although adding a second locus seemed to increase discriminatory power, it was not sufficient to definitively infer evolutionary relationships within a single multilocus sequence type (ST-5).
Resumo:
Despite the fact that in living cells DNA molecules are long and highly crowded, they are rarely knotted. DNA knotting interferes with the normal functioning of the DNA and, therefore, molecular mechanisms evolved that maintain the knotting and catenation level below that which would be achieved if the DNA segments could pass randomly through each other. Biochemical experiments with torsionally relaxed DNA demonstrated earlier that type II DNA topoisomerases that permit inter- and intramolecular passages between segments of DNA molecules use the energy of ATP hydrolysis to select passages that lead to unknotting rather than to the formation of knots. Using numerical simulations, we identify here another mechanism by which topoisomerases can keep the knotting level low. We observe that DNA supercoiling, such as found in bacterial cells, creates a situation where intramolecular passages leading to knotting are opposed by the free-energy change connected to transitions from unknotted to knotted circular DNA molecules.
Resumo:
This paper proposes a method to conduct inference in panel VAR models with cross unit interdependencies and time variations in the coefficients. The approach can be used to obtain multi-unit forecasts and leading indicators and to conduct policy analysis in a multiunit setups. The framework of analysis is Bayesian and MCMC methods are used to estimate the posterior distribution of the features of interest. The model is reparametrized to resemble an observable index model and specification searches are discussed. As an example, we construct leading indicators for inflation and GDP growth in the Euro area using G-7 information.
Resumo:
À l‟aide des microdonnées du recensement de 2000 et des données administratives sur l‟éducation et en s‟appuyant sur : 1) les scénarios concernant l‟évolution démographique, d‟éducation et d‟activité économique et 2) un modèle de microsimulation, on a projeté pour la période 2000 à 2025, certaines caractéristiques et comportements démographiques et socio-économiques de la population du Cap-Vert, notamment ceux liés à l‟évolution du statut d‟activité. Selon le scénario le plus plausible, à l‟horizon 2025, le pays se trouvera à l‟étape avancée de la seconde phase de sa transition démographique. Sa population continuerait de croître en raison de sa structure par âge relativement jeune. Bien que le solde migratoire tende à être nul et que la mortalité tende à se stabiliser (près de 5 à 7 décès pour 1 000 habitants par an), cette croissance sera à un rythme moins rapide (d‟environ 1,8 % par an) que celui de la décennie 1990-2000, et ce, malgré le déclin de la fécondité. De 2000 à 2025, le pays pourrait connaître également une augmentation des personnes âgées de 15 à 24 ans, variant de 26 % à 29 % selon les scénarios envisagés, soit ceux et celles qui entreront sur le marché du travail au cours de la période. Le nombre de ces jeunes n‟ayant pas obtenu un diplôme d‟études secondaire, en 2025, pourrait augmenter, selon les scénarios envisagés, variant de 30 % à 44 % de plus qu‟en 2000. Le nombre de personnes de ce groupe d‟âge ayant obtenu un diplôme d‟études secondaires ou plus, le pays pourrait voir leur nombre à décupler de 11 fois à 13 fois à la à l‟horizon 2025.