962 resultados para Monte-Carlo-Simulation
Resumo:
The present work focuses the attention on the skew-symmetry index as a measure of social reciprocity. This index is based on the correspondence between the amount of behaviour that each individual addresses to its partners and what it receives from them in return. Although the skew-symmetry index enables researchers to describe social groups, statistical inferential tests are required. The main aim of the present study is to propose an overall statistical technique for testing symmetry in experimental conditions, calculating the skew-symmetry statistic (Φ) at group level. Sampling distributions for the skew- symmetry statistic have been estimated by means of a Monte Carlo simulation in order to allow researchers to make statistical decisions. Furthermore, this study will allow researchers to choose the optimal experimental conditions for carrying out their research, as the power of the statistical test has been estimated. This statistical test could be used in experimental social psychology studies in which researchers may control the group size and the number of interactions within dyads.
Resumo:
In the first part of the study, nine estimators of the first-order autoregressive parameter are reviewed and a new estimator is proposed. The relationships and discrepancies between the estimators are discussed in order to achieve a clear differentiation. In the second part of the study, the precision in the estimation of autocorrelation is studied. The performance of the ten lag-one autocorrelation estimators is compared in terms of Mean Square Error (combining bias and variance) using data series generated by Monte Carlo simulation. The results show that there is not a single optimal estimator for all conditions, suggesting that the estimator ought to be chosen according to sample size and to the information available of the possible direction of the serial dependence. Additionally, the probability of labelling an actually existing autocorrelation as statistically significant is explored using Monte Carlo sampling. The power estimates obtained are quite similar among the tests associated with the different estimators. These estimates evidence the small probability of detecting autocorrelation in series with less than 20 measurement times.
Resumo:
Compartmental and physiologically based toxicokinetic modeling coupled with Monte Carlo simulation were used to quantify the impact of biological variability (physiological, biochemical, and anatomic parameters) on the values of a series of bio-indicators of metal and organic industrial chemical exposures. A variability extent index and the main parameters affecting biological indicators were identified. Results show a large diversity in interindividual variability for the different categories of biological indicators examined. Measurement of the unchanged substance in blood, alveolar air, or urine is much less variable than the measurement of metabolites, both in blood and urine. In most cases, the alveolar flow and cardiac output were identified as the prime parameters determining biological variability, thus suggesting the importance of workload intensity on absorbed dose for inhaled chemicals.
Resumo:
Despite the considerable evidence showing that dispersal between habitat patches is often asymmetric, most of the metapopulation models assume symmetric dispersal. In this paper, we develop a Monte Carlo simulation model to quantify the effect of asymmetric dispersal on metapopulation persistence. Our results suggest that metapopulation extinctions are more likely when dispersal is asymmetric. Metapopulation viability in systems with symmetric dispersal mirrors results from a mean field approximation, where the system persists if the expected per patch colonization probability exceeds the expected per patch local extinction rate. For asymmetric cases, the mean field approximation underestimates the number of patches necessary for maintaining population persistence. If we use a model assuming symmetric dispersal when dispersal is actually asymmetric, the estimation of metapopulation persistence is wrong in more than 50% of the cases. Metapopulation viability depends on patch connectivity in symmetric systems, whereas in the asymmetric case the number of patches is more important. These results have important implications for managing spatially structured populations, when asymmetric dispersal may occur. Future metapopulation models should account for asymmetric dispersal, while empirical work is needed to quantify the patterns and the consequences of asymmetric dispersal in natural metapopulations.
Resumo:
This study deals with the statistical properties of a randomization test applied to an ABAB design in cases where the desirable random assignment of the points of change in phase is not possible. In order to obtain information about each possible data division we carried out a conditional Monte Carlo simulation with 100,000 samples for each systematically chosen triplet. Robustness and power are studied under several experimental conditions: different autocorrelation levels and different effect sizes, as well as different phase lengths determined by the points of change. Type I error rates were distorted by the presence of autocorrelation for the majority of data divisions. Satisfactory Type II error rates were obtained only for large treatment effects. The relationship between the lengths of the four phases appeared to be an important factor for the robustness and the power of the randomization test.
Resumo:
Intravascular brachytherapy with beta sources has become a useful technique to prevent restenosis after cardiovascular intervention. In particular, the Beta-Cath high-dose-rate system, manufactured by Novoste Corporation, is a commercially available 90Sr 90Y source for intravascular brachytherapy that is achieving widespread use. Its dosimetric characterization has attracted considerable attention in recent years. Unfortunately, the short ranges of the emitted beta particles and the associated large dose gradients make experimental measurements particularly difficult. This circumstance has motivated the appearance of a number of papers addressing the characterization of this source by means of Monte Carlo simulation techniques.
Resumo:
In the present work we focus on two indices that quantify directionality and skew-symmetrical patterns in social interactions as measures of social reciprocity: the Directional consistency (DC) and Skew symmetry indices. Although both indices enable researchers to describe social groups, most studies require statistical inferential tests. The main aims of the present study are: firstly, to propose an overall statistical technique for testing null hypotheses regarding social reciprocity in behavioral studies, using the DC and Skew symmetry statistics (Φ) at group level; and secondly, to compare both statistics in order to allow researchers to choose the optimal measure depending on the conditions. In order to allow researchers to make statistical decisions, statistical significance for both statistics has been estimated by means of a Monte Carlo simulation. Furthermore, this study will enable researchers to choose the optimal observational conditions for carrying out their research, as the power of the statistical tests has been estimated.
Resumo:
This study examined the independent effect of skewness and kurtosis on the robustness of the linear mixed model (LMM), with the Kenward-Roger (KR) procedure, when group distributions are different, sample sizes are small, and sphericity cannot be assumed. Methods: A Monte Carlo simulation study considering a split-plot design involving three groups and four repeated measures was performed. Results: The results showed that when group distributions are different, the effect of skewness on KR robustness is greater than that of kurtosis for the corresponding values. Furthermore, the pairings of skewness and kurtosis with group size were found to be relevant variables when applying this procedure. Conclusions: With sample sizes of 45 and 60, KR is a suitable option for analyzing data when the distributions are: (a) mesokurtic and not highly or extremely skewed, and (b) symmetric with different degrees of kurtosis. With total sample sizes of 30, it is adequate when group sizes are equal and the distributions are: (a) mesokurtic and slightly or moderately skewed, and sphericity is assumed; and (b) symmetric with a moderate or high/extreme violation of kurtosis. Alternative analyses should be considered when the distributions are highly or extremely skewed and samples sizes are small.
Resumo:
Pressurized re-entrant (or 4 pi) ionization chambers (ICs) connected to current-measuring electronics are used for activity measurements of photon emitting radionuclides and some beta emitters in the fields of metrology and nuclear medicine. As a secondary method, these instruments need to be calibrated with appropriate activity standards from primary or direct standardization. The use of these instruments over 50 years has been well described in numerous publications, such as the Monographie BIPM-4 and the special issue of Metrologia on radionuclide metrology (Ratel 2007 Metrologia 44 S7-16, Schrader1997 Activity Measurements With Ionization Chambers (Monographie BIPM-4) Schrader 2007 Metrologia 44 S53-66, Cox et al 2007 Measurement Modelling of the International Reference System (SIR) for Gamma-Emitting Radionuclides (Monographie BIPM-7)). The present work describes the principles of activity measurements, calibrations, and impurity corrections using pressurized ionization chambers in the first part and the uncertainty analysis illustrated with example uncertainty budgets from routine source-calibration as well as from an international reference system (SIR) measurement in the second part.
Resumo:
Occupational hygiene practitioners typically assess the risk posed by occupational exposure by comparing exposure measurements to regulatory occupational exposure limits (OELs). In most jurisdictions, OELs are only available for exposure by the inhalation pathway. Skin notations are used to indicate substances for which dermal exposure may lead to health effects. However, these notations are either present or absent and provide no indication of acceptable levels of exposure. Furthermore, the methodology and framework for assigning skin notation differ widely across jurisdictions resulting in inconsistencies in the substances that carry notations. The UPERCUT tool was developed in response to these limitations. It helps occupational health stakeholders to assess the hazard associated with dermal exposure to chemicals. UPERCUT integrates dermal quantitative structure-activity relationships (QSARs) and toxicological data to provide users with a skin hazard index called the dermal hazard ratio (DHR) for the substance and scenario of interest. The DHR is the ratio between the estimated 'received' dose and the 'acceptable' dose. The 'received' dose is estimated using physico-chemical data and information on the exposure scenario provided by the user (body parts exposure and exposure duration), and the 'acceptable' dose is estimated using inhalation OELs and toxicological data. The uncertainty surrounding the DHR is estimated with Monte Carlo simulation. Additional information on the selected substances includes intrinsic skin permeation potential of the substance and the existence of skin notations. UPERCUT is the only available tool that estimates the absorbed dose and compares this to an acceptable dose. In the absence of dermal OELs it provides a systematic and simple approach for screening dermal exposure scenarios for 1686 substances.
Resumo:
This paper analyses the impact of using different correlation assumptions between lines of business when estimating the risk-based capital reserve, the Solvency Capital Requirement -SCR-, under Solvency II regulations. A case study is presented and the SCR is calculated according to the Standard Model approach. Alternatively, the requirement is then calculated using an Internal Model based on a Monte Carlo simulation of the net underwriting result at a one-year horizon, with copulas being used to model the dependence between lines of business. To address the impact of these model assumptions on the SCR we conduct a sensitivity analysis. We examine changes in the correlation matrix between lines of business and address the choice of copulas. Drawing on aggregate historical data from the Spanish non-life insurance market between 2000 and 2009, we conclude that modifications of the correlation and dependence assumptions have a significant impact on SCR estimation.
Resumo:
Pitkäaikaisten rakennusurakoiden tarjouslaskennassa on ennakoitava hintojen muutoksia useiden vuosien päähän, kun tarjoukset on tehtävä kiinteillä hinnoilla. Kustannusten ennakointi ja hintariskienhallinta on kriittinen tekijä rakennusalan yrityksen kilpailukyvylle. Tämän tutkielman tavoitteena on kehittää YIT Rakennus Oy:n Infrapalveluille toimintamalli ja työkalu, joiden avulla hintariskejä voidaan hallita tarjouslaskennassa sekä hankintatoimessa. Ratkaisuksi kehitettiin kustannusten ennakointi -malli, jossa panosryhmien hintojen kehitystä ennustetaan asiantuntijaryhmissä säännöllisesti. Kustannusten ennakointi -mallin käyttöönotto vaatii ennustettavien panosryhmien määrittelyä. Lisäksi on nimettävä asiantuntijaryhmä sekä valittava aikajänne, jolle ennuste tehdään. Ennusteisiin sisältyvä epävarmuus saadaan esiin Monte Carlo simulaatiolla, ja urakan hintariskiä voidaan siten arvioida todennäköisyysjakaumien ja herkkyysanalyysin avulla. Valmiita ennusteita hyödynnetään tarjouslaskennassa sekä hankintatoimessa taktiikoiden ja strategioiden valinnassa.
Resumo:
Tämän työn tarkoituksena oli tarkastella kohdeorganisaation hankintaprosessin suorituskykyä. Tutkimuksen päämääränä oli tuottaa yritykselle sellaista tietoa ja arviointikriteerejä, joiden avulla yritys voi kehittää valmiuksiaan oman suorituskyvyn tehokkaampaan arviointiin tulevaisuudessa. Tutkielma tehtiin Skanska Oy:n osto-osastolle Helsinkiin. Tutkimuksen kohteeksi valittiin kausisopimusten hankintaprosessi epäsuorissa hankinnoissa, kotimaisilla markkinoilla. Keskitetyn kausisopimusten hankintaprosessin tarkoituksena on tuottaa yritykselle kilpailukykyisiä sopimuksia sekä saavuttaa prosessin parempi hallinta ja läpinäkyvyys. Tietoa tutkimuksen kohteena olevasta prosessista kerättiin haastatteluilla ja keskustelutuokioilla sekä yrityksen dokumenteista. Aineiston keräämisen kautta pyrittiin saamaan syvempi kuva prosessin toiminnasta, sen ongelmakohdista sekä niiden syistä ja seurauksista. Toisen tarkastelunäkökulman prosessin arvioinnille tarjosi läpimenoajan mittaaminen. Saatua aineistoa luokiteltiin vika- ja vaikutusanalyysiin pohjautuvalla mallilla sekä Monte Carlo – simulaatiomenetelmään perustuvalla ohjelmalla. Työn tuloksena esitetään tutkimuksen kohteena olevalle prosessille sopivia kehitystoimenpiteitä sekä suositeltavia prosessin mittaamisalueita.
Resumo:
The ellipticines constitute a broad class of molecules with antitumor activity. In the present work we analyzed the structure and properties of a series of ellipticine derivatives in the gas phase and in solution using quantum mechanical and Monte Carlo methods. The results showed a good correlation between the solvation energies in water obtained with the continuum model and the Monte Carlo simulation. Molecular descriptors were considered in the development of QSAR models using the DNA association constant (log Kapp) as biological data. The results showed that the DNA binding is dominated by electronic parameters, with small contributions from the molecular volume and area.
Resumo:
The most widespread literature for the evaluation of uncertainty - GUM and Eurachem - does not describe explicitly how to deal with uncertainty of the concentration coming from non-linear calibration curves. This work had the objective of describing and validating a methodology, as recommended by the recent GUM Supplement approach, to evaluate the uncertainty through polynomial models of the second order. In the uncertainty determination of the concentration of benzatone (C) by chromatography, it is observed that the uncertainty of measurement between the methodology proposed and Monte Carlo Simulation, does not diverge by more than 0.0005 unit, thus validating the model proposed for one significant digit.