944 resultados para Genotyping uncertainty
Resumo:
Buccal swabs have recently been used as a minimally invasive sampling method in genetic studies of wild populations, including amphibian species. Yet it is not known to date what is the level of reliability for microsatellite genotypes obtained using such samples. Allelic dropout and false alleles may affect the genotyping derived from buccal samples. Here we quantified the success of microsatellite amplification and the rates of genotyping errors using buccal swabs in two amphibian species, the Alpine newt Triturus alpestris and the Green tree frog Hyla arborea, and we estimated two important parameters for downstream analyses, namely the number of repetitions required to achieve typing reliability and the probability of identity among genotypes. Amplification success was high, and only one locus tested required two to three repetitions to achieve reliable genotypes, showing that buccal swabbing is a very efficient approach allowing good quality DNA retrieval. This sampling method which allows avoiding the controversial toe-clipping will likely prove very useful in the context of amphibian conservation.
Resumo:
Approximate models (proxies) can be employed to reduce the computational costs of estimating uncertainty. The price to pay is that the approximations introduced by the proxy model can lead to a biased estimation. To avoid this problem and ensure a reliable uncertainty quantification, we propose to combine functional data analysis and machine learning to build error models that allow us to obtain an accurate prediction of the exact response without solving the exact model for all realizations. We build the relationship between proxy and exact model on a learning set of geostatistical realizations for which both exact and approximate solvers are run. Functional principal components analysis (FPCA) is used to investigate the variability in the two sets of curves and reduce the dimensionality of the problem while maximizing the retained information. Once obtained, the error model can be used to predict the exact response of any realization on the basis of the sole proxy response. This methodology is purpose-oriented as the error model is constructed directly for the quantity of interest, rather than for the state of the system. Also, the dimensionality reduction performed by FPCA allows a diagnostic of the quality of the error model to assess the informativeness of the learning set and the fidelity of the proxy to the exact model. The possibility of obtaining a prediction of the exact response for any newly generated realization suggests that the methodology can be effectively used beyond the context of uncertainty quantification, in particular for Bayesian inference and optimization.
Resumo:
Managers can craft effective integrated strategy by properly assessing regulatory uncertainty. Leveraging the existing political markets literature, we predict regulatory uncertainty from the novel interaction of demand and supply side rivalries across a range of political markets. We argue for two primary drivers of regulatory uncertainty: ideology-motivated interests opposed to the firm and a lack of competition for power among political actors supplying public policy. We align three, previously disparate dimensions of nonmarket strategy - profile level, coalition breadth, and pivotal target - to levels of regulatory uncertainty. Through this framework, we demonstrate how and when firms employ different nonmarket strategies. To illustrate variation in nonmarket strategy across levels of regulatory uncertainty, we analyze several market entry decisions of foreign firms operating in the global telecommunications sector.
Resumo:
ABSTRACT The citriculture consists in several environmental risks, as weather changes and pests, and also consists in considerable financial risk, mainly due to the period ofreturn on the initial investment. This study was motivated by the need to assess the risks of a business activity such as citriculture. Our objective was to build a stochastic simulation model to achieve the economic and financial analysis of an orange producer in the Midwest region of the state of Sao Paulo, under conditions of uncertainty. The parameters used were the Net Present Value (NPV), the Modified Internal Rate of Return(MIRR), and the Discounted Payback. To evaluate the risk conditions we built a probabilistic model of pseudorandom numbers generated with Monte Carlo method. The results showed that the activity analyzed provides a risk of 42.8% to reach a NPV negative; however, the yield assessed by MIRR was 7.7%, higher than the yield from the reapplication of the positive cash flows. The financial investment pays itself after the fourteenth year of activity.
Resumo:
In radionuclide metrology, Monte Carlo (MC) simulation is widely used to compute parameters associated with primary measurements or calibration factors. Although MC methods are used to estimate uncertainties, the uncertainty associated with radiation transport in MC calculations is usually difficult to estimate. Counting statistics is the most obvious component of MC uncertainty and has to be checked carefully, particularly when variance reduction is used. However, in most cases fluctuations associated with counting statistics can be reduced using sufficient computing power. Cross-section data have intrinsic uncertainties that induce correlations when apparently independent codes are compared. Their effect on the uncertainty of the estimated parameter is difficult to determine and varies widely from case to case. Finally, the most significant uncertainty component for radionuclide applications is usually that associated with the detector geometry. Recent 2D and 3D x-ray imaging tools may be utilized, but comparison with experimental data as well as adjustments of parameters are usually inevitable.
Resumo:
This chapter presents possible uses and examples of Monte Carlo methods for the evaluation of uncertainties in the field of radionuclide metrology. The method is already well documented in GUM supplement 1, but here we present a more restrictive approach, where the quantities of interest calculated by the Monte Carlo method are estimators of the expectation and standard deviation of the measurand, and the Monte Carlo method is used to propagate the uncertainties of the input parameters through the measurement model. This approach is illustrated by an example of the activity calibration of a 103Pd source by liquid scintillation counting and the calculation of a linear regression on experimental data points. An electronic supplement presents some algorithms which may be used to generate random numbers with various statistical distributions, for the implementation of this Monte Carlo calculation method.
Resumo:
This paper discusses basic theoretical strategies used to deal with measurement uncertainties arising from different experimental situations. It attempts to indicate the most appropriate method of obtaining a reliable estimate of the quantity to be evaluated depending on the characteristics of the data available. The theoretical strategies discussed are supported by experimental detail, and the conditions and results have been taken from examples in the field of radionuclide metrology. Special care regarding the correct treatment of covariances is emphasized because of the unreliability of the results obtained if these are neglected
Resumo:
Pressurized re-entrant (or 4 pi) ionization chambers (ICs) connected to current-measuring electronics are used for activity measurements of photon emitting radionuclides and some beta emitters in the fields of metrology and nuclear medicine. As a secondary method, these instruments need to be calibrated with appropriate activity standards from primary or direct standardization. The use of these instruments over 50 years has been well described in numerous publications, such as the Monographie BIPM-4 and the special issue of Metrologia on radionuclide metrology (Ratel 2007 Metrologia 44 S7-16, Schrader1997 Activity Measurements With Ionization Chambers (Monographie BIPM-4) Schrader 2007 Metrologia 44 S53-66, Cox et al 2007 Measurement Modelling of the International Reference System (SIR) for Gamma-Emitting Radionuclides (Monographie BIPM-7)). The present work describes the principles of activity measurements, calibrations, and impurity corrections using pressurized ionization chambers in the first part and the uncertainty analysis illustrated with example uncertainty budgets from routine source-calibration as well as from an international reference system (SIR) measurement in the second part.
Resumo:
The use of the Bayes factor (BF) or likelihood ratio as a metric to assess the probative value of forensic traces is largely supported by operational standards and recommendations in different forensic disciplines. However, the progress towards more widespread consensus about foundational principles is still fragile as it raises new problems about which views differ. It is not uncommon e.g. to encounter scientists who feel the need to compute the probability distribution of a given expression of evidential value (i.e. a BF), or to place intervals or significance probabilities on such a quantity. The article here presents arguments to show that such views involve a misconception of principles and abuse of language. The conclusion of the discussion is that, in a given case at hand, forensic scientists ought to offer to a court of justice a given single value for the BF, rather than an expression based on a distribution over a range of values.
Resumo:
Peer-reviewed
Resumo:
Traditionally, in the cigarettes industry, the determination of ammonium ion in the mainstream smoke is performed by ion chromatography. This work studies this determination and compares the results of this technique with the use of external and internal standard calibration. A reference cigarette sample presented measurement uncertainty of 2.0 μg/cigarette and 1.5 μg/cigarette, with external and internal standard, respectively. It is observed that the greatest source of uncertainty is the bias correction factor and that it is even more significant when using external standard, confirming thus the importance of internal standardization for this correction.