985 resultados para Error estimate.
Resumo:
Microsatellite genotyping is a common DNA characterization technique in population, ecological and evolutionary genetics research. Since different alleles are sized relative to internal size-standards, different laboratories must calibrate and standardize allelic designations when exchanging data. This interchange of microsatellite data can often prove problematic. Here, 16 microsatellite loci were calibrated and standardized for the Atlantic salmon, Salmo salar, across 12 laboratories. Although inconsistencies were observed, particularly due to differences between migration of DNA fragments and actual allelic size ('size shifts'), inter-laboratory calibration was successful. Standardization also allowed an assessment of the degree and partitioning of genotyping error. Notably, the global allelic error rate was reduced from 0.05 ± 0.01 prior to calibration to 0.01 ± 0.002 post-calibration. Most errors were found to occur during analysis (i.e. when size-calling alleles; the mean proportion of all errors that were analytical errors across loci was 0.58 after calibration). No evidence was found of an association between the degree of error and allelic size range of a locus, number of alleles, nor repeat type, nor was there evidence that genotyping errors were more prevalent when a laboratory analyzed samples outside of the usual geographic area they encounter. The microsatellite calibration between laboratories presented here will be especially important for genetic assignment of marine-caught Atlantic salmon, enabling analysis of marine mortality, a major factor in the observed declines of this highly valued species.
Resumo:
We analyze the effect of a quantum error correcting code on the entanglement of encoded logical qubits in the presence of a dephasing interaction with a correlated environment. Such correlated reservoir introduces entanglement between physical qubits. We show that for short times the quantum error correction interprets such entanglement as errors and suppresses it. However, for longer time, although quantum error correction is no longer able to correct errors, it enhances the rate of entanglement production due to the interaction with the environment.
Resumo:
In this paper we present an empirical analysis of the residential demand for electricity using annual aggregate data at the state level for 48 US states from 1995 to 2007. Earlier literature has examined residential energy consumption at the state level using annual or monthly data, focusing on the variation in price elasticities of demand across states or regions, but has failed to recognize or address two major issues. The first is that, when fitting dynamic panel models, the lagged consumption term in the right-hand side of the demand equation is endogenous. This has resulted in potentially inconsistent estimates of the long-run price elasticity of demand. The second is that energy price is likely mismeasured.
Resumo:
Melt viscosity is a key indicator of product quality in polymer extrusion processes. However, real time monitoring and control of viscosity is difficult to achieve. In this article, a novel “soft sensor” approach based on dynamic gray-box modeling is proposed. The soft sensor involves a nonlinear finite impulse response model with adaptable linear parameters for real-time prediction of the melt viscosity based on the process inputs; the model output is then used as an input of a model with a simple-fixed structure to predict the barrel pressure which can be measured online. Finally, the predicted pressure is compared to the measured value and the corresponding error is used as a feedback signal to correct the viscosity estimate. This novel feedback structure enables the online adaptability of the viscosity model in response to modeling errors and disturbances, hence producing a reliable viscosity estimate. The experimental results on different material/die/extruder confirm the effectiveness of the proposed “soft sensor” method based on dynamic gray-box modeling for real-time monitoring and control of polymer extrusion processes. POLYM. ENG. SCI., 2012. © 2012 Society of Plastics Engineers
Resumo:
To separately investigate the impact of simulated age-related lens yellowing, transparency loss and refractive error on measurements of macular pigment (MP) using resonance Raman spectroscopy.
Resumo:
In many environmental valuation applications standard sample sizes for choice modelling surveys are impractical to achieve. One can improve data quality using more in-depth surveys administered to fewer respondents. We report on a study using high quality rank-ordered data elicited with the best-worst approach. The resulting "exploded logit" choice model, estimated on 64 responses per person, was used to study the willingness to pay for external benefits by visitors for policies which maintain the cultural heritage of alpine grazing commons. We find evidence supporting this approach and reasonable estimates of mean WTP, which appear theoretically valid and policy informative. © The Author (2011).
Resumo:
Many of the most interesting questions ecologists ask lead to analyses of spatial data. Yet, perhaps confused by the large number of statistical models and fitting methods available, many ecologists seem to believe this is best left to specialists. Here, we describe the issues that need consideration when analysing spatial data and illustrate these using simulation studies. Our comparative analysis involves using methods including generalized least squares, spatial filters, wavelet revised models, conditional autoregressive models and generalized additive mixed models to estimate regression coefficients from synthetic but realistic data sets, including some which violate standard regression assumptions. We assess the performance of each method using two measures and using statistical error rates for model selection. Methods that performed well included generalized least squares family of models and a Bayesian implementation of the conditional auto-regressive model. Ordinary least squares also performed adequately in the absence of model selection, but had poorly controlled Type I error rates and so did not show the improvements in performance under model selection when using the above methods. Removing large-scale spatial trends in the response led to poor performance. These are empirical results; hence extrapolation of these findings to other situations should be performed cautiously. Nevertheless, our simulation-based approach provides much stronger evidence for comparative analysis than assessments based on single or small numbers of data sets, and should be considered a necessary foundation for statements of this type in future.
Resumo:
GC-MS data on veterinary drug residues in bovine urine are used for controlling the illegal practice of fattening cattle. According to current detection criteria, peak patterns of preferably four ions should agree within 10 or 20% from a corresponding standard pattern. These criteria are rigid, rather arbitrary and do not match daily practice. A new model, based on multivariate modeling of log peak abundance ratios, provides a theoretical basis for the identification of analytes and optimizes the balance between the avoidance of false positives and false negatives. The performance of the model is demonstrated on data provided by five laboratories, each supplying GC-MS measurements on the detection of clenbuterol, dienestrol and 19 beta-nortestosterone in urine. The proposed model shows a better performance than confirmation by using the current criteria and provides a statistical basis for inspection criteria in terms of error probabilities.