927 resultados para Error correction coding
Resumo:
El objetivo que persigue un proceso de auditoría de estados contables es la comunicación por parte del auditor de una conclusión en relación al grado de razonabilidad con que tales estados reflejan la situación patrimonial, económica y financiera del ente de acuerdo a los criterios plasmados en las normas contables de referencia a ser utilizadas. El hecho que un auditor emita una conclusión errónea como consecuencia de su labor puede implicar la asunción de responsabilidades profesionales, civiles y penales como consecuencia de reclamos de usuarios de los estados contables que pudieran haberse visto perjudicados como consecuencia de la emisión de la conclusión errónea. Las normas contables a nivel nacional e internacional admiten la existencia de errores u omisiones en la información contenida en los estados contables, en la medida que tales desvíos no provoquen en los usuarios interesados en tales estados una decisión distinta a la que tomarían en caso de no existir los errores u omisiones aludidos. De lo expuesto en el párrafo anterior surge la cabal importancia que la determinación del nivel de significación total (nivel de desvíos admitidos por los usuarios de los estados contables en la información por ellos contenida) adquiere en los procesos de auditoría, como así también la asignación de tal nivel entre los distintos componentes de los estados contables (asignación del error tolerable) a los efectos de que los auditores eviten asumir responsabilidades de índole profesional, civil y/o penal. Hasta el momento no se conoce la existencia de modelos matemáticos que respalden de modo objetivo y verificable el cálculo del nivel de significación total y la asignación del error tolerable entre los distintos elementos conformantes de los estados contables. Entendemos que el desarrollo e integración de un modelo de cuantificación del nivel de significación total y de asignación del error tolerable tiene las siguientes repercusiones: 1 – Representaría para el auditor un elemento que respalde el modo de cuantificación del nivel de significación y la asignación del error tolerable entre los componentes de los estados contables. 2 – Permitiría que los auditores reduzcan las posibilidades de asumir responsabilidades de carácter profesional, civil y/o penales como consecuencia de su labor. 3 – Representaría un principio de avance a los efectos de que los organismos emisores de normas de auditoría a nivel nacional e internacional recepten elementos a los efectos de fijar directrices en relación al cálculo del nivel de significación y de asignación del error tolerable. 4 - Eliminaría al cálculo del nivel de significación como una barrera que afecte la comparabilidad de los estados contables.
Resumo:
The classical central limit theorem states the uniform convergence of the distribution functions of the standardized sums of independent and identically distributed square integrable real-valued random variables to the standard normal distribution function. While first versions of the central limit theorem are already due to Moivre (1730) and Laplace (1812), a systematic study of this topic started at the beginning of the last century with the fundamental work of Lyapunov (1900, 1901). Meanwhile, extensions of the central limit theorem are available for a multitude of settings. This includes, e.g., Banach space valued random variables as well as substantial relaxations of the assumptions of independence and identical distributions. Furthermore, explicit error bounds are established and asymptotic expansions are employed to obtain better approximations. Classical error estimates like the famous bound of Berry and Esseen are stated in terms of absolute moments of the random summands and therefore do not reflect a potential closeness of the distributions of the single random summands to a normal distribution. Non-classical approaches take this issue into account by providing error estimates based on, e.g., pseudomoments. The latter field of investigation was initiated by work of Zolotarev in the 1960's and is still in its infancy compared to the development of the classical theory. For example, non-classical error bounds for asymptotic expansions seem not to be available up to now ...
Resumo:
Magdeburg, Univ., Fak. für Elektrotechnik und Informationstechnik, Diss., 2010
Resumo:
Magdeburg, Univ., Fak. für Naturwiss., Diss., 2012
Resumo:
This paper dis cusses the fitting of a Cobb-Doug las response curve Yi = αXβi, with additive error, Yi = αXβi + e i, instead of the usual multiplicative error Yi = αXβi (1 + e i). The estimation of the parameters A and B is discussed. An example is given with use of both types of error.
Resumo:
Otto-von-Guericke-Universität Magdeburg, Fakultät für Mathematik, Univ., Dissertation, 2015
Resumo:
v.72:no.1(1977)
Resumo:
Restriction site-associated DNA sequencing (RADseq) provides researchers with the ability to record genetic polymorphism across thousands of loci for nonmodel organisms, potentially revolutionizing the field of molecular ecology. However, as with other genotyping methods, RADseq is prone to a number of sources of error that may have consequential effects for population genetic inferences, and these have received only limited attention in terms of the estimation and reporting of genotyping error rates. Here we use individual sample replicates, under the expectation of identical genotypes, to quantify genotyping error in the absence of a reference genome. We then use sample replicates to (i) optimize de novo assembly parameters within the program Stacks, by minimizing error and maximizing the retrieval of informative loci; and (ii) quantify error rates for loci, alleles and single-nucleotide polymorphisms. As an empirical example, we use a double-digest RAD data set of a nonmodel plant species, Berberis alpina, collected from high-altitude mountains in Mexico.
Resumo:
BACKGROUND: Cone-beam computed tomography (CBCT) image-guided radiotherapy (IGRT) systems are widely used tools to verify and correct the target position before each fraction, allowing to maximize treatment accuracy and precision. In this study, we evaluate automatic three-dimensional intensity-based rigid registration (RR) methods for prostate setup correction using CBCT scans and study the impact of rectal distension on registration quality. METHODS: We retrospectively analyzed 115 CBCT scans of 10 prostate patients. CT-to-CBCT registration was performed using (a) global RR, (b) bony RR, or (c) bony RR refined by a local prostate RR using the CT clinical target volume (CTV) expanded with 1-to-20-mm varying margins. After propagation of the manual CT contours, automatic CBCT contours were generated. For evaluation, a radiation oncologist manually delineated the CTV on the CBCT scans. The propagated and manual CBCT contours were compared using the Dice similarity and a measure based on the bidirectional local distance (BLD). We also conducted a blind visual assessment of the quality of the propagated segmentations. Moreover, we automatically quantified rectal distension between the CT and CBCT scans without using the manual CBCT contours and we investigated its correlation with the registration failures. To improve the registration quality, the air in the rectum was replaced with soft tissue using a filter. The results with and without filtering were compared. RESULTS: The statistical analysis of the Dice coefficients and the BLD values resulted in highly significant differences (p<10(-6)) for the 5-mm and 8-mm local RRs vs the global, bony and 1-mm local RRs. The 8-mm local RR provided the best compromise between accuracy and robustness (Dice median of 0.814 and 97% of success with filtering the air in the rectum). We observed that all failures were due to high rectal distension. Moreover, the visual assessment confirmed the superiority of the 8-mm local RR over the bony RR. CONCLUSION: The most successful CT-to-CBCT RR method proved to be the 8-mm local RR. We have shown the correlation between its registration failures and rectal distension. Furthermore, we have provided a simple (easily applicable in routine) and automatic method to quantify rectal distension and to predict registration failure using only the manual CT contours.
Resumo:
A stringent branch-site codon model was used to detect positive selection in vertebrate evolution. We show that the test is robust to the large evolutionary distances involved. Positive selection was detected in 77% of 884 genes studied. Most positive selection concerns a few sites on a single branch of the phylogenetic tree: Between 0.9% and 4.7% of sites are affected by positive selection depending on the branches. No functional category was overrepresented among genes under positive selection. Surprisingly, whole genome duplication had no effect on the prevalence of positive selection, whether the fish-specific genome duplication or the two rounds at the origin of vertebrates. Thus positive selection has not been limited to a few gene classes, or to specific evolutionary events such as duplication, but has been pervasive during vertebrate evolution.
Resumo:
BACKGROUND: Surgical correction of complete atrio-ventricular septal defect (AVSD) achieves satisfactory results with low morbidity and mortality, but may require reoperation. Our recent operative results at mid-term were followed-up. METHODS: From June 2000 to December 2007, 81 patients (Down syndrome; n=60), median age 4.0 months (range 0.7-118.6) and weight 4.7kg (range 2.2-33), underwent complete AVSD correction. Patch closure for the ventricular septal defect (VSD; n=69) and atrial septal defect (ASD; n=42) was performed with left atrio-ventricular valve (LAVV) cleft closure (n=76) and right atrio-ventricular valve (RAVV) repair (n=57). Mortality, morbidity, and indications for reoperation were retrospectively studied; the end point 'time to reoperation' was analyzed using Kaplan-Meier curves. Follow-up was complete except in two patients and spanned a median of 28 months (range 0.4-6.1 years). RESULTS: In-hospital mortality was 3.7% (n=3) and one late death occurred. Reoperation was required in 7/79 patients (8.9%) for LAVV insufficiency (n=4), for a residual ASD (n=1), for right atrio-ventricular valve insufficiency (n=1), and for subaortic stenosis (n=1). At last follow-up, no or only mild LAVV and RAVV insufficiency was present in 81.3% and 92.1% of patients, respectively, and 2/3 of patients were medication-free. Risk factors for reoperation were younger age (<3 months; p=0.001) and lower weight (<4kg; p=0.003), and a trend towards less and later reoperations in Down syndrome (p<0.2). CONCLUSIONS: Surgical correction of AVSD can be achieved with low mortality and need for reoperation, regardless of Down syndrome or not. Immediate postoperative moderate or more residual atrio-ventricular valve insufficiency will eventually require a reoperation, and could be anticipated in patients younger than 3 months and weighing <4kg.
Resumo:
This note develops a flexible methodology for splicing economic time series that avoids the extreme assumptions implicit in the procedures most commonly used in the literature. It allows the user to split the required correction to the older of the series being linked between its levels and growth rates on the basis what he knows or conjectures about the persistence of the factors that account for the discrepancy between the two series that emerges at their linking point. The time profile of the correction is derived from the assumption that the error in the older series reflects the inadequate coverage of emerging sectors or activities that grow faster than the aggregate.
Resumo:
Pancreatic β-cells play a central role in glucose homeostasis by tightly regulating insulin release according to the organism's demand. Impairment of β-cell function due to hostile environment, such as hyperglycaemia and hyperlipidaemia, or due to autoimmune destruction of β-cells, results in diabetes onset. Both environmental factors and genetic predisposition are known to be involved in the development of the disease, but the exact mechanisms leading to β-cell dysfunction and death remain to be characterized. Non-coding RNA molecules, such as microRNAs (miRNAs), have been suggested to be necessary for proper β-cell development and function. The present review aims at summarizing the most recent findings about the role of non-coding RNAs in the control of β-cell functions and their involvement in diabetes. We will also provide a perspective view of the future research directions in the field of non-coding RNAs. In particular, we will discuss the implications for diabetes research of the discovery of a new communication mechanism based on cell-to-cell miRNA transfer. Moreover, we will highlight the emerging interconnections between miRNAs and epigenetics and the possible role of long non-coding RNAs in the control of β-cell activities.