953 resultados para Error Analysis


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Fourth Amendment prohibits unreasonable searches and seizures in criminal investigations. The Supreme Court has interpreted this to require that police obtain a warrant prior to search and that illegally seized evidence be excluded from trial. A consensus has developed in the law and economics literature that tort liability for police officers is a superior means of deterring unreasonable searches. We argue that this conclusion depends on the assumption of truth-seeking police, and develop a game-theoretic model to compare the two remedies when some police officers (the bad type) are willing to plant evidence in order to obtain convictions, even though other police (the good type) are not (where this type is private information). We characterize the perfect Bayesian equilibria of the asymmetric-information game between the police and a court that seeks to minimize error costs in deciding whether to convict or acquit suspects. In this framework, we show that the exclusionary rule with a warrant requirement leads to superior outcomes (relative to tort liability) in terms of truth-finding function of courts, because the warrant requirement can reduce the scope for bad types of police to plant evidence

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Statement of the problem and public health significance. Hospitals were designed to be a safe haven and respite from disease and illness. However, a large body of evidence points to preventable errors in hospitals as the eighth leading cause of death among Americans. Twelve percent of Americans, or over 33.8 million people, are hospitalized each year. This population represents a significant portion of at risk citizens exposed to hospital medical errors. Since the number of annual deaths due to hospital medical errors is estimated to exceed 44,000, the magnitude of this tragedy makes it a significant public health problem. ^ Specific aims. The specific aims of this study were threefold. First, this study aimed to analyze the state of the states' mandatory hospital medical error reporting six years after the release of the influential IOM report, "To Err is Human." The second aim was to identify barriers to reporting of medical errors by hospital personnel. The third aim was to identify hospital safety measures implemented to reduce medical errors and enhance patient safety. ^ Methods. A descriptive, longitudinal, retrospective design was used to address the first stated objective. The study data came from the twenty-one states with mandatory hospital reporting programs which report aggregate hospital error data that is accessible to the public by way of states' websites. The data analysis included calculations of expected number of medical errors for each state according to IOM rates. Where possible, a comparison was made between state reported data and the calculated IOM expected number of errors. A literature review was performed to achieve the second study aim, identifying barriers to reporting medical errors. The final aim was accomplished by telephone interviews of principal patient safety/quality officers from five Texas hospitals with more than 700 beds. ^ Results. The state medical error data suggests vast underreporting of hospital medical errors to the states. The telephone interviews suggest that hospitals are working at reducing medical errors and creating safer environments for patients. The literature review suggests the underreporting of medical errors at the state level stems from underreporting of errors at the delivery level. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

ATP-dependent chromatin remodeling has been shown to be critical for transcription and DNA repair. However, the involvement of ATP-dependent chromatin remodeling in DNA replication remains poorly defined. Interestingly, we found that the INO80 chromatin-remodeling complex is directly involved in the DNA damage tolerance pathways activated during DNA replication. DNA damage tolerance is important for genomic stability and is controlled by formation of either mono-ubiquitinated or multi-ubiquitinated PCNA, which respectively induce error prone or error-free replication bypass of the lesions. In addition, homologous recombination (HR) mediated by the Rad51 pathway is also involved in the DNA damage tolerance pathways. ^ We found that INO80 is specifically recruited to replication origins during S phase in a genome-wide fashion. In addition, DNA combing analysis shows INO80 is required for the resumption of replication at stalled forks induced by methyl methane-sulfonate (MMS). Mechanistically, we find that INO80 is required for PCNA ubiquitination as well as for Rad51 mediated processing of replication forks after MMS treatment. Furthermore, chromatin immunoprecipitation at specific ARSs indicates INO80 is necessary for Rad18 and Rad51 recruitment to replication forks after MMS treatment. Moreover, 2D gel analysis shows INO80 is necessary to process Rad51 mediated intermediates at impeded replication forks. ^ In conclusion, our findings establish a novel role of a chromatin-remodeling complex in DNA damage tolerance pathways and suggest that chromatin remodeling is fundamentally important to ensure faithful replication of DNA and genome stability in eukaryotes. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Next-generation DNA sequencing platforms can effectively detect the entire spectrum of genomic variation and is emerging to be a major tool for systematic exploration of the universe of variants and interactions in the entire genome. However, the data produced by next-generation sequencing technologies will suffer from three basic problems: sequence errors, assembly errors, and missing data. Current statistical methods for genetic analysis are well suited for detecting the association of common variants, but are less suitable to rare variants. This raises great challenge for sequence-based genetic studies of complex diseases.^ This research dissertation utilized genome continuum model as a general principle, and stochastic calculus and functional data analysis as tools for developing novel and powerful statistical methods for next generation of association studies of both qualitative and quantitative traits in the context of sequencing data, which finally lead to shifting the paradigm of association analysis from the current locus-by-locus analysis to collectively analyzing genome regions.^ In this project, the functional principal component (FPC) methods coupled with high-dimensional data reduction techniques will be used to develop novel and powerful methods for testing the associations of the entire spectrum of genetic variation within a segment of genome or a gene regardless of whether the variants are common or rare.^ The classical quantitative genetics suffer from high type I error rates and low power for rare variants. To overcome these limitations for resequencing data, this project used functional linear models with scalar response to develop statistics for identifying quantitative trait loci (QTLs) for both common and rare variants. To illustrate their applications, the functional linear models were applied to five quantitative traits in Framingham heart studies. ^ This project proposed a novel concept of gene-gene co-association in which a gene or a genomic region is taken as a unit of association analysis and used stochastic calculus to develop a unified framework for testing the association of multiple genes or genomic regions for both common and rare alleles. The proposed methods were applied to gene-gene co-association analysis of psoriasis in two independent GWAS datasets which led to discovery of networks significantly associated with psoriasis.^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In regression analysis, covariate measurement error occurs in many applications. The error-prone covariates are often referred to as latent variables. In this proposed study, we extended the study of Chan et al. (2008) on recovering latent slope in a simple regression model to that in a multiple regression model. We presented an approach that applied the Monte Carlo method in the Bayesian framework to the parametric regression model with the measurement error in an explanatory variable. The proposed estimator applied the conditional expectation of latent slope given the observed outcome and surrogate variables in the multiple regression models. A simulation study was presented showing that the method produces estimator that is efficient in the multiple regression model, especially when the measurement error variance of surrogate variable is large.^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Life expectancy has consistently increased over the last 150 years due to improvements in nutrition, medicine, and public health. Several studies found that in many developed countries, life expectancy continued to rise following a nearly linear trend, which was contrary to a common belief that the rate of improvement in life expectancy would decelerate and was fit with an S-shaped curve. Using samples of countries that exhibited a wide range of economic development levels, we explored the change in life expectancy over time by employing both nonlinear and linear models. We then observed if there were any significant differences in estimates between linear models, assuming an auto-correlated error structure. When data did not have a sigmoidal shape, nonlinear growth models sometimes failed to provide meaningful parameter estimates. The existence of an inflection point and asymptotes in the growth models made them inflexible with life expectancy data. In linear models, there was no significant difference in the life expectancy growth rate and future estimates between ordinary least squares (OLS) and generalized least squares (GLS). However, the generalized least squares model was more robust because the data involved time-series variables and residuals were positively correlated. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Next-generation sequencing (NGS) technology has become a prominent tool in biological and biomedical research. However, NGS data analysis, such as de novo assembly, mapping and variants detection is far from maturity, and the high sequencing error-rate is one of the major problems. . To minimize the impact of sequencing errors, we developed a highly robust and efficient method, MTM, to correct the errors in NGS reads. We demonstrated the effectiveness of MTM on both single-cell data with highly non-uniform coverage and normal data with uniformly high coverage, reflecting that MTM’s performance does not rely on the coverage of the sequencing reads. MTM was also compared with Hammer and Quake, the best methods for correcting non-uniform and uniform data respectively. For non-uniform data, MTM outperformed both Hammer and Quake. For uniform data, MTM showed better performance than Quake and comparable results to Hammer. By making better error correction with MTM, the quality of downstream analysis, such as mapping and SNP detection, was improved. SNP calling is a major application of NGS technologies. However, the existence of sequencing errors complicates this process, especially for the low coverage (

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Quantitative real-time polymerase chain reaction (qPCR) is a sensitive gene quantitation method that has been widely used in the biological and biomedical fields. The currently used methods for PCR data analysis, including the threshold cycle (CT) method, linear and non-linear model fitting methods, all require subtracting background fluorescence. However, the removal of background fluorescence is usually inaccurate, and therefore can distort results. Here, we propose a new method, the taking-difference linear regression method, to overcome this limitation. Briefly, for each two consecutive PCR cycles, we subtracted the fluorescence in the former cycle from that in the later cycle, transforming the n cycle raw data into n-1 cycle data. Then linear regression was applied to the natural logarithm of the transformed data. Finally, amplification efficiencies and the initial DNA molecular numbers were calculated for each PCR run. To evaluate this new method, we compared it in terms of accuracy and precision with the original linear regression method with three background corrections, being the mean of cycles 1-3, the mean of cycles 3-7, and the minimum. Three criteria, including threshold identification, max R2, and max slope, were employed to search for target data points. Considering that PCR data are time series data, we also applied linear mixed models. Collectively, when the threshold identification criterion was applied and when the linear mixed model was adopted, the taking-difference linear regression method was superior as it gave an accurate estimation of initial DNA amount and a reasonable estimation of PCR amplification efficiencies. When the criteria of max R2 and max slope were used, the original linear regression method gave an accurate estimation of initial DNA amount. Overall, the taking-difference linear regression method avoids the error in subtracting an unknown background and thus it is theoretically more accurate and reliable. This method is easy to perform and the taking-difference strategy can be extended to all current methods for qPCR data analysis.^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The widespread occurrence of microbialites in the last deglacial reef frameworks (16-6 Ka BP) implies that the accurate study of their development patterns is of prime importance to unravel the evolution of reef architecture through time and to reconstruct the reef response to sea-level variations and environmental changes. The present study is based on the sedimentological and chronological analysis (14C AMS dating) of drill cores obtained during the IODP Expedition #310 "Tahiti Sea Level" on the successive terraces which typify the modern reef slopes from Tahiti. It provides a comprehensive data base to investigate the microbialite growth patterns (i.e. growth rates and habitats), to analyze their roles in reef frameworks and to reconstruct the evolution of the reef framework architecture during sea-level rise. The last deglacial reefs from Tahiti are composed of two distinctive biological communities: (1) the coralgal communities including seven assemblages characterized by various growth forms (branching, robust branching, massive, tabular and encrusting) that form the initial frameworks and (2) the microbial communities developed in the primary cavities of those frameworks, a few meters (1.5 to 6 m) below the living coral reef surface, where they heavily encrusted the coralgal assemblages to form microbialite crusts. The dating results demonstrate the occurrence of two distinctive generations of microbialites: the "reefal microbialites" which developed a few hundred years after coralgal communities in shallow-water environments, whereas the "slope microbialites" grew a few thousands of years later in significantly deeper water conditions after the demise of coralgal communities. The development of microbialites was controlled by the volume and the shape of the primary cavities of the initial reef frameworks determined by the morphology and the packing of coral colonies. The most widespread microbialite development occurred in frameworks dominated by branching, thin encrusting, tabular and robust branching coral colonies which built loose and open frameworks typified by a high porosity (> 50%). In contrast, their growth was minimal in compact coral frameworks formed by massive and thick encrusting corals where primary cavities yielded a low porosity (~ 30%) and could not host a significant microbialite expansion.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Sarcya 1 dive explored a previously unknown 12 My old submerged volcano, labelled Cornacya. A well developed fracturation is characterised by the following directions: N 170 to N-S, N 20 to N 40, N 90 to N 120, N 50 to N 70, which corresponds to the fracturation pattern of the Sardinian margin. The sampled lavas exhibit features of shoshonitic suites of intermediate composition and include amphibole-and mica-bearing lamprophyric xenoliths which are geochemically similar to Ti-poor lamproites. Mica compositions reflect chemical exchanges between the lamprophyre and its shoshonitic host rock suggesting their simultaneous emplacement. Nd compositions of the Cornacya K-rich suite indicate that continental crust was largely involved in the genesis of these rocks. The spatial association of the lamprophyre with the shoshonitic rocks is geochemically similar to K-rich and TiO2-poor igneous suites, emplaced in post-collisional settings. Among shoshonitic rocks, sample SAR 1-01 has been dated at 12.6±0.3 My using the 40Ar/39Ar method with a laser microprobe on single grains. The age of the Cornacya shoshonitic suite is similar to that of the Sisco lamprophyre from Corsica, which similarly is located on the western margin of the Tyrrhenian Sea. Thus, the Cornacya shoshonitic rocks and their lamprophyric xenolith and the Sisco lamprophyre could represent post-collisional suites emplaced during the lithospheric extension of the Corsica-Sardinia block, just after its rotation and before the Tyrrhenian sea opening. Drilling on the Sardinia margin (ODP Leg 107) shows that the upper levels of the present day margin (Hole 654) suffered tectonic subsidence before the lower part (Hole 652). The structure of this lower part is interpreted as the result of an eastward migration of the extension during Late Miocene and Early Pliocene times. Data of Cornacya volcano are in good agreement with this model and provide good chronological constraints for the beginning of the phenomenon.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This research investigates the spatial market integration of the Chilean wheat market in relation with its most representative international markets by using a vector error correction model (VECM) and how a price support policy, as a price band, affect it. The international market was characterized by two relevant wheat prices: PAN from Argentina and Hard Red Winter from the United States. The spatial market integration level, expressed in the error correction term (ECT), allowed concluding that there is a high integration degree among these markets with a variable influence of the price band mechanism mainly related with its estimation methodology. Moreover, this paper showed that Chile can be seen as price taker as long as the speed of its adjustment to international shocks, being these reactions faster than in the United States and Argentina. Finally, the results validated the "Law of the One Price", which assumes price equalization across all local markets in the long run.