55 resultados para Error probability
Resumo:
The infinite slope method is widely used as the geotechnical component of geomorphic and landscape evolution models. Its assumption that shallow landslides are infinitely long (in a downslope direction) is usually considered valid for natural landslides on the basis that they are generally long relative to their depth. However, this is rarely justified, because the critical length/depth (L/H) ratio below which edge effects become important is unknown. We establish this critical L/H ratio by benchmarking infinite slope stability predictions against finite element predictions for a set of synthetic two-dimensional slopes, assuming that the difference between the predictions is due to error in the infinite slope method. We test the infinite slope method for six different L/H ratios to find the critical ratio at which its predictions fall within 5% of those from the finite element method. We repeat these tests for 5000 synthetic slopes with a range of failure plane depths, pore water pressures, friction angles, soil cohesions, soil unit weights and slope angles characteristic of natural slopes. We find that: (1) infinite slope stability predictions are consistently too conservative for small L/H ratios; (2) the predictions always converge to within 5% of the finite element benchmarks by a L/H ratio of 25 (i.e. the infinite slope assumption is reasonable for landslides 25 times longer than they are deep); but (3) they can converge at much lower ratios depending on slope properties, particularly for low cohesion soils. The implication for catchment scale stability models is that the infinite length assumption is reasonable if their grid resolution is coarse (e.g. >25?m). However, it may also be valid even at much finer grid resolutions (e.g. 1?m), because spatial organization in the predicted pore water pressure field reduces the probability of short landslides and minimizes the risk that predicted landslides will have L/H ratios less than 25. Copyright (c) 2012 John Wiley & Sons, Ltd.
Resumo:
In the assessment of medical malpractice imaging methods can be used for the documentation of crucial morphological findings which are indicative for or against an iatrogenically caused injury. The clarification of deaths in this context can be usefully supported by postmortem imaging (primarily native computed tomography, angiography, magnetic resonance imaging). Postmortem imaging offers significant additional information compared to an autopsy in the detection of iatrogenic air embolisms and documentation of misplaced medical aids before dissection with an inherent danger of relocation. Additional information is supplied by postmortem imaging in the search for sources of bleeding as well as the documentation of perfusion after cardiovascular surgery. Key criteria for the decision to perform postmortem imaging can be obtained from the necessary preliminary inspection of clinical documentation.
Resumo:
Many complex systems may be described by not one but a number of complex networks mapped on each other in a multi-layer structure. Because of the interactions and dependencies between these layers, the state of a single layer does not necessarily reflect well the state of the entire system. In this paper we study the robustness of five examples of two-layer complex systems: three real-life data sets in the fields of communication (the Internet), transportation (the European railway system), and biology (the human brain), and two models based on random graphs. In order to cover the whole range of features specific to these systems, we focus on two extreme policies of system's response to failures, no rerouting and full rerouting. Our main finding is that multi-layer systems are much more vulnerable to errors and intentional attacks than they appear from a single layer perspective.
Resumo:
The vast territories that have been radioactively contaminated during the 1986 Chernobyl accident provide a substantial data set of radioactive monitoring data, which can be used for the verification and testing of the different spatial estimation (prediction) methods involved in risk assessment studies. Using the Chernobyl data set for such a purpose is motivated by its heterogeneous spatial structure (the data are characterized by large-scale correlations, short-scale variability, spotty features, etc.). The present work is concerned with the application of the Bayesian Maximum Entropy (BME) method to estimate the extent and the magnitude of the radioactive soil contamination by 137Cs due to the Chernobyl fallout. The powerful BME method allows rigorous incorporation of a wide variety of knowledge bases into the spatial estimation procedure leading to informative contamination maps. Exact measurements (?hard? data) are combined with secondary information on local uncertainties (treated as ?soft? data) to generate science-based uncertainty assessment of soil contamination estimates at unsampled locations. BME describes uncertainty in terms of the posterior probability distributions generated across space, whereas no assumption about the underlying distribution is made and non-linear estimators are automatically incorporated. Traditional estimation variances based on the assumption of an underlying Gaussian distribution (analogous, e.g., to the kriging variance) can be derived as a special case of the BME uncertainty analysis. The BME estimates obtained using hard and soft data are compared with the BME estimates obtained using only hard data. The comparison involves both the accuracy of the estimation maps using the exact data and the assessment of the associated uncertainty using repeated measurements. Furthermore, a comparison of the spatial estimation accuracy obtained by the two methods was carried out using a validation data set of hard data. Finally, a separate uncertainty analysis was conducted that evaluated the ability of the posterior probabilities to reproduce the distribution of the raw repeated measurements available in certain populated sites. The analysis provides an illustration of the improvement in mapping accuracy obtained by adding soft data to the existing hard data and, in general, demonstrates that the BME method performs well both in terms of estimation accuracy as well as in terms estimation error assessment, which are both useful features for the Chernobyl fallout study.
Resumo:
Medical errors compromise patient safety in ambulatory practice. These errors must be faced in a framework that reduces to a minimum their consequences for the patients. This approach relies on the implementation of a new culture without stigmatization and where errors are disclosed to the patients; this culture implies the build up of a system for reporting errors associated to an in-depth analysis of the system, looking for root causes and insufficient barriers with the aim to fix them. A useful education tool is the "critical situations" meeting during which physicians are encouraged to openly present adverse events and "near misses". Their analysis, with supportive attitude towards involved staff members, allows to reveal systems failures within the institution or the private practice.
Resumo:
Nestling birds produced later in the season are hypothesized to be of poor quality with a low probability of survival and recruitment. In a Spanish population of house martins (Delichon urbica), we first compared reproductive success, immune responses and morphological traits between the first and the second broods. Second, we investigated the effects of an ectoparasite treatment and breeding date on the recapture rate the following year. Due probably to a reverse situation in weather conditions during the experiment, with more rain during rearing of the first brood, nestlings reared during the second brood were in better condition and had stronger immune responses compared with nestlings from the first brood. Contrary to other findings on house martins, we found a similar recapture rate for chicks reared during the first and the second brood. Furthermore, ectoparasitic house martin bugs had no significant effect on the recapture rate. Recaptured birds had similar morphology but higher immunoglobulin levels when nestlings compared with non-recaptured birds. This result implies that a measure of immune function is a better predictor of survival than body condition per se.
Resumo:
Summary points: - The bias introduced by random measurement error will be different depending on whether the error is in an exposure variable (risk factor) or outcome variable (disease) - Random measurement error in an exposure variable will bias the estimates of regression slope coefficients towards the null - Random measurement error in an outcome variable will instead increase the standard error of the estimates and widen the corresponding confidence intervals, making results less likely to be statistically significant - Increasing sample size will help minimise the impact of measurement error in an outcome variable but will only make estimates more precisely wrong when the error is in an exposure variable
Resumo:
This paper discusses the analysis of cases in which the inclusion or exclusion of a particular suspect, as a possible contributor to a DNA mixture, depends on the value of a variable (the number of contributors) that cannot be determined with certainty. It offers alternative ways to deal with such cases, including sensitivity analysis and object-oriented Bayesian networks, that separate uncertainty about the inclusion of the suspect from uncertainty about other variables. The paper presents a case study in which the value of DNA evidence varies radically depending on the number of contributors to a DNA mixture: if there are two contributors, the suspect is excluded; if there are three or more, the suspect is included; but the number of contributors cannot be determined with certainty. It shows how an object-oriented Bayesian network can accommodate and integrate varying perspectives on the unknown variable and how it can reduce the potential for bias by directing attention to relevant considerations and distinguishing different sources of uncertainty. It also discusses the challenge of presenting such evidence to lay audiences.
Resumo:
Optimal behavior relies on flexible adaptation to environmental requirements, notably based on the detection of errors. The impact of error detection on subsequent behavior typically manifests as a slowing down of RTs following errors. Precisely how errors impact the processing of subsequent stimuli and in turn shape behavior remains unresolved. To address these questions, we used an auditory spatial go/no-go task where continual feedback informed participants of whether they were too slow. We contrasted auditory-evoked potentials to left-lateralized go and right no-go stimuli as a function of performance on the preceding go stimuli, generating a 2 × 2 design with "preceding performance" (fast hit [FH], slow hit [SH]) and stimulus type (go, no-go) as within-subject factors. SH trials yielded SH trials on the following trials more often than did FHs, supporting our assumption that SHs engaged effects similar to errors. Electrophysiologically, auditory-evoked potentials modulated topographically as a function of preceding performance 80-110 msec poststimulus onset and then as a function of stimulus type at 110-140 msec, indicative of changes in the underlying brain networks. Source estimations revealed a stronger activity of prefrontal regions to stimuli after successful than error trials, followed by a stronger response of parietal areas to the no-go than go stimuli. We interpret these results in terms of a shift from a fast automatic to a slow controlled form of inhibitory control induced by the detection of errors, manifesting during low-level integration of task-relevant features of subsequent stimuli, which in turn influences response speed.
Resumo:
BACKGROUND: Microvascular decompression (MVD) is the reference technique for pharmacoresistant trigeminal neuralgia (TN). OBJECTIVE: To establish whether the safety and efficacy of Gamma Knife surgery for recurrent TN are influenced by prior MVD. METHODS: Between July 1992 and November 2010, 54 of 737 patients (45 of 497 with >1 year of follow-up) had a history of MVD (approximately half also with previous ablative procedure) and were operated on with Gamma Knife surgery for TN in the Timone University Hospital. A single 4-mm isocenter was positioned in the cisternal portion of the trigeminal nerve at a median distance of 7.6 mm (range, 3.9-11.9 mm) anterior to the emergence of the nerve. A median maximum dose of 85 Gy (range, 70-90 Gy) was delivered. RESULTS: The median follow-up time was 39.5 months (range, 14.1-144.6 months). Thirty-five patients (77.8%) were initially pain free in a median time of 14 days (range, 0-180 days), much lower compared with our global population of classic TN (P = .01). Their actuarial probabilities of remaining pain-free without medication at 3, 5, 7, and 10 years were 66.5%, 59.1%, 59.1%, and 44.3%. The hypoesthesia actuarial rate at 1 year was 9.1% and remained stable until 12 years (median, 8 months). CONCLUSION: Patients with previous MVD showed a significantly lower probability of initial pain cessation compared with our global population with classic TN (P = .01). The toxicity was low (only 9.1% hypoesthesia); furthermore, no patient reported bothersome hypoesthesia. However, the probability of maintaining pain relief without medication was 44.3% at 10 years, similar to our global series of classic TN (P = .85). ABBREVIATIONS: BNI, Barrow Neurological InstituteCI, confidence intervalCTN, classic trigeminal neuralgiaGKS, Gamma Knife surgeryHR, hazard ratioMVD, microvascular decompressionTN, trigeminal neuralgia.
Resumo:
Gene expression often cycles between active and inactive states in eukaryotes, yielding variable or noisy gene expression in the short-term, while slow epigenetic changes may lead to silencing or variegated expression. Understanding how cells control these effects will be of paramount importance to construct biological systems with predictable behaviours. Here we find that a human matrix attachment region (MAR) genetic element controls the stability and heritability of gene expression in cell populations. Mathematical modeling indicated that the MAR controls the probability of long-term transitions between active and inactive expression, thus reducing silencing effects and increasing the reactivation of silent genes. Single-cell short-terms assays revealed persistent expression and reduced expression noise in MAR-driven genes, while stochastic burst of expression occurred without this genetic element. The MAR thus confers a more deterministic behavior to an otherwise stochastic process, providing a means towards more reliable expression of engineered genetic systems.