66 resultados para Error correction


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Medical errors compromise patient safety in ambulatory practice. These errors must be faced in a framework that reduces to a minimum their consequences for the patients. This approach relies on the implementation of a new culture without stigmatization and where errors are disclosed to the patients; this culture implies the build up of a system for reporting errors associated to an in-depth analysis of the system, looking for root causes and insufficient barriers with the aim to fix them. A useful education tool is the "critical situations" meeting during which physicians are encouraged to openly present adverse events and "near misses". Their analysis, with supportive attitude towards involved staff members, allows to reveal systems failures within the institution or the private practice.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Summary points: - The bias introduced by random measurement error will be different depending on whether the error is in an exposure variable (risk factor) or outcome variable (disease) - Random measurement error in an exposure variable will bias the estimates of regression slope coefficients towards the null - Random measurement error in an outcome variable will instead increase the standard error of the estimates and widen the corresponding confidence intervals, making results less likely to be statistically significant - Increasing sample size will help minimise the impact of measurement error in an outcome variable but will only make estimates more precisely wrong when the error is in an exposure variable

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Detecting local differences between groups of connectomes is a great challenge in neuroimaging, because the large number of tests that have to be performed and the impact on multiplicity correction. Any available information should be exploited to increase the power of detecting true between-group effects. We present an adaptive strategy that exploits the data structure and the prior information concerning positive dependence between nodes and connections, without relying on strong assumptions. As a first step, we decompose the brain network, i.e., the connectome, into subnetworks and we apply a screening at the subnetwork level. The subnetworks are defined either according to prior knowledge or by applying a data driven algorithm. Given the results of the screening step, a filtering is performed to seek real differences at the node/connection level. The proposed strategy could be used to strongly control either the family-wise error rate or the false discovery rate. We show by means of different simulations the benefit of the proposed strategy, and we present a real application of comparing connectomes of preschool children and adolescents.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

CONTEXT: A passive knee-extension test has been shown to be a reliable method of assessing hamstring tightness, but this method does not take into account the potential effect of gravity on the tested leg. OBJECTIVE: To compare an original passive knee-extension test with 2 adapted methods including gravity's effect on the lower leg. DESIGN: Repeated measures. SETTING: Laboratory. PARTICIPANTS: 20 young track and field athletes (16.6 ± 1.6 y, 177.6 ± 9.2 cm, 75.9 ± 24.8 kg). INTERVENTION: Each subject was tested in a randomized order with 3 different methods: In the original one (M1), passive knee angle was measured with a standard force of 68.7 N (7 kg) applied proximal to the lateral malleolus. The second (M2) and third (M3) methods took into account the relative lower-leg weight (measured respectively by handheld dynamometer and anthropometrical table) to individualize the force applied to assess passive knee angle. MAIN OUTCOME MEASURES: Passive knee angles measured with video-analysis software. RESULTS: No difference in mean individualized applied force was found between M2 and M3, so the authors assessed passive knee angle only with M2. The mean knee angle was different between M1 and M2 (68.8 ± 12.4 vs 73.1 ± 10.6, P < .001). Knee angles in M1 and M2 were correlated (r = .93, P < .001). CONCLUSIONS: Differences in knee angle were found between the original passive knee-extension test and a method with gravity correction. M2 is an improved version of the original method (M1) since it minimizes the effect of gravity. Therefore, we recommend using it rather than M1.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Images acquired using optical microscopes are inherently subject to vignetting effects due to imperfect illumination and image acquisition. However, such vignetting effects hamper accurate extraction of quantitative information from biological images, leading to less effective image segmentation and increased noise in the measurements. Here, we describe a rapid and effective method for vignetting correction, which generates an estimate for a correction function from the background fluorescence without the need to acquire additional calibration images. We validate the usefulness of this algorithm using artificially distorted images as a gold standard for assessing the accuracy of the applied correction and then demonstrate that this correction method enables the reliable detection of biologically relevant variation in cell populations. A simple user interface called FlattifY was developed and integrated into the image analysis platform YeastQuant to facilitate easy application of vignetting correction to a wide range of images.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Optimal behavior relies on flexible adaptation to environmental requirements, notably based on the detection of errors. The impact of error detection on subsequent behavior typically manifests as a slowing down of RTs following errors. Precisely how errors impact the processing of subsequent stimuli and in turn shape behavior remains unresolved. To address these questions, we used an auditory spatial go/no-go task where continual feedback informed participants of whether they were too slow. We contrasted auditory-evoked potentials to left-lateralized go and right no-go stimuli as a function of performance on the preceding go stimuli, generating a 2 × 2 design with "preceding performance" (fast hit [FH], slow hit [SH]) and stimulus type (go, no-go) as within-subject factors. SH trials yielded SH trials on the following trials more often than did FHs, supporting our assumption that SHs engaged effects similar to errors. Electrophysiologically, auditory-evoked potentials modulated topographically as a function of preceding performance 80-110 msec poststimulus onset and then as a function of stimulus type at 110-140 msec, indicative of changes in the underlying brain networks. Source estimations revealed a stronger activity of prefrontal regions to stimuli after successful than error trials, followed by a stronger response of parietal areas to the no-go than go stimuli. We interpret these results in terms of a shift from a fast automatic to a slow controlled form of inhibitory control induced by the detection of errors, manifesting during low-level integration of task-relevant features of subsequent stimuli, which in turn influences response speed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

PURPOSE: To evaluate the effect of a real-time adaptive trigger delay on image quality to correct for heart rate variability in 3D whole-heart coronary MR angiography (MRA). MATERIALS AND METHODS: Twelve healthy adults underwent 3D whole-heart coronary MRA with and without the use of an adaptive trigger delay. The moment of minimal coronary artery motion was visually determined on a high temporal resolution MRI. Throughout the scan performed without adaptive trigger delay, trigger delay was kept constant, whereas during the scan performed with adaptive trigger delay, trigger delay was continuously updated after each RR-interval using physiological modeling. Signal-to-noise, contrast-to-noise, vessel length, vessel sharpness, and subjective image quality were compared in a blinded manner. RESULTS: Vessel sharpness improved significantly for the middle segment of the right coronary artery (RCA) with the use of the adaptive trigger delay (52.3 +/- 7.1% versus 48.9 +/- 7.9%, P = 0.026). Subjective image quality was significantly better in the middle segments of the RCA and left anterior descending artery (LAD) when the scan was performed with adaptive trigger delay compared to constant trigger delay. CONCLUSION: Our results demonstrate that the use of an adaptive trigger delay to correct for heart rate variability improves image quality mainly in the middle segments of the RCA and LAD.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Three-dimensional free-breathing coronary magnetic resonance angiography was performed in eight healthy volunteers with use of real-time navigator technology. Images acquired with the navigator localized at the right hemidiaphragm and at the left ventricle were objectively compared. The diaphragmatic navigator was found to be superior for vessel delineation of middle to distal portions of the coronary arteries.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In groundwater applications, Monte Carlo methods are employed to model the uncertainty on geological parameters. However, their brute-force application becomes computationally prohibitive for highly detailed geological descriptions, complex physical processes, and a large number of realizations. The Distance Kernel Method (DKM) overcomes this issue by clustering the realizations in a multidimensional space based on the flow responses obtained by means of an approximate (computationally cheaper) model; then, the uncertainty is estimated from the exact responses that are computed only for one representative realization per cluster (the medoid). Usually, DKM is employed to decrease the size of the sample of realizations that are considered to estimate the uncertainty. We propose to use the information from the approximate responses for uncertainty quantification. The subset of exact solutions provided by DKM is then employed to construct an error model and correct the potential bias of the approximate model. Two error models are devised that both employ the difference between approximate and exact medoid solutions, but differ in the way medoid errors are interpolated to correct the whole set of realizations. The Local Error Model rests upon the clustering defined by DKM and can be seen as a natural way to account for intra-cluster variability; the Global Error Model employs a linear interpolation of all medoid errors regardless of the cluster to which the single realization belongs. These error models are evaluated for an idealized pollution problem in which the uncertainty of the breakthrough curve needs to be estimated. For this numerical test case, we demonstrate that the error models improve the uncertainty quantification provided by the DKM algorithm and are effective in correcting the bias of the estimate computed solely from the MsFV results. The framework presented here is not specific to the methods considered and can be applied to other combinations of approximate models and techniques to select a subset of realizations

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Attempts to use a stimulated echo acquisition mode (STEAM) in cardiac imaging are impeded by imaging artifacts that result in signal attenuation and nulling of the cardiac tissue. In this work, we present a method to reduce this artifact by acquiring two sets of stimulated echo images with two different demodulations. The resulting two images are combined to recover the signal loss and weighted to compensate for possible deformation-dependent intensity variation. Numerical simulations were used to validate the theory. Also, the proposed correction method was applied to in vivo imaging of normal volunteers (n = 6) and animal models with induced infarction (n = 3). The results show the ability of the method to recover the lost myocardial signal and generate artifact-free black-blood cardiac images.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

RATIONALE AND OBJECTIVES: The purpose of this study was the investigation of the impact of real-time adaptive motion correction on image quality in navigator-gated, free-breathing, double-oblique three-dimensional (3D) submillimeter right coronary magnetic resonance angiography (MRA). MATERIALS AND METHODS: Free-breathing 3D right coronary MRA with real-time navigator technology was performed in 10 healthy adult subjects with an in-plane spatial resolution of 700 x 700 microm. Identical double-oblique coronary MR-angiograms were performed with navigator gating alone and combined navigator gating and real-time adaptive motion correction. Quantitative objective parameters of contrast-to-noise ratio (CNR) and vessel sharpness and subjective image quality scores were compared. RESULTS: Superior vessel sharpness, increased CNR, and superior image quality scores were found with combined navigator gating and real-time adaptive motion correction (vs. navigator gating alone; P < 0.01 for all comparisons). CONCLUSION: Real-time adaptive motion correction objectively and subjectively improves image quality in 3D navigator-gated free-breathing double-oblique submillimeter right coronary MRA.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The multiscale finite-volume (MSFV) method is designed to reduce the computational cost of elliptic and parabolic problems with highly heterogeneous anisotropic coefficients. The reduction is achieved by splitting the original global problem into a set of local problems (with approximate local boundary conditions) coupled by a coarse global problem. It has been shown recently that the numerical errors in MSFV results can be reduced systematically with an iterative procedure that provides a conservative velocity field after any iteration step. The iterative MSFV (i-MSFV) method can be obtained with an improved (smoothed) multiscale solution to enhance the localization conditions, with a Krylov subspace method [e.g., the generalized-minimal-residual (GMRES) algorithm] preconditioned by the MSFV system, or with a combination of both. In a multiphase-flow system, a balance between accuracy and computational efficiency should be achieved by finding a minimum number of i-MSFV iterations (on pressure), which is necessary to achieve the desired accuracy in the saturation solution. In this work, we extend the i-MSFV method to sequential implicit simulation of time-dependent problems. To control the error of the coupled saturation/pressure system, we analyze the transport error caused by an approximate velocity field. We then propose an error-control strategy on the basis of the residual of the pressure equation. At the beginning of simulation, the pressure solution is iterated until a specified accuracy is achieved. To minimize the number of iterations in a multiphase-flow problem, the solution at the previous timestep is used to improve the localization assumption at the current timestep. Additional iterations are used only when the residual becomes larger than a specified threshold value. Numerical results show that only a few iterations on average are necessary to improve the MSFV results significantly, even for very challenging problems. Therefore, the proposed adaptive strategy yields efficient and accurate simulation of multiphase flow in heterogeneous porous media.