92 resultados para Numerical Algorithms


Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJECTIVE: Accuracy studies of Patient Safety Indicators (PSIs) are critical but limited by the large samples required due to low occurrence of most events. We tested a sampling design based on test results (verification-biased sampling [VBS]) that minimizes the number of subjects to be verified. METHODS: We considered 3 real PSIs, whose rates were calculated using 3 years of discharge data from a university hospital and a hypothetical screen of very rare events. Sample size estimates, based on the expected sensitivity and precision, were compared across 4 study designs: random and VBS, with and without constraints on the size of the population to be screened. RESULTS: Over sensitivities ranging from 0.3 to 0.7 and PSI prevalence levels ranging from 0.02 to 0.2, the optimal VBS strategy makes it possible to reduce sample size by up to 60% in comparison with simple random sampling. For PSI prevalence levels below 1%, the minimal sample size required was still over 5000. CONCLUSIONS: Verification-biased sampling permits substantial savings in the required sample size for PSI validation studies. However, sample sizes still need to be very large for many of the rarer PSIs.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Images of myocardial strain can be used to diagnose heart disease, plan and monitor treatment, and to learn about cardiac structure and function. Three-dimensional (3D) strain is typically quantified using many magnetic resonance (MR) images obtained in two or three orthogonal planes. Problems with this approach include long scan times, image misregistration, and through-plane motion. This article presents a novel method for calculating cardiac 3D strain using a stack of two or more images acquired in only one orientation. The zHARP pulse sequence encodes in-plane motion using MR tagging and out-of-plane motion using phase encoding, and has been previously shown to be capable of computing 3D displacement within a single image plane. Here, data from two adjacent image planes are combined to yield a 3D strain tensor at each pixel; stacks of zHARP images can be used to derive stacked arrays of 3D strain tensors without imaging multiple orientations and without numerical interpolation. The performance and accuracy of the method is demonstrated in vitro on a phantom and in vivo in four healthy adult human subjects.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Due to the advances in sensor networks and remote sensing technologies, the acquisition and storage rates of meteorological and climatological data increases every day and ask for novel and efficient processing algorithms. A fundamental problem of data analysis and modeling is the spatial prediction of meteorological variables in complex orography, which serves among others to extended climatological analyses, for the assimilation of data into numerical weather prediction models, for preparing inputs to hydrological models and for real time monitoring and short-term forecasting of weather.In this thesis, a new framework for spatial estimation is proposed by taking advantage of a class of algorithms emerging from the statistical learning theory. Nonparametric kernel-based methods for nonlinear data classification, regression and target detection, known as support vector machines (SVM), are adapted for mapping of meteorological variables in complex orography.With the advent of high resolution digital elevation models, the field of spatial prediction met new horizons. In fact, by exploiting image processing tools along with physical heuristics, an incredible number of terrain features which account for the topographic conditions at multiple spatial scales can be extracted. Such features are highly relevant for the mapping of meteorological variables because they control a considerable part of the spatial variability of meteorological fields in the complex Alpine orography. For instance, patterns of orographic rainfall, wind speed and cold air pools are known to be correlated with particular terrain forms, e.g. convex/concave surfaces and upwind sides of mountain slopes.Kernel-based methods are employed to learn the nonlinear statistical dependence which links the multidimensional space of geographical and topographic explanatory variables to the variable of interest, that is the wind speed as measured at the weather stations or the occurrence of orographic rainfall patterns as extracted from sequences of radar images. Compared to low dimensional models integrating only the geographical coordinates, the proposed framework opens a way to regionalize meteorological variables which are multidimensional in nature and rarely show spatial auto-correlation in the original space making the use of classical geostatistics tangled.The challenges which are explored during the thesis are manifolds. First, the complexity of models is optimized to impose appropriate smoothness properties and reduce the impact of noisy measurements. Secondly, a multiple kernel extension of SVM is considered to select the multiscale features which explain most of the spatial variability of wind speed. Then, SVM target detection methods are implemented to describe the orographic conditions which cause persistent and stationary rainfall patterns. Finally, the optimal splitting of the data is studied to estimate realistic performances and confidence intervals characterizing the uncertainty of predictions.The resulting maps of average wind speeds find applications within renewable resources assessment and opens a route to decrease the temporal scale of analysis to meet hydrological requirements. Furthermore, the maps depicting the susceptibility to orographic rainfall enhancement can be used to improve current radar-based quantitative precipitation estimation and forecasting systems and to generate stochastic ensembles of precipitation fields conditioned upon the orography.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The objectives of this study were to develop a computerized method to screen for potentially avoidable hospital readmissions using routinely collected data and a prediction model to adjust rates for case mix. We studied hospital information system data of a random sample of 3,474 inpatients discharged alive in 1997 from a university hospital and medical records of those (1,115) readmitted within 1 year. The gold standard was set on the basis of the hospital data and medical records: all readmissions were classified as foreseen readmissions, unforeseen readmissions for a new affection, or unforeseen readmissions for a previously known affection. The latter category was submitted to a systematic medical record review to identify the main cause of readmission. Potentially avoidable readmissions were defined as a subgroup of unforeseen readmissions for a previously known affection occurring within an appropriate interval, set to maximize the chance of detecting avoidable readmissions. The computerized screening algorithm was strictly based on routine statistics: diagnosis and procedures coding and admission mode. The prediction was based on a Poisson regression model. There were 454 (13.1%) unforeseen readmissions for a previously known affection within 1 year. Fifty-nine readmissions (1.7%) were judged avoidable, most of them occurring within 1 month, which was the interval used to define potentially avoidable readmissions (n = 174, 5.0%). The intra-sample sensitivity and specificity of the screening algorithm both reached approximately 96%. Higher risk for potentially avoidable readmission was associated with previous hospitalizations, high comorbidity index, and long length of stay; lower risk was associated with surgery and delivery. The model offers satisfactory predictive performance and a good medical plausibility. The proposed measure could be used as an indicator of inpatient care outcome. However, the instrument should be validated using other sets of data from various hospitals.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: Surveillance of multiple congenital anomalies is considered to be more sensitive for the detection of new teratogens than surveillance of all or isolated congenital anomalies. Current literature proposes the manual review of all cases for classification into isolated or multiple congenital anomalies. METHODS: Multiple anomalies were defined as two or more major congenital anomalies, excluding sequences and syndromes. A computer algorithm for classification of major congenital anomaly cases in the EUROCAT database according to International Classification of Diseases (ICD)v10 codes was programmed, further developed, and implemented for 1 year's data (2004) from 25 registries. The group of cases classified with potential multiple congenital anomalies were manually reviewed by three geneticists to reach a final agreement of classification as "multiple congenital anomaly" cases. RESULTS: A total of 17,733 cases with major congenital anomalies were reported giving an overall prevalence of major congenital anomalies at 2.17%. The computer algorithm classified 10.5% of all cases as "potentially multiple congenital anomalies". After manual review of these cases, 7% were agreed to have true multiple congenital anomalies. Furthermore, the algorithm classified 15% of all cases as having chromosomal anomalies, 2% as monogenic syndromes, and 76% as isolated congenital anomalies. The proportion of multiple anomalies varies by congenital anomaly subgroup with up to 35% of cases with bilateral renal agenesis. CONCLUSIONS: The implementation of the EUROCAT computer algorithm is a feasible, efficient, and transparent way to improve classification of congenital anomalies for surveillance and research.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Coordination games are important to explain efficient and desirable social behavior. Here we study these games by extensive numerical simulation on networked social structures using an evolutionary approach. We show that local network effects may promote selection of efficient equilibria in both pure and general coordination games and may explain social polarization. These results are put into perspective with respect to known theoretical results. The main insight we obtain is that clustering, and especially community structure in social networks has a positive role in promoting socially efficient outcomes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: Active screening by mobile teams is considered the best method for detecting human African trypanosomiasis (HAT) caused by Trypanosoma brucei gambiense but the current funding context in many post-conflict countries limits this approach. As an alternative, non-specialist health care workers (HCWs) in peripheral health facilities could be trained to identify potential cases who need testing based on their symptoms. We explored the predictive value of syndromic referral algorithms to identify symptomatic cases of HAT among a treatment-seeking population in Nimule, South Sudan. METHODOLOGY/PRINCIPAL FINDINGS: Symptom data from 462 patients (27 cases) presenting for a HAT test via passive screening over a 7 month period were collected to construct and evaluate over 14,000 four item syndromic algorithms considered simple enough to be used by peripheral HCWs. For comparison, algorithms developed in other settings were also tested on our data, and a panel of expert HAT clinicians were asked to make referral decisions based on the symptom dataset. The best performing algorithms consisted of three core symptoms (sleep problems, neurological problems and weight loss), with or without a history of oedema, cervical adenopathy or proximity to livestock. They had a sensitivity of 88.9-92.6%, a negative predictive value of up to 98.8% and a positive predictive value in this context of 8.4-8.7%. In terms of sensitivity, these out-performed more complex algorithms identified in other studies, as well as the expert panel. The best-performing algorithm is predicted to identify about 9/10 treatment-seeking HAT cases, though only 1/10 patients referred would test positive. CONCLUSIONS/SIGNIFICANCE: In the absence of regular active screening, improving referrals of HAT patients through other means is essential. Systematic use of syndromic algorithms by peripheral HCWs has the potential to increase case detection and would increase their participation in HAT programmes. The algorithms proposed here, though promising, should be validated elsewhere.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The flow of two immiscible fluids through a porous medium depends on the complex interplay between gravity, capillarity, and viscous forces. The interaction between these forces and the geometry of the medium gives rise to a variety of complex flow regimes that are difficult to describe using continuum models. Although a number of pore-scale models have been employed, a careful investigation of the macroscopic effects of pore-scale processes requires methods based on conservation principles in order to reduce the number of modeling assumptions. In this work we perform direct numerical simulations of drainage by solving Navier-Stokes equations in the pore space and employing the Volume Of Fluid (VOF) method to track the evolution of the fluid-fluid interface. After demonstrating that the method is able to deal with large viscosity contrasts and model the transition from stable flow to viscous fingering, we focus on the macroscopic capillary pressure and we compare different definitions of this quantity under quasi-static and dynamic conditions. We show that the difference between the intrinsic phase-average pressures, which is commonly used as definition of Darcy-scale capillary pressure, is subject to several limitations and it is not accurate in presence of viscous effects or trapping. In contrast, a definition based on the variation of the total surface energy provides an accurate estimate of the macroscopic capillary pressure. This definition, which links the capillary pressure to its physical origin, allows a better separation of viscous effects and does not depend on the presence of trapped fluid clusters.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The noise power spectrum (NPS) is the reference metric for understanding the noise content in computed tomography (CT) images. To evaluate the noise properties of clinical multidetector (MDCT) scanners, local 2D and 3D NPSs were computed for different acquisition reconstruction parameters.A 64- and a 128-MDCT scanners were employed. Measurements were performed on a water phantom in axial and helical acquisition modes. CT dose index was identical for both installations. Influence of parameters such as the pitch, the reconstruction filter (soft, standard and bone) and the reconstruction algorithm (filtered-back projection (FBP), adaptive statistical iterative reconstruction (ASIR)) were investigated. Images were also reconstructed in the coronal plane using a reformat process. Then 2D and 3D NPS methods were computed.In axial acquisition mode, the 2D axial NPS showed an important magnitude variation as a function of the z-direction when measured at the phantom center. In helical mode, a directional dependency with lobular shape was observed while the magnitude of the NPS was kept constant. Important effects of the reconstruction filter, pitch and reconstruction algorithm were observed on 3D NPS results for both MDCTs. With ASIR, a reduction of the NPS magnitude and a shift of the NPS peak to the low frequency range were visible. 2D coronal NPS obtained from the reformat images was impacted by the interpolation when compared to 2D coronal NPS obtained from 3D measurements.The noise properties of volume measured in last generation MDCTs was studied using local 3D NPS metric. However, impact of the non-stationarity noise effect may need further investigations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The state of the art to describe image quality in medical imaging is to assess the performance of an observer conducting a task of clinical interest. This can be done by using a model observer leading to a figure of merit such as the signal-to-noise ratio (SNR). Using the non-prewhitening (NPW) model observer, we objectively characterised the evolution of its figure of merit in various acquisition conditions. The NPW model observer usually requires the use of the modulation transfer function (MTF) as well as noise power spectra. However, although the computation of the MTF poses no problem when dealing with the traditional filtered back-projection (FBP) algorithm, this is not the case when using iterative reconstruction (IR) algorithms, such as adaptive statistical iterative reconstruction (ASIR) or model-based iterative reconstruction (MBIR). Given that the target transfer function (TTF) had already shown it could accurately express the system resolution even with non-linear algorithms, we decided to tune the NPW model observer, replacing the standard MTF by the TTF. It was estimated using a custom-made phantom containing cylindrical inserts surrounded by water. The contrast differences between the inserts and water were plotted for each acquisition condition. Then, mathematical transformations were performed leading to the TTF. As expected, the first results showed a dependency of the image contrast and noise levels on the TTF for both ASIR and MBIR. Moreover, FBP also proved to be dependent of the contrast and noise when using the lung kernel. Those results were then introduced in the NPW model observer. We observed an enhancement of SNR every time we switched from FBP to ASIR to MBIR. IR algorithms greatly improve image quality, especially in low-dose conditions. Based on our results, the use of MBIR could lead to further dose reduction in several clinical applications.