67 resultados para rule-based algorithms
Resumo:
New evidence shows that older adults need more dietary protein than do younger adults to support good health, promote recovery from illness, and maintain functionality. Older people need to make up for age-related changes in protein metabolism, such as high splanchnic extraction and declining anabolic responses to ingested protein. They also need more protein to offset inflammatory and catabolic conditions associated with chronic and acute diseases that occur commonly with aging. With the goal of developing updated, evidence-based recommendations for optimal protein intake by older people, the European Union Geriatric Medicine Society (EUGMS), in cooperation with other scientific organizations, appointed an international study group to review dietary protein needs with aging (PROT-AGE Study Group). To help older people (>65 years) maintain and regain lean body mass and function, the PROT-AGE study group recommends average daily intake at least in the range of 1.0 to 1.2 g protein per kilogram of body weight per day. Both endurance- and resistance-type exercises are recommended at individualized levels that are safe and tolerated, and higher protein intake (ie, ≥1.2 g/kg body weight/d) is advised for those who are exercising and otherwise active. Most older adults who have acute or chronic diseases need even more dietary protein (ie, 1.2-1.5 g/kg body weight/d). Older people with severe kidney disease (ie, estimated GFR <30 mL/min/1.73m(2)), but who are not on dialysis, are an exception to this rule; these individuals may need to limit protein intake. Protein quality, timing of ingestion, and intake of other nutritional supplements may be relevant, but evidence is not yet sufficient to support specific recommendations. Older people are vulnerable to losses in physical function capacity, and such losses predict loss of independence, falls, and even mortality. Thus, future studies aimed at pinpointing optimal protein intake in specific populations of older people need to include measures of physical function.
Resumo:
Background: The Pulmonary Embolism Rule-out Criteria (PERC) rule is a clinical diagnostic rule designed to exclude pulmonary embolism (PE) without further testing. We sought to externally validate the diagnostic performance of the PERC rule alone and combined with clinical probability assessment based on the revised Geneva score. Methods: The PERC rule was applied retrospectively to consecutive patients who presented with a clinical suspicion of PE to six emergency departments, and who were enrolled in a randomized trial of PE diagnosis. Patients who met all eight PERC criteria [PERC(-)] were considered to be at a very low risk for PE. We calculated the prevalence of PE among PERC(-) patients according to their clinical pretest probability of PE. We estimated the negative likelihood ratio of the PERC rule to predict PE. Results: Among 1675 patients, the prevalence of PE was 21.3%. Overall, 13.2% of patients were PERC(-). The prevalence of PE was 5.4% [95% confidence interval (CI): 3.1-9.3%] among PERC(-) patients overall and 6.4% (95% CI: 3.7-10.8%) among those PERC(-) patients with a low clinical pretest probability of PE. The PERC rule had a negative likelihood ratio of 0.70 (95% CI: 0.67-0.73) for predicting PE overall, and 0.63 (95% CI: 0.38-1.06) in low-risk patients. Conclusions: Our results suggest that the PERC rule alone or even when combined with the revised Geneva score cannot safely identify very low risk patients in whom PE can be ruled out without additional testing, at least in populations with a relatively high prevalence of PE.
Resumo:
This paper presents a new non parametric atlas registration framework, derived from the optical flow model and the active contour theory, applied to automatic subthalamic nucleus (STN) targeting in deep brain stimulation (DBS) surgery. In a previous work, we demonstrated that the STN position can be predicted based on the position of surrounding visible structures, namely the lateral and third ventricles. A STN targeting process can thus be obtained by registering these structures of interest between a brain atlas and the patient image. Here we aim to improve the results of the state of the art targeting methods and at the same time to reduce the computational time. Our simultaneous segmentation and registration model shows mean STN localization errors statistically similar to the most performing registration algorithms tested so far and to the targeting expert's variability. Moreover, the computational time of our registration method is much lower, which is a worthwhile improvement from a clinical point of view.
Resumo:
In this paper, we present an efficient numerical scheme for the recently introduced geodesic active fields (GAF) framework for geometric image registration. This framework considers the registration task as a weighted minimal surface problem. Hence, the data-term and the regularization-term are combined through multiplication in a single, parametrization invariant and geometric cost functional. The multiplicative coupling provides an intrinsic, spatially varying and data-dependent tuning of the regularization strength, and the parametrization invariance allows working with images of nonflat geometry, generally defined on any smoothly parametrizable manifold. The resulting energy-minimizing flow, however, has poor numerical properties. Here, we provide an efficient numerical scheme that uses a splitting approach; data and regularity terms are optimized over two distinct deformation fields that are constrained to be equal via an augmented Lagrangian approach. Our approach is more flexible than standard Gaussian regularization, since one can interpolate freely between isotropic Gaussian and anisotropic TV-like smoothing. In this paper, we compare the geodesic active fields method with the popular Demons method and three more recent state-of-the-art algorithms: NL-optical flow, MRF image registration, and landmark-enhanced large displacement optical flow. Thus, we can show the advantages of the proposed FastGAF method. It compares favorably against Demons, both in terms of registration speed and quality. Over the range of example applications, it also consistently produces results not far from more dedicated state-of-the-art methods, illustrating the flexibility of the proposed framework.
Resumo:
OBJECTIVE: Accuracy studies of Patient Safety Indicators (PSIs) are critical but limited by the large samples required due to low occurrence of most events. We tested a sampling design based on test results (verification-biased sampling [VBS]) that minimizes the number of subjects to be verified. METHODS: We considered 3 real PSIs, whose rates were calculated using 3 years of discharge data from a university hospital and a hypothetical screen of very rare events. Sample size estimates, based on the expected sensitivity and precision, were compared across 4 study designs: random and VBS, with and without constraints on the size of the population to be screened. RESULTS: Over sensitivities ranging from 0.3 to 0.7 and PSI prevalence levels ranging from 0.02 to 0.2, the optimal VBS strategy makes it possible to reduce sample size by up to 60% in comparison with simple random sampling. For PSI prevalence levels below 1%, the minimal sample size required was still over 5000. CONCLUSIONS: Verification-biased sampling permits substantial savings in the required sample size for PSI validation studies. However, sample sizes still need to be very large for many of the rarer PSIs.
Resumo:
Multi-centre data repositories like the Alzheimer's Disease Neuroimaging Initiative (ADNI) offer a unique research platform, but pose questions concerning comparability of results when using a range of imaging protocols and data processing algorithms. The variability is mainly due to the non-quantitative character of the widely used structural T1-weighted magnetic resonance (MR) images. Although the stability of the main effect of Alzheimer's disease (AD) on brain structure across platforms and field strength has been addressed in previous studies using multi-site MR images, there are only sparse empirically-based recommendations for processing and analysis of pooled multi-centre structural MR data acquired at different magnetic field strengths (MFS). Aiming to minimise potential systematic bias when using ADNI data we investigate the specific contributions of spatial registration strategies and the impact of MFS on voxel-based morphometry in AD. We perform a whole-brain analysis within the framework of Statistical Parametric Mapping, testing for main effects of various diffeomorphic spatial registration strategies, of MFS and their interaction with disease status. Beyond the confirmation of medial temporal lobe volume loss in AD, we detect a significant impact of spatial registration strategy on estimation of AD related atrophy. Additionally, we report a significant effect of MFS on the assessment of brain anatomy (i) in the cerebellum, (ii) the precentral gyrus and (iii) the thalamus bilaterally, showing no interaction with the disease status. We provide empirical evidence in support of pooling data in multi-centre VBM studies irrespective of disease status or MFS.
Resumo:
To cite this article: Ponvert C, Perrin Y, Bados-Albiero A, Le Bourgeois M, Karila C, Delacourt C, Scheinmann P, De Blic J. Allergy to betalactam antibiotics in children: results of a 20-year study based on clinical history, skin and challenge tests. Pediatr Allergy Immunol 2011; 22: 411-418. ABSTRACT: Studies based on skin and challenge tests have shown that 12-60% of children with suspected betalactam hypersensitivity were allergic to betalactams. Responses in skin and challenge tests were studied in 1865 children with suspected betalactam allergy (i) to confirm or rule out the suspected diagnosis; (ii) to evaluate diagnostic value of immediate and non-immediate responses in skin and challenge tests; (iii) to determine frequency of betalactam allergy in those children, and (iv) to determine potential risk factors for betalactam allergy. The work-up was completed in 1431 children, of whom 227 (15.9%) were diagnosed allergic to betalactams. Betalactam hypersensitivity was diagnosed in 50 of the 162 (30.9%) children reporting immediate reactions and in 177 of the 1087 (16.7%) children reporting non-immediate reactions (p < 0.001). The likelihood of betalactam hypersensitivity was also significantly higher in children reporting anaphylaxis, serum sickness-like reactions, and (potentially) severe skin reactions such as acute generalized exanthematic pustulosis, Stevens-Johnson syndrome, and drug reaction with systemic symptoms than in other children (p < 0.001). Skin tests diagnosed 86% of immediate and 31.6% of non-immediate sensitizations. Cross-reactivity and/or cosensitization among betalactams was diagnosed in 76% and 14.7% of the children with immediate and non-immediate hypersensitivity, respectively. The number of children diagnosed allergic to betalactams decreased with time between the reaction and the work-up, probably because the majority of children with severe and worrying reactions were referred for allergological work-up more promptly than the other children. Sex, age, and atopy were not risk factors for betalactam hypersensitivity. In conclusion, we confirm in numerous children that (i) only a few children with suspected betalactam hypersensitivity are allergic to betalactams; (ii) the likelihood of betalactam allergy increases with earliness and/or severity of the reactions; (iii) although non-immediate-reading skin tests (intradermal and patch tests) may diagnose non-immediate sensitizations in children with non-immediate reactions to betalactams (maculopapular rashes and potentially severe skin reactions especially), the diagnostic value of non-immediate-reading skin tests is far lower than the diagnostic value of immediate-reading skin tests, most non-immediate sensitizations to betalactams being diagnosed by means of challenge tests; (iv) cross-reactivity and/or cosensitizations among betalactams are much more frequent in children reporting immediate and/or anaphylactic reactions than in the other children; (v) age, sex and personal atopy are not significant risk factors for betalactam hypersensitivity; and (vi) the number of children with diagnosed allergy to betalactams (of the immediate-type hypersensitivity especially) decreases with time between the reaction and allergological work-up. Finally, based on our experience, we also propose a practical diagnostic approach in children with suspected betalactam hypersensitivity.
Resumo:
Due to the advances in sensor networks and remote sensing technologies, the acquisition and storage rates of meteorological and climatological data increases every day and ask for novel and efficient processing algorithms. A fundamental problem of data analysis and modeling is the spatial prediction of meteorological variables in complex orography, which serves among others to extended climatological analyses, for the assimilation of data into numerical weather prediction models, for preparing inputs to hydrological models and for real time monitoring and short-term forecasting of weather.In this thesis, a new framework for spatial estimation is proposed by taking advantage of a class of algorithms emerging from the statistical learning theory. Nonparametric kernel-based methods for nonlinear data classification, regression and target detection, known as support vector machines (SVM), are adapted for mapping of meteorological variables in complex orography.With the advent of high resolution digital elevation models, the field of spatial prediction met new horizons. In fact, by exploiting image processing tools along with physical heuristics, an incredible number of terrain features which account for the topographic conditions at multiple spatial scales can be extracted. Such features are highly relevant for the mapping of meteorological variables because they control a considerable part of the spatial variability of meteorological fields in the complex Alpine orography. For instance, patterns of orographic rainfall, wind speed and cold air pools are known to be correlated with particular terrain forms, e.g. convex/concave surfaces and upwind sides of mountain slopes.Kernel-based methods are employed to learn the nonlinear statistical dependence which links the multidimensional space of geographical and topographic explanatory variables to the variable of interest, that is the wind speed as measured at the weather stations or the occurrence of orographic rainfall patterns as extracted from sequences of radar images. Compared to low dimensional models integrating only the geographical coordinates, the proposed framework opens a way to regionalize meteorological variables which are multidimensional in nature and rarely show spatial auto-correlation in the original space making the use of classical geostatistics tangled.The challenges which are explored during the thesis are manifolds. First, the complexity of models is optimized to impose appropriate smoothness properties and reduce the impact of noisy measurements. Secondly, a multiple kernel extension of SVM is considered to select the multiscale features which explain most of the spatial variability of wind speed. Then, SVM target detection methods are implemented to describe the orographic conditions which cause persistent and stationary rainfall patterns. Finally, the optimal splitting of the data is studied to estimate realistic performances and confidence intervals characterizing the uncertainty of predictions.The resulting maps of average wind speeds find applications within renewable resources assessment and opens a route to decrease the temporal scale of analysis to meet hydrological requirements. Furthermore, the maps depicting the susceptibility to orographic rainfall enhancement can be used to improve current radar-based quantitative precipitation estimation and forecasting systems and to generate stochastic ensembles of precipitation fields conditioned upon the orography.
Resumo:
Normal and abnormal brains can be segmented by registering the target image with an atlas. Here, an atlas is defined as the combination of an intensity image (template) and its segmented image (the atlas labels). After registering the atlas template and the target image, the atlas labels are propagated to the target image. We define this process as atlas-based segmentation. In recent years, researchers have investigated registration algorithms to match atlases to query subjects and also strategies for atlas construction. In this paper we present a review of the automated approaches for atlas-based segmentation of magnetic resonance brain images. We aim to point out the strengths and weaknesses of atlas-based methods and suggest new research directions. We use two different criteria to present the methods. First, we refer to the algorithms according to their atlas-based strategy: label propagation, multi-atlas methods, and probabilistic techniques. Subsequently, we classify the methods according to their medical target: the brain and its internal structures, tissue segmentation in healthy subjects, tissue segmentation in fetus, neonates and elderly subjects, and segmentation of damaged brains. A quantitative comparison of the results reported in the literature is also presented.
Resumo:
Background: Johanson-Blizzard syndrome (JBS; OMIM 243800) is an autosomal recessive disorder that includes congenital exocrine pancreatic insufficiency, facial dysmorphism with the characteristic nasal wing hypoplasia, multiple malformations, and frequent mental retardation. Our previous work has shown that JBS is caused by mutations in human UBR1, which encodes one of the E3 ubiquitin ligases of the N-end rule pathway. The N-end rule relates the regulation of the in vivo half-life of a protein to the identity of its N-terminal residue. One class of degradation signals (degrons) recognized by UBR1 are destabilizing N-terminal residues of protein substrates.Methodology/Principal Findings: Most JBS-causing alterations of UBR1 are nonsense, frameshift or splice-site mutations that abolish UBR1 activity. We report here missense mutations of human UBR1 in patients with milder variants of JBS. These single-residue changes, including a previously reported missense mutation, involve positions in the RING-H2 and UBR domains of UBR1 that are conserved among eukaryotes. Taking advantage of this conservation, we constructed alleles of the yeast Saccharomyces cerevisiae UBR1 that were counterparts of missense JBS-UBR1 alleles. Among these yeast Ubr1 mutants, one of them (H160R) was inactive in yeast-based activity assays, the other one (Q1224E) had a detectable but weak activity, and the third one (V146L) exhibited a decreased but significant activity, in agreement with manifestations of JBS in the corresponding JBS patients.Conclusions/Significance: These results, made possible by modeling defects of a human ubiquitin ligase in its yeast counterpart, verified and confirmed the relevance of specific missense UBR1 alleles to JBS, and suggested that a residual activity of a missense allele is causally associated with milder variants of JBS.
Resumo:
Because data on rare species usually are sparse, it is important to have efficient ways to sample additional data. Traditional sampling approaches are of limited value for rare species because a very large proportion of randomly chosen sampling sites are unlikely to shelter the species. For these species, spatial predictions from niche-based distribution models can be used to stratify the sampling and increase sampling efficiency. New data sampled are then used to improve the initial model. Applying this approach repeatedly is an adaptive process that may allow increasing the number of new occurrences found. We illustrate the approach with a case study of a rare and endangered plant species in Switzerland and a simulation experiment. Our field survey confirmed that the method helps in the discovery of new populations of the target species in remote areas where the predicted habitat suitability is high. In our simulations the model-based approach provided a significant improvement (by a factor of 1.8 to 4 times, depending on the measure) over simple random sampling. In terms of cost this approach may save up to 70% of the time spent in the field.
Resumo:
BACKGROUND: Active screening by mobile teams is considered the best method for detecting human African trypanosomiasis (HAT) caused by Trypanosoma brucei gambiense but the current funding context in many post-conflict countries limits this approach. As an alternative, non-specialist health care workers (HCWs) in peripheral health facilities could be trained to identify potential cases who need testing based on their symptoms. We explored the predictive value of syndromic referral algorithms to identify symptomatic cases of HAT among a treatment-seeking population in Nimule, South Sudan. METHODOLOGY/PRINCIPAL FINDINGS: Symptom data from 462 patients (27 cases) presenting for a HAT test via passive screening over a 7 month period were collected to construct and evaluate over 14,000 four item syndromic algorithms considered simple enough to be used by peripheral HCWs. For comparison, algorithms developed in other settings were also tested on our data, and a panel of expert HAT clinicians were asked to make referral decisions based on the symptom dataset. The best performing algorithms consisted of three core symptoms (sleep problems, neurological problems and weight loss), with or without a history of oedema, cervical adenopathy or proximity to livestock. They had a sensitivity of 88.9-92.6%, a negative predictive value of up to 98.8% and a positive predictive value in this context of 8.4-8.7%. In terms of sensitivity, these out-performed more complex algorithms identified in other studies, as well as the expert panel. The best-performing algorithm is predicted to identify about 9/10 treatment-seeking HAT cases, though only 1/10 patients referred would test positive. CONCLUSIONS/SIGNIFICANCE: In the absence of regular active screening, improving referrals of HAT patients through other means is essential. Systematic use of syndromic algorithms by peripheral HCWs has the potential to increase case detection and would increase their participation in HAT programmes. The algorithms proposed here, though promising, should be validated elsewhere.
Resumo:
quantiNemo is an individual-based, genetically explicit stochastic simulation program. It was developed to investigate the effects of selection, mutation, recombination and drift on quantitative traits with varying architectures in structured populations connected by migration and located in a heterogeneous habitat. quantiNemo is highly flexible at various levels: population, selection, trait(s) architecture, genetic map for QTL and/or markers, environment, demography, mating system, etc. quantiNemo is coded in C++ using an object-oriented approach and runs on any computer platform. Availability: Executables for several platforms, user's manual, and source code are freely available under the GNU General Public License at http://www2.unil.ch/popgen/softwares/quantinemo.
Resumo:
Impressive developments in X-ray imaging are associated with X-ray phase contrast computed tomography based on grating interferometry, a technique that provides increased contrast compared with conventional absorption-based imaging. A new "single-step" method capable of separating phase information from other contributions has been recently proposed. This approach not only simplifies data-acquisition procedures, but, compared with the existing phase step approach, significantly reduces the dose delivered to a sample. However, the image reconstruction procedure is more demanding than for traditional methods and new algorithms have to be developed to take advantage of the "single-step" method. In the work discussed in this paper, a fast iterative image reconstruction method named OSEM (ordered subsets expectation maximization) was applied to experimental data to evaluate its performance and range of applicability. The OSEM algorithm with different subsets was also characterized by comparison of reconstruction image quality and convergence speed. Computer simulations and experimental results confirm the reliability of this new algorithm for phase-contrast computed tomography applications. Compared with the traditional filtered back projection algorithm, in particular in the presence of a noisy acquisition, it furnishes better images at a higher spatial resolution and with lower noise. We emphasize that the method is highly compatible with future X-ray phase contrast imaging clinical applications.
Resumo:
PURPOSE: To improve the traditional Nyquist ghost correction approach in echo planar imaging (EPI) at high fields, via schemes based on the reversal of the EPI readout gradient polarity for every other volume throughout a functional magnetic resonance imaging (fMRI) acquisition train. MATERIALS AND METHODS: An EPI sequence in which the readout gradient was inverted every other volume was implemented on two ultrahigh-field systems. Phantom images and fMRI data were acquired to evaluate ghost intensities and the presence of false-positive blood oxygenation level-dependent (BOLD) signal with and without ghost correction. Three different algorithms for ghost correction of alternating readout EPI were compared. RESULTS: Irrespective of the chosen processing approach, ghosting was significantly reduced (up to 70% lower intensity) in both rat brain images acquired on a 9.4T animal scanner and human brain images acquired at 7T, resulting in a reduction of sources of false-positive activation in fMRI data. CONCLUSION: It is concluded that at high B(0) fields, substantial gains in Nyquist ghost correction of echo planar time series are possible by alternating the readout gradient every other volume.