916 resultados para Topology-based methods
Resumo:
Proceedings of the Information Technology Applications in Biomedicine, Ioannina - Epirus, Greece, October 26-28, 2006
Resumo:
13th International Conference on Autonomous Robot Systems (Robotica), 2013, Lisboa
Resumo:
DNA amplification techniques are being used increasingly in clinical laboratories to confirm the identity of medically important bacteria. A PCR-based identification method has been in use in our centre for 10 years for Burkholderia pseudomallei and was used to confirm the identity of bacteria isolated from cases of melioidosis in Ceará since 2003. This particular method has been used as a reference standard for less discriminatory methods. In this study we evaluated three PCR-based methods of B. pseudomallei identification and used DNA sequencing to resolve discrepancies between PCR-based results and phenotypic identification methods. The established semi-nested PCR protocol for B. pseudomallei 16-23s spacer region produced a consistent negative result for one of our 100 test isolates (BCC #99), but correctly identified all 71 other B. pseudomallei isolates tested. Anomalous sequence variation was detected at the inner, reverse primer binding site for this method. PCR methods were developed for detection of two other B. pseudomallei bacterial metabolic genes. The conventional lpxO PCR protocol had a sensitivity of 0.89 and a specificity of 1.00, while a real-time lpxO protocol performed even better with sensitivity and specificity of 1.00, and 1.00. This method identified all B. pseudomallei isolates including the PCR-negative discrepant isolate. The phaC PCR protocol detected the gene in all B. pseudomallei and all but three B. cepacia isolates, making this method unsuitable for PCR-based identification of B. pseudomallei. This experience with PCR-based B. pseudomallei identification methods indicates that single PCR targets should be used with caution for identification of these bacteria, and need to be interpreted alongside phenotypic and alternative molecular methods such as gene sequencing.
Resumo:
This project proposes an approach for supporting Indoor Navigation Systems using Pedestrian Dead Reckoning-based methods and by analyzing motion sensor data available in most modern smartphones. Processes suggested in this investigation are able to calculate the distance traveled by a user while he or she is walking. WLAN fingerprint- based navigation systems benefit from the processes followed in this research and results achieved to reduce its workload and improve its positioning estimations.
Resumo:
Introduction Toxoplasmosis may be life-threatening in fetuses and in immune-deficient patients. Conventional laboratory diagnosis of toxoplasmosis is based on the presence of IgM and IgG anti-Toxoplasma gondii antibodies; however, molecular techniques have emerged as alternative tools due to their increased sensitivity. The aim of this study was to compare the performance of 4 PCR-based methods for the laboratory diagnosis of toxoplasmosis. One hundred pregnant women who seroconverted during pregnancy were included in the study. The definition of cases was based on a 12-month follow-up of the infants. Methods Amniotic fluid samples were submitted to DNA extraction and amplification by the following 4 Toxoplasma techniques performed with parasite B1 gene primers: conventional PCR, nested-PCR, multiplex-nested-PCR, and real-time PCR. Seven parameters were analyzed, sensitivity (Se), specificity (Sp), positive predictive value (PPV), negative predictive value (NPV), positive likelihood ratio (PLR), negative likelihood ratio (NLR) and efficiency (Ef). Results Fifty-nine of the 100 infants had toxoplasmosis; 42 (71.2%) had IgM antibodies at birth but were asymptomatic, and the remaining 17 cases had non-detectable IgM antibodies but high IgG antibody titers that were associated with retinochoroiditis in 8 (13.5%) cases, abnormal cranial ultrasound in 5 (8.5%) cases, and signs/symptoms suggestive of infection in 4 (6.8%) cases. The conventional PCR assay detected 50 cases (9 false-negatives), nested-PCR detected 58 cases (1 false-negative and 4 false-positives), multiplex-nested-PCR detected 57 cases (2 false-negatives), and real-time-PCR detected 58 cases (1 false-negative). Conclusions The real-time PCR assay was the best-performing technique based on the parameters of Se (98.3%), Sp (100%), PPV (100%), NPV (97.6%), PLR (∞), NLR (0.017), and Ef (99%).
Resumo:
Anti-basal ganglia antibodies (ABGAs) have been suggested to be a hallmark of autoimmunity in Gilles de la Tourette's syndrome (GTS), possibly related to prior exposure to streptococcal infection. In order to detect whether the presence of ABGAs was associated with subtle structural changes in GTS, whole-brain analysis using independent sets of T(1) and diffusion tensor imaging MRI-based methods were performed on 22 adults with GTS with (n = 9) and without (n = 13) detectable ABGAs in the serum. Voxel-based morphometry analysis failed to detect any significant difference in grey matter density between ABGA-positive and ABGA-negative groups in caudate nuclei, putamina, thalami and frontal lobes. These results suggest that ABGA synthesis is not related to structural changes in grey and white matter (detectable with these methods) within frontostriatal circuits.
Resumo:
To assess the preferred methods to quit smoking among current smokers. Cross-sectional, population-based study conducted in Lausanne between 2003 and 2006 including 988 current smokers. Preference was assessed by questionnaire. Evidence-based (EB) methods were nicotine replacement, bupropion, physician or group consultations; non-EB-based methods were acupuncture, hypnosis and autogenic training. EB methods were frequently (physician consultation: 48%, 95% confidence interval (45-51); nicotine replacement therapy: 35% (32-38)) or rarely (bupropion and group consultations: 13% (11-15)) preferred by the participants. Non-EB methods were preferred by a third (acupuncture: 33% (30-36)), a quarter (hypnosis: 26% (23-29)) or a seventh (autogenic training: 13% (11-15)) of responders. On multivariate analysis, women preferred both EB and non-EB methods more frequently than men (odds ratio and 95% confidence interval: 1.46 (1.10-1.93) and 2.26 (1.72-2.96) for any EB and non-EB method, respectively). Preference for non-EB methods was higher among highly educated participants, while no such relationship was found for EB methods. Many smokers are unaware of the full variety of methods to quit smoking. Better information regarding these methods is necessary.
Resumo:
BACKGROUND Most textbooks contains messages relating to health. This profuse information requires analysis with regards to the quality of such information. The objective was to identify the scientific evidence on which the health messages in textbooks are based. METHODS The degree of evidence on which such messages are based was identified and the messages were subsequently classified into three categories: Messages with high, medium or low levels of evidence; Messages with an unknown level of evidence; and Messages with no known evidence. RESULTS 844 messages were studied. Of this total, 61% were classified as messages with an unknown level of evidence. Less than 15% fell into the category where the level of evidence was known and less than 6% were classified as possessing high levels of evidence. More than 70% of the messages relating to "Balanced Diets and Malnutrition", "Food Hygiene", "Tobacco", "Sexual behaviour and AIDS" and "Rest and ergonomics" are based on an unknown level of evidence. "Oral health" registered the highest percentage of messages based on a high level of evidence (37.5%), followed by "Pregnancy and newly born infants" (35%). Of the total, 24.6% are not based on any known evidence. Two of the messages appeared to contravene known evidence. CONCLUSION Many of the messages included in school textbooks are not based on scientific evidence. Standards must be established to facilitate the production of texts that include messages that are based on the best available evidence and which can improve children's health more effectively.
Resumo:
High-resolution tomographic imaging of the shallow subsurface is becoming increasingly important for a wide range of environmental, hydrological and engineering applications. Because of their superior resolution power, their sensitivity to pertinent petrophysical parameters, and their far reaching complementarities, both seismic and georadar crosshole imaging are of particular importance. To date, corresponding approaches have largely relied on asymptotic, ray-based approaches, which only account for a very small part of the observed wavefields, inherently suffer from a limited resolution, and in complex environments may prove to be inadequate. These problems can potentially be alleviated through waveform inversion. We have developed an acoustic waveform inversion approach for crosshole seismic data whose kernel is based on a finite-difference time-domain (FDTD) solution of the 2-D acoustic wave equations. This algorithm is tested on and applied to synthetic data from seismic velocity models of increasing complexity and realism and the results are compared to those obtained using state-of-the-art ray-based traveltime tomography. Regardless of the heterogeneity of the underlying models, the waveform inversion approach has the potential of reliably resolving both the geometry and the acoustic properties of features of the size of less than half a dominant wavelength. Our results do, however, also indicate that, within their inherent resolution limits, ray-based approaches provide an effective and efficient means to obtain satisfactory tomographic reconstructions of the seismic velocity structure in the presence of mild to moderate heterogeneity and in absence of strong scattering. Conversely, the excess effort of waveform inversion provides the greatest benefits for the most heterogeneous, and arguably most realistic, environments where multiple scattering effects tend to be prevalent and ray-based methods lose most of their effectiveness.
Resumo:
This paper shows how recently developed regression-based methods for thedecomposition of health inequality can be extended to incorporateindividual heterogeneity in the responses of health to the explanatoryvariables. We illustrate our method with an application to the CanadianNPHS of 1994. Our strategy for the estimation of heterogeneous responsesis based on the quantile regression model. The results suggest that thereis an important degree of heterogeneity in the association of health toexplanatory variables which, in turn, accounts for a substantial percentageof inequality in observed health. A particularly interesting finding isthat the marginal response of health to income is zero for healthyindividuals but positive and significant for unhealthy individuals. Theheterogeneity in the income response reduces both overall health inequalityand income related health inequality.
Resumo:
Due to the advances in sensor networks and remote sensing technologies, the acquisition and storage rates of meteorological and climatological data increases every day and ask for novel and efficient processing algorithms. A fundamental problem of data analysis and modeling is the spatial prediction of meteorological variables in complex orography, which serves among others to extended climatological analyses, for the assimilation of data into numerical weather prediction models, for preparing inputs to hydrological models and for real time monitoring and short-term forecasting of weather.In this thesis, a new framework for spatial estimation is proposed by taking advantage of a class of algorithms emerging from the statistical learning theory. Nonparametric kernel-based methods for nonlinear data classification, regression and target detection, known as support vector machines (SVM), are adapted for mapping of meteorological variables in complex orography.With the advent of high resolution digital elevation models, the field of spatial prediction met new horizons. In fact, by exploiting image processing tools along with physical heuristics, an incredible number of terrain features which account for the topographic conditions at multiple spatial scales can be extracted. Such features are highly relevant for the mapping of meteorological variables because they control a considerable part of the spatial variability of meteorological fields in the complex Alpine orography. For instance, patterns of orographic rainfall, wind speed and cold air pools are known to be correlated with particular terrain forms, e.g. convex/concave surfaces and upwind sides of mountain slopes.Kernel-based methods are employed to learn the nonlinear statistical dependence which links the multidimensional space of geographical and topographic explanatory variables to the variable of interest, that is the wind speed as measured at the weather stations or the occurrence of orographic rainfall patterns as extracted from sequences of radar images. Compared to low dimensional models integrating only the geographical coordinates, the proposed framework opens a way to regionalize meteorological variables which are multidimensional in nature and rarely show spatial auto-correlation in the original space making the use of classical geostatistics tangled.The challenges which are explored during the thesis are manifolds. First, the complexity of models is optimized to impose appropriate smoothness properties and reduce the impact of noisy measurements. Secondly, a multiple kernel extension of SVM is considered to select the multiscale features which explain most of the spatial variability of wind speed. Then, SVM target detection methods are implemented to describe the orographic conditions which cause persistent and stationary rainfall patterns. Finally, the optimal splitting of the data is studied to estimate realistic performances and confidence intervals characterizing the uncertainty of predictions.The resulting maps of average wind speeds find applications within renewable resources assessment and opens a route to decrease the temporal scale of analysis to meet hydrological requirements. Furthermore, the maps depicting the susceptibility to orographic rainfall enhancement can be used to improve current radar-based quantitative precipitation estimation and forecasting systems and to generate stochastic ensembles of precipitation fields conditioned upon the orography.
Resumo:
A nonlocal variational formulation for interpolating a sparsel sampled image is introduced in this paper. The proposed variational formulation, originally motivated by image inpainting problems, encouragesthe transfer of information between similar image patches, following the paradigm of exemplar-based methods. Contrary to the classical inpaintingproblem, no complete patches are available from the sparse imagesamples, and the patch similarity criterion has to be redefined as here proposed. Initial experimental results with the proposed framework, at very low sampling densities, are very encouraging. We also explore somedepartures from the variational setting, showing a remarkable ability to recover textures at low sampling densities.
Resumo:
Normal and abnormal brains can be segmented by registering the target image with an atlas. Here, an atlas is defined as the combination of an intensity image (template) and its segmented image (the atlas labels). After registering the atlas template and the target image, the atlas labels are propagated to the target image. We define this process as atlas-based segmentation. In recent years, researchers have investigated registration algorithms to match atlases to query subjects and also strategies for atlas construction. In this paper we present a review of the automated approaches for atlas-based segmentation of magnetic resonance brain images. We aim to point out the strengths and weaknesses of atlas-based methods and suggest new research directions. We use two different criteria to present the methods. First, we refer to the algorithms according to their atlas-based strategy: label propagation, multi-atlas methods, and probabilistic techniques. Subsequently, we classify the methods according to their medical target: the brain and its internal structures, tissue segmentation in healthy subjects, tissue segmentation in fetus, neonates and elderly subjects, and segmentation of damaged brains. A quantitative comparison of the results reported in the literature is also presented.
Resumo:
The estimation of muscle forces in musculoskeletal shoulder models is still controversial. Two different methods are widely used to solve the indeterminacy of the system: electromyography (EMG)-based methods and stress-based methods. The goal of this work was to evaluate the influence of these two methods on the prediction of muscle forces, glenohumeral load and joint stability after total shoulder arthroplasty. An EMG-based and a stress-based method were implemented into the same musculoskeletal shoulder model. The model replicated the glenohumeral joint after total shoulder arthroplasty. It contained the scapula, the humerus, the joint prosthesis, the rotator cuff muscles supraspinatus, subscapularis and infraspinatus and the middle, anterior and posterior deltoid muscles. A movement of abduction was simulated in the plane of the scapula. The EMG-based method replicated muscular activity of experimentally measured EMG. The stress-based method minimised a cost function based on muscle stresses. We compared muscle forces, joint reaction force, articular contact pressure and translation of the humeral head. The stress-based method predicted a lower force of the rotator cuff muscles. This was partly counter-balanced by a higher force of the middle part of the deltoid muscle. As a consequence, the stress-based method predicted a lower joint load (16% reduced) and a higher superior-inferior translation of the humeral head (increased by 1.2 mm). The EMG-based method has the advantage of replicating the observed cocontraction of stabilising muscles of the rotator cuff. This method is, however, limited to available EMG measurements. The stress-based method has thus an advantage of flexibility, but may overestimate glenohumeral subluxation.
Resumo:
Blood culture remains the best approach to identify the incriminating microorganisms when a bloodstream infection is suspected, and to guarantee that the antimicrobial treatment is adequate. Major improvements have been made in the last years to increase the sensitivity and specificity and to reduce the time to identification of microorganisms recovered from blood cultures. Among other factors, the introduction in clinical microbiology laboratories of the matrix-assisted laser desorption ionization time-of-flight mass spectrometry technology revolutionized the identification of microorganisms whereas the introduction of nucleic-acid-based methods, such as DNA hybridization or rapid PCR-based test, significantly reduce the time to results. Together with traditional antimicrobial susceptibility testing, new rapid methods for the detection of resistance mechanisms respond to major epidemiological concerns such as methicillin-resistant Staphylococcus aureus, extended-spectrum β-lactamase or carbapenemases. This review presents and discusses the recent developments in microbial diagnosis of bloodstream infections based on blood cultures.