998 resultados para Noninvasive methods


Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: In May 2010, Switzerland introduced a heterogeneous smoking ban in the hospitality sector. While the law leaves room for exceptions in some cantons, it is comprehensive in others. This longitudinal study uses different measurement methods to examine airborne nicotine levels in hospitality venues and the level of personal exposure of non-smoking hospitality workers before and after implementation of the law. METHODS: Personal exposure to second hand smoke (SHS) was measured by three different methods. We compared a passive sampler called MoNIC (Monitor of NICotine) badge, to salivary cotinine and nicotine concentration as well as questionnaire data. Badges allowed the number of passively smoked cigarettes to be estimated. They were placed at the venues as well as distributed to the participants for personal measurements. To assess personal exposure at work, a time-weighted average of the workplace badge measurements was calculated. RESULTS: Prior to the ban, smoke-exposed hospitality venues yielded a mean badge value of 4.48 (95%-CI: 3.7 to 5.25; n = 214) cigarette equivalents/day. At follow-up, measurements in venues that had implemented a smoking ban significantly declined to an average of 0.31 (0.17 to 0.45; n = 37) (p = 0.001). Personal badge measurements also significantly decreased from an average of 2.18 (1.31-3.05 n = 53) to 0.25 (0.13-0.36; n = 41) (p = 0.001). Spearman rank correlations between badge exposure measures and salivary measures were small to moderate (0.3 at maximum). CONCLUSIONS: Nicotine levels significantly decreased in all types of hospitality venues after implementation of the smoking ban. In-depth analyses demonstrated that a time-weighted average of the workplace badge measurements represented typical personal SHS exposure at work more reliably than personal exposure measures such as salivary cotinine and nicotine.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: Chronic mountain sickness (CMS) is an important public health problem and is characterized by exaggerated hypoxemia, erythrocytosis, and pulmonary hypertension. While pulmonary hypertension is a leading cause of morbidity and mortality in patients with CMS, it is relatively mild and its underlying mechanisms are not known. We speculated that during mild exercise associated with daily activities, pulmonary hypertension in CMS is much more pronounced. METHODS: We estimated pulmonary artery pressure by using echocardiography at rest and during mild bicycle exercise at 50 W in 30 male patients with CMS and 32 age-matched, healthy control subjects who were born and living at an altitude of 3,600 m. RESULTS: The modest, albeit significant difference of the systolic right-ventricular-to-right-atrial pressure gradient between patients with CMS and controls at rest (30.3 +/- 8.0 vs 25.4 +/- 4.5 mm Hg, P 5 .002) became more than three times larger during mild bicycle exercise (56.4 +/- 19.0 vs 39.8 +/- 8.0 mm Hg, P < .001). CONCLUSIONS: Measurements of pulmonary artery pressure at rest greatly underestimate pulmonary artery pressure during daily activity in patients with CMS. The marked pulmonary hypertension during mild exercise associated with daily activity may explain why this problem is a leading cause of morbidity and mortality in patients with CMS.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Résumé Suite aux recentes avancées technologiques, les archives d'images digitales ont connu une croissance qualitative et quantitative sans précédent. Malgré les énormes possibilités qu'elles offrent, ces avancées posent de nouvelles questions quant au traitement des masses de données saisies. Cette question est à la base de cette Thèse: les problèmes de traitement d'information digitale à très haute résolution spatiale et/ou spectrale y sont considérés en recourant à des approches d'apprentissage statistique, les méthodes à noyau. Cette Thèse étudie des problèmes de classification d'images, c'est à dire de catégorisation de pixels en un nombre réduit de classes refletant les propriétés spectrales et contextuelles des objets qu'elles représentent. L'accent est mis sur l'efficience des algorithmes, ainsi que sur leur simplicité, de manière à augmenter leur potentiel d'implementation pour les utilisateurs. De plus, le défi de cette Thèse est de rester proche des problèmes concrets des utilisateurs d'images satellite sans pour autant perdre de vue l'intéret des méthodes proposées pour le milieu du machine learning dont elles sont issues. En ce sens, ce travail joue la carte de la transdisciplinarité en maintenant un lien fort entre les deux sciences dans tous les développements proposés. Quatre modèles sont proposés: le premier répond au problème de la haute dimensionalité et de la redondance des données par un modèle optimisant les performances en classification en s'adaptant aux particularités de l'image. Ceci est rendu possible par un système de ranking des variables (les bandes) qui est optimisé en même temps que le modèle de base: ce faisant, seules les variables importantes pour résoudre le problème sont utilisées par le classifieur. Le manque d'information étiquétée et l'incertitude quant à sa pertinence pour le problème sont à la source des deux modèles suivants, basés respectivement sur l'apprentissage actif et les méthodes semi-supervisées: le premier permet d'améliorer la qualité d'un ensemble d'entraînement par interaction directe entre l'utilisateur et la machine, alors que le deuxième utilise les pixels non étiquetés pour améliorer la description des données disponibles et la robustesse du modèle. Enfin, le dernier modèle proposé considère la question plus théorique de la structure entre les outputs: l'intègration de cette source d'information, jusqu'à présent jamais considérée en télédétection, ouvre des nouveaux défis de recherche. Advanced kernel methods for remote sensing image classification Devis Tuia Institut de Géomatique et d'Analyse du Risque September 2009 Abstract The technical developments in recent years have brought the quantity and quality of digital information to an unprecedented level, as enormous archives of satellite images are available to the users. However, even if these advances open more and more possibilities in the use of digital imagery, they also rise several problems of storage and treatment. The latter is considered in this Thesis: the processing of very high spatial and spectral resolution images is treated with approaches based on data-driven algorithms relying on kernel methods. In particular, the problem of image classification, i.e. the categorization of the image's pixels into a reduced number of classes reflecting spectral and contextual properties, is studied through the different models presented. The accent is put on algorithmic efficiency and the simplicity of the approaches proposed, to avoid too complex models that would not be used by users. The major challenge of the Thesis is to remain close to concrete remote sensing problems, without losing the methodological interest from the machine learning viewpoint: in this sense, this work aims at building a bridge between the machine learning and remote sensing communities and all the models proposed have been developed keeping in mind the need for such a synergy. Four models are proposed: first, an adaptive model learning the relevant image features has been proposed to solve the problem of high dimensionality and collinearity of the image features. This model provides automatically an accurate classifier and a ranking of the relevance of the single features. The scarcity and unreliability of labeled. information were the common root of the second and third models proposed: when confronted to such problems, the user can either construct the labeled set iteratively by direct interaction with the machine or use the unlabeled data to increase robustness and quality of the description of data. Both solutions have been explored resulting into two methodological contributions, based respectively on active learning and semisupervised learning. Finally, the more theoretical issue of structured outputs has been considered in the last model, which, by integrating outputs similarity into a model, opens new challenges and opportunities for remote sensing image processing.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In comparison with other micronutrients, the levels of nickel (Ni) available in soils and plant tissues are very low, making quantification very difficult. The objective of this paper is to present optimized determination methods of Ni availability in soils by extractants and total content in plant tissues for routine commercial laboratory analyses. Samples of natural and agricultural soils were processed and analyzed by Mehlich-1 extraction and by DTPA. To quantify Ni in the plant tissues, samples were digested with nitric acid in a closed system in a microwave oven. The measurement was performed by inductively coupled plasma/optical emission spectrometry (ICP-OES). There was a positive and significant correlation between the levels of available Ni in the soils subjected to Mehlich-1 and DTPA extraction, while for plant tissue samples the Ni levels recovered were high and similar to the reference materials. The availability of Ni in some of the natural soil and plant tissue samples were lower than the limits of quantification. Concentrations of this micronutrient were higher in the soil samples in which Ni had been applied. Nickel concentration differed in the plant parts analyzed, with highest levels in the grains of soybean. The grain, in comparison with the shoot and leaf concentrations, were better correlated with the soil available levels for both extractants. The methods described in this article were efficient in quantifying Ni and can be used for routine laboratory analysis of soils and plant tissues.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Debris accumulation on bridge piers is an on-going national problem that can obstruct the waterway openings at bridges and result in significant erosion of stream banks and scour at abutments and piers. In some cases, the accumulation of debris can adversely affect the operation of the waterway opening or cause failure of the structure. In addition, removal of debris accumulation is difficult, time consuming, and expensive for maintenance programs. This research involves a literature search of publications, products, and pier design recommendations that provide a cost effective method to mitigate debris accumulation at bridges. In addition, a nationwide survey was conducted to determine the state-of-the-practice and the results are presented within.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Traffic safety engineers are among the early adopters of Bayesian statistical tools for analyzing crash data. As in many other areas of application, empirical Bayes methods were their first choice, perhaps because they represent an intuitively appealing, yet relatively easy to implement alternative to purely classical approaches. With the enormous progress in numerical methods made in recent years and with the availability of free, easy to use software that permits implementing a fully Bayesian approach, however, there is now ample justification to progress towards fully Bayesian analyses of crash data. The fully Bayesian approach, in particular as implemented via multi-level hierarchical models, has many advantages over the empirical Bayes approach. In a full Bayesian analysis, prior information and all available data are seamlessly integrated into posterior distributions on which practitioners can base their inferences. All uncertainties are thus accounted for in the analyses and there is no need to pre-process data to obtain Safety Performance Functions and other such prior estimates of the effect of covariates on the outcome of interest. In this slight, fully Bayesian methods may well be less costly to implement and may result in safety estimates with more realistic standard errors. In this manuscript, we present the full Bayesian approach to analyzing traffic safety data and focus on highlighting the differences between the empirical Bayes and the full Bayes approaches. We use an illustrative example to discuss a step-by-step Bayesian analysis of the data and to show some of the types of inferences that are possible within the full Bayesian framework.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Traffic safety engineers are among the early adopters of Bayesian statistical tools for analyzing crash data. As in many other areas of application, empirical Bayes methods were their first choice, perhaps because they represent an intuitively appealing, yet relatively easy to implement alternative to purely classical approaches. With the enormous progress in numerical methods made in recent years and with the availability of free, easy to use software that permits implementing a fully Bayesian approach, however, there is now ample justification to progress towards fully Bayesian analyses of crash data. The fully Bayesian approach, in particular as implemented via multi-level hierarchical models, has many advantages over the empirical Bayes approach. In a full Bayesian analysis, prior information and all available data are seamlessly integrated into posterior distributions on which practitioners can base their inferences. All uncertainties are thus accounted for in the analyses and there is no need to pre-process data to obtain Safety Performance Functions and other such prior estimates of the effect of covariates on the outcome of interest. In this light, fully Bayesian methods may well be less costly to implement and may result in safety estimates with more realistic standard errors. In this manuscript, we present the full Bayesian approach to analyzing traffic safety data and focus on highlighting the differences between the empirical Bayes and the full Bayes approaches. We use an illustrative example to discuss a step-by-step Bayesian analysis of the data and to show some of the types of inferences that are possible within the full Bayesian framework.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

DnaSP is a software package for the analysis of DNA polymorphism data. Present version introduces several new modules and features which, among other options allow: (1) handling big data sets (~5 Mb per sequence); (2) conducting a large number of coalescent-based tests by Monte Carlo computer simulations; (3) extensive analyses of the genetic differentiation and gene flow among populations; (4) analysing the evolutionary pattern of preferred and unpreferred codons; (5) generating graphical outputs for an easy visualization of results. Availability: The software package, including complete documentation and examples, is freely available to academic users from: http://www.ub.es/dnasp

Relevância:

20.00% 20.00%

Publicador:

Resumo:

PURPOSE: Continuous positive airway pressure (CPAP) is the gold standard treatment for obstructive sleep apnea. However, the physiologic impact of CPAP on cerebral blood flow (CBF) is not well established. Ultrasound can be used to estimate CBF, but there is no widespread accepted protocol. We studied the physiologic influence of CPAP on CBF using a method integrating arterial diameter and flow velocity (FV) measurements obtained for each vessel supplying blood to the brain. METHODS: FV and lumen diameter of the left and right internal carotid, vertebral, and middle cerebral arteries were measured using duplex Doppler ultrasound with and without CPAP at 15 cm H(2)O, applied in a random order. Transcutaneous carbon dioxide (PtcCO(2)), heart rate (HR), blood pressure (BP), and oxygen saturation were monitored. Results were compared with a theoretical prediction of CBF change based on the effect of partial pressure of carbon dioxide on CBF. RESULTS: Data were obtained from 23 healthy volunteers (mean ± SD; 12 male, age 25.1 ± 2.6 years, body mass index 21.8 ± 2.0 kg/m(2)). The mean experimental and theoretical CBF decrease under CPAP was 12.5 % (p < 0.001) and 11.9 % (p < 0.001), respectively. The difference between experimental and theoretical CBF reduction was not statistically significant (3.84 ± 79 ml/min, p = 0.40). There was a significant reduction in PtcCO(2) with CPAP (p = <0.001) and a significant increase in mean BP (p = 0.0017). No significant change was observed in SaO(2) (p = 0.21) and HR (p = 0.62). CONCLUSION: Duplex Doppler ultrasound measurements of arterial diameter and FV allow for a noninvasive bedside estimation of CBF. CPAP at 15 cm H(2)O significantly decreased CBF in healthy awake volunteers. This effect appeared to be mediated predominately through the hypocapnic vasoconstriction coinciding with PCO(2) level reduction. The results suggest that CPAP should be used cautiously in patients with unstable cerebral hemodynamics.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: Molecular tools may help to uncover closely related and still diverging species from a wide variety of taxa and provide insight into the mechanisms, pace and geography of marine speciation. There is a certain controversy on the phylogeography and speciation modes of species-groups with an Eastern Atlantic-Western Indian Ocean distribution, with previous studies suggesting that older events (Miocene) and/or more recent (Pleistocene) oceanographic processes could have influenced the phylogeny of marine taxa. The spiny lobster genus Palinurus allows for testing among speciation hypotheses, since it has a particular distribution with two groups of three species each in the Northeastern Atlantic (P. elephas, P. mauritanicus and P. charlestoni) and Southeastern Atlantic and Southwestern Indian Oceans (P. gilchristi, P. delagoae and P. barbarae). In the present study, we obtain a more complete understanding of the phylogenetic relationships among these species through a combined dataset with both nuclear and mitochondrial markers, by testing alternative hypotheses on both the mutation rate and tree topology under the recently developed approximate Bayesian computation (ABC) methods. Results Our analyses support a North-to-South speciation pattern in Palinurus with all the South-African species forming a monophyletic clade nested within the Northern Hemisphere species. Coalescent-based ABC methods allowed us to reject the previously proposed hypothesis of a Middle Miocene speciation event related with the closure of the Tethyan Seaway. Instead, divergence times obtained for Palinurus species using the combined mtDNA-microsatellite dataset and standard mutation rates for mtDNA agree with known glaciation-related processes occurring during the last 2 my. Conclusion The Palinurus speciation pattern is a typical example of a series of rapid speciation events occurring within a group, with very short branches separating different species. Our results support the hypothesis that recent climate change-related oceanographic processes have influenced the phylogeny of marine taxa, with most Palinurus species originating during the last two million years. The present study highlights the value of new coalescent-based statistical methods such as ABC for testing different speciation hypotheses using molecular data.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Abstract : This work is concerned with the development and application of novel unsupervised learning methods, having in mind two target applications: the analysis of forensic case data and the classification of remote sensing images. First, a method based on a symbolic optimization of the inter-sample distance measure is proposed to improve the flexibility of spectral clustering algorithms, and applied to the problem of forensic case data. This distance is optimized using a loss function related to the preservation of neighborhood structure between the input space and the space of principal components, and solutions are found using genetic programming. Results are compared to a variety of state-of--the-art clustering algorithms. Subsequently, a new large-scale clustering method based on a joint optimization of feature extraction and classification is proposed and applied to various databases, including two hyperspectral remote sensing images. The algorithm makes uses of a functional model (e.g., a neural network) for clustering which is trained by stochastic gradient descent. Results indicate that such a technique can easily scale to huge databases, can avoid the so-called out-of-sample problem, and can compete with or even outperform existing clustering algorithms on both artificial data and real remote sensing images. This is verified on small databases as well as very large problems. Résumé : Ce travail de recherche porte sur le développement et l'application de méthodes d'apprentissage dites non supervisées. Les applications visées par ces méthodes sont l'analyse de données forensiques et la classification d'images hyperspectrales en télédétection. Dans un premier temps, une méthodologie de classification non supervisée fondée sur l'optimisation symbolique d'une mesure de distance inter-échantillons est proposée. Cette mesure est obtenue en optimisant une fonction de coût reliée à la préservation de la structure de voisinage d'un point entre l'espace des variables initiales et l'espace des composantes principales. Cette méthode est appliquée à l'analyse de données forensiques et comparée à un éventail de méthodes déjà existantes. En second lieu, une méthode fondée sur une optimisation conjointe des tâches de sélection de variables et de classification est implémentée dans un réseau de neurones et appliquée à diverses bases de données, dont deux images hyperspectrales. Le réseau de neurones est entraîné à l'aide d'un algorithme de gradient stochastique, ce qui rend cette technique applicable à des images de très haute résolution. Les résultats de l'application de cette dernière montrent que l'utilisation d'une telle technique permet de classifier de très grandes bases de données sans difficulté et donne des résultats avantageusement comparables aux méthodes existantes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

There is a renewal of interest among psychotherapy researchers and psychotherapists towards psychotherapy case studies. This article presents two paradigms that have greatly influenced this increasing interest in psychotherapy case studies : the pragmatic case study and the theory-building case study paradigm. The origins, developments and key-concepts of both paradigms are presented, as well as their methodological and ethical specificities. Examples of case studies, along with models developed, are cited. The differential influence of the post-modern schools on both paradigms are presented, as well as their contribution to the field of methods of psychotherapy case studies discussed and assessed in terms of relevance for the researcher and the psychotherapist.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: Direct noninvasive visualization of the coronary vessel wall may enhance risk stratification by quantifying subclinical coronary atherosclerotic plaque burden. We sought to evaluate high-resolution black-blood 3D cardiovascular magnetic resonance (CMR) imaging for in vivo visualization of the proximal coronary artery vessel wall. METHODS AND RESULTS: Twelve adult subjects, including 6 clinically healthy subjects and 6 patients with nonsignificant coronary artery disease (10% to 50% x-ray angiographic diameter reduction) were studied with the use of a commercial 1.5 Tesla CMR scanner. Free-breathing 3D coronary vessel wall imaging was performed along the major axis of the right coronary artery with isotropic spatial resolution (1.0x1.0x1.0 mm(3)) with the use of a black-blood spiral image acquisition. The proximal vessel wall thickness and luminal diameter were objectively determined with an automated edge detection tool. The 3D CMR vessel wall scans allowed for visualization of the contiguous proximal right coronary artery in all subjects. Both mean vessel wall thickness (1.7+/-0.3 versus 1.0+/-0.2 mm) and wall area (25.4+/-6.9 versus 11.5+/-5.2 mm(2)) were significantly increased in the patients compared with the healthy subjects (both P<0.01). The lumen diameter (3.6+/-0.7 versus 3.4+/-0.5 mm, P=0.47) and lumen area (8.9+/-3.4 versus 7.9+/-3.5 mm(2), P=0.47) were similar in both groups. CONCLUSIONS: Free-breathing 3D black-blood coronary CMR with isotropic resolution identified an increased coronary vessel wall thickness with preservation of lumen size in patients with nonsignificant coronary artery disease, consistent with a "Glagov-type" outward arterial remodeling. This novel approach has the potential to quantify subclinical disease.