25 resultados para Rapid Assessment

em Université de Lausanne, Switzerland


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background We assessed the impact of a smoking ban in hospitality venues in the Seychelles 9 months after legislation was implemented. Methods Survey officers observed compliance with the smoking ban in 38 most popular hospitality venues and administered a structured questionnaire to two customers, two workers and one manager in each venue. Results Virtually no customers or workers were seen smoking in the indoor premises. Patrons, workers and managers largely supported the ban. The personnel of the hospitality venues reported that most smokers had no difficulty refraining from smoking. However, a third of workers did not systematically request customers to stop smoking and half of them did not report adequate training. Workers reported improved health. No substantial change in the number of customers was noted. Conclusion A ban on public smoking was generally well implemented in hospitality venues but some less than optimal findings suggest the need for adequate training of workers and strengthened enforcement measures. The simple and inexpensive methodology used in this rapid survey may be a useful approach to evaluate the implementation and impact of clean air policy in low and middle-income countries.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Ce rapport permet d'identifier le sous-groupe des personnes transgenres pratiquant ou ayant pratiqué le travail du sexe comme une population clairement exposée à un risque notoirement élevé d'infection par le VIH et les IST. L'ampleur rapportée du phénomène justifie pleinement l'inclusion de cette population dans le dispositif de surveillance comportementale du VIH et des autres IST, mais également la réalisation urgente d'actions de prévention communautaire. Par contre, il ne nous permet pas d'arriver à un consensus fort en ce qui concerne la situation des personnes transgenres non travailleuses du sexe par rapport au VIH et aux autres IST en Suisse. Les données internationales sont cependant suffisamment préoccupantes pour justifier la réalisation d'une recherche sur la santé sexuelle au sein de cette population. Cette recherche devra prendre en compte les contextes de vie souvent hostiles auxquels sont confrontées les personnes transgenres. Par ailleurs, les personnes transgenres devraient pouvoir être reconnues en tant que telles et enregistrées de manière systématique dans les systèmes de déclaration du VIH et des autres IST, dans les outils de suivi statistique des centres de dépistage et de conseil VIH/IST, ainsi que dans l'enquête suisse sur la santé.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

OBJECTIVES: The Swiss Aids prevention strategy has been subject to a continuous process of evaluation for the past 12 years. This paper describes the conceptual approach, methodology, results obtained and contribution to policy-making of that evaluation.¦DESIGN: The evaluation is on-going, global with respect to all components of the strategy, and utilization-focused. Each successive phase of the evaluation has included 10-20 studies centred either on aspects of process, of outcome or of environmental context. Findings are synthesized at the end of each phase. METHODS: Both quantitative and qualitative methods are used. Studies generally have one of three functions within the overall evaluation: assessment of trends through surveys or other types of repeated studies; evaluation of specific areas through a series of studies from different viewpoints; in-depth investigation or rapid assessment through one-off studies. Various methods of triangulation are used to validate findings. RESULTS: The evaluation has allowed for: the observation of behavioural change in different populations; the availability of scientific data in controversial fields such as drug-use policy; an understanding of the diversity of public appropriation of prevention messages. Recommendations are regularly formulated and have been used by policy-makers and field workers for strategy development. CONCLUSIONS: The global approach adopted corresponds well to the evaluation requirements of an integrated long-term prevention strategy. Cost is low relative to the extent of information provided. Such an evaluation cannot however address the question of causal relationship between the strategy and observed changes. The evaluation has contributed to the development of a culture of evaluation in Swiss AIDS prevention more generally.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Summary Due to their conic shape and the reduction of area with increasing elevation, mountain ecosystems were early identified as potentially very sensitive to global warming. Moreover, mountain systems may experience unprecedented rates of warming during the next century, two or three times higher than that records of the 20th century. In this context, species distribution models (SDM) have become important tools for rapid assessment of the impact of accelerated land use and climate change on the distribution plant species. In my study, I developed and tested new predictor variables for species distribution models (SDM), specific to current and future geographic projections of plant species in a mountain system, using the Western Swiss Alps as model region. Since meso- and micro-topography are relevant to explain geographic patterns of plant species in mountain environments, I assessed the effect of scale on predictor variables and geographic projections of SDM. I also developed a methodological framework of space-for-time evaluation to test the robustness of SDM when projected in a future changing climate. Finally, I used a cellular automaton to run dynamic simulations of plant migration under climate change in a mountain landscape, including realistic distance of seed dispersal. Results of future projections for the 21st century were also discussed in perspective of vegetation changes monitored during the 20th century. Overall, I showed in this study that, based on the most severe A1 climate change scenario and realistic dispersal simulations of plant dispersal, species extinctions in the Western Swiss Alps could affect nearly one third (28.5%) of the 284 species modeled by 2100. With the less severe 61 scenario, only 4.6% of species are predicted to become extinct. However, even with B1, 54% (153 species) may still loose more than 80% of their initial surface. Results of monitoring of past vegetation changes suggested that plant species can react quickly to the warmer conditions as far as competition is low However, in subalpine grasslands, competition of already present species is probably important and limit establishment of newly arrived species. Results from future simulations also showed that heavy extinctions of alpine plants may start already in 2040, but the latest in 2080. My study also highlighted the importance of fine scale and regional. assessments of climate change impact on mountain vegetation, using more direct predictor variables. Indeed, predictions at the continental scale may fail to predict local refugees or local extinctions, as well as loss of connectivity between local populations. On the other hand, migrations of low-elevation species to higher altitude may be difficult to predict at the local scale. Résumé La forme conique des montagnes ainsi que la diminution de surface dans les hautes altitudes sont reconnues pour exposer plus sensiblement les écosystèmes de montagne au réchauffement global. En outre, les systèmes de montagne seront sans doute soumis durant le 21ème siècle à un réchauffement deux à trois fois plus rapide que celui mesuré durant le 20ème siècle. Dans ce contexte, les modèles prédictifs de distribution géographique de la végétation se sont imposés comme des outils puissants pour de rapides évaluations de l'impact des changements climatiques et de la transformation du paysage par l'homme sur la végétation. Dans mon étude, j'ai développé de nouvelles variables prédictives pour les modèles de distribution, spécifiques à la projection géographique présente et future des plantes dans un système de montagne, en utilisant les Préalpes vaudoises comme zone d'échantillonnage. La méso- et la microtopographie étant particulièrement adaptées pour expliquer les patrons de distribution géographique des plantes dans un environnement montagneux, j'ai testé les effets d'échelle sur les variables prédictives et sur les projections des modèles de distribution. J'ai aussi développé un cadre méthodologique pour tester la robustesse potentielle des modèles lors de projections pour le futur. Finalement, j'ai utilisé un automate cellulaire pour simuler de manière dynamique la migration future des plantes dans le paysage et dans quatre scénarios de changement climatique pour le 21ème siècle. J'ai intégré dans ces simulations des mécanismes et des distances plus réalistes de dispersion de graines. J'ai pu montrer, avec les simulations les plus réalistes, que près du tiers des 284 espèces considérées (28.5%) pourraient être menacées d'extinction en 2100 dans le cas du plus sévère scénario de changement climatique A1. Pour le moins sévère des scénarios B1, seulement 4.6% des espèces sont menacées d'extinctions, mais 54% (153 espèces) risquent de perdre plus 80% de leur habitat initial. Les résultats de monitoring des changements de végétation dans le passé montrent que les plantes peuvent réagir rapidement au réchauffement climatique si la compétition est faible. Dans les prairies subalpines, les espèces déjà présentes limitent certainement l'arrivée de nouvelles espèces par effet de compétition. Les résultats de simulation pour le futur prédisent le début d'extinctions massives dans les Préalpes à partir de 2040, au plus tard en 2080. Mon travail démontre aussi l'importance d'études régionales à échelle fine pour évaluer l'impact des changements climatiques sur la végétation, en intégrant des variables plus directes. En effet, les études à échelle continentale ne tiennent pas compte des micro-refuges, des extinctions locales ni des pertes de connectivité entre populations locales. Malgré cela, la migration des plantes de basses altitudes reste difficile à prédire à l'échelle locale sans modélisation plus globale.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Adequate in-vitro training in valved stents deployment as well as testing of the latter devices requires compliant real-size models of the human aortic root. The casting methods utilized up to now are multi-step, time consuming and complicated. We pursued a goal of building a flexible 3D model in a single-step procedure. We created a precise 3D CAD model of a human aortic root using previously published anatomical and geometrical data and printed it using a novel rapid prototyping system developed by the Fab@Home project. As a material for 3D fabrication we used common house-hold silicone and afterwards dip-coated several models with dispersion silicone one or two times. To assess the production precision we compared the size of the final product with the CAD model. Compliance of the models was measured and compared with native porcine aortic root. Total fabrication time was 3 h and 20 min. Dip-coating one or two times with dispersion silicone if applied took one or two extra days, respectively. The error in dimensions of non-coated aortic root model compared to the CAD design was <3.0% along X, Y-axes and 4.1% along Z-axis. Compliance of a non-coated model as judged by the changes of radius values in the radial direction by 16.39% is significantly different (P<0.001) from native aortic tissue--23.54% at the pressure of 80-100 mmHg. Rapid prototyping of compliant, life-size anatomical models with the Fab@Home 3D printer is feasible--it is very quick compared to previous casting methods.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

PURPOSE: To evaluate a diagnostic strategy for pulmonary embolism that combined clinical assessment, plasma D-dimer measurement, lower limb venous ultrasonography, and helical computed tomography (CT). METHODS: A cohort of 965 consecutive patients presenting to the emergency departments of three general and teaching hospitals with clinically suspected pulmonary embolism underwent sequential noninvasive testing. Clinical probability was assessed by a prediction rule combined with implicit judgment. All patients were followed for 3 months. RESULTS: A normal D-dimer level (&lt;500 microg/L by a rapid enzyme-linked immunosorbent assay) ruled out venous thromboembolism in 280 patients (29%), and finding a deep vein thrombosis by ultrasonography established the diagnosis in 92 patients (9.5%). Helical CT was required in only 593 patients (61%) and showed pulmonary embolism in 124 patients (12.8%). Pulmonary embolism was considered ruled out in the 450 patients (46.6%) with a negative ultrasound and CT scan and a low-to-intermediate clinical probability. The 8 patients with a negative ultrasound and CT scan despite a high clinical probability proceeded to pulmonary angiography (positive: 2; negative: 6). Helical CT was inconclusive in 11 patients (pulmonary embolism: 4; no pulmonary embolism: 7). The overall prevalence of pulmonary embolism was 23%. Patients classified as not having pulmonary embolism were not anticoagulated during follow-up and had a 3-month thromboembolic risk of 1.0% (95% confidence interval: 0.5% to 2.1%). CONCLUSION: A noninvasive diagnostic strategy combining clinical assessment, D-dimer measurement, ultrasonography, and helical CT yielded a diagnosis in 99% of outpatients suspected of pulmonary embolism, and appeared to be safe, provided that CT was combined with ultrasonography to rule out the disease.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Currently, the most widely used criteria for assessing response to therapy in high-grade gliomas are based on two-dimensional tumor measurements on computed tomography (CT) or magnetic resonance imaging (MRI), in conjunction with clinical assessment and corticosteroid dose (the Macdonald Criteria). It is increasingly apparent that there are significant limitations to these criteria, which only address the contrast-enhancing component of the tumor. For example, chemoradiotherapy for newly diagnosed glioblastomas results in transient increase in tumor enhancement (pseudoprogression) in 20% to 30% of patients, which is difficult to differentiate from true tumor progression. Antiangiogenic agents produce high radiographic response rates, as defined by a rapid decrease in contrast enhancement on CT/MRI that occurs within days of initiation of treatment and that is partly a result of reduced vascular permeability to contrast agents rather than a true antitumor effect. In addition, a subset of patients treated with antiangiogenic agents develop tumor recurrence characterized by an increase in the nonenhancing component depicted on T2-weighted/fluid-attenuated inversion recovery sequences. The recognition that contrast enhancement is nonspecific and may not always be a true surrogate of tumor response and the need to account for the nonenhancing component of the tumor mandate that new criteria be developed and validated to permit accurate assessment of the efficacy of novel therapies. The Response Assessment in Neuro-Oncology Working Group is an international effort to develop new standardized response criteria for clinical trials in brain tumors. In this proposal, we present the recommendations for updated response criteria for high-grade gliomas.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dans le contexte climatique actuel, les régions méditerranéennes connaissent une intensification des phénomènes hydrométéorologiques extrêmes. Au Maroc, le risque lié aux inondations est devenu problématique, les communautés étant vulnérables aux événements extrêmes. En effet, le développement économique et urbain rapide et mal maîtrisé augmente l'exposition aux phénomènes extrêmes. La Direction du Développement et de la Coopération suisse (DDC) s'implique activement dans la réduction des risques naturels au Maroc. La cartographie des dangers et son intégration dans l'aménagement du territoire représentent une méthode efficace afin de réduire la vulnérabilité spatiale. Ainsi, la DDC a mandaté ce projet d'adaptation de la méthode suisse de cartographie des dangers à un cas d'étude marocain (la ville de Beni Mellal, région de Tadla-Azilal, Maroc). La méthode suisse a été adaptée aux contraintes spécifiques du terrain (environnement semi-aride, morphologie de piémont) et au contexte de transfert de connaissances (caractéristiques socio-économiques et pratiques). Une carte des phénomènes d'inondations a été produite. Elle contient les témoins morphologiques et les éléments anthropiques pertinents pour le développement et l'aggravation des inondations. La modélisation de la relation pluie-débit pour des événements de référence, et le routage des hydrogrammes de crue ainsi obtenus ont permis d'estimer quantitativement l'aléa inondation. Des données obtenues sur le terrain (estimations de débit, extension de crues connues) ont permis de vérifier les résultats des modèles. Des cartes d'intensité et de probabilité ont été obtenues. Enfin, une carte indicative du danger d'inondation a été produite sur la base de la matrice suisse du danger qui croise l'intensité et la probabilité d'occurrence d'un événement pour obtenir des degrés de danger assignables au territoire étudié. En vue de l'implémentation des cartes de danger dans les documents de l'aménagement du territoire, nous nous intéressons au fonctionnement actuel de la gestion institutionnelle du risque à Beni Mellal, en étudiant le degré d'intégration de la gestion et la manière dont les connaissances sur les risques influencent le processus de gestion. L'analyse montre que la gestion est marquée par une logique de gestion hiérarchique et la priorité des mesures de protection par rapport aux mesures passives d'aménagement du territoire. Les connaissances sur le risque restent sectorielles, souvent déconnectées. L'innovation dans le domaine de la gestion du risque résulte de collaborations horizontales entre les acteurs ou avec des sources de connaissances externes (par exemple les universités). Des recommandations méthodologiques et institutionnelles issues de cette étude ont été adressées aux gestionnaires en vue de l'implémentation des cartes de danger. Plus que des outils de réduction du risque, les cartes de danger aident à transmettre des connaissances vers le public et contribuent ainsi à établir une culture du risque. - Severe rainfall events are thought to be occurring more frequently in semi-arid areas. In Morocco, flood hazard has become an important topic, notably as rapid economic development and high urbanization rates have increased the exposure of people and assets in hazard-prone areas. The Swiss Agency for Development and Cooperation (SADC) is active in natural hazard mitigation in Morocco. As hazard mapping for urban planning is thought to be a sound tool for vulnerability reduction, the SADC has financed a project aimed at adapting the Swiss approach for hazard assessment and mapping to the case of Morocco. In a knowledge transfer context, the Swiss method was adapted to the semi-arid environment, the specific piedmont morphology and to socio-economic constraints particular to the study site. Following the Swiss guidelines, a hydro-geomorphological map was established, containing all geomorphic elements related to known past floods. Next, rainfall / runoff modeling for reference events and hydraulic routing of the obtained hydrographs were carried out in order to assess hazard quantitatively. Field-collected discharge estimations and flood extent for known floods were used to verify the model results. Flood hazard intensity and probability maps were obtained. Finally, an indicative danger map as defined within the Swiss hazard assessment terminology was calculated using the Swiss hazard matrix that convolves flood intensity with its recurrence probability in order to assign flood danger degrees to the concerned territory. Danger maps become effective, as risk mitigation tools, when implemented in urban planning. We focus on how local authorities are involved in the risk management process and how knowledge about risk impacts the management. An institutional vulnerability "map" was established based on individual interviews held with the main institutional actors in flood management. Results show that flood hazard management is defined by uneven actions and relationships, it is based on top-down decision-making patterns, and focus is maintained on active mitigation measures. The institutional actors embody sectorial, often disconnected risk knowledge pools, whose relationships are dictated by the institutional hierarchy. Results show that innovation in the risk management process emerges when actors collaborate despite the established hierarchy or when they open to outer knowledge pools (e.g. the academia). Several methodological and institutional recommendations were addressed to risk management stakeholders in view of potential map implementation to planning. Hazard assessment and mapping is essential to an integrated risk management approach: more than a mitigation tool, danger maps represent tools that allow communicating on hazards and establishing a risk culture.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Biomarker analysis is playing an essential role in cancer diagnosis, prognosis, and prediction. Quantitative assessment of immunohistochemical biomarker expression on tumor tissues is of clinical relevance when deciding targeted treatments for cancer patients. Here, we report a microfluidic tissue processor that permits accurate quantification of the expression of biomarkers on tissue sections, enabled by the ultra-rapid and uniform fluidic exchange of the device. An important clinical biomarker for invasive breast cancer is human epidermal growth factor receptor 2 [(HER2), also known as neu], a transmembrane tyrosine kinase that connotes adverse prognostic information for the patients concerned and serves as a target for personalized treatment using the humanized antibody trastuzumab. Unfortunately, when using state-of-the-art methods, the intensity of an immunohistochemical signal is not proportional to the extent of biomarker expression, causing ambiguous outcomes. Using our device, we performed tests on 76 invasive breast carcinoma cases expressing various levels of HER2. We eliminated more than 90% of the ambiguous results (n = 27), correctly assigning cases to the amplification status as assessed by in situ hybridization controls, whereas the concordance for HER2-negative (n = 31) and -positive (n = 18) cases was 100%. Our results demonstrate the clinical potential of microfluidics for accurate biomarker expression analysis. We anticipate our technique will be a diagnostic tool that will provide better and more reliable data, onto which future treatment regimes can be based.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Tamoxifen (tam) is a widely used endocrine therapy in the treatment of early and advanced stage breast cancer in women and men. It is a pro-drug having weak affinity with the estrogen receptor and needs to be converted to its main metabolite, endoxifen (endox), to have full anticancer activity. Cytochrome 2D6 (CYP2D6) plays a major role in the metabolism of tamoxifen to endoxifen. It is genetically highly polymorphic and its activity influences profoundly the synthesis of endoxifen and potentially the efficacy of tamoxifen treatment. Genotyping is currently the most widely used approach in studies and also in clinical practice to categorize patients as poor- (PM), intermediate- (IM), extensive- (EM) and ultra rapid-metabolizers (UM). Some clinicians already use genotyping in order to tailor the endocrine therapy of their patients. Owing to the large inter-individual variations in concentrations of the active moitey due to genetic and non-genetic influences renders the predictive value of the test uncertain for an individual patient. A significant number of patients classified as EM or IM by genotyping have indeed relatively low endoxifen levels similar to PMs1. This suggests that genotyping is probably not the opti ma l meth o d f or predi cti ng end oxif en l evels.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

La tomodensitométrie (CT) est une technique d'imagerie dont l'intérêt n'a cessé de croître depuis son apparition dans le début des années 70. Dans le domaine médical, son utilisation est incontournable à tel point que ce système d'imagerie pourrait être amené à devenir victime de son succès si son impact au niveau de l'exposition de la population ne fait pas l'objet d'une attention particulière. Bien évidemment, l'augmentation du nombre d'examens CT a permis d'améliorer la prise en charge des patients ou a rendu certaines procédures moins invasives. Toutefois, pour assurer que le compromis risque - bénéfice soit toujours en faveur du patient, il est nécessaire d'éviter de délivrer des doses non utiles au diagnostic.¦Si cette action est importante chez l'adulte elle doit être une priorité lorsque les examens se font chez l'enfant, en particulier lorsque l'on suit des pathologies qui nécessitent plusieurs examens CT au cours de la vie du patient. En effet, les enfants et jeunes adultes sont plus radiosensibles. De plus, leur espérance de vie étant supérieure à celle de l'adulte, ils présentent un risque accru de développer un cancer radio-induit dont la phase de latence peut être supérieure à vingt ans. Partant du principe que chaque examen radiologique est justifié, il devient dès lors nécessaire d'optimiser les protocoles d'acquisitions pour s'assurer que le patient ne soit pas irradié inutilement. L'avancée technologique au niveau du CT est très rapide et depuis 2009, de nouvelles techniques de reconstructions d'images, dites itératives, ont été introduites afin de réduire la dose et améliorer la qualité d'image.¦Le présent travail a pour objectif de déterminer le potentiel des reconstructions itératives statistiques pour réduire au minimum les doses délivrées lors d'examens CT chez l'enfant et le jeune adulte tout en conservant une qualité d'image permettant le diagnostic, ceci afin de proposer des protocoles optimisés.¦L'optimisation d'un protocole d'examen CT nécessite de pouvoir évaluer la dose délivrée et la qualité d'image utile au diagnostic. Alors que la dose est estimée au moyen d'indices CT (CTDIV0| et DLP), ce travail a la particularité d'utiliser deux approches radicalement différentes pour évaluer la qualité d'image. La première approche dite « physique », se base sur le calcul de métriques physiques (SD, MTF, NPS, etc.) mesurées dans des conditions bien définies, le plus souvent sur fantômes. Bien que cette démarche soit limitée car elle n'intègre pas la perception des radiologues, elle permet de caractériser de manière rapide et simple certaines propriétés d'une image. La seconde approche, dite « clinique », est basée sur l'évaluation de structures anatomiques (critères diagnostiques) présentes sur les images de patients. Des radiologues, impliqués dans l'étape d'évaluation, doivent qualifier la qualité des structures d'un point de vue diagnostique en utilisant une échelle de notation simple. Cette approche, lourde à mettre en place, a l'avantage d'être proche du travail du radiologue et peut être considérée comme méthode de référence.¦Parmi les principaux résultats de ce travail, il a été montré que les algorithmes itératifs statistiques étudiés en clinique (ASIR?, VEO?) ont un important potentiel pour réduire la dose au CT (jusqu'à-90%). Cependant, par leur fonctionnement, ils modifient l'apparence de l'image en entraînant un changement de texture qui pourrait affecter la qualité du diagnostic. En comparant les résultats fournis par les approches « clinique » et « physique », il a été montré que ce changement de texture se traduit par une modification du spectre fréquentiel du bruit dont l'analyse permet d'anticiper ou d'éviter une perte diagnostique. Ce travail montre également que l'intégration de ces nouvelles techniques de reconstruction en clinique ne peut se faire de manière simple sur la base de protocoles utilisant des reconstructions classiques. Les conclusions de ce travail ainsi que les outils développés pourront également guider de futures études dans le domaine de la qualité d'image, comme par exemple, l'analyse de textures ou la modélisation d'observateurs pour le CT.¦-¦Computed tomography (CT) is an imaging technique in which interest has been growing since it first began to be used in the early 1970s. In the clinical environment, this imaging system has emerged as the gold standard modality because of its high sensitivity in producing accurate diagnostic images. However, even if a direct benefit to patient healthcare is attributed to CT, the dramatic increase of the number of CT examinations performed has raised concerns about the potential negative effects of ionizing radiation on the population. To insure a benefit - risk that works in favor of a patient, it is important to balance image quality and dose in order to avoid unnecessary patient exposure.¦If this balance is important for adults, it should be an absolute priority for children undergoing CT examinations, especially for patients suffering from diseases requiring several follow-up examinations over the patient's lifetime. Indeed, children and young adults are more sensitive to ionizing radiation and have an extended life span in comparison to adults. For this population, the risk of developing cancer, whose latency period exceeds 20 years, is significantly higher than for adults. Assuming that each patient examination is justified, it then becomes a priority to optimize CT acquisition protocols in order to minimize the delivered dose to the patient. Over the past few years, CT advances have been developing at a rapid pace. Since 2009, new iterative image reconstruction techniques, called statistical iterative reconstructions, have been introduced in order to decrease patient exposure and improve image quality.¦The goal of the present work was to determine the potential of statistical iterative reconstructions to reduce dose as much as possible without compromising image quality and maintain diagnosis of children and young adult examinations.¦The optimization step requires the evaluation of the delivered dose and image quality useful to perform diagnosis. While the dose is estimated using CT indices (CTDIV0| and DLP), the particularity of this research was to use two radically different approaches to evaluate image quality. The first approach, called the "physical approach", computed physical metrics (SD, MTF, NPS, etc.) measured on phantoms in well-known conditions. Although this technique has some limitations because it does not take radiologist perspective into account, it enables the physical characterization of image properties in a simple and timely way. The second approach, called the "clinical approach", was based on the evaluation of anatomical structures (diagnostic criteria) present on patient images. Radiologists, involved in the assessment step, were asked to score image quality of structures for diagnostic purposes using a simple rating scale. This approach is relatively complicated to implement and also time-consuming. Nevertheless, it has the advantage of being very close to the practice of radiologists and is considered as a reference method.¦Primarily, this work revealed that the statistical iterative reconstructions studied in clinic (ASIR? and VECO have a strong potential to reduce CT dose (up to -90%). However, by their mechanisms, they lead to a modification of the image appearance with a change in image texture which may then effect the quality of the diagnosis. By comparing the results of the "clinical" and "physical" approach, it was showed that a change in texture is related to a modification of the noise spectrum bandwidth. The NPS analysis makes possible to anticipate or avoid a decrease in image quality. This project demonstrated that integrating these new statistical iterative reconstruction techniques can be complex and cannot be made on the basis of protocols using conventional reconstructions. The conclusions of this work and the image quality tools developed will be able to guide future studies in the field of image quality as texture analysis or model observers dedicated to CT.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Despite major advances in care of premature infants, survivors exhibit mild cognitive deficits in around 40%. Beside severe intraventricular haemorrhages (IVH) and cystic periventricular leucomalacia (PVL), more subtle patterns such as grade I and II IVH, punctuate WM lesions and diffuse PVL might be linked to the cognitive deficits. Grey matter disease is also recognized to contribute to long-term cognitive impairment.¦OBJECTIVE: We intend to use novel MR techniques to study more precisely the different injury patterns. In particular MP2RAGE (magnetization prepared dual rapid echo gradient) produces high-resolution quantitative T1 relaxation maps. This contrast is known to reflect tissue anomalies such as white matter injury in general and dysmyelination in particular. We also used diffusion tensor imaging, a quantitative technique known to reflect white matter maturation and disease.¦DESIGN/METHODS: All preterm infants born under 30 weeks of GA were included. Serial 3T MR-imaging using a neonatal head-coil at DOL 3, 10 and at term equivalent age (TEA), using DTI and MP2RAGE sequences was performed. MP2RAGE generates a T1 map and allows calculating the relaxation time T1. Multiple measurements were performed for each exam in 12 defined white and grey matter ROIs.¦RESULTS: 16 patients were recruited: mean GA 27 2/7 w (191,2d SD±10,8), mean BW 999g (SD±265). 39 MRIs were realized (12 early: mean 4,83d±1,75, 13 late: mean 18,77d±8,05 and 14 at TEA: 88,91d±8,96). Measures of relaxation time T1 show a gradual and significant decrease over time (for ROI PLIC mean±SD in ms: 2100.53±102,75, 2116,5±41,55 and 1726,42±51,31 and for ROI central WM: 2302,25±79,02, 2315,02±115,02 and 1992,7±96,37 for early, late and TEA MR respectively). These trends are also observed in grey matter area, especially in thalamus. Measurements of ADC values show similar monotonous decrease over time.¦CONCLUSIONS: From these preliminary results, we conclude that quantitative MR imaging in very preterm infants is feasible. On the successive MP2RAGE and DTI sequences, we observe a gradual decrease over time in the described ROIs, representing the progressive maturation of the WM micro-structure and interestingly the same evolution is observed in the grey matter. We speculate that our study will provide normative values for T1map and ADC and might be a predictive factor for favourable or less favourable outcome.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE: To validate a revision of the Mini Nutritional Assessment short-form (MNA(R)-SF) against the full MNA, a standard tool for nutritional evaluation. METHODS: A literature search identified studies that used the MNA for nutritional screening in geriatric patients. The contacted authors submitted original datasets that were merged into a single database. Various combinations of the questions on the current MNA-SF were tested using this database through combination analysis and ROC based derivation of classification thresholds. RESULTS: Twenty-seven datasets (n=6257 participants) were initially processed from which twelve were used in the current analysis on a sample of 2032 study participants (mean age 82.3y) with complete information on all MNA items. The original MNA-SF was a combination of six questions from the full MNA. A revised MNA-SF included calf circumference (CC) substituted for BMI performed equally well. A revised three-category scoring classification for this revised MNA-SF, using BMI and/or CC, had good sensitivity compared to the full MNA. CONCLUSION: The newly revised MNA-SF is a valid nutritional screening tool applicable to geriatric health care professionals with the option of using CC when BMI cannot be calculated. This revised MNA-SF increases the applicability of this rapid screening tool in clinical practice through the inclusion of a "malnourished" category.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Protein-energy malnutrition is highly prevalent in aged populations. Associated clinical, economic, and social burden is important. A valid screening method that would be robust and precise, but also easy, simple, and rapid to apply, is essential for adequate therapeutic management. OBJECTIVES: To compare the interobserver variability of 2 methods measuring food intake: semiquantitative visual estimations made by nurses versus calorie measurements performed by dieticians on the basis of standardized color digital photographs of servings before and after consumption. DESIGN: Observational monocentric pilot study. SETTING/PARTICIPANTS: A geriatric ward. The meals were randomly chosen from the meal tray. The choice was anonymous with respect to the patients who consumed them. MEASUREMENTS: The test method consisted of the estimation of calorie consumption by dieticians on the basis of standardized color digital photographs of servings before and after consumption. The reference method was based on direct visual estimations of the meals by nurses. Food intake was expressed in the form of a percentage of the serving consumed and calorie intake was then calculated by a dietician based on these percentages. The methods were applied with no previous training of the observers. Analysis of variance was performed to compare their interobserver variability. RESULTS: Of 15 meals consumed and initially examined, 6 were assessed with each method. Servings not consumed at all (0% consumption) or entirely consumed by the patient (100% consumption) were not included in the analysis so as to avoid systematic error. The digital photography method showed higher interobserver variability in calorie intake estimations. The difference between the compared methods was statistically significant (P < .03). CONCLUSIONS: Calorie intake measures for geriatric patients are more concordant when estimated in a semiquantitative way. Digital photography for food intake estimation without previous specific training of dieticians should not be considered as a reference method in geriatric settings, as it shows no advantages in terms of interobserver variability.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE: To evaluate the relationship between changes in body bioelectrical impedance (BI) at 0.5, 50 and kHz and the changes in body weight, as an index of total body water changes, in acutely ill surgical patients during the rapid infusion of isotonic saline solution. DESIGN: Prospective clinical study. SETTING: Multidisciplinary surgical ICU in a university hospital. PATIENTS: Twelve male patients treated for acute surgical illness (multiple trauma n = 5, major surgery n = 7). Selection criteria: stable cardiovascular parameters, normal cardiac function, signs of hypovolemia (CVP < or = 5 mmHg, urine output < 1 ml/kg x h). INTERVENTIONS: After baseline measurements, a 60 min fluid challenge test was performed with normal saline solution, 0.25 ml/kg/min [corrected]. MEASUREMENTS AND RESULTS: Body weight (platform digital scale), total body impedance (four-surface electrode technique; measurements at 0.5, 50 and 100 kHz) and urine output. Fluid retention induced a progressive decrease in BI at 0.5, 50 and 100 kHz, but the changes were significant for BI 0.5 and BI 100 only, from 40 min after the beginning of the fluid therapy onwards. There was a significant negative correlation between changes in water retention and BI 0.5, with individual correlation coefficients ranging from -0.72 to 0.95 (p < 0.01-0.0001). The slopes of the regression lines indicated that for each kg of water change, there was a mean decrease in BI of 18 ohm, but a substantial inter-individual variability was noted. CONCLUSION: BI measured at low frequency can represent a valuable index of acute changes in body water in a group of surgical patients but not in a given individual.