148 resultados para Help sources
Resumo:
L'imagerie par résonance magnétique (IRM) peut fournir aux cardiologues des informations diagnostiques importantes sur l'état de la maladie de l'artère coronarienne dans les patients. Le défi majeur pour l'IRM cardiaque est de gérer toutes les sources de mouvement qui peuvent affecter la qualité des images en réduisant l'information diagnostique. Cette thèse a donc comme but de développer des nouvelles techniques d'acquisitions des images IRM, en changeant les techniques de compensation du mouvement, pour en augmenter l'efficacité, la flexibilité, la robustesse et pour obtenir plus d'information sur le tissu et plus d'information temporelle. Les techniques proposées favorisent donc l'avancement de l'imagerie des coronaires dans une direction plus maniable et multi-usage qui peut facilement être transférée dans l'environnement clinique. La première partie de la thèse s'est concentrée sur l'étude du mouvement des artères coronariennes sur des patients en utilisant la techniques d'imagerie standard (rayons x), pour mesurer la précision avec laquelle les artères coronariennes retournent dans la même position battement après battement (repositionnement des coronaires). Nous avons découvert qu'il y a des intervalles dans le cycle cardiaque, tôt dans la systole et à moitié de la diastole, où le repositionnement des coronaires est au minimum. En réponse nous avons développé une nouvelle séquence d'acquisition (T2-post) capable d'acquérir les données aussi tôt dans la systole. Cette séquence a été testée sur des volontaires sains et on a pu constater que la qualité de visualisation des artère coronariennes est égale à celle obtenue avec les techniques standard. De plus, le rapport signal sur bruit fourni par la séquence d'acquisition proposée est supérieur à celui obtenu avec les techniques d'imagerie standard. La deuxième partie de la thèse a exploré un paradigme d'acquisition des images cardiaques complètement nouveau pour l'imagerie du coeur entier. La technique proposée dans ce travail acquiert les données sans arrêt (free-running) au lieu d'être synchronisée avec le mouvement cardiaque. De cette façon, l'efficacité de la séquence d'acquisition est augmentée de manière significative et les images produites représentent le coeur entier dans toutes les phases cardiaques (quatre dimensions, 4D). Par ailleurs, l'auto-navigation de la respiration permet d'effectuer cette acquisition en respiration libre. Cette technologie rend possible de visualiser et évaluer l'anatomie du coeur et de ses vaisseaux ainsi que la fonction cardiaque en quatre dimensions et avec une très haute résolution spatiale et temporelle, sans la nécessité d'injecter un moyen de contraste. Le pas essentiel qui a permis le développement de cette technique est l'utilisation d'une trajectoire d'acquisition radiale 3D basée sur l'angle d'or. Avec cette trajectoire, il est possible d'acquérir continûment les données d'espace k, puis de réordonner les données et choisir les paramètres temporel des images 4D a posteriori. L'acquisition 4D a été aussi couplée avec un algorithme de reconstructions itératif (compressed sensing) qui permet d'augmenter la résolution temporelle tout en augmentant la qualité des images. Grâce aux images 4D, il est possible maintenant de visualiser les artères coronariennes entières dans chaque phase du cycle cardiaque et, avec les mêmes données, de visualiser et mesurer la fonction cardiaque. La qualité des artères coronariennes dans les images 4D est la même que dans les images obtenues avec une acquisition 3D standard, acquise en diastole Par ailleurs, les valeurs de fonction cardiaque mesurées au moyen des images 4D concorde avec les valeurs obtenues avec les images 2D standard. Finalement, dans la dernière partie de la thèse une technique d'acquisition a temps d'écho ultra-court (UTE) a été développée pour la visualisation in vivo des calcifications des artères coronariennes. Des études récentes ont démontré que les acquisitions UTE permettent de visualiser les calcifications dans des plaques athérosclérotiques ex vivo. Cepandent le mouvement du coeur a entravé jusqu'à maintenant l'utilisation des techniques UTE in vivo. Pour résoudre ce problème nous avons développé une séquence d'acquisition UTE avec trajectoire radiale 3D et l'avons testée sur des volontaires. La technique proposée utilise une auto-navigation 3D pour corriger le mouvement respiratoire et est synchronisée avec l'ECG. Trois échos sont acquis pour extraire le signal de la calcification avec des composants au T2 très court tout en permettant de séparer le signal de la graisse depuis le signal de l'eau. Les résultats sont encore préliminaires mais on peut affirmer que la technique développé peut potentiellement montrer les calcifications des artères coronariennes in vivo. En conclusion, ce travail de thèse présente trois nouvelles techniques pour l'IRM du coeur entier capables d'améliorer la visualisation et la caractérisation de la maladie athérosclérotique des coronaires. Ces techniques fournissent des informations anatomiques et fonctionnelles en quatre dimensions et des informations sur la composition du tissu auparavant indisponibles. CORONARY artery magnetic resonance imaging (MRI) has the potential to provide the cardiologist with relevant diagnostic information relative to coronary artery disease of patients. The major challenge of cardiac MRI, though, is dealing with all sources of motions that can corrupt the images affecting the diagnostic information provided. The current thesis, thus, focused on the development of new MRI techniques that change the standard approach to cardiac motion compensation in order to increase the efficiency of cardioavscular MRI, to provide more flexibility and robustness, new temporal information and new tissue information. The proposed approaches help in advancing coronary magnetic resonance angiography (MRA) in the direction of an easy-to-use and multipurpose tool that can be translated to the clinical environment. The first part of the thesis focused on the study of coronary artery motion through gold standard imaging techniques (x-ray angiography) in patients, in order to measure the precision with which the coronary arteries assume the same position beat after beat (coronary artery repositioning). We learned that intervals with minimal coronary artery repositioning occur in peak systole and in mid diastole and we responded with a new pulse sequence (T2~post) that is able to provide peak-systolic imaging. Such a sequence was tested in healthy volunteers and, from the image quality comparison, we learned that the proposed approach provides coronary artery visualization and contrast-to-noise ratio (CNR) comparable with the standard acquisition approach, but with increased signal-to-noise ratio (SNR). The second part of the thesis explored a completely new paradigm for whole- heart cardiovascular MRI. The proposed techniques acquires the data continuously (free-running), instead of being triggered, thus increasing the efficiency of the acquisition and providing four dimensional images of the whole heart, while respiratory self navigation allows for the scan to be performed in free breathing. This enabling technology allows for anatomical and functional evaluation in four dimensions, with high spatial and temporal resolution and without the need for contrast agent injection. The enabling step is the use of a golden-angle based 3D radial trajectory, which allows for a continuous sampling of the k-space and a retrospective selection of the timing parameters of the reconstructed dataset. The free-running 4D acquisition was then combined with a compressed sensing reconstruction algorithm that further increases the temporal resolution of the 4D dataset, while at the same time increasing the overall image quality by removing undersampling artifacts. The obtained 4D images provide visualization of the whole coronary artery tree in each phases of the cardiac cycle and, at the same time, allow for the assessment of the cardiac function with a single free- breathing scan. The quality of the coronary arteries provided by the frames of the free-running 4D acquisition is in line with the one obtained with the standard ECG-triggered one, and the cardiac function evaluation matched the one measured with gold-standard stack of 2D cine approaches. Finally, the last part of the thesis focused on the development of ultrashort echo time (UTE) acquisition scheme for in vivo detection of calcification in the coronary arteries. Recent studies showed that UTE imaging allows for the coronary artery plaque calcification ex vivo, since it is able to detect the short T2 components of the calcification. The heart motion, though, prevented this technique from being applied in vivo. An ECG-triggered self-navigated 3D radial triple- echo UTE acquisition has then been developed and tested in healthy volunteers. The proposed sequence combines a 3D self-navigation approach with a 3D radial UTE acquisition enabling data collection during free breathing. Three echoes are simultaneously acquired to extract the short T2 components of the calcification while a water and fat separation technique allows for proper visualization of the coronary arteries. Even though the results are still preliminary, the proposed sequence showed great potential for the in vivo visualization of coronary artery calcification. In conclusion, the thesis presents three novel MRI approaches aimed at improved characterization and assessment of atherosclerotic coronary artery disease. These approaches provide new anatomical and functional information in four dimensions, and support tissue characterization for coronary artery plaques.
Resumo:
Ingvaldsen et al. comment on our study assessing global fish interchanges between the North Atlantic and Pacific oceans for more than 500 species during the entire 21st century. They propose that discrepancies between our model projections and observed data for cod in the Barents Sea are the result of the choice of Atmosphere-Ocean General Circulation Models (AOGCMs). We address this assertion here, re-running the cod model with additional observation data from the Barents Sea1, 3, and show that the lack of open-access, archived data for the Barents Sea was the primary cause of local prediction mismatch. This finding recalls the importance of systematic deposit of biodiversity data in global databases
Resumo:
Le traitement de radiochirurgie par Gamma Knife (GK) est utilisé de plus en plus souvent comme une alternative à la microchirurgie conventionnelle pour le traitement des pathologies neurochirurgicales intracrâniennes. Il s'agit d'irradier en dose unique et à haute énergie, en condition stéréotaxique et à l'aide d'une imagerie multimodale (imagerie par résonance magnétique [IRM], tomodensitométrie et éventuellement artériographie). Le GK a été inventé par le neurochirurgien suédois Lars Leksell, qui a réalisé le premier ciblage du nerf trijumeau en 1951, sur la base d'une radiographie standard. Depuis, les progrès de l'informatique et de la robotique ont permis d'améliorer la technique de radiochirurgie qui s'effectue actuellement soit par accélérateur linéaire de particules monté sur un bras robotisé (Novalis®, Cyberknife®), soit par collimation de près de 192 sources fixes (GK). La principale indication radiochirurgicale dans le traitement de la douleur est la névralgie du nerf trijumeau. Les autres indications, plus rares, sont la névralgie du nerf glossopharyngien, l'algie vasculaire de la face, ainsi qu'un traitement de la douleur d'origine cancéreuse par hypophysiolyse. Gamma Knife surgery (GKS) is widely used as an alternative to open microsurgical procedures as noninvasive treatment of many intracranial conditions. It consists of delivering a single dose of high energy in stereotactic conditions, and with the help of a multimodal imaging (e.g., magnetic resonance imaging [MRI], computer tomography, and eventually angiography). The Gamma Knife (GK) was invented by the Swedish neurosurgeon Lars Leksell who was the first to treat a trigeminal neuralgia sufferer in 1951 using an orthogonal X-ray tube. Since then, the progresses made both in the field of informatics and robotics have allowed to improve the radiosurgical technique, which is currently performed either by a linear accelerator of particles mounted on a robotized arm (Novalis®, Cyberknife®), or by collimation of 192 fixed Co-60 sources (GK). The main indication of GKS in the treatment of pain is trigeminal neuralgia. The other indications, less frequent, are: glossopharyngeal neuralgia, cluster headache, and hypophysiolyse for cancer pain.
Resumo:
OBJECTIVE: To identify and quantify sources of variability in scores on the speech, spatial, and qualities of hearing scale (SSQ) and its short forms among normal-hearing and hearing-impaired subjects using a French-language version of the SSQ. DESIGN: Multi-regression analyses of SSQ scores were performed using age, gender, years of education, hearing loss, and hearing-loss asymmetry as predictors. Similar analyses were performed for each subscale (Speech, Spatial, and Qualities), for several SSQ short forms, and for differences in subscale scores. STUDY SAMPLE: One hundred normal-hearing subjects (NHS) and 230 hearing-impaired subjects (HIS). RESULTS: Hearing loss in the better ear and hearing-loss asymmetry were the two main predictors of scores on the overall SSQ, the three main subscales, and the SSQ short forms. The greatest difference between the NHS and HIS was observed for the Speech subscale, and the NHS showed scores well below the maximum of 10. An age effect was observed mostly on the Speech subscale items, and the number of years of education had a significant influence on several Spatial and Qualities subscale items. CONCLUSION: Strong similarities between SSQ scores obtained across different populations and languages, and between SSQ and short forms, underline their potential international use.
Resumo:
Following their detection and seizure by police and border guard authorities, false identity and travel documents are usually scanned, producing digital images. This research investigates the potential of these images to classify false identity documents, highlight links between documents produced by a same modus operandi or same source, and thus support forensic intelligence efforts. Inspired by previous research work about digital images of Ecstasy tablets, a systematic and complete method has been developed to acquire, collect, process and compare images of false identity documents. This first part of the article highlights the critical steps of the method and the development of a prototype that processes regions of interest extracted from images. Acquisition conditions have been fine-tuned in order to optimise reproducibility and comparability of images. Different filters and comparison metrics have been evaluated and the performance of the method has been assessed using two calibration and validation sets of documents, made up of 101 Italian driving licenses and 96 Portuguese passports seized in Switzerland, among which some were known to come from common sources. Results indicate that the use of Hue and Edge filters or their combination to extract profiles from images, and then the comparison of profiles with a Canberra distance-based metric provides the most accurate classification of documents. The method appears also to be quick, efficient and inexpensive. It can be easily operated from remote locations and shared amongst different organisations, which makes it very convenient for future operational applications. The method could serve as a first fast triage method that may help target more resource-intensive profiling methods (based on a visual, physical or chemical examination of documents for instance). Its contribution to forensic intelligence and its application to several sets of false identity documents seized by police and border guards will be developed in a forthcoming article (part II).
Resumo:
(ENGLISH VERSION BELOW) En 1780, le médecin Jean-André Venel fonde à Orbe, dans le canton de Vaud, le premier institut orthopédique connu dans le monde, proposant une version clinique d'un savoir-faire médical ancestral. A travers des sources qui réactualisent les travaux consacrés à Venel, cet article retrace les origines de son institution et de sa pensée médicale, dans un contexte de production et de diffusion d'un savoir particulier en termes de technique du corps et de médecine de l'enfant. Revisitant la figure légendaire - ou mythique ? - de ce que l'histoire de la médecine a retenu comme étant le « père de l'orthopédie », l'article s'interroge par la même occasion sur les conditions d'émergence d'une spécialité médicale au sortir de l'Ancien Régime, et de son impact dans les premières décennies du XIXe siècle. In 1780, the physician Jean-André Venel creates in Orbe (canton of Vaud) the first orthopedic institute of the world, offering a clinical version of an ancient medical savoir-faire. By using sources that enable us to update the scholarship on Venel, this article traces the origins of his institute and of his medical thought, in the context of the production and diffusion of a specialized knowledge on the body and on children. With this new perspective on the legendary, if not mythical, figure, whom the history of medicine has canonized as the "father of orthopedia", this article also examines the conditions of emergence of a medical specialization at the end of the Ancien Régime and its impact in the first decades of the nineteenth century.
Resumo:
Given their central role in mercury (Hg) excretion and suitability as reservoirs, bird feathers are useful Hg biomonitors. Nevertheless, the interpretation of Hg concentrations is still questioned as a result of a poor knowledge of feather physiology and mechanisms affecting Hg deposition. Given the constraints of feather availability to ecotoxicological studies, we tested the effect of intra-individual differences in Hg concentrations according to feather type (body vs. flight feathers), position in the wing and size (mass and length) in order to understand how these factors could affect Hg estimates. We measured Hg concentration of 154 feathers from 28 un-moulted barn owls (Tyto alba), collected dead on roadsides. Median Hg concentration was 0.45 (0.076-4.5) mg kg(-1) in body feathers, 0.44 (0.040-4.9) mg kg(-1) in primary and 0.60 (0.042-4.7) mg kg(-1) in secondary feathers, and we found a poor effect of feather type on intra-individual Hg levels. We also found a negative effect of wing feather mass on Hg concentration but not of feather length and of its position in the wing. We hypothesize that differences in feather growth rate may be the main driver of between-feather differences in Hg concentrations, which can have implications in the interpretation of Hg concentrations in feathers. Finally, we recommend that, whenever possible, several feathers from the same individual should be analysed. The five innermost primaries have lowest mean deviations to both between-feather and intra-individual mean Hg concentration and thus should be selected under restrictive sampling scenarios.
Resumo:
Determining the appropriate level of integration is crucial to realizing value from acquisitions. Most prior research assumes that higher integration implies the removal of autonomy from target managers, which in turn undermines the functioning of the target firm if it entails unfamiliar elements for the acquirer. Using a survey of 86 acquisitions to obtain the richness of detail necessary to distinguish integration from autonomy, the authors argue and find that integration and autonomy are not the opposite ends of a single continuum. Certain conditions (e.g., when complementarity rather than similarity is the primary source of synergy) lead to high levels of both integration and autonomy. In addition, similarity negatively moderates the relationship between complementarity and autonomy when the target offers both synergy sources. In contrast, similarity does not moderate the link between complementarity and integration. The authors' findings advance scholarly understanding about the drivers of implementation strategy and in particular the different implementation strategies acquiring managers deploy when they attempt to leverage complementarities, similarities, or both.
Resumo:
Illicit drug analyses usually focus on the identification and quantitation of questioned material to support the judicial process. In parallel, more and more laboratories develop physical and chemical profiling methods in a forensic intelligence perspective. The analysis of large databases resulting from this approach enables not only to draw tactical and operational intelligence, but may also contribute to the strategic overview of drugs markets. In Western Switzerland, the chemical analysis of illicit drug seizures is centralised in a laboratory hosted by the University of Lausanne. For over 8 years, this laboratory has analysed 5875 cocaine and 2728 heroin specimens, coming from respectively 1138 and 614 seizures operated by police and border guards or customs. Chemical (major and minor alkaloids, purity, cutting agents, chemical class), physical (packaging and appearance) as well as circumstantial (criminal case number, mass of drug seized, date and place of seizure) information are collated in a dedicated database for each specimen. The study capitalises on this extended database and defines several indicators to characterise the structure of drugs markets, to follow-up on their evolution and to compare cocaine and heroin markets. Relational, spatial, temporal and quantitative analyses of data reveal the emergence and importance of distribution networks. They enable to evaluate the cross-jurisdictional character of drug trafficking and the observation time of drug batches, as well as the quantity of drugs entering the market every year. Results highlight the stable nature of drugs markets over the years despite the very dynamic flows of distribution and consumption. This research work illustrates how the systematic analysis of forensic data may elicit knowledge on criminal activities at a strategic level. In combination with information from other sources, such knowledge can help to devise intelligence-based preventive and repressive measures and to discuss the impact of countermeasures.
Resumo:
En collectant plus de deux millions de tweets reliés au centenaire de la Grande Guerre, de nombreuses questions méthodologiques se sont posées, interrogeant par exemple la notion de corpus, les relations entre historien.ne.s et archivistes, le traitement du passé à une ère de données massives. Cette intervention se penche sur l'une de ces questions: comment fonder une recherche sur des sources primaires en flux? Comment résoudre la contradiction inhérente entre l'archive, réputée figée, et les données nées numériques qui sont émises en flux?