106 resultados para Distance-based techniques


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Screening people without symptoms of disease is an attractive idea. Screening allows early detection of disease or elevated risk of disease, and has the potential for improved treatment and reduction of mortality. The list of future screening opportunities is set to grow because of the refinement of screening techniques, the increasing frequency of degenerative and chronic diseases, and the steadily growing body of evidence on genetic predispositions for various diseases. But how should we decide on the diseases for which screening should be done and on recommendations for how it should be implemented? We use the examples of prostate cancer and genetic screening to show the importance of considering screening as an ongoing population-based intervention with beneficial and harmful effects, and not simply the use of a test. Assessing whether screening should be recommended and implemented for any named disease is therefore a multi-dimensional task in health technology assessment. There are several countries that already use established processes and criteria to assess the appropriateness of screening. We argue that the Swiss healthcare system needs a nationwide screening commission mandated to conduct appropriate evidence-based evaluation of the impact of proposed screening interventions, to issue evidence-based recommendations, and to monitor the performance of screening programmes introduced. Without explicit processes there is a danger that beneficial screening programmes could be neglected and that ineffective, and potentially harmful, screening procedures could be introduced.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Rapid amplification of cDNA ends (RACE) is a widely used approach for transcript identification. Random clone selection from the RACE mixture, however, is an ineffective sampling strategy if the dynamic range of transcript abundances is large. To improve sampling efficiency of human transcripts, we hybridized the products of the RACE reaction onto tiling arrays and used the detected exons to delineate a series of reverse-transcriptase (RT)-PCRs, through which the original RACE transcript population was segregated into simpler transcript populations. We independently cloned the products and sequenced randomly selected clones. This approach, RACEarray, is superior to direct cloning and sequencing of RACE products because it specifically targets new transcripts and often results in overall normalization of transcript abundance. We show theoretically and experimentally that this strategy leads indeed to efficient sampling of new transcripts, and we investigated multiplexing the strategy by pooling RACE reactions from multiple interrogated loci before hybridization.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Data characteristics and species traits are expected to influence the accuracy with which species' distributions can be modeled and predicted. We compare 10 modeling techniques in terms of predictive power and sensitivity to location error, change in map resolution, and sample size, and assess whether some species traits can explain variation in model performance. We focused on 30 native tree species in Switzerland and used presence-only data to model current distribution, which we evaluated against independent presence-absence data. While there are important differences between the predictive performance of modeling methods, the variance in model performance is greater among species than among techniques. Within the range of data perturbations in this study, some extrinsic parameters of data affect model performance more than others: location error and sample size reduced performance of many techniques, whereas grain had little effect on most techniques. No technique can rescue species that are difficult to predict. The predictive power of species-distribution models can partly be predicted from a series of species characteristics and traits based on growth rate, elevational distribution range, and maximum elevation. Slow-growing species or species with narrow and specialized niches tend to be better modeled. The Swiss presence-only tree data produce models that are reliable enough to be useful in planning and management applications.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Oculo-auriculo-vertebral spectrum is a complex developmental disorder characterised mainly by anomalies of the ear, hemifacial microsomia, epibulbar dermoids and vertebral anomalies. The aetiology is largely unknown, and the epidemiological data are limited and inconsistent. We present the largest population-based epidemiological study to date, using data provided by the large network of congenital anomalies registries in Europe. The study population included infants diagnosed with oculo-auriculo-vertebral spectrum during the 1990-2009 period from 34 registries active in 16 European countries. Of the 355 infants diagnosed with oculo-auriculo-vertebral spectrum, there were 95.8% (340/355) live born, 0.8% (3/355) fetal deaths, 3.4% (12/355) terminations of pregnancy for fetal anomaly and 1.5% (5/340) neonatal deaths. In 18.9%, there was prenatal detection of anomaly/anomalies associated with oculo-auriculo-vertebral spectrum, 69.7% were diagnosed at birth, 3.9% in the first week of life and 6.1% within 1 year of life. Microtia (88.8%), hemifacial microsomia (49.0%) and ear tags (44.4%) were the most frequent anomalies, followed by atresia/stenosis of external auditory canal (25.1%), diverse vertebral (24.3%) and eye (24.3%) anomalies. There was a high rate (69.5%) of associated anomalies of other organs/systems. The most common were congenital heart defects present in 27.8% of patients. The prevalence of oculo-auriculo-vertebral spectrum, defined as microtia/ear anomalies and at least one major characteristic anomaly, was 3.8 per 100,000 births. Twinning, assisted reproductive techniques and maternal pre-pregnancy diabetes were confirmed as risk factors. The high rate of different associated anomalies points to the need of performing an early ultrasound screening in all infants born with this disorder.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Diffusion MRI has evolved towards an important clinical diagnostic and research tool. Though clinical routine is using mainly diffusion weighted and tensor imaging approaches, Q-ball imaging and diffusion spectrum imaging techniques have become more widely available. They are frequently used in research-oriented investigations in particular those aiming at measuring brain network connectivity. In this work, we aim at assessing the dependency of connectivity measurements on various diffusion encoding schemes in combination with appropriate data modeling. We process and compare the structural connection matrices computed from several diffusion encoding schemes, including diffusion tensor imaging, q-ball imaging and high angular resolution schemes, such as diffusion spectrum imaging with a publically available processing pipeline for data reconstruction, tracking and visualization of diffusion MR imaging. The results indicate that the high angular resolution schemes maximize the number of obtained connections when applying identical processing strategies to the different diffusion schemes. Compared to the conventional diffusion tensor imaging, the added connectivity is mainly found for pathways in the 50-100mm range, corresponding to neighboring association fibers and long-range associative, striatal and commissural fiber pathways. The analysis of the major associative fiber tracts of the brain reveals striking differences between the applied diffusion schemes. More complex data modeling techniques (beyond tensor model) are recommended 1) if the tracts of interest run through large fiber crossings such as the centrum semi-ovale, or 2) if non-dominant fiber populations, e.g. the neighboring association fibers are the subject of investigation. An important finding of the study is that since the ground truth sensitivity and specificity is not known, the comparability between results arising from different strategies in data reconstruction and/or tracking becomes implausible to understand.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Previous studies on childhood cancer and nuclear power plants (NPPs) produced conflicting results. We used a cohort approach to examine whether residence near NPPs was associated with leukaemia or any childhood cancer in Switzerland. METHODS: We computed person-years at risk for children aged 0-15 years born in Switzerland from 1985 to 2009, based on the Swiss censuses 1990 and 2000 and identified cancer cases from the Swiss Childhood Cancer Registry. We geo-coded place of residence at birth and calculated incidence rate ratios (IRRs) with 95% confidence intervals (CIs) comparing the risk of cancer in children born <5 km, 5-10 km and 10-15 km from the nearest NPP with children born >15 km away, using Poisson regression models. RESULTS: We included 2925 children diagnosed with cancer during 21 117 524 person-years of follow-up; 953 (32.6%) had leukaemia. Eight and 12 children diagnosed with leukaemia at ages 0-4 and 0-15 years, and 18 and 31 children diagnosed with any cancer were born <5 km from a NPP. Compared with children born >15 km away, the IRRs (95% CI) for leukaemia in 0-4 and 0-15 year olds were 1.20 (0.60-2.41) and 1.05 (0.60-1.86), respectively. For any cancer, corresponding IRRs were 0.97 (0.61-1.54) and 0.89 (0.63-1.27). There was no evidence of a dose-response relationship with distance (P > 0.30). Results were similar for residence at diagnosis and at birth, and when adjusted for potential confounders. Results from sensitivity analyses were consistent with main results. CONCLUSIONS: This nationwide cohort study found little evidence of an association between residence near NPPs and the risk of leukaemia or any childhood cancer.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

PURPOSE OF REVIEW: Recent findings in the physiology and neurobiology of ejaculation have expanded our understanding of male sexual function and have allowed the development of new instruments to investigate ejaculatory and orgasmic disorders. RECENT FINDINGS: The evidence-based definition of lifelong premature ejaculation has set a model in the evaluation and treatment outcome of sexual dysfunction. New instruments to objectively assess arousal, orgasm and the expulsion phase of ejaculation such as functional MRI, dynamic pelvic ultrasound, PET scans and validated questionnaires have lead to a better understanding of sexual dysfunction in men. Animal models, developments in neurobiology and clinical experience have transformed a purely psychoanalytical approach to ejaculatory and orgasmic function into a novel multidisciplinary, scientifically sound and evidence-based discipline of medicine. SUMMARY: Ejaculation is an integral part of normal sexual function. Ejaculatory dysfunction is common and may cause substantial disruption to the quality of a patient's life. A better understanding of the epidemiology, pathophysiology, neuroscience and genetics of ejaculatory and orgasmic function will eventually lead to the development of new, effective methods of treatment of disorders of ejaculation and orgasm in men.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

PURPOSE: To describe the anatomical characteristics and patterns of neurovascular compression in patients suffering classic trigeminal neuralgia (CTN), using high-resolution magnetic resonance imaging (MRI). MATERIALS AND METHODS: The analysis of the anatomy of the trigeminal nerve, brain stem and the vascular structures related to this nerve was made in 100 consecutive patients treated with a Gamma Knife radiosurgery for CTN between December 1999 and September 2004. MRI studies (T1, T1 enhanced and T2-SPIR) with axial, coronal and sagital simultaneous visualization were dynamically assessed using the software GammaPlan?. Three-dimensional reconstructions were also developed in some representative cases. RESULTS: In 93 patients (93%), there were one or several vascular structures in contact, either, with the trigeminal nerve, or close to its origin in the pons. The superior cerebellar artery was involved in 71 cases (76%). Other vessels identified were the antero-inferior cerebellar artery, the basilar artery, the vertebral artery, and some venous structures. Vascular compression was found anywhere along the trigeminal nerve. The mean distance between the nerve compression and the origin of the nerve in the brainstem was 3.76±2.9mm (range 0-9.8mm). In 39 patients (42%), the vascular compression was located proximally and in 42 (45%) the compression was located distally. Nerve dislocation or distortion by the vessel was observed in 30 cases (32%). CONCLUSIONS: The findings of this study are similar to those reported in surgical and autopsy series. This non-invasive MRI-based approach could be useful for diagnostic and therapeutic decisions in CTN, and it could help to understand its pathogenesis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: In order to provide a cost-effective tool to analyse pharmacogenetic markers in malaria treatment, DNA microarray technology was compared with sequencing of polymerase chain reaction (PCR) fragments to detect single nucleotide polymorphisms (SNPs) in a larger number of samples. Methods: The microarray was developed to affordably generate SNP data of genes encoding the human cytochrome P450 enzyme family (CYP) and N-acetyltransferase-2 (NAT2) involved in antimalarial drug metabolisms and with known polymorphisms, i.e. CYP2A6, CYP2B6, CYP2C8, CYP2C9, CYP2C19, CYP2D6, CYP3A4, CYP3A5, and NAT2. Results: For some SNPs, i.e. CYP2A6*2, CYP2B6*5, CYP2C8*3, CYP2C9*3/*5, CYP2C19*3, CYP2D6*4 and NAT2*6/*7/*14, agreement between both techniques ranged from substantial to almost perfect (kappa index between 0.61 and 1.00), whilst for other SNPs a large variability from slight to substantial agreement (kappa index between 0.39 and 1.00) was found, e. g. CYP2D6*17 (2850C>T), CYP3A4*1B and CYP3A5*3. Conclusion: The major limit of the microarray technology for this purpose was lack of robustness and with a large number of missing data or with incorrect specificity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Résumé: L'automatisation du séquençage et de l'annotation des génomes, ainsi que l'application à large échelle de méthodes de mesure de l'expression génique, génèrent une quantité phénoménale de données pour des organismes modèles tels que l'homme ou la souris. Dans ce déluge de données, il devient très difficile d'obtenir des informations spécifiques à un organisme ou à un gène, et une telle recherche aboutit fréquemment à des réponses fragmentées, voir incomplètes. La création d'une base de données capable de gérer et d'intégrer aussi bien les données génomiques que les données transcriptomiques peut grandement améliorer la vitesse de recherche ainsi que la qualité des résultats obtenus, en permettant une comparaison directe de mesures d'expression des gènes provenant d'expériences réalisées grâce à des techniques différentes. L'objectif principal de ce projet, appelé CleanEx, est de fournir un accès direct aux données d'expression publiques par le biais de noms de gènes officiels, et de représenter des données d'expression produites selon des protocoles différents de manière à faciliter une analyse générale et une comparaison entre plusieurs jeux de données. Une mise à jour cohérente et régulière de la nomenclature des gènes est assurée en associant chaque expérience d'expression de gène à un identificateur permanent de la séquence-cible, donnant une description physique de la population d'ARN visée par l'expérience. Ces identificateurs sont ensuite associés à intervalles réguliers aux catalogues, en constante évolution, des gènes d'organismes modèles. Cette procédure automatique de traçage se fonde en partie sur des ressources externes d'information génomique, telles que UniGene et RefSeq. La partie centrale de CleanEx consiste en un index de gènes établi de manière hebdomadaire et qui contient les liens à toutes les données publiques d'expression déjà incorporées au système. En outre, la base de données des séquences-cible fournit un lien sur le gène correspondant ainsi qu'un contrôle de qualité de ce lien pour différents types de ressources expérimentales, telles que des clones ou des sondes Affymetrix. Le système de recherche en ligne de CleanEx offre un accès aux entrées individuelles ainsi qu'à des outils d'analyse croisée de jeux de donnnées. Ces outils se sont avérés très efficaces dans le cadre de la comparaison de l'expression de gènes, ainsi que, dans une certaine mesure, dans la détection d'une variation de cette expression liée au phénomène d'épissage alternatif. Les fichiers et les outils de CleanEx sont accessibles en ligne (http://www.cleanex.isb-sib.ch/). Abstract: The automatic genome sequencing and annotation, as well as the large-scale gene expression measurements methods, generate a massive amount of data for model organisms. Searching for genespecific or organism-specific information througout all the different databases has become a very difficult task, and often results in fragmented and unrelated answers. The generation of a database which will federate and integrate genomic and transcriptomic data together will greatly improve the search speed as well as the quality of the results by allowing a direct comparison of expression results obtained by different techniques. The main goal of this project, called the CleanEx database, is thus to provide access to public gene expression data via unique gene names and to represent heterogeneous expression data produced by different technologies in a way that facilitates joint analysis and crossdataset comparisons. A consistent and uptodate gene nomenclature is achieved by associating each single gene expression experiment with a permanent target identifier consisting of a physical description of the targeted RNA population or the hybridization reagent used. These targets are then mapped at regular intervals to the growing and evolving catalogues of genes from model organisms, such as human and mouse. The completely automatic mapping procedure relies partly on external genome information resources such as UniGene and RefSeq. The central part of CleanEx is a weekly built gene index containing crossreferences to all public expression data already incorporated into the system. In addition, the expression target database of CleanEx provides gene mapping and quality control information for various types of experimental resources, such as cDNA clones or Affymetrix probe sets. The Affymetrix mapping files are accessible as text files, for further use in external applications, and as individual entries, via the webbased interfaces . The CleanEx webbased query interfaces offer access to individual entries via text string searches or quantitative expression criteria, as well as crossdataset analysis tools, and crosschip gene comparison. These tools have proven to be very efficient in expression data comparison and even, to a certain extent, in detection of differentially expressed splice variants. The CleanEx flat files and tools are available online at: http://www.cleanex.isbsib. ch/.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Genetics is the study of heredity, which means the study of genes and factors related to all aspects of genes. The scientific history of genetics began with the works of Gregor Mendel in the mid-19th century. Prior to Mendel, genetics was primarily theoretical whilst, after Mendel, the science of genetics was broadened to include experimental genetics. Developments in all fields of genetics and genetic technology in the first half of the 20th century provided a basis for the later developments. In the second half of the 20th century, the molecular background of genetics has become more understandable. Rapid technological advancements, followed by the completion of Human Genome Project, have contributed a great deal to the knowledge of genetic factors and their impact on human life and diseases. Currently, more than 1800 disease genes have been identified, more than 2000 genetic tests have become available, and in conjunction with this at least 350 biotechnology-based products have been released onto the market. Novel technologies, particularly next generation sequencing, have dramatically accelerated the pace of biological research, while at the same time increasing expectations. In this paper, a brief summary of genetic history with short explanations of most popular genetic techniques is given.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

L'imagerie par résonance magnétique (IRM) peut fournir aux cardiologues des informations diagnostiques importantes sur l'état de la maladie de l'artère coronarienne dans les patients. Le défi majeur pour l'IRM cardiaque est de gérer toutes les sources de mouvement qui peuvent affecter la qualité des images en réduisant l'information diagnostique. Cette thèse a donc comme but de développer des nouvelles techniques d'acquisitions des images IRM, en changeant les techniques de compensation du mouvement, pour en augmenter l'efficacité, la flexibilité, la robustesse et pour obtenir plus d'information sur le tissu et plus d'information temporelle. Les techniques proposées favorisent donc l'avancement de l'imagerie des coronaires dans une direction plus maniable et multi-usage qui peut facilement être transférée dans l'environnement clinique. La première partie de la thèse s'est concentrée sur l'étude du mouvement des artères coronariennes sur des patients en utilisant la techniques d'imagerie standard (rayons x), pour mesurer la précision avec laquelle les artères coronariennes retournent dans la même position battement après battement (repositionnement des coronaires). Nous avons découvert qu'il y a des intervalles dans le cycle cardiaque, tôt dans la systole et à moitié de la diastole, où le repositionnement des coronaires est au minimum. En réponse nous avons développé une nouvelle séquence d'acquisition (T2-post) capable d'acquérir les données aussi tôt dans la systole. Cette séquence a été testée sur des volontaires sains et on a pu constater que la qualité de visualisation des artère coronariennes est égale à celle obtenue avec les techniques standard. De plus, le rapport signal sur bruit fourni par la séquence d'acquisition proposée est supérieur à celui obtenu avec les techniques d'imagerie standard. La deuxième partie de la thèse a exploré un paradigme d'acquisition des images cardiaques complètement nouveau pour l'imagerie du coeur entier. La technique proposée dans ce travail acquiert les données sans arrêt (free-running) au lieu d'être synchronisée avec le mouvement cardiaque. De cette façon, l'efficacité de la séquence d'acquisition est augmentée de manière significative et les images produites représentent le coeur entier dans toutes les phases cardiaques (quatre dimensions, 4D). Par ailleurs, l'auto-navigation de la respiration permet d'effectuer cette acquisition en respiration libre. Cette technologie rend possible de visualiser et évaluer l'anatomie du coeur et de ses vaisseaux ainsi que la fonction cardiaque en quatre dimensions et avec une très haute résolution spatiale et temporelle, sans la nécessité d'injecter un moyen de contraste. Le pas essentiel qui a permis le développement de cette technique est l'utilisation d'une trajectoire d'acquisition radiale 3D basée sur l'angle d'or. Avec cette trajectoire, il est possible d'acquérir continûment les données d'espace k, puis de réordonner les données et choisir les paramètres temporel des images 4D a posteriori. L'acquisition 4D a été aussi couplée avec un algorithme de reconstructions itératif (compressed sensing) qui permet d'augmenter la résolution temporelle tout en augmentant la qualité des images. Grâce aux images 4D, il est possible maintenant de visualiser les artères coronariennes entières dans chaque phase du cycle cardiaque et, avec les mêmes données, de visualiser et mesurer la fonction cardiaque. La qualité des artères coronariennes dans les images 4D est la même que dans les images obtenues avec une acquisition 3D standard, acquise en diastole Par ailleurs, les valeurs de fonction cardiaque mesurées au moyen des images 4D concorde avec les valeurs obtenues avec les images 2D standard. Finalement, dans la dernière partie de la thèse une technique d'acquisition a temps d'écho ultra-court (UTE) a été développée pour la visualisation in vivo des calcifications des artères coronariennes. Des études récentes ont démontré que les acquisitions UTE permettent de visualiser les calcifications dans des plaques athérosclérotiques ex vivo. Cepandent le mouvement du coeur a entravé jusqu'à maintenant l'utilisation des techniques UTE in vivo. Pour résoudre ce problème nous avons développé une séquence d'acquisition UTE avec trajectoire radiale 3D et l'avons testée sur des volontaires. La technique proposée utilise une auto-navigation 3D pour corriger le mouvement respiratoire et est synchronisée avec l'ECG. Trois échos sont acquis pour extraire le signal de la calcification avec des composants au T2 très court tout en permettant de séparer le signal de la graisse depuis le signal de l'eau. Les résultats sont encore préliminaires mais on peut affirmer que la technique développé peut potentiellement montrer les calcifications des artères coronariennes in vivo. En conclusion, ce travail de thèse présente trois nouvelles techniques pour l'IRM du coeur entier capables d'améliorer la visualisation et la caractérisation de la maladie athérosclérotique des coronaires. Ces techniques fournissent des informations anatomiques et fonctionnelles en quatre dimensions et des informations sur la composition du tissu auparavant indisponibles. CORONARY artery magnetic resonance imaging (MRI) has the potential to provide the cardiologist with relevant diagnostic information relative to coronary artery disease of patients. The major challenge of cardiac MRI, though, is dealing with all sources of motions that can corrupt the images affecting the diagnostic information provided. The current thesis, thus, focused on the development of new MRI techniques that change the standard approach to cardiac motion compensation in order to increase the efficiency of cardioavscular MRI, to provide more flexibility and robustness, new temporal information and new tissue information. The proposed approaches help in advancing coronary magnetic resonance angiography (MRA) in the direction of an easy-to-use and multipurpose tool that can be translated to the clinical environment. The first part of the thesis focused on the study of coronary artery motion through gold standard imaging techniques (x-ray angiography) in patients, in order to measure the precision with which the coronary arteries assume the same position beat after beat (coronary artery repositioning). We learned that intervals with minimal coronary artery repositioning occur in peak systole and in mid diastole and we responded with a new pulse sequence (T2~post) that is able to provide peak-systolic imaging. Such a sequence was tested in healthy volunteers and, from the image quality comparison, we learned that the proposed approach provides coronary artery visualization and contrast-to-noise ratio (CNR) comparable with the standard acquisition approach, but with increased signal-to-noise ratio (SNR). The second part of the thesis explored a completely new paradigm for whole- heart cardiovascular MRI. The proposed techniques acquires the data continuously (free-running), instead of being triggered, thus increasing the efficiency of the acquisition and providing four dimensional images of the whole heart, while respiratory self navigation allows for the scan to be performed in free breathing. This enabling technology allows for anatomical and functional evaluation in four dimensions, with high spatial and temporal resolution and without the need for contrast agent injection. The enabling step is the use of a golden-angle based 3D radial trajectory, which allows for a continuous sampling of the k-space and a retrospective selection of the timing parameters of the reconstructed dataset. The free-running 4D acquisition was then combined with a compressed sensing reconstruction algorithm that further increases the temporal resolution of the 4D dataset, while at the same time increasing the overall image quality by removing undersampling artifacts. The obtained 4D images provide visualization of the whole coronary artery tree in each phases of the cardiac cycle and, at the same time, allow for the assessment of the cardiac function with a single free- breathing scan. The quality of the coronary arteries provided by the frames of the free-running 4D acquisition is in line with the one obtained with the standard ECG-triggered one, and the cardiac function evaluation matched the one measured with gold-standard stack of 2D cine approaches. Finally, the last part of the thesis focused on the development of ultrashort echo time (UTE) acquisition scheme for in vivo detection of calcification in the coronary arteries. Recent studies showed that UTE imaging allows for the coronary artery plaque calcification ex vivo, since it is able to detect the short T2 components of the calcification. The heart motion, though, prevented this technique from being applied in vivo. An ECG-triggered self-navigated 3D radial triple- echo UTE acquisition has then been developed and tested in healthy volunteers. The proposed sequence combines a 3D self-navigation approach with a 3D radial UTE acquisition enabling data collection during free breathing. Three echoes are simultaneously acquired to extract the short T2 components of the calcification while a water and fat separation technique allows for proper visualization of the coronary arteries. Even though the results are still preliminary, the proposed sequence showed great potential for the in vivo visualization of coronary artery calcification. In conclusion, the thesis presents three novel MRI approaches aimed at improved characterization and assessment of atherosclerotic coronary artery disease. These approaches provide new anatomical and functional information in four dimensions, and support tissue characterization for coronary artery plaques.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Plus de la moitié des patients présentant une thrombose veineuse profonde des membres inférieurs développent un syndrome post-thrombotique. Le risque est particulièrement élevé en cas de thrombose de l'axe principal de drainage veineux comprenant la veine fémorale commune et les veines iliaques. Plusieurs études ont démontré que l'incidence du syndrome post-thrombotique peut être diminuée si une recanalisation des veines ilio-fémorales est obtenue dans la phase aiguë. A l'heure actuelle, des techniques de recanalisation percutanées sont proposées à des patients sélectionnés présentant une thrombose ilio-fémorale. Cet article a pour but de résumer les connaissances actuelles sur la recanalisation percutanée de la thrombose veineuse profonde aiguë. Nearly half of patients with acute lower limb deep vein thrombosis (DVT) develop a post-thrombotic syndrome (PTS). This risk is particularly high in case of proximal DVT of the common femoral and iliac vein, the major lower limbs venous outflow vessel. Several studies have demonstrated that PTS incidence can be reduced with early vein recanalisation. Currently, catheter-based recanalisation therapies can be offered to selected patients with acute ilio-femoral deep vein thrombosis. Aim of the present article is to summarize current knowledge on these catheter-based recanalisation therapies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Characterizing the geological features and structures in three dimensions over inaccessible rock cliffs is needed to assess natural hazards such as rockfalls and rockslides and also to perform investigations aimed at mapping geological contacts and building stratigraphy and fold models. Indeed, the detailed 3D data, such as LiDAR point clouds, allow to study accurately the hazard processes and the structure of geologic features, in particular in vertical and overhanging rock slopes. Thus, 3D geological models have a great potential of being applied to a wide range of geological investigations both in research and applied geology projects, such as mines, tunnels and reservoirs. Recent development of ground-based remote sensing techniques (LiDAR, photogrammetry and multispectral / hyperspectral images) are revolutionizing the acquisition of morphological and geological information. As a consequence, there is a great potential for improving the modeling of geological bodies as well as failure mechanisms and stability conditions by integrating detailed remote data. During the past ten years several large rockfall events occurred along important transportation corridors where millions of people travel every year (Switzerland: Gotthard motorway and railway; Canada: Sea to sky highway between Vancouver and Whistler). These events show that there is still a lack of knowledge concerning the detection of potential rockfalls, making mountain residential settlements and roads highly risky. It is necessary to understand the main factors that destabilize rocky outcrops even if inventories are lacking and if no clear morphological evidences of rockfall activity are observed. In order to increase the possibilities of forecasting potential future landslides, it is crucial to understand the evolution of rock slope stability. Defining the areas theoretically most prone to rockfalls can be particularly useful to simulate trajectory profiles and to generate hazard maps, which are the basis for land use planning in mountainous regions. The most important questions to address in order to assess rockfall hazard are: Where are the most probable sources for future rockfalls located? What are the frequencies of occurrence of these rockfalls? I characterized the fracturing patterns in the field and with LiDAR point clouds. Afterwards, I developed a model to compute the failure mechanisms on terrestrial point clouds in order to assess the susceptibility to rockfalls at the cliff scale. Similar procedures were already available to evaluate the susceptibility to rockfalls based on aerial digital elevation models. This new model gives the possibility to detect the most susceptible rockfall sources with unprecented detail in the vertical and overhanging areas. The results of the computation of the most probable rockfall source areas in granitic cliffs of Yosemite Valley and Mont-Blanc massif were then compared to the inventoried rockfall events to validate the calculation methods. Yosemite Valley was chosen as a test area because it has a particularly strong rockfall activity (about one rockfall every week) which leads to a high rockfall hazard. The west face of the Dru was also chosen for the relevant rockfall activity and especially because it was affected by some of the largest rockfalls that occurred in the Alps during the last 10 years. Moreover, both areas were suitable because of their huge vertical and overhanging cliffs that are difficult to study with classical methods. Limit equilibrium models have been applied to several case studies to evaluate the effects of different parameters on the stability of rockslope areas. The impact of the degradation of rockbridges on the stability of large compartments in the west face of the Dru was assessed using finite element modeling. In particular I conducted a back-analysis of the large rockfall event of 2005 (265'000 m3) by integrating field observations of joint conditions, characteristics of fracturing pattern and results of geomechanical tests on the intact rock. These analyses improved our understanding of the factors that influence the stability of rock compartments and were used to define the most probable future rockfall volumes at the Dru. Terrestrial laser scanning point clouds were also successfully employed to perform geological mapping in 3D, using the intensity of the backscattered signal. Another technique to obtain vertical geological maps is combining triangulated TLS mesh with 2D geological maps. At El Capitan (Yosemite Valley) we built a georeferenced vertical map of the main plutonio rocks that was used to investigate the reasons for preferential rockwall retreat rate. Additional efforts to characterize the erosion rate were made at Monte Generoso (Ticino, southern Switzerland) where I attempted to improve the estimation of long term erosion by taking into account also the volumes of the unstable rock compartments. Eventually, the following points summarize the main out puts of my research: The new model to compute the failure mechanisms and the rockfall susceptibility with 3D point clouds allows to define accurately the most probable rockfall source areas at the cliff scale. The analysis of the rockbridges at the Dru shows the potential of integrating detailed measurements of the fractures in geomechanical models of rockmass stability. The correction of the LiDAR intensity signal gives the possibility to classify a point cloud according to the rock type and then use this information to model complex geologic structures. The integration of these results, on rockmass fracturing and composition, with existing methods can improve rockfall hazard assessments and enhance the interpretation of the evolution of steep rockslopes. -- La caractérisation de la géologie en 3D pour des parois rocheuses inaccessibles est une étape nécessaire pour évaluer les dangers naturels tels que chutes de blocs et glissements rocheux, mais aussi pour réaliser des modèles stratigraphiques ou de structures plissées. Les modèles géologiques 3D ont un grand potentiel pour être appliqués dans une vaste gamme de travaux géologiques dans le domaine de la recherche, mais aussi dans des projets appliqués comme les mines, les tunnels ou les réservoirs. Les développements récents des outils de télédétection terrestre (LiDAR, photogrammétrie et imagerie multispectrale / hyperspectrale) sont en train de révolutionner l'acquisition d'informations géomorphologiques et géologiques. Par conséquence, il y a un grand potentiel d'amélioration pour la modélisation d'objets géologiques, ainsi que des mécanismes de rupture et des conditions de stabilité, en intégrant des données détaillées acquises à distance. Pour augmenter les possibilités de prévoir les éboulements futurs, il est fondamental de comprendre l'évolution actuelle de la stabilité des parois rocheuses. Définir les zones qui sont théoriquement plus propices aux chutes de blocs peut être très utile pour simuler les trajectoires de propagation des blocs et pour réaliser des cartes de danger, qui constituent la base de l'aménagement du territoire dans les régions de montagne. Les questions plus importantes à résoudre pour estimer le danger de chutes de blocs sont : Où se situent les sources plus probables pour les chutes de blocs et éboulement futurs ? Avec quelle fréquence vont se produire ces événements ? Donc, j'ai caractérisé les réseaux de fractures sur le terrain et avec des nuages de points LiDAR. Ensuite, j'ai développé un modèle pour calculer les mécanismes de rupture directement sur les nuages de points pour pouvoir évaluer la susceptibilité au déclenchement de chutes de blocs à l'échelle de la paroi. Les zones sources de chutes de blocs les plus probables dans les parois granitiques de la vallée de Yosemite et du massif du Mont-Blanc ont été calculées et ensuite comparés aux inventaires des événements pour vérifier les méthodes. Des modèles d'équilibre limite ont été appliqués à plusieurs cas d'études pour évaluer les effets de différents paramètres sur la stabilité des parois. L'impact de la dégradation des ponts rocheux sur la stabilité de grands compartiments de roche dans la paroi ouest du Petit Dru a été évalué en utilisant la modélisation par éléments finis. En particulier j'ai analysé le grand éboulement de 2005 (265'000 m3), qui a emporté l'entier du pilier sud-ouest. Dans le modèle j'ai intégré des observations des conditions des joints, les caractéristiques du réseau de fractures et les résultats de tests géoméchaniques sur la roche intacte. Ces analyses ont amélioré l'estimation des paramètres qui influencent la stabilité des compartiments rocheux et ont servi pour définir des volumes probables pour des éboulements futurs. Les nuages de points obtenus avec le scanner laser terrestre ont été utilisés avec succès aussi pour produire des cartes géologiques en 3D, en utilisant l'intensité du signal réfléchi. Une autre technique pour obtenir des cartes géologiques des zones verticales consiste à combiner un maillage LiDAR avec une carte géologique en 2D. A El Capitan (Yosemite Valley) nous avons pu géoréferencer une carte verticale des principales roches plutoniques que j'ai utilisé ensuite pour étudier les raisons d'une érosion préférentielle de certaines zones de la paroi. D'autres efforts pour quantifier le taux d'érosion ont été effectués au Monte Generoso (Ticino, Suisse) où j'ai essayé d'améliorer l'estimation de l'érosion au long terme en prenant en compte les volumes des compartiments rocheux instables. L'intégration de ces résultats, sur la fracturation et la composition de l'amas rocheux, avec les méthodes existantes permet d'améliorer la prise en compte de l'aléa chute de pierres et éboulements et augmente les possibilités d'interprétation de l'évolution des parois rocheuses.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Connectivity analysis on diffusion MRI data of the whole- brain suffers from distortions caused by the standard echo- planar imaging acquisition strategies. These images show characteristic geometrical deformations and signal destruction that are an important drawback limiting the success of tractography algorithms. Several retrospective correction techniques are readily available. In this work, we use a digital phantom designed for the evaluation of connectivity pipelines. We subject the phantom to a âeurooetheoretically correctâeuro and plausible deformation that resembles the artifact under investigation. We correct data back, with three standard methodologies (namely fieldmap-based, reversed encoding-based, and registration- based). Finally, we rank the methods based on their geometrical accuracy, the dropout compensation, and their impact on the resulting connectivity matrices.