989 resultados para Strong-Field
Resumo:
Arbuscular mycorrhizal fungi (AMF) are important symbionts of plants that improve plant nutrient acquisition and promote plant diversity. Although within-species genetic differences among AMF have been shown to differentially affect plant growth, very little is actually known about the degree of genetic diversity in AMF populations. This is largely because of difficulties in isolation and cultivation of the fungi in a clean system allowing reliable genotyping to be performed. A population of the arbuscular mycorrhizal fungus Glomus intraradices growing in an in vitro cultivation system was studied using newly developed simple sequence repeat (SSR), nuclear gene intron and mitochondrial ribosomal gene intron markers. The markers revealed a strong differentiation at the nuclear and mitochondrial level among isolates. Genotypes were nonrandomly distributed among four plots showing genetic subdivisions in the field. Meanwhile, identical genotypes were found in geographically distant locations. AMF genotypes showed significant preferences to different host plant species (Glycine max, Helianthus annuus and Allium porrum) used before the fungal in vitro culture establishment. Host plants in a field could provide a heterogeneous environment favouring certain genotypes. Such preferences may partly explain within-population patterns of genetic diversity.
Resumo:
Malgrat els esforços de la UE en la promoció de la democràcia i un compromís comú per la democràcia i els drets humans al EMP, no hi ha signes de convergència cap al model liberal democràtic propugnat per la UE. No obstant això, l'abast i la intensitat de la cooperació multilateral, transnacional i bilateral han augmentat constantment en tota la regió des de mitjans de 1990. La cooperació en el camp de la promoció de la democràcia es caracteritza per la forta dinàmica de normativa sectorial, i la diferenciació geogràfica, però està clarament situada en un marc regional i altament estandarditzat. Si bé la convergència política o la política sembla poc probable en el curt o mitjà termini, democràcia i drets humans estan fermament establerts en una agenda regional comú
Resumo:
Time-lapse geophysical data acquired during transient hydrological experiments are being increasingly employed to estimate subsurface hydraulic properties at the field scale. In particular, crosshole ground-penetrating radar (GPR) data, collected while water infiltrates into the subsurface either by natural or artificial means, have been demonstrated in a number of studies to contain valuable information concerning the hydraulic properties of the unsaturated zone. Previous work in this domain has considered a variety of infiltration conditions and different amounts of time-lapse GPR data in the estimation procedure. However, the particular benefits and drawbacks of these different strategies as well as the impact of a variety of key and common assumptions remain unclear. Using a Bayesian Markov-chain-Monte-Carlo stochastic inversion methodology, we examine in this paper the information content of time-lapse zero-offset-profile (ZOP) GPR traveltime data, collected under three different infiltration conditions, for the estimation of van Genuchten-Mualem (VGM) parameters in a layered subsurface medium. Specifically, we systematically analyze synthetic and field GPR data acquired under natural loading and two rates of forced infiltration, and we consider the value of incorporating different amounts of time-lapse measurements into the estimation procedure. Our results confirm that, for all infiltration scenarios considered, the ZOP GPR traveltime data contain important information about subsurface hydraulic properties as a function of depth, with forced infiltration offering the greatest potential for VGM parameter refinement because of the higher stressing of the hydrological system. Considering greater amounts of time-lapse data in the inversion procedure is also found to help refine VGM parameter estimates. Quite importantly, however, inconsistencies observed in the field results point to the strong possibility that posterior uncertainties are being influenced by model structural errors, which in turn underlines the fundamental importance of a systematic analysis of such errors in future related studies.
Resumo:
Non-alcoholic fatty liver disease (NAFLD) is an emerging health concern in both developed and non-developed world, encompassing from simple steatosis to non-alcoholic steatohepatitis (NASH), cirrhosis and liver cancer. Incidence and prevalence of this disease are increasing due to the socioeconomic transition and change to harmful diet. Currently, gold standard method in NAFLD diagnosis is liver biopsy, despite complications and lack of accuracy due to sampling error. Further, pathogenesis of NAFLD is not fully understood, but is well-known that obesity, diabetes and metabolic derangements played a major role in disease development and progression. Besides, gut microbioma and host genetic and epigenetic background could explain considerable interindividual variability. Knowledge that epigenetics, heritable events not caused by changes in DNA sequence, contribute to development of diseases has been a revolution in the last few years. Recently, evidences are accumulating revealing the important role of epigenetics in NAFLD pathogenesis and in NASH genesis. Histone modifications, changes in DNA methylation and aberrant profiles or microRNAs could boost development of NAFLD and transition into clinical relevant status. PNPLA3 genotype GG has been associated with a more progressive disease and epigenetics could modulate this effect. The impact of epigenetic on NAFLD progression could deserve further applications on therapeutic targets together with future non-invasive methods useful for the diagnosis and staging of NAFLD.
Resumo:
Economic evaluation of health care interventions has experienced a strong growth over the past decade and is increasingly present as a support tool in the decisions making process on public funding of health services and pricing in European countries. A necessary element using them is that agents that perform economic evaluations have minimum rules with agreement on methodological aspects. Although there are methodological issues in which there is a high degree of consensus, there are others in which there is no such degree of agreement being closest to the normative field or have experienced significant methodological advances in recent years. In this first article of a series of three, we will discuss on the perspective of analysis and assessment of costs in economic evaluation of health interventions using the technique Metaplan. Finally, research lines are proposed to overcome the identified discrepancies.
Resumo:
It is increasingly evident that cancer results from altered organ homeostasis rather than from deregulated control of single cells or groups of cells. This applies especially to epithelial cancer, the most common form of human solid tumors and a major cause of cancer lethality. In the vast majority of cases, in situ epithelial cancer lesions do not progress into malignancy, even if they harbor many of the genetic changes found in invasive and metastatic tumors. While changes in tumor stroma are frequently viewed as secondary to changes in the epithelium, recent evidence indicates that they can play a primary role in both cancer progression and initiation. These processes may explain the phenomenon of field cancerization, i.e., the occurrence of multifocal and recurrent epithelial tumors that are preceded by and associated with widespread changes of surrounding tissue or organ "fields."
Resumo:
In this paper, we propose a new paradigm to carry outthe registration task with a dense deformation fieldderived from the optical flow model and the activecontour method. The proposed framework merges differenttasks such as segmentation, regularization, incorporationof prior knowledge and registration into a singleframework. The active contour model is at the core of ourframework even if it is used in a different way than thestandard approaches. Indeed, active contours are awell-known technique for image segmentation. Thistechnique consists in finding the curve which minimizesan energy functional designed to be minimal when thecurve has reached the object contours. That way, we getaccurate and smooth segmentation results. So far, theactive contour model has been used to segment objectslying in images from boundary-based, region-based orshape-based information. Our registration technique willprofit of all these families of active contours todetermine a dense deformation field defined on the wholeimage. A well-suited application of our model is theatlas registration in medical imaging which consists inautomatically delineating anatomical structures. Wepresent results on 2D synthetic images to show theperformances of our non rigid deformation field based ona natural registration term. We also present registrationresults on real 3D medical data with a large spaceoccupying tumor substantially deforming surroundingstructures, which constitutes a high challenging problem.
Resumo:
Background: Glutathione (GSH), a major cellular redox regulator and antioxidant, is decreased in cerebrospinal fluid and prefrontal cortex of schizophrenia patients. The gene of the key GSH-synthesizing enzyme, glutamate-cysteine ligase, modifier (GCLM) subunit, is associated with schizophrenia, suggesting that the deficit in the GSH system is of genetic origin. Using the GCLM knock-out (KO) mouse as model system with 60% decreased brain GSH levels and, thus, strong vulnerability to oxidative stress, we have shown that GSH dysregulation results in abnormal mouse brain morphology (e.g., reduced parvalbumin, PV, immuno-reactivity in frontal areas) and function. Additional oxidative stress, induced by GBR12909 (a dopamine re-uptake inhibitor), enhances morphological changes even further. Aim: In the present study we use the GCLM KO mouse model system, asking now, whether GSH dysregulation also compromises mouse behaviour and cognition. Methods: Male and female wildtype (WT) and GCLM-KO mice are treated with GBR12909 or phosphate buffered saline (PBS) from postnatal day (P) 5 to 10, and are behaviourally tested at P 60 and older. Results: In comparison to WT, KO animals of both sexes are hyperactive in the open field, display more frequent open arm entries on the elevated plus maze, longer float latencies in the Porsolt swim test, and more frequent contacts of novel and familiar objects. Contrary to other reports of animal models with reduced PV immuno-reactivity, GCLM-KO mice display normal rule learning capacity and perform normally on a spatial recognition task. GCLM-KO mice do, however, show a strong deficit in object-recognition after a 15 minutes retention delay. GBR12909 treatment exerts no additional effect. Conclusions: The results suggest that animals with impaired regulation of brain oxidative stress are impulsive and have reduced behavioural control in novel, unpredictable contexts. Moreover, GSH dysregulation seems to induce a selective attentional or stimulus-encoding deficit: despite intensive object exploration, GCLM-KO mice cannot discriminate between novel and familiar objects. In conclusion, the present data indicate that GSH dysregulation may contribute to the manifestation of behavioural and cognitive anomalies that are associated with schizophrenia.
Resumo:
The present thesis is about cognitions of left-wing activists and the role they play to better understand contentious participation. It compares activists of three post-industrial social movement organizations in Switzerland, i.e. Solidarity across Borders defending migrant's rights, the Society of Threatened People promoting collective human rights and Greenpeace protecting the environment. It makes use of an innovative mixed methods design combining survey and interview data. The main theoretical contribution is to conceptualize an analytical tool enabling to grasp the cognitive map of these activists by putting forward the concept of strong citizen, summing up their relation to society and politics. The relation to society consists of an extensive relation to others and an interconnected vision of society. Consequently, their primary concerns include the handing of common goods and the equal treatment of individuals with regard to common goods. The relation to politics incorporates a critical and vigilant citizen. They are critical towards political authorities and they appreciate political action by organized groups of the civil society. The thesis states that only by having such worldviews activists are able to construct an injustice, agency and identity frame for the claims of their organizations. Thus, the present work delivers a parsimonious answer to the question of where an injustice, agency and identity frame comes from. It does so by a systematic analysis of four specific arguments. First, it empirically demonstrates that these activists have - at the aggregate level - specific cognitive resources compared to the general population. Second, it describes the content of this specific cognitive outlook by evaluating the appropriateness of the strong citizen concept. Third, it looks at variations between activist's communities and shows that activists of more challenging protest issues are stronger citizens than activists of more mainstream protests. Finally, cognitions are not the only part of the story if one looks at contentious participation. Other factors, i.e. social networks and biographical availability, matter too. Therefore, I test if cognitions are able to contribute in explaining differences between activists' communities if one controls for other factors. In sum, this thesis is thus a first step to demonstrate why one should be concerned about activists' cognitions. - Cette thèse s'intéresse aux cognitions des activistes de gauche et à leur rôle dans le phénomène de la participation contestataire. Des activistes de trois organisations post- industrielles en Suisse sont comparé, à savoir Solidarité sans Frontières qui défend les droits des migrants, la Société des Peuples menacés qui promeut les droits des collectivités minoritaires et Greenpeace qui oeuvre pour la protection de l'environnement. Cette recherche utilise un « mixed methods design » en combinant de manière innovant des données de sondage et d'entretiens. Ma principale contribution théorique réside dans la conceptualisation d'un outil analytique qui permet de saisir la « carte cognitive » des activistes, à travers le concept de « strong citizen » qui se réfère à la relation spécifique qu'entretiennent certains individus avec la société et la politique. Ces individus sont caractérisés par une vision inclusive et interconnectée de la société, ainsi que par une conception politique du citoyen comme critique et vigilant. Mon argument principal est celui selon lequel seuls les individus possédant ce type particulier de cognitions sont capable de construire un cadre d'injustice, d'« agency » et d'identité. Cette thèse apporte donc quelques éléments de réponse à la question de l'origine de ces cadres cognitifs qui sont cruciales pour la participation. Pour ce faire, quatre aspects spécifiques sont analysés de manière systématique. Premièrement, je démontre empiriquement, au niveau agrégé, que ces activistes possèdent effectivement des ressources cognitives spécifiques - en comparaison avec la population générale. Deuxièmement, j'analyse le contenu de ces cognitions, ce qui me permet notamment d'évaluer la pertinence et l'adéquation du concept de « strong citizen ». Troisièmement, en m'intéressant cette fois aux variations entre communautés d'activistes, je démontre que ceux réunis autour d'enjeux protestataires très revendicatifs sont, d'un point de vue cognitif, plus proches de la figure du « strong citizen » que ceux mobilisés sur des enjeux plus consensuels. Finalement, d'autres facteurs, à savoir les réseaux sociaux et la disponibilité biographique, sont intégrés à l'analyse afin de mesurer le réel pouvoir explicatif des cognitions dans l'explication des différences observées entre communautés d'activistes. A travers ces analyses, cette thèse met en avant l'importance du rôle des cognitions dans l'étude de la participation contestataire.
Resumo:
La tomodensitométrie (CT) est une technique d'imagerie dont l'intérêt n'a cessé de croître depuis son apparition dans le début des années 70. Dans le domaine médical, son utilisation est incontournable à tel point que ce système d'imagerie pourrait être amené à devenir victime de son succès si son impact au niveau de l'exposition de la population ne fait pas l'objet d'une attention particulière. Bien évidemment, l'augmentation du nombre d'examens CT a permis d'améliorer la prise en charge des patients ou a rendu certaines procédures moins invasives. Toutefois, pour assurer que le compromis risque - bénéfice soit toujours en faveur du patient, il est nécessaire d'éviter de délivrer des doses non utiles au diagnostic.¦Si cette action est importante chez l'adulte elle doit être une priorité lorsque les examens se font chez l'enfant, en particulier lorsque l'on suit des pathologies qui nécessitent plusieurs examens CT au cours de la vie du patient. En effet, les enfants et jeunes adultes sont plus radiosensibles. De plus, leur espérance de vie étant supérieure à celle de l'adulte, ils présentent un risque accru de développer un cancer radio-induit dont la phase de latence peut être supérieure à vingt ans. Partant du principe que chaque examen radiologique est justifié, il devient dès lors nécessaire d'optimiser les protocoles d'acquisitions pour s'assurer que le patient ne soit pas irradié inutilement. L'avancée technologique au niveau du CT est très rapide et depuis 2009, de nouvelles techniques de reconstructions d'images, dites itératives, ont été introduites afin de réduire la dose et améliorer la qualité d'image.¦Le présent travail a pour objectif de déterminer le potentiel des reconstructions itératives statistiques pour réduire au minimum les doses délivrées lors d'examens CT chez l'enfant et le jeune adulte tout en conservant une qualité d'image permettant le diagnostic, ceci afin de proposer des protocoles optimisés.¦L'optimisation d'un protocole d'examen CT nécessite de pouvoir évaluer la dose délivrée et la qualité d'image utile au diagnostic. Alors que la dose est estimée au moyen d'indices CT (CTDIV0| et DLP), ce travail a la particularité d'utiliser deux approches radicalement différentes pour évaluer la qualité d'image. La première approche dite « physique », se base sur le calcul de métriques physiques (SD, MTF, NPS, etc.) mesurées dans des conditions bien définies, le plus souvent sur fantômes. Bien que cette démarche soit limitée car elle n'intègre pas la perception des radiologues, elle permet de caractériser de manière rapide et simple certaines propriétés d'une image. La seconde approche, dite « clinique », est basée sur l'évaluation de structures anatomiques (critères diagnostiques) présentes sur les images de patients. Des radiologues, impliqués dans l'étape d'évaluation, doivent qualifier la qualité des structures d'un point de vue diagnostique en utilisant une échelle de notation simple. Cette approche, lourde à mettre en place, a l'avantage d'être proche du travail du radiologue et peut être considérée comme méthode de référence.¦Parmi les principaux résultats de ce travail, il a été montré que les algorithmes itératifs statistiques étudiés en clinique (ASIR?, VEO?) ont un important potentiel pour réduire la dose au CT (jusqu'à-90%). Cependant, par leur fonctionnement, ils modifient l'apparence de l'image en entraînant un changement de texture qui pourrait affecter la qualité du diagnostic. En comparant les résultats fournis par les approches « clinique » et « physique », il a été montré que ce changement de texture se traduit par une modification du spectre fréquentiel du bruit dont l'analyse permet d'anticiper ou d'éviter une perte diagnostique. Ce travail montre également que l'intégration de ces nouvelles techniques de reconstruction en clinique ne peut se faire de manière simple sur la base de protocoles utilisant des reconstructions classiques. Les conclusions de ce travail ainsi que les outils développés pourront également guider de futures études dans le domaine de la qualité d'image, comme par exemple, l'analyse de textures ou la modélisation d'observateurs pour le CT.¦-¦Computed tomography (CT) is an imaging technique in which interest has been growing since it first began to be used in the early 1970s. In the clinical environment, this imaging system has emerged as the gold standard modality because of its high sensitivity in producing accurate diagnostic images. However, even if a direct benefit to patient healthcare is attributed to CT, the dramatic increase of the number of CT examinations performed has raised concerns about the potential negative effects of ionizing radiation on the population. To insure a benefit - risk that works in favor of a patient, it is important to balance image quality and dose in order to avoid unnecessary patient exposure.¦If this balance is important for adults, it should be an absolute priority for children undergoing CT examinations, especially for patients suffering from diseases requiring several follow-up examinations over the patient's lifetime. Indeed, children and young adults are more sensitive to ionizing radiation and have an extended life span in comparison to adults. For this population, the risk of developing cancer, whose latency period exceeds 20 years, is significantly higher than for adults. Assuming that each patient examination is justified, it then becomes a priority to optimize CT acquisition protocols in order to minimize the delivered dose to the patient. Over the past few years, CT advances have been developing at a rapid pace. Since 2009, new iterative image reconstruction techniques, called statistical iterative reconstructions, have been introduced in order to decrease patient exposure and improve image quality.¦The goal of the present work was to determine the potential of statistical iterative reconstructions to reduce dose as much as possible without compromising image quality and maintain diagnosis of children and young adult examinations.¦The optimization step requires the evaluation of the delivered dose and image quality useful to perform diagnosis. While the dose is estimated using CT indices (CTDIV0| and DLP), the particularity of this research was to use two radically different approaches to evaluate image quality. The first approach, called the "physical approach", computed physical metrics (SD, MTF, NPS, etc.) measured on phantoms in well-known conditions. Although this technique has some limitations because it does not take radiologist perspective into account, it enables the physical characterization of image properties in a simple and timely way. The second approach, called the "clinical approach", was based on the evaluation of anatomical structures (diagnostic criteria) present on patient images. Radiologists, involved in the assessment step, were asked to score image quality of structures for diagnostic purposes using a simple rating scale. This approach is relatively complicated to implement and also time-consuming. Nevertheless, it has the advantage of being very close to the practice of radiologists and is considered as a reference method.¦Primarily, this work revealed that the statistical iterative reconstructions studied in clinic (ASIR? and VECO have a strong potential to reduce CT dose (up to -90%). However, by their mechanisms, they lead to a modification of the image appearance with a change in image texture which may then effect the quality of the diagnosis. By comparing the results of the "clinical" and "physical" approach, it was showed that a change in texture is related to a modification of the noise spectrum bandwidth. The NPS analysis makes possible to anticipate or avoid a decrease in image quality. This project demonstrated that integrating these new statistical iterative reconstruction techniques can be complex and cannot be made on the basis of protocols using conventional reconstructions. The conclusions of this work and the image quality tools developed will be able to guide future studies in the field of image quality as texture analysis or model observers dedicated to CT.
Resumo:
PURPOSE: To compare the apparent diffusion coefficient (ADC) values of malignant liver lesions on diffusion-weighted MRI (DWI) before and after successful radiofrequency ablation (RF ablation). MATERIALS AND METHODS: Thirty-two patients with 43 malignant liver lesions (23/20: metastases/hepatocellular carcinomas (HCC)) underwent liver MRI (3.0T) before (<1month) and after RF ablation (at 1, 3 and 6months) using T2-, gadolinium-enhanced T1- and DWI-weighted MR sequences. Jointly, two radiologists prospectively measured ADCs for each lesion by means of two different regions of interest (ROIs), first including the whole lesion and secondly the area with the visibly most restricted diffusion (MRDA) on ADC map. Changes of ADCs were evaluated with ANOVA and Dunnett tests. RESULTS: Thirty-one patients were successfully treated, while one patient was excluded due to focal recurrence. In metastases (n=22), the ADC in the whole lesion and in MRDA showed an up-and-down evolution. In HCC (n=20), the evolution of ADC was more complex, but with significantly higher values (p=0.013) at 1 and 6months after RF ablation. CONCLUSION: The ADC values of malignant liver lesions successfully treated by RF ablation show a predictable evolution and may help radiologists to monitor tumor response after treatment.
Resumo:
Sackung is a widespread post-glacial morphological feature affecting Alpine mountains and creating characteristic geomorphological expression that can be detected from topography. Over long time evolution, internal deformation can lead to the formation of rapidly moving phenomena such as a rock-slide or rock avalanche. In this study, a detailed description of the Sierre rock-avalanche (SW Switzerland) is presented. This convex-shaped postglacial instability is one of the larger rock-avalanche in the Alps, involving more than 1.5 billion m3 with a run-out distance of about 14 km and extremely low Fahrböschung angle. This study presents comprehensive analyses of the structural and geological characteristics leading to the development of the Sierre rock-avalanche. In particular, by combining field observations, digital elevation model analyses and numerical modelling, the strong influence of both ductile and brittle tectonic structures on the failure mechanism and on the failure surface geometry is highlighted. The detection of pre-failure deformation indicates that the development of the rock avalanche corresponds to the last evolutionary stage of a pre-existing deep seated gravitational slope instability. These analyses accompanied by the dating and the characterization of rock avalanche deposits, allow the proposal of a destabilization model that clarifies the different phases leading to the development of the Sierre rock avalanche.
Resumo:
OBJECTIVE: Esophageal temperature is the gold standard for in-the-field temperature monitoring in hypothermic victims with cardiac arrest. For practical reasons, some mountain rescue teams use homemade esophageal thermometers to measure esophageal temperature; these consist of nonmedical inside/outside temperature monitoring instruments that have been modified to allow for esophageal insertion. We planned a study to determine the accuracy of such thermometers. METHODS: Two of the same model of digital cabled indoor/outdoor thermometer were modified and tested in comparison with a reference thermometer. The thermometers were tested in a water bath at different temperatures between 10°C and 35.2°C. Three hundred measurements were taken with each thermometer. RESULTS: Our experimental study showed that both homemade thermometers provided a good correlation and a clinically acceptable agreement in comparison with the reference thermometer. Measurements were within 0.5°C in comparison with the reference thermometer 97.5% of the time. CONCLUSIONS: The homemade thermometers performed well in vitro, in comparison with a reference thermometer. However, because these devices in their original form are not designed for clinical use, their use should be restricted to situations when the use of a conventional esophageal thermometer is impossible.