103 resultados para Gradient-based approaches
Resumo:
PURPOSE: All methods presented to date to map both conductivity and permittivity rely on multiple acquisitions to compute quantitatively the magnitude of radiofrequency transmit fields, B1+. In this work, we propose a method to compute both conductivity and permittivity based solely on relative receive coil sensitivities ( B1-) that can be obtained in one single measurement without the need to neither explicitly perform transmit/receive phase separation nor make assumptions regarding those phases. THEORY AND METHODS: To demonstrate the validity and the noise sensitivity of our method we used electromagnetic finite differences simulations of a 16-channel transceiver array. To experimentally validate our methodology at 7 Tesla, multi compartment phantom data was acquired using a standard 32-channel receive coil system and two-dimensional (2D) and 3D gradient echo acquisition. The reconstructed electric properties were correlated to those measured using dielectric probes. RESULTS: The method was demonstrated both in simulations and in phantom data with correlations to both the modeled and bench measurements being close to identity. The noise properties were modeled and understood. CONCLUSION: The proposed methodology allows to quantitatively determine the electrical properties of a sample using any MR contrast, with the only constraint being the need to have 4 or more receive coils and high SNR. Magn Reson Med, 2014. © 2014 Wiley Periodicals, Inc.
Resumo:
This thesis is composed of three main parts. The first consists of a state of the art of the different notions that are significant to understand the elements surrounding art authentication in general, and of signatures in particular, and that the author deemed them necessary to fully grasp the microcosm that makes up this particular market. Individuals with a solid knowledge of the art and expertise area, and that are particularly interested in the present study are advised to advance directly to the fourth Chapter. The expertise of the signature, it's reliability, and the factors impacting the expert's conclusions are brought forward. The final aim of the state of the art is to offer a general list of recommendations based on an exhaustive review of the current literature and given in light of all of the exposed issues. These guidelines are specifically formulated for the expertise of signatures on paintings, but can also be applied to wider themes in the area of signature examination. The second part of this thesis covers the experimental stages of the research. It consists of the method developed to authenticate painted signatures on works of art. This method is articulated around several main objectives: defining measurable features on painted signatures and defining their relevance in order to establish the separation capacities between groups of authentic and simulated signatures. For the first time, numerical analyses of painted signatures have been obtained and are used to attribute their authorship to given artists. An in-depth discussion of the developed method constitutes the third and final part of this study. It evaluates the opportunities and constraints when applied by signature and handwriting experts in forensic science. A brief summary covering each chapter allows a rapid overview of the study and summarizes the aims and main themes of each chapter. These outlines presented below summarize the aims and main themes addressed in each chapter. Part I - Theory Chapter 1 exposes legal aspects surrounding the authentication of works of art by art experts. The definition of what is legally authentic, the quality and types of the experts that can express an opinion concerning the authorship of a specific painting, and standard deontological rules are addressed. The practices applied in Switzerland will be specifically dealt with. Chapter 2 presents an overview of the different scientific analyses that can be carried out on paintings (from the canvas to the top coat). Scientific examinations of works of art have become more common, as more and more museums equip themselves with laboratories, thus an understanding of their role in the art authentication process is vital. The added value that a signature expertise can have in comparison to other scientific techniques is also addressed. Chapter 3 provides a historical overview of the signature on paintings throughout the ages, in order to offer the reader an understanding of the origin of the signature on works of art and its evolution through time. An explanation is given on the transitions that the signature went through from the 15th century on and how it progressively took on its widely known modern form. Both this chapter and chapter 2 are presented to show the reader the rich sources of information that can be provided to describe a painting, and how the signature is one of these sources. Chapter 4 focuses on the different hypotheses the FHE must keep in mind when examining a painted signature, since a number of scenarios can be encountered when dealing with signatures on works of art. The different forms of signatures, as well as the variables that may have an influence on the painted signatures, are also presented. Finally, the current state of knowledge of the examination procedure of signatures in forensic science in general, and in particular for painted signatures, is exposed. The state of the art of the assessment of the authorship of signatures on paintings is established and discussed in light of the theoretical facets mentioned previously. Chapter 5 considers key elements that can have an impact on the FHE during his or her2 examinations. This includes a discussion on elements such as the skill, confidence and competence of an expert, as well as the potential bias effects he might encounter. A better understanding of elements surrounding handwriting examinations, to, in turn, better communicate results and conclusions to an audience, is also undertaken. Chapter 6 reviews the judicial acceptance of signature analysis in Courts and closes the state of the art section of this thesis. This chapter brings forward the current issues pertaining to the appreciation of this expertise by the non- forensic community, and will discuss the increasing number of claims of the unscientific nature of signature authentication. The necessity to aim for more scientific, comprehensive and transparent authentication methods will be discussed. The theoretical part of this thesis is concluded by a series of general recommendations for forensic handwriting examiners in forensic science, specifically for the expertise of signatures on paintings. These recommendations stem from the exhaustive review of the literature and the issues exposed from this review and can also be applied to the traditional examination of signatures (on paper). Part II - Experimental part Chapter 7 describes and defines the sampling, extraction and analysis phases of the research. The sampling stage of artists' signatures and their respective simulations are presented, followed by the steps that were undertaken to extract and determine sets of characteristics, specific to each artist, that describe their signatures. The method is based on a study of five artists and a group of individuals acting as forgers for the sake of this study. Finally, the analysis procedure of these characteristics to assess of the strength of evidence, and based on a Bayesian reasoning process, is presented. Chapter 8 outlines the results concerning both the artist and simulation corpuses after their optical observation, followed by the results of the analysis phase of the research. The feature selection process and the likelihood ratio evaluation are the main themes that are addressed. The discrimination power between both corpuses is illustrated through multivariate analysis. Part III - Discussion Chapter 9 discusses the materials, the methods, and the obtained results of the research. The opportunities, but also constraints and limits, of the developed method are exposed. Future works that can be carried out subsequent to the results of the study are also presented. Chapter 10, the last chapter of this thesis, proposes a strategy to incorporate the model developed in the last chapters into the traditional signature expertise procedures. Thus, the strength of this expertise is discussed in conjunction with the traditional conclusions reached by forensic handwriting examiners in forensic science. Finally, this chapter summarizes and advocates a list of formal recommendations for good practices for handwriting examiners. In conclusion, the research highlights the interdisciplinary aspect of signature examination of signatures on paintings. The current state of knowledge of the judicial quality of art experts, along with the scientific and historical analysis of paintings and signatures, are overviewed to give the reader a feel of the different factors that have an impact on this particular subject. The temperamental acceptance of forensic signature analysis in court, also presented in the state of the art, explicitly demonstrates the necessity of a better recognition of signature expertise by courts of law. This general acceptance, however, can only be achieved by producing high quality results through a well-defined examination process. This research offers an original approach to attribute a painted signature to a certain artist: for the first time, a probabilistic model used to measure the discriminative potential between authentic and simulated painted signatures is studied. The opportunities and limits that lie within this method of scientifically establishing the authorship of signatures on works of art are thus presented. In addition, the second key contribution of this work proposes a procedure to combine the developed method into that used traditionally signature experts in forensic science. Such an implementation into the holistic traditional signature examination casework is a large step providing the forensic, judicial and art communities with a solid-based reasoning framework for the examination of signatures on paintings. The framework and preliminary results associated with this research have been published (Montani, 2009a) and presented at international forensic science conferences (Montani, 2009b; Montani, 2012).
Resumo:
BACKGROUND: Recent neuroimaging studies suggest that value-based decision-making may rely on mechanisms of evidence accumulation. However no studies have explicitly investigated the time when single decisions are taken based on such an accumulation process. NEW METHOD: Here, we outline a novel electroencephalography (EEG) decoding technique which is based on accumulating the probability of appearance of prototypical voltage topographies and can be used for predicting subjects' decisions. We use this approach for studying the time-course of single decisions, during a task where subjects were asked to compare reward vs. loss points for accepting or rejecting offers. RESULTS: We show that based on this new method, we can accurately decode decisions for the majority of the subjects. The typical time-period for accurate decoding was modulated by task difficulty on a trial-by-trial basis. Typical latencies of when decisions are made were detected at ∼500ms for 'easy' vs. ∼700ms for 'hard' decisions, well before subjects' response (∼340ms). Importantly, this decision time correlated with the drift rates of a diffusion model, evaluated independently at the behavioral level. COMPARISON WITH EXISTING METHOD(S): We compare the performance of our algorithm with logistic regression and support vector machine and show that we obtain significant results for a higher number of subjects than with these two approaches. We also carry out analyses at the average event-related potential level, for comparison with previous studies on decision-making. CONCLUSIONS: We present a novel approach for studying the timing of value-based decision-making, by accumulating patterns of topographic EEG activity at single-trial level.
Resumo:
The design of therapeutic cancer vaccines is aimed at inducing high numbers and potent T cells that are able to target and eradicate malignant cells. This calls for close collaboration between cells of the innate immune system, in particular dendritic cells (DCs), and cells of the adaptive immune system, notably CD4+ helper T cells and CD8+ cytotoxic T cells. Therapeutic vaccines are aided by adjuvants, which can be, for example, Toll¬like Receptor agonists or agents promoting the cytosolic delivery of antigens, among others. Vaccination with long synthetic peptides (LSPs) is a promising strategy, as the requirement for their intracellular processing will mainly target LSPs to professional antigen presenting cells (APCs), hence avoiding the immune tolerance elicited by the presentation of antigens by non-professional APCs. The unique property of antigen cross-processing and cross-presentation activity by DCs plays an important role in eliciting antitumour immunity given that antigens from engulfed dead tumour cells require this distinct biological process to be processed and presented to CD8+T cells in the context of MHC class I molecules. DCs expressing the XCR1 chemokine receptor are characterised by their superior capability of antigen cross- presentation and priming of highly cytotoxic T lymphocyte (CTL) responses. Recently, XCR1 was found to be also expressed in tissue-residents DCs in humans, with a simitar transcriptional profile to that of cross- presenting murine DCs. This shed light into the value of harnessing this subtype of XCR1+ cross-presenting DCs for therapeutic vaccination of cancer. In this study, we explored ways of adjuvanting and optimising LSP therapeutic vaccinations by the use, in Part I, of the XCLl chemokine that selectively binds to the XCR1 receptor, as a mean to target antigen to the cross-presenting XCR1+ DCs; and in Part II, by the inclusion of Q.S21 in the LSP vaccine formulation, a saponin with adjuvant activity, as well as the ability to promote cytosolic delivery of LSP antigens due to its intrinsic cell membrane insertion activity. In Part I, we designed and produced XCLl-(OVA LSP)-Fc fusion proteins, and showed that their binding to XCR1+ DCs mediate their chemoattraction. In addition, therapeutic vaccinations adjuvanted with XCLl-(OVA LSP)-Fc fusion proteins significantly enhanced the OVA-specific CD8+ T cell response, and led to complete tumour regression in the EL4-OVA model, and significant control of tumour growth in the B16.0VA tumour model. With the aim to optimise the co-delivery of LSP antigen and XCLl to skin-draining lymph nodes we also tested immunisations using nanoparticle (NP)-conjugated OVA LSP in the presence or absence of XCLl chemokine. The NP-mediated delivery of LSP potentiated the CTL response seen in the blood of vaccinated mice, and NP-OVA LSP vaccine in the presence of XCLl led to higher blood frequencies of OVA-specific memory-precursor effector cells. Nevertheless, in these settings, the addition XCLl to NP-OVA LSP vaccine formulation did not increase its antitumour therapeutic effect. In the Part II, we assessed in HLA-A2/DR1 mice the immunogenicity of the Melan-AA27L LSP or the Melan-A26. 35 AA27l short synthetic peptide (SSP) used in conjunction with the saponin adjuvant QS21, aiming to identify a potent adjuvant formulation that elicits a quantitatively and qualitatively strong immune response to tumour antigens. We showed a high CTL immune response elicited by the use of Melan-A LSP or SSP with QS21, which both exerted similar killing capacity upon in vivo transfer of target cells expressing the Melan-A peptide in the context of HLA-A2 molecules. However, the response generated by the LSP immunisation comprised higher percentages of CD8+T cells of the central memory phenotype (CD44hl CD62L+ and CCR7+ CD62L+) than those of SSP immunisation, and most importantly, the strong LSP+QS21 response was strictly CD4+T cell-dependent, as shown upon CD4 T cell depletion. Altogether, these results suggest that both XCLl and QS21 may enhance the ability of LSP to prime CD8 specific T cell responses, and promote a long-term memory response. Therefore, these observations may have important implications for the design of protein or LSP-based cancer vaccines for specific immunotherapy of cancer -- Les vacans thérapeutiques contre le cancer visent à induire une forte et durable réponse immunitaire contre des cellules cancéreuses résiduelles. Cette réponse requiert la collaboration entre le système immunitaire inné, en particulier les cellules dendrites (DCs), et le système immunitaire adaptatif, en l'occurrence les lymphocytes TCD4 hdper et CD8 cytotoxiques. La mise au point d'adjuvants et de molécules mimant un agent pathogène tels les ligands TLRs ou d'autres agents facilitant l'internalisation d'antigènes, est essentielle pour casser la tolérance du système immunitaire contre les cellules cancéreuses afin de générer une réponse effectrice et mémoire contre la tumeur. L'utilisation de longs peptides synthétiques (LSPs) est une approche prometteuse du fait que leur présentation en tant qu'antigénes requiert leur internalisation et leur transformation par les cellules dendrites (DCs, qui sont les mieux à même d'éviter la tolérance immunitaire. Récemment une sous-population de DCs exprimant le récepteur XCR1 a été décrite comme ayant une capacité supérieure dans la cross-présentation d'antigènes, d'où un intérêt à développer des vaccins ciblant les DCs exprimant le XCR1. Durant ma thèse de doctorat, j'ai exploré différentes approches pour optimiser les vaccins avec LSPs. La première partie visait à cibler les XCR1-DCs à l'aide de la chemokine XCL1 spécifique du récepteur XCR1, soit sou s la forme de protéine de fusion XCL1-OVA LSP-Fc, soit associée à des nanoparticules. La deuxième partie a consisté à tester l'association des LSPs avec I adjuvant QS21 dérivant d'une saponine dans le but d'optimiser l'internalisation cytosolique des longs peptides. Les protéines de fusion XCLl-OVA-Fc développées dans la première partie de mon travail, ont démontré leur capacité de liaison spécifique sur les XCRl-DCs associée à leur capacité de chemo-attractio. Lorsque inclues dans une mmunisation de souris porteuse de tumeurs établies, ces protéines de fusion XCL1-0VA LSP-Fc et XCLl-Fc plus OVA LSP ont induites une forte réponse CDS OVA spécifique permettant la complète régression des tumeurs de modèle EL4- 0VA et un retard de croissance significatif de tumeurs de type B16-0VA. Dans le but d'optimiser le drainage des LSPs vers es noyaux lymphatiques, nous avons également testé les LSPs fixés de manière covalente à des nanoparticules co- injectees ou non avec la chemokine XCL1. Cette formulation a également permis une forte réponse CD8 accompagnée d'un effet thérapeutique significatif, mais l'addition de la chemokine XCL1 n'a pas ajouté d'effet anti-tumeur supplémentaire. Dans la deuxième partie de ma thèse, j'ai comparé l'immunogénicité de l'antigène humain Melan A soit sous la forme d un LSP incluant un épitope CD4 et CD8 ou sous la forme d'un peptide ne contenant que l'épitope CD8 (SSP) Les peptides ont été formulés avec l'adjuvant QS21 et testés dans un modèle de souris transgéniques pour les MHC let II humains, respectivement le HLA-A2 et DR1. Les deux peptides LSP et SSP ont généré une forte réponse CD8 similaire assoc.ee a une capacité cytotoxique équivalente lors du transfert in vivo de cellules cibles présentant le peptide SSP' Cependant les souris immunisées avec le Melan A LSP présentaient un pourcentage plus élevé de CD8 ayant un Phénotype «centra, memory» (CD44h' CD62L+ and CCR7+ CD62L+) que les souris immunisées avec le SSP, même dix mois après I'immunisation. Par ailleurs, la réponse CD8 au Melan A LSP était strictement dépendante des lymphocytes CD4, contrairement à l'immunisation par le Melan A SSP qui n'était pas affectée. Dans l'ensemble ces résultats suggèrent que la chemokine XCL1 et l'adjuvant QS21 améliorent la réponse CD8 à un long peptide synthétique, favorisant ainsi le développement d'une réponse anti-tumeur mémoire durable. Ces observations pourraient être utiles au développement de nouveau vaccins thérapeutiques contre les tumeurs.
Resumo:
PURPOSE OF REVIEW: Current computational neuroanatomy based on MRI focuses on morphological measures of the brain. We present recent methodological developments in quantitative MRI (qMRI) that provide standardized measures of the brain, which go beyond morphology. We show how biophysical modelling of qMRI data can provide quantitative histological measures of brain tissue, leading to the emerging field of in-vivo histology using MRI (hMRI). RECENT FINDINGS: qMRI has greatly improved the sensitivity and specificity of computational neuroanatomy studies. qMRI metrics can also be used as direct indicators of the mechanisms driving observed morphological findings. For hMRI, biophysical models of the MRI signal are being developed to directly access histological information such as cortical myelination, axonal diameters or axonal g-ratio in white matter. Emerging results indicate promising prospects for the combined study of brain microstructure and function. SUMMARY: Non-invasive brain tissue characterization using qMRI or hMRI has significant implications for both research and clinics. Both approaches improve comparability across sites and time points, facilitating multicentre/longitudinal studies and standardized diagnostics. hMRI is expected to shed new light on the relationship between brain microstructure, function and behaviour, both in health and disease, and become an indispensable addition to computational neuroanatomy.
Resumo:
PURPOSE: According to estimations around 230 people die as a result of radon exposure in Switzerland. This public health concern makes reliable indoor radon prediction and mapping methods necessary in order to improve risk communication to the public. The aim of this study was to develop an automated method to classify lithological units according to their radon characteristics and to develop mapping and predictive tools in order to improve local radon prediction. METHOD: About 240 000 indoor radon concentration (IRC) measurements in about 150 000 buildings were available for our analysis. The automated classification of lithological units was based on k-medoids clustering via pair-wise Kolmogorov distances between IRC distributions of lithological units. For IRC mapping and prediction we used random forests and Bayesian additive regression trees (BART). RESULTS: The automated classification groups lithological units well in terms of their IRC characteristics. Especially the IRC differences in metamorphic rocks like gneiss are well revealed by this method. The maps produced by random forests soundly represent the regional difference of IRCs in Switzerland and improve the spatial detail compared to existing approaches. We could explain 33% of the variations in IRC data with random forests. Additionally, the influence of a variable evaluated by random forests shows that building characteristics are less important predictors for IRCs than spatial/geological influences. BART could explain 29% of IRC variability and produced maps that indicate the prediction uncertainty. CONCLUSION: Ensemble regression trees are a powerful tool to model and understand the multidimensional influences on IRCs. Automatic clustering of lithological units complements this method by facilitating the interpretation of radon properties of rock types. This study provides an important element for radon risk communication. Future approaches should consider taking into account further variables like soil gas radon measurements as well as more detailed geological information.
Resumo:
The enhanced functional sensitivity offered by ultra-high field imaging may significantly benefit simultaneous EEG-fMRI studies, but the concurrent increases in artifact contamination can strongly compromise EEG data quality. In the present study, we focus on EEG artifacts created by head motion in the static B0 field. A novel approach for motion artifact detection is proposed, based on a simple modification of a commercial EEG cap, in which four electrodes are non-permanently adapted to record only magnetic induction effects. Simultaneous EEG-fMRI data were acquired with this setup, at 7T, from healthy volunteers undergoing a reversing-checkerboard visual stimulation paradigm. Data analysis assisted by the motion sensors revealed that, after gradient artifact correction, EEG signal variance was largely dominated by pulse artifacts (81-93%), but contributions from spontaneous motion (4-13%) were still comparable to or even larger than those of actual neuronal activity (3-9%). Multiple approaches were tested to determine the most effective procedure for denoising EEG data incorporating motion sensor information. Optimal results were obtained by applying an initial pulse artifact correction step (AAS-based), followed by motion artifact correction (based on the motion sensors) and ICA denoising. On average, motion artifact correction (after AAS) yielded a 61% reduction in signal power and a 62% increase in VEP trial-by-trial consistency. Combined with ICA, these improvements rose to a 74% power reduction and an 86% increase in trial consistency. Overall, the improvements achieved were well appreciable at single-subject and single-trial levels, and set an encouraging quality mark for simultaneous EEG-fMRI at ultra-high field.
Resumo:
Aim: Obesity and smoking are major CVD risk factors and may be associated with other unfavourable lifestyle behaviours. Our aim was to investigate the significance of obesity, heavy smoking, and both combined in terms of prevalence trends and their relationship with other lifestyle factors. Methods: We used data from the population-based cross-sectional Swiss Health Survey (5 waves, 1992-2012) comprising 85,575 individuals aged 18 years. Height, weight, and smoking status were self-reported. We used multinomial logistic regression to analyse differences in lifestyle for the combinations of BMI category and smoking status, focusing on obese and heavy smokers. We defined normal-weight never smokers as reference.
Resumo:
Nowadays, Species Distribution Models (SDMs) are a widely used tool. Using different statistical approaches these models reconstruct the realized niche of a species using presence data and a set of variables, often topoclimatic. There utilization range is quite large from understanding single species requirements, to the creation of nature reserve based on species hotspots, or modeling of climate change impact, etc... Most of the time these models are using variables at a resolution of 50km x 50km or 1 km x 1 km. However in some cases these models are used with resolutions below the kilometer scale and thus called high resolution models (100 m x 100 m or 25 m x 25 m). Quite recently a new kind of data has emerged enabling precision up to lm x lm and thus allowing very high resolution modeling. However these new variables are very costly and need an important amount of time to be processed. This is especially the case when these variables are used in complex calculation like models projections over large areas. Moreover the importance of very high resolution data in SDMs has not been assessed yet and is not well understood. Some basic knowledge on what drive species presence-absences is still missing. Indeed, it is not clear whether in mountain areas like the Alps coarse topoclimatic gradients are driving species distributions or if fine scale temperature or topography are more important or if their importance can be neglected when balance to competition or stochasticity. In this thesis I investigated the importance of very high resolution data (2-5m) in species distribution models using either very high resolution topographic, climatic or edaphic variables over a 2000m elevation gradient in the Western Swiss Alps. I also investigated more local responses of these variables for a subset of species living in this area at two precise elvation belts. During this thesis I showed that high resolution data necessitates very good datasets (species and variables for the models) to produce satisfactory results. Indeed, in mountain areas, temperature is the most important factor driving species distribution and needs to be modeled at very fine resolution instead of being interpolated over large surface to produce satisfactory results. Despite the instinctive idea that topographic should be very important at high resolution, results are mitigated. However looking at the importance of variables over a large gradient buffers the importance of the variables. Indeed topographic factors have been shown to be highly important at the subalpine level but their importance decrease at lower elevations. Wether at the mountane level edaphic and land use factors are more important high resolution topographic data is more imporatant at the subalpine level. Finally the biggest improvement in the models happens when edaphic variables are added. Indeed, adding soil variables is of high importance and variables like pH are overpassing the usual topographic variables in SDMs in term of importance in the models. To conclude high resolution is very important in modeling but necessitate very good datasets. Only increasing the resolution of the usual topoclimatic predictors is not sufficient and the use of edaphic predictors has been highlighted as fundamental to produce significantly better models. This is of primary importance, especially if these models are used to reconstruct communities or as basis for biodiversity assessments. -- Ces dernières années, l'utilisation des modèles de distribution d'espèces (SDMs) a continuellement augmenté. Ces modèles utilisent différents outils statistiques afin de reconstruire la niche réalisée d'une espèce à l'aide de variables, notamment climatiques ou topographiques, et de données de présence récoltées sur le terrain. Leur utilisation couvre de nombreux domaines allant de l'étude de l'écologie d'une espèce à la reconstruction de communautés ou à l'impact du réchauffement climatique. La plupart du temps, ces modèles utilisent des occur-rences issues des bases de données mondiales à une résolution plutôt large (1 km ou même 50 km). Certaines bases de données permettent cependant de travailler à haute résolution, par conséquent de descendre en dessous de l'échelle du kilomètre et de travailler avec des résolutions de 100 m x 100 m ou de 25 m x 25 m. Récemment, une nouvelle génération de données à très haute résolution est apparue et permet de travailler à l'échelle du mètre. Les variables qui peuvent être générées sur la base de ces nouvelles données sont cependant très coûteuses et nécessitent un temps conséquent quant à leur traitement. En effet, tout calcul statistique complexe, comme des projections de distribution d'espèces sur de larges surfaces, demande des calculateurs puissants et beaucoup de temps. De plus, les facteurs régissant la distribution des espèces à fine échelle sont encore mal connus et l'importance de variables à haute résolution comme la microtopographie ou la température dans les modèles n'est pas certaine. D'autres facteurs comme la compétition ou la stochasticité naturelle pourraient avoir une influence toute aussi forte. C'est dans ce contexte que se situe mon travail de thèse. J'ai cherché à comprendre l'importance de la haute résolution dans les modèles de distribution d'espèces, que ce soit pour la température, la microtopographie ou les variables édaphiques le long d'un important gradient d'altitude dans les Préalpes vaudoises. J'ai également cherché à comprendre l'impact local de certaines variables potentiellement négligées en raison d'effets confondants le long du gradient altitudinal. Durant cette thèse, j'ai pu monter que les variables à haute résolution, qu'elles soient liées à la température ou à la microtopographie, ne permettent qu'une amélioration substantielle des modèles. Afin de distinguer une amélioration conséquente, il est nécessaire de travailler avec des jeux de données plus importants, tant au niveau des espèces que des variables utilisées. Par exemple, les couches climatiques habituellement interpolées doivent être remplacées par des couches de température modélisées à haute résolution sur la base de données de terrain. Le fait de travailler le long d'un gradient de température de 2000m rend naturellement la température très importante au niveau des modèles. L'importance de la microtopographie est négligeable par rapport à la topographie à une résolution de 25m. Cependant, lorsque l'on regarde à une échelle plus locale, la haute résolution est une variable extrêmement importante dans le milieu subalpin. À l'étage montagnard par contre, les variables liées aux sols et à l'utilisation du sol sont très importantes. Finalement, les modèles de distribution d'espèces ont été particulièrement améliorés par l'addition de variables édaphiques, principalement le pH, dont l'importance supplante ou égale les variables topographique lors de leur ajout aux modèles de distribution d'espèces habituels.
Resumo:
It is well established that cytotoxic T lymphocytes play a pivotal role in the protection against intracellular pathogens and tumour cells. Such protective immune responses rely on the specific T cell receptor (TCR)-mediated recognition by CD8 T cells of small antigenic peptides presented in the context of class-I Major Histocompatibility Complex molecules (pMHCs) on the surface of infected or malignant cells. The strength (affinity/avidity) of this interaction is a major correlate of protection. Although tumour-reactive CD8 T cells can be observed in cancer patients, anti-tumour immune responses are often ineffective in controlling or eradicating the disease due to the relative low TCR affinity of these cells. To overcome this limitation, tumour-specific CD8 T cells can be genetically modified to express TCRs of improved binding strength against a defined tumour antigen before adoptive cell transfer into cancer patients. We previously generated a panel of TCRs specific for the cancer-testis antigen NY-ESO-l,57.165 with progressively increased affinities for the pMHC complex, thus providing us with a unique tool to investigate the causal link between the surface expression of such TCRs and T cell activation and function. We recently demonstrated that anti-tumour CD8 T cell reactivity could only be improved within physiological affinity limits, beyond which drastic functional declines were observed, suggesting the presence of multiple regulatory mechanisms limiting T cell activation and function in a TCR affinity-dependent manner. The overarching goal of this thesis was (i) to assess the precise impact of TCR affinity on T cell activation and signalling at the molecular level and (ii) to gain further insights on the mechanisms that regulate and delimitate maximal/optimized CD8 T cell activation and signalling. Specifically, by combining several technical approaches we characterized the activation status of proximal (i.e. CD3Ç, Lek, and ZAP-70) and distal (i.e. ERK1/2) signalling molecules along the TCR affinity gradient. Moreover, we assessed the extent of TCR downmodulation, a critical step for initial T cell activation. CD8 T cells engineered with the optimal TCR affinity variants showed increased activation levels of both proximal and distal signalling molecules when compared to the wild-type T cells. Our analyses also highlighted the "paradoxical" status of tumour-reactive CD8 T cells bearing very high TCR affinities, which retained strong proximal signalling capacity and TCR downmodulation, but were unable to propagate signalling distally (i.e. pERKl/2), resulting in impaired cell-mediated functions. Importantly, these very high affinity T cells displayed maximal levels of SHP-1 and SHP-2 phosphatases, two negative regulatory molecules, and this correlated with a partial pERKl/2 signalling recovery upon pharmacological SHP-l/SHP-2 inhibition. These findings revealed the putative presence of inhibitory regulators of the TCR signalling cascade acting very rapidly following tumour-specific stimulation. Moreover, the very high affinity T cells were only able to transiently express enhanced proximal signalling molecules, suggesting the presence of an additional level of regulation that operates through the activation of negative feedback loops over time, limiting the duration of the TCR-mediated signalling. Overall, the determination of TCR-pMHC binding parameters eliciting optimal CD8 T cell activation, signalling, and effector function while guaranteeing high antigen specificity, together with the identification of critical regulatory mechanisms acting proximally in the TCR signalling cascade, will directly contribute to optimize and support the development of future TCR-based adoptive T cell strategies for the treatment of malignant diseases. -- Les lymphocytes T CD8 cytotoxiques jouent un rôle prédominant dans la protection contre les pathogènes intracellulaires et les cellules tumorales. Ces réponses immunitaires dépendent de la spécificité avec laquelle les récepteurs T (TCR) des lymphocytes CD8 reconnaissent les peptides antigéniques présentés par les molécules du complexe Majeur de Histocompatibilité de classe I (pCMH) à la surface des cellules infectées ou malignes. La force (ou affinité/avidité) de l'interaction du TCR-pCMH est un corrélat majeur de protection. Les réponses immunitaires sont cependant souvent inefficaces et ne permettent pas de contrôler ou d'éliminer les cellules tumorales chez les patients atteint du cancer, et ce à cause de la relative faible reconnaissance des TCRs exprimés par les lymphocytes T CD8 envers les antigènes tumoraux. Afin de surmonter cette limitation, les cellules T anti-tumorales peuvent être génétiquement modifiées en les dotant de TCRs préalablement optimisés afin d'augmenter leur reconnaissance ou affinité contre les antigènes tumoraux, avant leur ré¬infusion dans le patient. Nous avons récemment généré des cellules T CD8 exprimant un panel de TCRs spécifiques pour l'antigène tumoral NY-ESO-l157.16J avec des affinités croissantes, permettant ainsi d'investiguer la causalité directe entre l'affinité du TCR-pCMH et la fonction des cellules T CD8. Nous avons démontré que la réactivité anti-tumorale pouvait être améliorée en augmentant l'affinité du TCR dans une intervalle physiologique, mais au delà duquel nous observons un important déclin fonctionnel. Ces résultats suggèrent la présence de mécanismes de régulation limitant l'activation des cellules T de manière dépendante de l'affinité du TCR. Le but de cette thèse a été (i) de définir l'impact précis de l'affinité du TCR sur l'activation et la signalisation des cellules T CD8 au niveau moléculaire et (ii) d'acquérir de nouvelles connaissances sur les mécanismes qui régulent et délimitent l'activation et la signalisation maximale des cellules T CD8 optimisées. Spécifiquement, en combinant plusieurs approches technologiques, nous avons caractérisé l'état d'activation de différentes protéines de la voie de signalisation proximale (CD3Ç, Lek et ZAP-70) et distale (ERK1/2) le long du gradient d'affinité du TCR, ainsi que l'internalisation du TCR, une étape clef dans l'activation initiale des cellules T. Les lymphocytes T CD8 exprimant des TCRs d'affinité optimale ont montré des niveaux d'activation augmentés des molécules proximales et distales par rapport aux cellules de type sauvage (wild-type). Nos analyses ont également mis en évidence un paradoxe chez les cellules T CD8 équipées avec des TCRs de très haute affinité. En effet, ces cellules anti-tumorales sont capables d'activer leurs circuits biochimiques au niveau proximal et d'internaliser efficacement leur TCR, mais ne parviennent pas à propager les signaux biochimiques dépendants du TCR jusqu'au niveau distal (via phospho-ERKl/2), avec pour conséquence une limitation de leur capacité fonctionnelle. Finalement, nous avons démontré que SHP-1 et SHP-2, deux phosphatases avec des propriétés régulatrices négatives, étaient majoritairement exprimées dans les cellules T CD8 de très hautes affinités. Une récupération partielle des niveaux d'activation de ERK1/2 a pu être observée après l'inhibition pharmacologique de ces phosphatases. Ces découvertes révèlent la présence de régulateurs moléculaires qui inhibent le complexe de signalisation du TCR très rapidement après la stimulation anti-tumorale. De plus, les cellules T de très hautes affinités ne sont capables d'activer les molécules de la cascade de signalisation proximale que de manière transitoire, suggérant ainsi un second niveau de régulation via l'activation de mécanismes de rétroaction prenant place progressivement au cours du temps et limitant la durée de la signalisation dépendante du TCR. En résumé, la détermination des paramètres impliqués dans l'interaction du TCR-pCMH permettant l'activation de voies de signalisation et des fonctions effectrices optimales ainsi que l'identification des mécanismes de régulation au niveau proximal de la cascade de signalisation du TCR contribuent directement à l'optimisation et au développement de stratégies anti-tumorales basées sur l'ingénierie des TCRs pour le traitement des maladies malignes.
Resumo:
Candida albicans adaptation to the host requires a profound reprogramming of the fungal transcriptome as compared to in vitro laboratory conditions. A detailed knowledge of the C. albicans transcriptome during the infection process is necessary in order to understand which of the fungal genes are important for host adaptation. Such genes could be thought of as potential targets for antifungal therapy. The acquisition of the C. albicans transcriptome is, however, technically challenging due to the low proportion of fungal RNA in host tissues. Two emerging technologies were used recently to circumvent this problem. One consists of the detection of low abundance fungal RNA using capture and reporter gene probes which is followed by emission and quantification of resulting fluorescent signals (nanoString). The other is based first on the capture of fungal RNA by short biotinylated oligonucleotide baits covering the C. albicans ORFome permitting fungal RNA purification. Next, the enriched fungal RNA is amplified and subjected to RNA sequencing (RNA-seq). Here we detail these two transcriptome approaches and discuss their advantages and limitations and future perspectives in microbial transcriptomics from host material.
Resumo:
Immunotherapy is emerging as a promising anti-cancer curative modality. However, in contrast to recent advances obtained employing checkpoint blockade agents and T cell therapies, clinical efficacy of therapeutic cancer vaccines is still limited. Most vaccination attempts in the clinic represent "off-the shelf" approaches since they target common "self" tumor antigens, shared among different patients. In contrast, personalized approaches of vaccination are tailor-made for each patient and in spite being laborious, hold great potential. Recent technical advancement enabled the first steps in the clinic of personalized vaccines that target patient-specific mutated neo-antigens. Such vaccines could induce enhanced tumor-specific immune response since neo-antigens are mutation-derived antigens that can be recognized by high affinity T cells, not limited by central tolerance. Alternatively, the use of personalized vaccines based on whole autologous tumor cells, overcome the need for the identification of specific tumor antigens. Whole autologous tumor cells could be administered alone, pulsed on dendritic cells as lysate, DNA, RNA or delivered to dendritic cells in-vivo through encapsulation in nanoparticle vehicles. Such vaccines may provide a source for the full repertoire of the patient-specific tumor antigens, including its private neo-antigens. Furthermore, combining next-generation personalized vaccination with other immunotherapy modalities might be the key for achieving significant therapeutic outcome.
Resumo:
Landslide processes can have direct and indirect consequences affecting human lives and activities. In order to improve landslide risk management procedures, this PhD thesis aims to investigate capabilities of active LiDAR and RaDAR sensors for landslides detection and characterization at regional scales, spatial risk assessment over large areas and slope instabilities monitoring and modelling at site-specific scales. At regional scales, we first demonstrated recent boat-based mobile LiDAR capabilities to model topography of the Normand coastal cliffs. By comparing annual acquisitions, we validated as well our approach to detect surface changes and thus map rock collapses, landslides and toe erosions affecting the shoreline at a county scale. Then, we applied a spaceborne InSAR approach to detect large slope instabilities in Argentina. Based on both phase and amplitude RaDAR signals, we extracted decisive information to detect, characterize and monitor two unknown extremely slow landslides, and to quantify water level variations of an involved close dam reservoir. Finally, advanced investigations on fragmental rockfall risk assessment were conducted along roads of the Val de Bagnes, by improving approaches of the Slope Angle Distribution and the FlowR software. Therefore, both rock-mass-failure susceptibilities and relative frequencies of block propagations were assessed and rockfall hazard and risk maps could be established at the valley scale. At slope-specific scales, in the Swiss Alps, we first integrated ground-based InSAR and terrestrial LiDAR acquisitions to map, monitor and model the Perraire rock slope deformation. By interpreting both methods individually and originally integrated as well, we therefore delimited the rockslide borders, computed volumes and highlighted non-uniform translational displacements along a wedge failure surface. Finally, we studied specific requirements and practical issues experimented on early warning systems of some of the most studied landslides worldwide. As a result, we highlighted valuable key recommendations to design new reliable systems; in addition, we also underlined conceptual issues that must be solved to improve current procedures. To sum up, the diversity of experimented situations brought an extensive experience that revealed the potential and limitations of both methods and highlighted as well the necessity of their complementary and integrated uses.