143 resultados para discrete choice experiments


Relevância:

20.00% 20.00%

Publicador:

Resumo:

SUMMARY : Eukaryotic DNA interacts with the nuclear proteins using non-covalent ionic interactions. Proteins can recognize specific nucleotide sequences based on the sterical interactions with the DNA and these specific protein-DNA interactions are the basis for many nuclear processes, e.g. gene transcription, chromosomal replication, and recombination. New technology termed ChIP-Seq has been recently developed for the analysis of protein-DNA interactions on a whole genome scale and it is based on immunoprecipitation of chromatin and high-throughput DNA sequencing procedure. ChIP-Seq is a novel technique with a great potential to replace older techniques for mapping of protein-DNA interactions. In this thesis, we bring some new insights into the ChIP-Seq data analysis. First, we point out to some common and so far unknown artifacts of the method. Sequence tag distribution in the genome does not follow uniform distribution and we have found extreme hot-spots of tag accumulation over specific loci in the human and mouse genomes. These artifactual sequence tags accumulations will create false peaks in every ChIP-Seq dataset and we propose different filtering methods to reduce the number of false positives. Next, we propose random sampling as a powerful analytical tool in the ChIP-Seq data analysis that could be used to infer biological knowledge from the massive ChIP-Seq datasets. We created unbiased random sampling algorithm and we used this methodology to reveal some of the important biological properties of Nuclear Factor I DNA binding proteins. Finally, by analyzing the ChIP-Seq data in detail, we revealed that Nuclear Factor I transcription factors mainly act as activators of transcription, and that they are associated with specific chromatin modifications that are markers of open chromatin. We speculate that NFI factors only interact with the DNA wrapped around the nucleosome. We also found multiple loci that indicate possible chromatin barrier activity of NFI proteins, which could suggest the use of NFI binding sequences as chromatin insulators in biotechnology applications. RESUME : L'ADN des eucaryotes interagit avec les protéines nucléaires par des interactions noncovalentes ioniques. Les protéines peuvent reconnaître les séquences nucléotidiques spécifiques basées sur l'interaction stérique avec l'ADN, et des interactions spécifiques contrôlent de nombreux processus nucléaire, p.ex. transcription du gène, la réplication chromosomique, et la recombinaison. Une nouvelle technologie appelée ChIP-Seq a été récemment développée pour l'analyse des interactions protéine-ADN à l'échelle du génome entier et cette approche est basée sur l'immuno-précipitation de la chromatine et sur la procédure de séquençage de l'ADN à haut débit. La nouvelle approche ChIP-Seq a donc un fort potentiel pour remplacer les anciennes techniques de cartographie des interactions protéine-ADN. Dans cette thèse, nous apportons de nouvelles perspectives dans l'analyse des données ChIP-Seq. Tout d'abord, nous avons identifié des artefacts très communs associés à cette méthode qui étaient jusqu'à présent insoupçonnés. La distribution des séquences dans le génome ne suit pas une distribution uniforme et nous avons constaté des positions extrêmes d'accumulation de séquence à des régions spécifiques, des génomes humains et de la souris. Ces accumulations des séquences artéfactuelles créera de faux pics dans toutes les données ChIP-Seq, et nous proposons différentes méthodes de filtrage pour réduire le nombre de faux positifs. Ensuite, nous proposons un nouvel échantillonnage aléatoire comme un outil puissant d'analyse des données ChIP-Seq, ce qui pourraient augmenter l'acquisition de connaissances biologiques à partir des données ChIP-Seq. Nous avons créé un algorithme d'échantillonnage aléatoire et nous avons utilisé cette méthode pour révéler certaines des propriétés biologiques importantes de protéines liant à l'ADN nommés Facteur Nucléaire I (NFI). Enfin, en analysant en détail les données de ChIP-Seq pour la famille de facteurs de transcription nommés Facteur Nucléaire I, nous avons révélé que ces protéines agissent principalement comme des activateurs de transcription, et qu'elles sont associées à des modifications de la chromatine spécifiques qui sont des marqueurs de la chromatine ouverte. Nous pensons que lés facteurs NFI interagir uniquement avec l'ADN enroulé autour du nucléosome. Nous avons également constaté plusieurs régions génomiques qui indiquent une éventuelle activité de barrière chromatinienne des protéines NFI, ce qui pourrait suggérer l'utilisation de séquences de liaison NFI comme séquences isolatrices dans des applications de la biotechnologie.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The human estrogen receptor (hER) is a trans-acting regulatory protein composed of a series of discrete functional domains. We have microinjected an hER expression vector (HEO) into Xenopus oocyte nuclei and demonstrate, using Western blot assay, that the hER is synthesized. When nuclear extracts from oocytes were prepared and incubated in the presence of a 2.7 kb DNA fragment comprising the 5' end of the vitellogenin gene B2, formation of estrogen-dependent complexes could be visualized by electron microscopy over the estrogen responsive element (ERE). Of crucial importance is the observation that the complex formation is inhibited by the estrogen antagonist tamoxifen, is restored by the addition of the hormone and does not take place with extracts from control oocytes injected with the expression vector lacking the sequences encoding the receptor. The presence of the biologically active hER is confirmed in co-injection experiments, in which HEO is co-introduced with a CAT reporter gene under the control of a vitellogenin promoter containing or lacking the ERE. CAT assays and primer extensions analyses reveal that both the receptor and the ERE are essential for estrogen induced stimulation of transcription. The same approach was used to analyze selective hER mutants. We find that the DNA binding domain (region C) is essential for protein--DNA complex formation at the ERE but is not sufficient by itself to activate transcription from the reporter gene. In addition to region C, both the hormone binding (region E) and amino terminal (region A/B) domains are needed for an efficient transcription activation.(ABSTRACT TRUNCATED AT 250 WORDS)

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Lentiviral vectors infect quiescent cells and allow for the delivery of genes to discrete brain regions. The present study assessed whether stable lentiviral gene transduction can be achieved in the monkey nigrostriatal system. Three young adult Rhesus monkeys received injections of a lentiviral vector encoding for the marker gene beta galatosidase (beta Gal). On one side of the brain, each monkey received multiple lentivirus injections into the caudate and putamen. On the opposite side, each animal received a single injection aimed at the substantia nigra. The first two monkeys were sacrificed 1 month postinjection, while the third monkey was sacrificed 3 months postinjection. Robust incorporation of the beta Gal gene was seen in the striatum of all three monkeys. Stereological counts revealed that 930,218; 1,192,359; and 1,501,217 cells in the striatum were beta Gal positive in monkeys 1 (n = 2) and 3 (n = 1) months later, respectively. Only the third monkey had an injection placed directly into the substantia nigra and 187,308 beta Gal-positive cells were identified in this animal. The injections induced only minor perivascular cuffing and there was no apparent inflammatory response resulting from the lentivirus injections. Double label experiments revealed that between 80 and 87% of the beta Gal-positive cells were neurons. These data indicate that robust transduction of striatal and nigral cells can occur in the nonhuman primate brain for up to 3 months. Studies are now ongoing testing the ability of lentivirus encoding for dopaminergic trophic factors to augment the nigrostriatal system in nonhuman primate models of Parkinson's disease.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Résumé: L'évaluation de l'exposition aux nuisances professionnelles représente une étape importante dans l'analyse de poste de travail. Les mesures directes sont rarement utilisées sur les lieux même du travail et l'exposition est souvent estimée sur base de jugements d'experts. Il y a donc un besoin important de développer des outils simples et transparents, qui puissent aider les spécialistes en hygiène industrielle dans leur prise de décision quant aux niveaux d'exposition. L'objectif de cette recherche est de développer et d'améliorer les outils de modélisation destinés à prévoir l'exposition. Dans un premier temps, une enquête a été entreprise en Suisse parmi les hygiénistes du travail afin d'identifier les besoins (types des résultats, de modèles et de paramètres observables potentiels). Il a été constaté que les modèles d'exposition ne sont guère employés dans la pratique en Suisse, l'exposition étant principalement estimée sur la base de l'expérience de l'expert. De plus, l'émissions de polluants ainsi que leur dispersion autour de la source ont été considérés comme des paramètres fondamentaux. Pour tester la flexibilité et la précision des modèles d'exposition classiques, des expériences de modélisations ont été effectuées dans des situations concrètes. En particulier, des modèles prédictifs ont été utilisés pour évaluer l'exposition professionnelle au monoxyde de carbone et la comparer aux niveaux d'exposition répertoriés dans la littérature pour des situations similaires. De même, l'exposition aux sprays imperméabilisants a été appréciée dans le contexte d'une étude épidémiologique sur une cohorte suisse. Dans ce cas, certains expériences ont été entreprises pour caractériser le taux de d'émission des sprays imperméabilisants. Ensuite un modèle classique à deux-zone a été employé pour évaluer la dispersion d'aérosol dans le champ proche et lointain pendant l'activité de sprayage. D'autres expériences ont également été effectuées pour acquérir une meilleure compréhension des processus d'émission et de dispersion d'un traceur, en se concentrant sur la caractérisation de l'exposition du champ proche. Un design expérimental a été développé pour effectuer des mesures simultanées dans plusieurs points d'une cabine d'exposition, par des instruments à lecture directe. Il a été constaté que d'un point de vue statistique, la théorie basée sur les compartiments est sensée, bien que l'attribution à un compartiment donné ne pourrait pas se faire sur la base des simples considérations géométriques. Dans une étape suivante, des données expérimentales ont été collectées sur la base des observations faites dans environ 100 lieux de travail différents: des informations sur les déterminants observés ont été associées aux mesures d'exposition des informations sur les déterminants observés ont été associé. Ces différentes données ont été employées pour améliorer le modèle d'exposition à deux zones. Un outil a donc été développé pour inclure des déterminants spécifiques dans le choix du compartiment, renforçant ainsi la fiabilité des prévisions. Toutes ces investigations ont servi à améliorer notre compréhension des outils des modélisations ainsi que leurs limitations. L'intégration de déterminants mieux adaptés aux besoins des experts devrait les inciter à employer cet outil dans leur pratique. D'ailleurs, en augmentant la qualité des outils des modélisations, cette recherche permettra non seulement d'encourager leur utilisation systématique, mais elle pourra également améliorer l'évaluation de l'exposition basée sur les jugements d'experts et, par conséquent, la protection de la santé des travailleurs. Abstract Occupational exposure assessment is an important stage in the management of chemical exposures. Few direct measurements are carried out in workplaces, and exposures are often estimated based on expert judgements. There is therefore a major requirement for simple transparent tools to help occupational health specialists to define exposure levels. The aim of the present research is to develop and improve modelling tools in order to predict exposure levels. In a first step a survey was made among professionals to define their expectations about modelling tools (what types of results, models and potential observable parameters). It was found that models are rarely used in Switzerland and that exposures are mainly estimated from past experiences of the expert. Moreover chemical emissions and their dispersion near the source have also been considered as key parameters. Experimental and modelling studies were also performed in some specific cases in order to test the flexibility and drawbacks of existing tools. In particular, models were applied to assess professional exposure to CO for different situations and compared with the exposure levels found in the literature for similar situations. Further, exposure to waterproofing sprays was studied as part of an epidemiological study on a Swiss cohort. In this case, some laboratory investigation have been undertaken to characterize the waterproofing overspray emission rate. A classical two-zone model was used to assess the aerosol dispersion in the near and far field during spraying. Experiments were also carried out to better understand the processes of emission and dispersion for tracer compounds, focusing on the characterization of near field exposure. An experimental set-up has been developed to perform simultaneous measurements through direct reading instruments in several points. It was mainly found that from a statistical point of view, the compartmental theory makes sense but the attribution to a given compartment could ñó~be done by simple geometric consideration. In a further step the experimental data were completed by observations made in about 100 different workplaces, including exposure measurements and observation of predefined determinants. The various data obtained have been used to improve an existing twocompartment exposure model. A tool was developed to include specific determinants in the choice of the compartment, thus largely improving the reliability of the predictions. All these investigations helped improving our understanding of modelling tools and identify their limitations. The integration of more accessible determinants, which are in accordance with experts needs, may indeed enhance model application for field practice. Moreover, while increasing the quality of modelling tool, this research will not only encourage their systematic use, but might also improve the conditions in which the expert judgments take place, and therefore the workers `health protection.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

PURPOSE: NovoTTF-100A is a portable device delivering low-intensity, intermediate frequency electric fields via non-invasive, transducer arrays. Tumour Treatment Fields (TTF), a completely new therapeutic modality in cancer treatment, physically interfere with cell division. METHODS: Phase III trial of chemotherapy-free treatment of NovoTTF (20-24h/day) versus active chemotherapy in the treatment of patients with recurrent glioblastoma. Primary end-point was improvement of overall survival. RESULTS: Patients (median age 54years (range 23-80), Karnofsky performance status 80% (range 50-100) were randomised to TTF alone (n=120) or active chemotherapy control (n=117). Number of prior treatments was two (range 1-6). Median survival was 6.6 versus 6.0months (hazard ratio 0.86 [95% CI 0.66-1.12]; p=0.27), 1-year survival rate was 20% and 20%, progression-free survival rate at 6months was 21.4% and 15.1% (p=0.13), respectively in TTF and active control patients. Responses were more common in the TTF arm (14% versus 9.6%, p=0.19). The TTF-related adverse events were mild (14%) to moderate (2%) skin rash beneath the transducer arrays. Severe adverse events occurred in 6% and 16% (p=0.022) of patients treated with TTF and chemotherapy, respectively. Quality of life analyses favoured TTF therapy in most domains. CONCLUSIONS: This is the first controlled trial evaluating an entirely novel cancer treatment modality delivering electric fields rather than chemotherapy. No improvement in overall survival was demonstrated, however efficacy and activity with this chemotherapy-free treatment device appears comparable to chemotherapy regimens that are commonly used for recurrent glioblastoma. Toxicity and quality of life clearly favoured TTF.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Warming experiments are increasingly relied on to estimate plant responses to global climate change. For experiments to provide meaningful predictions of future responses, they should reflect the empirical record of responses to temperature variability and recent warming, including advances in the timing of flowering and leafing. We compared phenology (the timing of recurring life history events) in observational studies and warming experiments spanning four continents and 1,634 plant species using a common measure of temperature sensitivity (change in days per degree Celsius). We show that warming experiments underpredict advances in the timing of flowering and leafing by 8.5-fold and 4.0-fold, respectively, compared with long-term observations. For species that were common to both study types, the experimental results did not match the observational data in sign or magnitude. The observational data also showed that species that flower earliest in the spring have the highest temperature sensitivities, but this trend was not reflected in the experimental data. These significant mismatches seem to be unrelated to the study length or to the degree of manipulated warming in experiments. The discrepancy between experiments and observations, however, could arise from complex interactions among multiple drivers in the observational data, or it could arise from remediable artefacts in the experiments that result in lower irradiance and drier soils, thus dampening the phenological responses to manipulated warming. Our results introduce uncertainty into ecosystem models that are informed solely by experiments and suggest that responses to climate change that are predicted using such models should be re-evaluated.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: The ambition of most molecular biologists is the understanding of the intricate network of molecular interactions that control biological systems. As scientists uncover the components and the connectivity of these networks, it becomes possible to study their dynamical behavior as a whole and discover what is the specific role of each of their components. Since the behavior of a network is by no means intuitive, it becomes necessary to use computational models to understand its behavior and to be able to make predictions about it. Unfortunately, most current computational models describe small networks due to the scarcity of kinetic data available. To overcome this problem, we previously published a methodology to convert a signaling network into a dynamical system, even in the total absence of kinetic information. In this paper we present a software implementation of such methodology. RESULTS: We developed SQUAD, a software for the dynamic simulation of signaling networks using the standardized qualitative dynamical systems approach. SQUAD converts the network into a discrete dynamical system, and it uses a binary decision diagram algorithm to identify all the steady states of the system. Then, the software creates a continuous dynamical system and localizes its steady states which are located near the steady states of the discrete system. The software permits to make simulations on the continuous system, allowing for the modification of several parameters. Importantly, SQUAD includes a framework for perturbing networks in a manner similar to what is performed in experimental laboratory protocols, for example by activating receptors or knocking out molecular components. Using this software we have been able to successfully reproduce the behavior of the regulatory network implicated in T-helper cell differentiation. CONCLUSION: The simulation of regulatory networks aims at predicting the behavior of a whole system when subject to stimuli, such as drugs, or determine the role of specific components within the network. The predictions can then be used to interpret and/or drive laboratory experiments. SQUAD provides a user-friendly graphical interface, accessible to both computational and experimental biologists for the fast qualitative simulation of large regulatory networks for which kinetic data is not necessarily available.