46 resultados para 380109 Psychological Methodology, Design and Analysis
em Université de Lausanne, Switzerland
Resumo:
SUMMARY : Eukaryotic DNA interacts with the nuclear proteins using non-covalent ionic interactions. Proteins can recognize specific nucleotide sequences based on the sterical interactions with the DNA and these specific protein-DNA interactions are the basis for many nuclear processes, e.g. gene transcription, chromosomal replication, and recombination. New technology termed ChIP-Seq has been recently developed for the analysis of protein-DNA interactions on a whole genome scale and it is based on immunoprecipitation of chromatin and high-throughput DNA sequencing procedure. ChIP-Seq is a novel technique with a great potential to replace older techniques for mapping of protein-DNA interactions. In this thesis, we bring some new insights into the ChIP-Seq data analysis. First, we point out to some common and so far unknown artifacts of the method. Sequence tag distribution in the genome does not follow uniform distribution and we have found extreme hot-spots of tag accumulation over specific loci in the human and mouse genomes. These artifactual sequence tags accumulations will create false peaks in every ChIP-Seq dataset and we propose different filtering methods to reduce the number of false positives. Next, we propose random sampling as a powerful analytical tool in the ChIP-Seq data analysis that could be used to infer biological knowledge from the massive ChIP-Seq datasets. We created unbiased random sampling algorithm and we used this methodology to reveal some of the important biological properties of Nuclear Factor I DNA binding proteins. Finally, by analyzing the ChIP-Seq data in detail, we revealed that Nuclear Factor I transcription factors mainly act as activators of transcription, and that they are associated with specific chromatin modifications that are markers of open chromatin. We speculate that NFI factors only interact with the DNA wrapped around the nucleosome. We also found multiple loci that indicate possible chromatin barrier activity of NFI proteins, which could suggest the use of NFI binding sequences as chromatin insulators in biotechnology applications. RESUME : L'ADN des eucaryotes interagit avec les protéines nucléaires par des interactions noncovalentes ioniques. Les protéines peuvent reconnaître les séquences nucléotidiques spécifiques basées sur l'interaction stérique avec l'ADN, et des interactions spécifiques contrôlent de nombreux processus nucléaire, p.ex. transcription du gène, la réplication chromosomique, et la recombinaison. Une nouvelle technologie appelée ChIP-Seq a été récemment développée pour l'analyse des interactions protéine-ADN à l'échelle du génome entier et cette approche est basée sur l'immuno-précipitation de la chromatine et sur la procédure de séquençage de l'ADN à haut débit. La nouvelle approche ChIP-Seq a donc un fort potentiel pour remplacer les anciennes techniques de cartographie des interactions protéine-ADN. Dans cette thèse, nous apportons de nouvelles perspectives dans l'analyse des données ChIP-Seq. Tout d'abord, nous avons identifié des artefacts très communs associés à cette méthode qui étaient jusqu'à présent insoupçonnés. La distribution des séquences dans le génome ne suit pas une distribution uniforme et nous avons constaté des positions extrêmes d'accumulation de séquence à des régions spécifiques, des génomes humains et de la souris. Ces accumulations des séquences artéfactuelles créera de faux pics dans toutes les données ChIP-Seq, et nous proposons différentes méthodes de filtrage pour réduire le nombre de faux positifs. Ensuite, nous proposons un nouvel échantillonnage aléatoire comme un outil puissant d'analyse des données ChIP-Seq, ce qui pourraient augmenter l'acquisition de connaissances biologiques à partir des données ChIP-Seq. Nous avons créé un algorithme d'échantillonnage aléatoire et nous avons utilisé cette méthode pour révéler certaines des propriétés biologiques importantes de protéines liant à l'ADN nommés Facteur Nucléaire I (NFI). Enfin, en analysant en détail les données de ChIP-Seq pour la famille de facteurs de transcription nommés Facteur Nucléaire I, nous avons révélé que ces protéines agissent principalement comme des activateurs de transcription, et qu'elles sont associées à des modifications de la chromatine spécifiques qui sont des marqueurs de la chromatine ouverte. Nous pensons que lés facteurs NFI interagir uniquement avec l'ADN enroulé autour du nucléosome. Nous avons également constaté plusieurs régions génomiques qui indiquent une éventuelle activité de barrière chromatinienne des protéines NFI, ce qui pourrait suggérer l'utilisation de séquences de liaison NFI comme séquences isolatrices dans des applications de la biotechnologie.
Resumo:
BACKGROUND AND STUDY AIMS: Appropriate use of colonoscopy is a key component of quality management in gastrointestinal endoscopy. In an update of a 1998 publication, the 2008 European Panel on the Appropriateness of Gastrointestinal Endoscopy (EPAGE II) defined appropriateness criteria for various colonoscopy indications. This introductory paper therefore deals with methodology, general appropriateness, and a review of colonoscopy complications. METHODS:The RAND/UCLA Appropriateness Method was used to evaluate the appropriateness of various diagnostic colonoscopy indications, with 14 multidisciplinary experts using a scale from 1 (extremely inappropriate) to 9 (extremely appropriate). Evidence reported in a comprehensive updated literature review was used for these decisions. Consolidation of the ratings into three appropriateness categories (appropriate, uncertain, inappropriate) was based on the median and the heterogeneity of the votes. The experts then met to discuss areas of disagreement in the light of existing evidence, followed by a second rating round, with a subsequent third voting round on necessity criteria, using much more stringent criteria (i. e. colonoscopy is deemed mandatory). RESULTS: Overall, 463 indications were rated, with 55 %, 16 % and 29 % of them being judged appropriate, uncertain and inappropriate, respectively. Perforation and hemorrhage rates, as reported in 39 studies, were in general < 0.1 % and < 0.3 %, respectively CONCLUSIONS: The updated EPAGE II criteria constitute an aid to clinical decision-making but should in no way replace individual judgment. Detailed panel results are freely available on the internet (www.epage.ch) and will thus constitute a reference source of information for clinicians.
Resumo:
BACKGROUND AND PURPOSE: Stroke registries are valuable tools for obtaining information about stroke epidemiology and management. The Acute STroke Registry and Analysis of Lausanne (ASTRAL) prospectively collects epidemiological, clinical, laboratory and multimodal brain imaging data of acute ischemic stroke patients in the Centre Hospitalier Universitaire Vaudois (CHUV). Here, we provide design and methods used to create ASTRAL and present baseline data of our patients (2003 to 2008). METHODS: All consecutive patients admitted to CHUV between January 1, 2003 and December 31, 2008 with acute ischemic stroke within 24 hours of symptom onset were included in ASTRAL. Patients arriving beyond 24 hours, with transient ischemic attack, intracerebral hemorrhage, subarachnoidal hemorrhage, or cerebral sinus venous thrombosis, were excluded. Recurrent ischemic strokes were registered as new events. RESULTS: Between 2003 and 2008, 1633 patients and 1742 events were registered in ASTRAL. There was a preponderance of males, even in the elderly. Cardioembolic stroke was the most frequent type of stroke. Most strokes were of minor severity (National Institute of Health Stroke Scale [NIHSS] score ≤ 4 in 40.8% of patients). Cardioembolic stroke and dissections presented with the most severe clinical picture. There was a significant number of patients with unknown onset stroke, including wake-up stroke (n=568, 33.1%). Median time from last-well time to hospital arrival was 142 minutes for known onset and 759 minutes for unknown-onset stroke. The rate of intravenous or intraarterial thrombolysis between 2003 and 2008 increased from 10.8% to 20.8% in patients admitted within 24 hours of last-well time. Acute brain imaging was performed in 1695 patients (97.3%) within 24 hours. In 1358 patients (78%) who underwent acute computed tomography angiography, 717 patients (52.8%) had significant abnormalities. Of the 1068 supratentorial stroke patients who underwent acute perfusion computed tomography (61.3%), focal hypoperfusion was demonstrated in 786 patients (73.6%). CONCLUSIONS: This hospital-based prospective registry of consecutive acute ischemic strokes incorporates demographic, clinical, metabolic, acute perfusion, and arterial imaging. It is characterized by a high proportion of minor and unknown-onset strokes, short onset-to-admission time for known-onset patients, rapidly increasing thrombolysis rates, and significant vascular and perfusion imaging abnormalities in the majority of patients.
Resumo:
In Switzerland, organ procurement is well organized at the national-level but transplant outcomes have not been systematically monitored so far. Therefore, a novel project, the Swiss Transplant Cohort Study (STCS), was established. The STCS is a prospective multicentre study, designed as a dynamic cohort, which enrolls all solid organ recipients at the national level. The features of the STCS are a flexible patient-case system that allows capturing all transplant scenarios and collection of patient-specific and allograft-specific data. Beyond comprehensive clinical data, specific focus is directed at psychosocial and behavioral factors, infectious disease development, and bio-banking. Between May 2008 and end of 2011, the six Swiss transplant centers recruited 1,677 patients involving 1,721 transplantations, and a total of 1,800 organs implanted in 15 different transplantation scenarios. 10 % of all patients underwent re-transplantation and 3% had a second transplantation, either in the past or during follow-up. 34% of all kidney allografts originated from living donation. Until the end of 2011 we observed 4,385 infection episodes in our patient population. The STCS showed operative capabilities to collect high-quality data and to adequately reflect the complexity of the post-transplantation process. The STCS represents a promising novel project for comparative effectiveness research in transplantation medicine.
Resumo:
Objective. The aim of this study is to analyse associations between eating behaviour and psychological dysfunctions in treatment-seeking obese patients and identify parameters for the development of diagnostic tools with regard to eating and psychological disorders. Design and Methods. Cross-sectional data were analysed from 138 obese women. Bulimic Investigatory Test of Edinburgh and Eating Disorder Inventory-2 assessed eating behaviours. Beck Depression Inventory II, Spielberger State-Trait Anxiety Inventory, form Y, Rathus Assertiveness Schedule, and Marks and Mathews Fear Questionnaire assessed psychological profile. Results. 61% of patients showed moderate or major depressive symptoms and 77% showed symptoms of anxiety. Half of the participants presented with a low degree of assertiveness. No correlation was found between psychological profile and age or anthropometric measurements. The prevalence and severity of depression, anxiety, and assertiveness increased with the degree of eating disorders. The feeling of ineffectiveness explained a large degree of score variance. It explained 30 to 50% of the variability of assertiveness, phobias, anxiety, and depression. Conclusion. Psychological dysfunctions had a high prevalence and their severity is correlated with degree of eating disorders. The feeling of ineffectiveness constitutes the major predictor of the psychological profile and could open new ways to develop screening tools.
Resumo:
OBJECTIVE: Intervention during the pre-psychotic period of illness holds the potential of delaying or even preventing the onset of a full-threshold disorder, or at least of reducing the impact of such a disorder if it does develop. The first step in realizing this aim was achieved more than 10 years ago with the development and validation of criteria for the identification of young people at ultra-high risk (UHR) of psychosis. Results of three clinical trials have been published that provide mixed support for the effectiveness of psychological and pharmacological interventions in preventing the onset of psychotic disorder. METHOD: The present paper describes a fourth study that has now been undertaken in which young people who met UHR criteria were randomized to one of three treatment groups: cognitive therapy plus risperidone (CogTher + Risp: n = 43); cognitive therapy plus placebo (CogTher + Placebo: n = 44); and supportive counselling + placebo (Supp + Placebo; n = 28). A fourth group of young people who did not agree to randomization were also followed up (monitoring: n = 78). Baseline characteristics of participants are provided. RESULTS AND CONCLUSION: The present study improves on the previous studies because treatment was provided for 12 months and the independent contributions of psychological and pharmacological treatments in preventing transition to psychosis in the UHR cohort and on levels of psychopathology and functioning can be directly compared. Issues associated with recruitment and randomization are discussed.
Resumo:
Large animal models are an important resource for the understanding of human disease and for evaluating the applicability of new therapies to human patients. For many diseases, such as cone dystrophy, research effort is hampered by the lack of such models. Lentiviral transgenesis is a methodology broadly applicable to animals from many different species. When conjugated to the expression of a dominant mutant protein, this technology offers an attractive approach to generate new large animal models in a heterogeneous background. We adopted this strategy to mimic the phenotype diversity encounter in humans and generate a cohort of pigs for cone dystrophy by expressing a dominant mutant allele of the guanylate cyclase 2D (GUCY2D) gene. Sixty percent of the piglets were transgenic, with mutant GUCY2D mRNA detected in the retina of all animals tested. Functional impairment of vision was observed among the transgenic pigs at 3 months of age, with a follow-up at 1 year indicating a subsequent slower progression of phenotype. Abnormal retina morphology, notably among the cone photoreceptor cell population, was observed exclusively amongst the transgenic animals. Of particular note, these transgenic animals were characterized by a range in the severity of the phenotype, reflecting the human clinical situation. We demonstrate that a transgenic approach using lentiviral vectors offers a powerful tool for large animal model development. Not only is the efficiency of transgenesis higher than conventional transgenic methodology but this technique also produces a heterogeneous cohort of transgenic animals that mimics the genetic variation encountered in human patients.
Resumo:
Bacterial reporters are live, genetically engineered cells with promising application in bioanalytics. They contain genetic circuitry to produce a cellular sensing element, which detects the target compound and relays the detection to specific synthesis of so-called reporter proteins (the presence or activity of which is easy to quantify). Bioassays with bacterial reporters are a useful complement to chemical analytics because they measure biological responses rather than total chemical concentrations. Simple bacterial reporter assays may also replace more costly chemical methods as a first line sample analysis technique. Recent promising developments integrate bacterial reporter cells with microsystems to produce bacterial biosensors. This lecture presents an in-depth treatment of the synthetic biological design principles of bacterial reporters, the engineering of which started as simple recombinant DNA puzzles, but has now become a more rational approach of choosing and combining sensing, controlling and reporting DNA 'parts'. Several examples of existing bacterial reporter designs and their genetic circuitry will be illustrated. Besides the design principles, the lecture also focuses on the application principles of bacterial reporter assays. A variety of assay formats will be illustrated, and principles of quantification will be dealt with. In addition to this discussion, substantial reference material is supplied in various Annexes.
Resumo:
Gastric cancer affects about one million people per year worldwide, being the second leading cause of cancer mortality. The study of its etiology remains therefore a global issue as it may allow the identification of major targets, besides eradication of Helicobacter pylori infection, for primary prevention. It has however received little attention, given its comparatively low incidence in most high-income countries. We introduce a consortium of epidemiological investigations named the 'Stomach cancer Pooling (StoP) Project'. Twenty-two studies agreed to participate, for a total of over 9000 cases and 23 000 controls. Twenty studies have already shared the original data set. Of the patients, 40% are from Asia, 43% from Europe, and 17% from North America; 34% are women and 66% men; the median age is 61 years; 56% are from population-based case-control studies, 41% from hospital-based ones, and 3% from nested case-control studies derived from cohort investigations. Biological samples are available from 12 studies. The aim of the StoP Project is to analyze the role of lifestyle and genetic determinants in the etiology of gastric cancer through pooled analyses of individual-level data. The uniquely large data set will allow us to define and quantify the main effects of each risk factor of interest, including a number of infrequent habits, and to adequately address associations in subgroups of the population, as well as interaction within and between environmental and genetic factors. Further, we will carry out separate analyses according to different histotypes and subsites of gastric cancer, to identify potential different risk patterns and etiological characteristics.
Resumo:
The antihypertensive effects of the beta-blocking agent betaxolol and the calcium entry blocker verapamil were compared in a crossover single-blind trial. Seventeen patients with uncomplicated essential hypertension took either betaxolol or a slow-release formulation of verapamil for two consecutive 6-week periods. The sequence of treatment phases was randomly allocated and a 2-week washout period preceded each treatment. The antihypertensive effect of the test drugs was assessed both at the physician's office and during everyday activities using a portable blood pressure recorder. The crossover design of the trial made it possible to evaluate the antihypertensive efficacy of betaxolol and verapamil both in the group as a whole and in the individual patient. The individual patient response to one of these agents was not a reliable indicator of the same patient's response to the alternative agent. Betaxolol brought both office and ambulatory recorded blood pressures under control in a larger fraction of patients than verapamil, although the magnitude of the blood pressure fall in the responders was equal for each drug. These observations stress the need for an individualized approach to the evaluation of antihypertensive therapy. The present results also demonstrate that optimal antihypertensive therapy is still a matter of trial and error. The precise methodology that ought to characterize crossover trials may make it possible to improve the therapeutic approach to hypertensive patients.
Resumo:
An active, solvent-free solid sampler was developed for the collection of 1,6-hexamethylene diisocyanate (HDI) aerosol and prepolymers. The sampler was made of a filter impregnated with 1-(2-methoxyphenyl)piperazine contained in a filter holder. Interferences with HDI were observed when a set of cellulose acetate filters and a polystyrene filter holder were used; a glass fiber filter and polypropylene filter cassette gave better results. The applicability of the sampling and analytical procedure was validated with a test chamber, constructed for the dynamic generation of HDI aerosol and prepolymers in commercial two-component spray paints (Desmodur(R) N75) used in car refinishing. The particle size distribution, temporal stability, and spatial uniformity of the simulated aerosol were established in order to test the sample. The monitoring of aerosol concentrations was conducted with the solid sampler paired to the reference impinger technique (impinger flasks contained 10 mL of 0.5 mg/mL 1-(2-methoxyphenyl)piperazine in toluene) under a controlled atmosphere in the test chamber. Analyses of derivatized HDI and prepolymers were carried out by using high-performance liquid chromatography and ultraviolet detection. The correlation between the solvent-free and the impinger techniques appeared fairly good (Y = 0.979X - 0.161; R = 0.978), when the tests were conducted in the range of 0.1 to 10 times the threshold limit value (TLV) for HDI monomer and up to 60-mu-g/m3 (3 U.K. TLVs) for total -N = C = O groups.
Resumo:
The dynamical analysis of large biological regulatory networks requires the development of scalable methods for mathematical modeling. Following the approach initially introduced by Thomas, we formalize the interactions between the components of a network in terms of discrete variables, functions, and parameters. Model simulations result in directed graphs, called state transition graphs. We are particularly interested in reachability properties and asymptotic behaviors, which correspond to terminal strongly connected components (or "attractors") in the state transition graph. A well-known problem is the exponential increase of the size of state transition graphs with the number of network components, in particular when using the biologically realistic asynchronous updating assumption. To address this problem, we have developed several complementary methods enabling the analysis of the behavior of large and complex logical models: (i) the definition of transition priority classes to simplify the dynamics; (ii) a model reduction method preserving essential dynamical properties, (iii) a novel algorithm to compact state transition graphs and directly generate compressed representations, emphasizing relevant transient and asymptotic dynamical properties. The power of an approach combining these different methods is demonstrated by applying them to a recent multilevel logical model for the network controlling CD4+ T helper cell response to antigen presentation and to a dozen cytokines. This model accounts for the differentiation of canonical Th1 and Th2 lymphocytes, as well as of inflammatory Th17 and regulatory T cells, along with many hybrid subtypes. All these methods have been implemented into the software GINsim, which enables the definition, the analysis, and the simulation of logical regulatory graphs.
Resumo:
We report the generation and analysis of functional data from multiple, diverse experiments performed on a targeted 1% of the human genome as part of the pilot phase of the ENCODE Project. These data have been further integrated and augmented by a number of evolutionary and computational analyses. Together, our results advance the collective knowledge about human genome function in several major areas. First, our studies provide convincing evidence that the genome is pervasively transcribed, such that the majority of its bases can be found in primary transcripts, including non-protein-coding transcripts, and those that extensively overlap one another. Second, systematic examination of transcriptional regulation has yielded new understanding about transcription start sites, including their relationship to specific regulatory sequences and features of chromatin accessibility and histone modification. Third, a more sophisticated view of chromatin structure has emerged, including its inter-relationship with DNA replication and transcriptional regulation. Finally, integration of these new sources of information, in particular with respect to mammalian evolution based on inter- and intra-species sequence comparisons, has yielded new mechanistic and evolutionary insights concerning the functional landscape of the human genome. Together, these studies are defining a path for pursuit of a more comprehensive characterization of human genome function.