964 resultados para Genetic data quality control
Resumo:
Foodborne diseases represent operational risks in industrial restaurants. We described an outbreak of nine clustered cases of acute illness resembling acute toxoplasmosis in an industrial plant with 2300 employees. These patients and another 36 similar asymptomatic employees were diagnosed with anti-T. gondii IgG titer and avidity by ELISA. We excluded 14 patients based on high IgG avidity and chronic toxoplasmosis: 13 from controls and one from acute disease other than T. gondii infection. We also identified another three asymptomatic employees with T.gondii acute infection and also anti-T. gondii IgM positive as remaining acute cases. Case control study was conducted by interview in 11 acute infections and 20 negative controls. The ingestion of green vegetables, but not meat or water, was observed to be associated with the incidence of acute disease. These data reinforce the importance of sanitation control in industrial restaurants and also demonstrate the need for improvement in quality control regarding vegetables at risk for T. gondii oocyst contamination. We emphasized the accurate diagnosis of indexed cases and the detection of asymptomatic infections to determine the extent of the toxoplasmosis outbreak.
Resumo:
Foodborne diseases represent operational risks in industrial restaurants. We described an outbreak of nine clustered cases of acute illness resembling acute toxoplasmosis in an industrial plant with 2300 employees. These patients and another 36 similar asymptomatic employees were diagnosed with anti-T. gondii IgG titer and avidity by ELISA. We excluded 14 patients based on high IgG avidity and chronic toxoplasmosis: 13 from controls and one from acute disease other than T. gondii infection. We also identified another three asymptomatic employees with T.gondii acute infection and also anti-T. gondii IgM positive as remaining acute cases. Case control study was conducted by interview in 11 acute infections and 20 negative controls. The ingestion of green vegetables, but not meat or water, was observed to be associated with the incidence of acute disease. These data reinforce the importance of sanitation control in industrial restaurants and also demonstrate the need for improvement in quality control regarding vegetables at risk for T. gondii oocyst contamination. We emphasized the accurate diagnosis of indexed cases and the detection of asymptomatic infections to determine the extent of the toxoplasmosis outbreak.
Resumo:
The human movement analysis (HMA) aims to measure the abilities of a subject to stand or to walk. In the field of HMA, tests are daily performed in research laboratories, hospitals and clinics, aiming to diagnose a disease, distinguish between disease entities, monitor the progress of a treatment and predict the outcome of an intervention [Brand and Crowninshield, 1981; Brand, 1987; Baker, 2006]. To achieve these purposes, clinicians and researchers use measurement devices, like force platforms, stereophotogrammetric systems, accelerometers, baropodometric insoles, etc. This thesis focus on the force platform (FP) and in particular on the quality assessment of the FP data. The principal objective of our work was the design and the experimental validation of a portable system for the in situ calibration of FPs. The thesis is structured as follows: Chapter 1. Description of the physical principles used for the functioning of a FP: how these principles are used to create force transducers, such as strain gauges and piezoelectrics transducers. Then, description of the two category of FPs, three- and six-component, the signals acquisition (hardware structure), and the signals calibration. Finally, a brief description of the use of FPs in HMA, for balance or gait analysis. Chapter 2. Description of the inverse dynamics, the most common method used in the field of HMA. This method uses the signals measured by a FP to estimate kinetic quantities, such as joint forces and moments. The measures of these variables can not be taken directly, unless very invasive techniques; consequently these variables can only be estimated using indirect techniques, as the inverse dynamics. Finally, a brief description of the sources of error, present in the gait analysis. Chapter 3. State of the art in the FP calibration. The selected literature is divided in sections, each section describes: systems for the periodic control of the FP accuracy; systems for the error reduction in the FP signals; systems and procedures for the construction of a FP. In particular is detailed described a calibration system designed by our group, based on the theoretical method proposed by ?. This system was the “starting point” for the new system presented in this thesis. Chapter 4. Description of the new system, divided in its parts: 1) the algorithm; 2) the device; and 3) the calibration procedure, for the correct performing of the calibration process. The algorithm characteristics were optimized by a simulation approach, the results are here presented. In addiction, the different versions of the device are described. Chapter 5. Experimental validation of the new system, achieved by testing it on 4 commercial FPs. The effectiveness of the calibration was verified by measuring, before and after calibration, the accuracy of the FPs in measuring the center of pressure of an applied force. The new system can estimate local and global calibration matrices; by local and global calibration matrices, the non–linearity of the FPs was quantified and locally compensated. Further, a non–linear calibration is proposed. This calibration compensates the non– linear effect in the FP functioning, due to the bending of its upper plate. The experimental results are presented. Chapter 6. Influence of the FP calibration on the estimation of kinetic quantities, with the inverse dynamics approach. Chapter 7. The conclusions of this thesis are presented: need of a calibration of FPs and consequential enhancement in the kinetic data quality. Appendix: Calibration of the LC used in the presented system. Different calibration set–up of a 3D force transducer are presented, and is proposed the optimal set–up, with particular attention to the compensation of non–linearities. The optimal set–up is verified by experimental results.
Resumo:
Biobanken sind Sammlungen von Körpersubstanzen, die mit umfangreichen gesundheits- und lebensstilbezogenen sowie geneologischen Daten ihrer Spender verknüpft sind. Sie dienen der Erforschung weit verbreiteter Krankheiten. Diese sog. Volkskrankheiten sind multifaktoriell bedingte Krankheiten. Dies bedeutet, dass diese Krankheiten das Ergebnis eines komplizierten Zusammenspiels von umwelt- und verhaltensrelevanten Faktoren mit individuellen genetischen Prädispositionen sind. Forschungen im Bereich von Pharmakogenomik und Pharmakogenetik untersuchen den Einfluss von Genen und Genexpressionen auf die individuelle Wirksamkeit von Medikamenten sowie auf die Entstehung ungewollter Nebenwirkungen und könnten so den Weg zu einer individualisierten Medizin ebnen. Menschliches Material ist ein wichtiger Bestandteil dieser Forschungen und die Nachfrage nach Sammlungen, die Proben mit Daten verknüpfen, steigt. Einerseits sehen Mediziner in Biobanken eine Chance für die Weiterentwicklung der medizinischen Forschung und des Gesundheitswesens. Andererseits lösen Biobanken auch Ängste und Misstrauen aus. Insbesondere wird befürchtet, dass Proben und Daten unkontrolliert verwendet werden und sensible Bereiche des Persönlichkeitsrechts und der persönlichen Identität betroffen sind. Diese Gefahren und Befürchtungen sind nicht neu, sondern bestanden schon in der Vergangenheit bei jeglicher Form der Spende von Körpersubstanzen. Neu ist aber der Umfang an Informationen, der durch die Genanalyse entsteht und den Spender in ganz besonderer Weise betreffen kann. Bei der Speicherung und Nutzung der medizinischen und genetischen Daten ergibt sich somit ein Spannungsfeld insbesondere zwischen dem Recht der betroffenen Datenspender auf informationelle Selbstbestimmung und den Forschungsinteressen der Datennutzer. Im Kern dreht sich die ethisch-rechtliche Bewertung der Biobanken um die Frage, ob diese Forschung zusätzliche Regeln braucht, und falls ja, wie umfassend diese sein müssten. Im Zentrum dieser Diskussion stehen dabei v.a. ethische Fragen im Zusammenhang mit der informierten Einwilligung, dem Datenschutz, der Wiederverwendung von Proben und Daten, der Information der Spender über Forschungsergebnisse und der Nutzungsrechte an den Daten. Ziel dieser Arbeit ist es, vor dem Hintergrund des Verfassungsrechts, insbesondere dem Recht auf informationelle Selbstbestimmung, das Datenschutzrecht im Hinblick auf die Risiken zu untersuchen, die sich aus der Speicherung, Verarbeitung und Kommunikation von persönlichen genetischen Informationen beim Aufbau von Biobanken ergeben. Daraus ergibt sich die weitere Untersuchung, ob und unter welchen Voraussetzungen die sich entgegenstehenden Interessen und Rechte aus verfassungsrechtlichem Blickwinkel in Einklang zu bringen sind. Eine wesentliche Frage lautet, ob die bisherigen rechtlichen Rahmenbedingungen ausreichen, um den Schutz der gespeicherten höchstpersönlichen Daten und zugleich ihre angemessene Nutzung zu gewährleisten. Das Thema ist interdisziplinär im Schnittfeld von Datenschutz, Verfassungsrecht sowie Rechts- und Medizinethik angelegt. Aus dem Inhalt: Naturwissenschaftliche und empirische Grundlagen von Biobanken – Überblick über Biobankprojekte in Europa und im außereuropäischen Ausland – Rechtsgrundlagen für Biobanken - Recht auf informationelle Selbstbestimmung - Recht auf Nichtwissen - Forschungsfreiheit - Qualitätssicherung und Verfahren – informierte Einwilligung – globale Einwilligung - Datenschutzkonzepte - Forschungsgeheimnis –– Biobankgeheimnis - Biobankgesetz
Resumo:
This article gives an overview over the methods used in the low--level analysis of gene expression data generated using DNA microarrays. This type of experiment allows to determine relative levels of nucleic acid abundance in a set of tissues or cell populations for thousands of transcripts or loci simultaneously. Careful statistical design and analysis are essential to improve the efficiency and reliability of microarray experiments throughout the data acquisition and analysis process. This includes the design of probes, the experimental design, the image analysis of microarray scanned images, the normalization of fluorescence intensities, the assessment of the quality of microarray data and incorporation of quality information in subsequent analyses, the combination of information across arrays and across sets of experiments, the discovery and recognition of patterns in expression at the single gene and multiple gene levels, and the assessment of significance of these findings, considering the fact that there is a lot of noise and thus random features in the data. For all of these components, access to a flexible and efficient statistical computing environment is an essential aspect.
Resumo:
BACKGROUND: since 1999 data from pulmonary hypertension (PH) patients from all PH centres in Switzerland were prospectively collected. We analyse the epidemiological aspects of these data. METHODS: PH was defined as a mean pulmonary artery pressure of >25 mm Hg at rest or >30 mm Hg during exercise. Patients with pulmonary arterial hypertension (PAH), PH associated with lung diseases, PH due to chronic thrombotic and/or embolic disease (CTEPH), or PH due to miscellaneous disorders were registered. Data from adult patients included between January 1999 and December 2004 were analysed. RESULTS: 250 patients were registered (age 58 +/- 16 years, 104 (41%) males). 152 patients (61%) had PAH, 73 (29%) had CTEPH and 18 (7%) had PH associated with lung disease. Patients <50 years (32%) were more likely to have PAH than patients >50 years (76% vs. 53%, p <0.005). Twenty-four patients (10%) were lost to followup, 58 patients (26%) died and 150 (66%) survived without transplantation or thrombendarterectomy. Survivors differed from patients who died in the baseline six-minute walking distance (400 m [300-459] vs. 273 m [174-415]), the functional impairment (NYHA class III/IV 86% vs. 98%), mixed venous saturation (63% [57-68] vs. 56% [50-61]) and right atrial pressure (7 mm Hg [4-11] vs. 11 mm Hg [4-18]). DISCUSSION: PH is a disease affecting adults of all ages. The management of these patients in specialised centres guarantees a high quality of care. Analysis of the registry data could be an instrument for quality control and might help identify weak points in assessment and treatment of these patients.
Resumo:
A wealth of genetic associations for cardiovascular and metabolic phenotypes in humans has been accumulating over the last decade, in particular a large number of loci derived from recent genome wide association studies (GWAS). True complex disease-associated loci often exert modest effects, so their delineation currently requires integration of diverse phenotypic data from large studies to ensure robust meta-analyses. We have designed a gene-centric 50 K single nucleotide polymorphism (SNP) array to assess potentially relevant loci across a range of cardiovascular, metabolic and inflammatory syndromes. The array utilizes a "cosmopolitan" tagging approach to capture the genetic diversity across approximately 2,000 loci in populations represented in the HapMap and SeattleSNPs projects. The array content is informed by GWAS of vascular and inflammatory disease, expression quantitative trait loci implicated in atherosclerosis, pathway based approaches and comprehensive literature searching. The custom flexibility of the array platform facilitated interrogation of loci at differing stringencies, according to a gene prioritization strategy that allows saturation of high priority loci with a greater density of markers than the existing GWAS tools, particularly in African HapMap samples. We also demonstrate that the IBC array can be used to complement GWAS, increasing coverage in high priority CVD-related loci across all major HapMap populations. DNA from over 200,000 extensively phenotyped individuals will be genotyped with this array with a significant portion of the generated data being released into the academic domain facilitating in silico replication attempts, analyses of rare variants and cross-cohort meta-analyses in diverse populations. These datasets will also facilitate more robust secondary analyses, such as explorations with alternative genetic models, epistasis and gene-environment interactions.
Resumo:
Heat shock protein 70 (Hsp70) plays a central role in protein homeostasis and quality control in conjunction with other chaperone machines, including Hsp90. The Hsp110 chaperone Sse1 promotes Hsp90 activity in yeast, and functions as a nucleotide exchange factor (NEF) for cytosolic Hsp70, but the precise roles Sse1 plays in client maturation through the Hsp70-Hsp90 chaperone system are not fully understood. We find that upon pharmacological inhibition of Hsp90, a model protein kinase, Ste11DeltaN, is rapidly degraded, whereas heterologously expressed glucocorticoid receptor (GR) remains stable. Hsp70 binding and nucleotide exchange by Sse1 was required for GR maturation and signaling through endogenous Ste11, as well as to promote Ste11DeltaN degradation. Overexpression of another functional NEF partially compensated for loss of Sse1, whereas the paralog Sse2 fully restored GR maturation and Ste11DeltaN degradation. Sse1 was required for ubiquitinylation of Ste11DeltaN upon Hsp90 inhibition, providing a mechanistic explanation for its role in substrate degradation. Sse1/2 copurified with Hsp70 and other proteins comprising the "early-stage" Hsp90 complex, and was absent from "late-stage" Hsp90 complexes characterized by the presence of Sba1/p23. These findings support a model in which Hsp110 chaperones contribute significantly to the decision made by Hsp70 to fold or degrade a client protein.
Resumo:
The performance of high-resolution CZE for determination of carbohydrate-deficient transferrin (CDT) in human serum based on internal and external quality data gathered over a 10-year period is reported. The assay comprises mixing of serum with a Fe(III) ion-containing solution prior to analysis of the iron saturated mixture in a dynamically double-coated capillary using a commercial buffer at alkaline pH. CDT values obtained with a human serum of a healthy individual and commercial quality control sera are shown to vary less than 10%. Values of a control from a specific lot were found to slowly decrease as function of time (less than 10% per year). Furthermore, due to unknown reasons, gradual changes in the monitored pattern around pentasialo-transferrin were detected, which limit the use of commercial control sera of the same lot to less than 2 years. Analysis of external quality control sera revealed correct classification of the samples over the entire 10-year period. Data obtained compare well with those of HPLC and CZE assays of other laboratories. The data gathered over a 10-year period demonstrate the robustness of the high-resolution CZE assay. This is the first account of a CZE-based CDT assay with complete internal and external quality assessment over an extended time period.
Resumo:
The Data Quality Campaign (DQC) has been focused since 2005 on advocating for states to build robust state longitudinal data systems (SLDS). While states have made great progress in their data infrastructure, and should continue to emphasize this work, t data systems alone will not improve outcomes. It is time for both DQC and states to focus on building capacity to use the information that these systems are producing at every level – from classrooms to state houses. To impact system performance and student achievement, the ingrained culture must be replaced with one that focuses on data use for continuous improvement. The effective use of data to inform decisions, provide transparency, improve the measurement of outcomes, and fuel continuous improvement will not come to fruition unless there is a system wide focus on building capacity around the collection, analysis, dissemination, and use of this data, including through research.