935 resultados para Electronic data processing -- Quality control


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Pós-graduação em Genética e Melhoramento Animal - FCAV

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Foodborne diseases represent operational risks in industrial restaurants. We described an outbreak of nine clustered cases of acute illness resembling acute toxoplasmosis in an industrial plant with 2300 employees. These patients and another 36 similar asymptomatic employees were diagnosed with anti-T. gondii IgG titer and avidity by ELISA. We excluded 14 patients based on high IgG avidity and chronic toxoplasmosis: 13 from controls and one from acute disease other than T. gondii infection. We also identified another three asymptomatic employees with T.gondii acute infection and also anti-T. gondii IgM positive as remaining acute cases. Case control study was conducted by interview in 11 acute infections and 20 negative controls. The ingestion of green vegetables, but not meat or water, was observed to be associated with the incidence of acute disease. These data reinforce the importance of sanitation control in industrial restaurants and also demonstrate the need for improvement in quality control regarding vegetables at risk for T. gondii oocyst contamination. We emphasized the accurate diagnosis of indexed cases and the detection of asymptomatic infections to determine the extent of the toxoplasmosis outbreak.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Foodborne diseases represent operational risks in industrial restaurants. We described an outbreak of nine clustered cases of acute illness resembling acute toxoplasmosis in an industrial plant with 2300 employees. These patients and another 36 similar asymptomatic employees were diagnosed with anti-T. gondii IgG titer and avidity by ELISA. We excluded 14 patients based on high IgG avidity and chronic toxoplasmosis: 13 from controls and one from acute disease other than T. gondii infection. We also identified another three asymptomatic employees with T.gondii acute infection and also anti-T. gondii IgM positive as remaining acute cases. Case control study was conducted by interview in 11 acute infections and 20 negative controls. The ingestion of green vegetables, but not meat or water, was observed to be associated with the incidence of acute disease. These data reinforce the importance of sanitation control in industrial restaurants and also demonstrate the need for improvement in quality control regarding vegetables at risk for T. gondii oocyst contamination. We emphasized the accurate diagnosis of indexed cases and the detection of asymptomatic infections to determine the extent of the toxoplasmosis outbreak.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Hybrid technologies, thanks to the convergence of integrated microelectronic devices and new class of microfluidic structures could open new perspectives to the way how nanoscale events are discovered, monitored and controlled. The key point of this thesis is to evaluate the impact of such an approach into applications of ion-channel High Throughput Screening (HTS)platforms. This approach offers promising opportunities for the development of new classes of sensitive, reliable and cheap sensors. There are numerous advantages of embedding microelectronic readout structures strictly coupled to sensing elements. On the one hand the signal-to-noise-ratio is increased as a result of scaling. On the other, the readout miniaturization allows organization of sensors into arrays, increasing the capability of the platform in terms of number of acquired data, as required in the HTS approach, to improve sensing accuracy and reliabiity. However, accurate interface design is required to establish efficient communication between ionic-based and electronic-based signals. The work made in this thesis will show a first example of a complete parallel readout system with single ion channel resolution, using a compact and scalable hybrid architecture suitable to be interfaced to large array of sensors, ensuring simultaneous signal recording and smart control of the signal-to-noise ratio and bandwidth trade off. More specifically, an array of microfluidic polymer structures, hosting artificial lipid bilayers blocks where single ion channel pores are embededed, is coupled with an array of ultra-low noise current amplifiers for signal amplification and data processing. As demonstrating working example, the platform was used to acquire ultra small currents derived by single non-covalent molecular binding between alpha-hemolysin pores and beta-cyclodextrin molecules in artificial lipid membranes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis presents several data processing and compression techniques capable of addressing the strict requirements of wireless sensor networks. After introducing a general overview of sensor networks, the energy problem is introduced, dividing the different energy reduction approaches according to the different subsystem they try to optimize. To manage the complexity brought by these techniques, a quick overview of the most common middlewares for WSNs is given, describing in detail SPINE2, a framework for data processing in the node environment. The focus is then shifted on the in-network aggregation techniques, used to reduce data sent by the network nodes trying to prolong the network lifetime as long as possible. Among the several techniques, the most promising approach is the Compressive Sensing (CS). To investigate this technique, a practical implementation of the algorithm is compared against a simpler aggregation scheme, deriving a mixed algorithm able to successfully reduce the power consumption. The analysis moves from compression implemented on single nodes to CS for signal ensembles, trying to exploit the correlations among sensors and nodes to improve compression and reconstruction quality. The two main techniques for signal ensembles, Distributed CS (DCS) and Kronecker CS (KCS), are introduced and compared against a common set of data gathered by real deployments. The best trade-off between reconstruction quality and power consumption is then investigated. The usage of CS is also addressed when the signal of interest is sampled at a Sub-Nyquist rate, evaluating the reconstruction performance. Finally the group sparsity CS (GS-CS) is compared to another well-known technique for reconstruction of signals from an highly sub-sampled version. These two frameworks are compared again against a real data-set and an insightful analysis of the trade-off between reconstruction quality and lifetime is given.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Therapeutisches Drug Monitoring (TDM) umfasst die Messung von Medikamentenspiegeln im Blut und stellt die Ergebnisse in Zusammenhang mit dem klinischen Erscheinungsbild der Patienten. Dabei wird angenommen, dass die Konzentrationen im Blut besser mit der Wirkung korrelieren als die Dosis. Dies gilt auch für Antidepressiva. Voraussetzung für eine Therapiesteuerung durch TDM ist die Verfügbarkeit valider Messmethoden im Labor und die korrekte Anwendung des Verfahrens in der Klinik. Ziel dieser Arbeit war es, den Einsatz von TDM für die Depressionsbehandlung zu analysieren und zu verbessern. Im ersten Schritt wurde für das neu zugelassene Antidepressivum Duloxetin eine hochleistungsflüssig-chromatographische (HPLC) Methode mit Säulenschaltung und spektrophotometrischer Detektion etabliert und an Patienten für TDM angewandt. Durch Analyse von 280 Patientenproben wurde herausgefunden, dass Duloxetin-Konzentrationen von 60 bis 120 ng/ml mit gutem klinischen Ansprechen und einem geringen Risiko für Nebenwirkungen einhergingen. Bezüglich seines Interaktionspotentials erwies sich Duloxetin im Vergleich zu anderen Antidepressiva als schwacher Inhibitor des Cytochrom P450 (CYP) Isoenzyms 2D6. Es gab keinen Hinweis auf eine klinische Relevanz. Im zweiten Schritt sollte eine Methode entwickelt werden, mit der möglichst viele unterschiedliche Antidepressiva einschließlich deren Metaboliten messbar sind. Dazu wurde eine flüssigchromatographische Methode (HPLC) mit Ultraviolettspektroskopie (UV) entwickelt, mit der die quantitative Analyse von zehn antidepressiven und zusätzlich zwei antipsychotischen Substanzen innerhalb von 25 Minuten mit ausreichender Präzision und Richtigkeit (beide über 85%) und Sensitivität erlaubte. Durch Säulenschaltung war eine automatisierte Analyse von Blutplasma oder –serum möglich. Störende Matrixbestandteile konnten auf einer Vorsäule ohne vorherige Probenaufbereitung abgetrennt werden. Das kosten- und zeiteffektive Verfahren war eine deutliche Verbesserung für die Bewältigung von Proben im Laboralltag und damit für das TDM von Antidepressiva. Durch Analyse des klinischen Einsatzes von TDM wurden eine Reihe von Anwendungsfehlern identifiziert. Es wurde deshalb versucht, die klinische Anwendung des TDM von Antidepressiva durch die Umstellung von einer weitgehend händischen Dokumentation auf eine elektronische Bearbeitungsweise zu verbessern. Im Rahmen der Arbeit wurde untersucht, welchen Effekt man mit dieser Intervention erzielen konnte. Dazu wurde eine Labor-EDV eingeführt, mit der der Prozess vom Probeneingang bis zur Mitteilung der Messergebnisse auf die Stationen elektronisch erfolgte und die Anwendung von TDM vor und nach der Umstellung untersucht. Die Umstellung fand bei den behandelnden Ärzten gute Akzeptanz. Die Labor-EDV erlaubte eine kumulative Befundabfrage und eine Darstellung des Behandlungsverlaufs jedes einzelnen Patienten inklusive vorhergehender Klinikaufenthalte. Auf die Qualität der Anwendung von TDM hatte die Implementierung des Systems jedoch nur einen geringen Einfluss. Viele Anforderungen waren vor und nach der Einführung der EDV unverändert fehlerhaft, z.B. wurden häufig Messungen vor Erreichen des Steady State angefordert. Die Geschwindigkeit der Bearbeitung der Proben war im Vergleich zur vorher händischen Ausführung unverändert, ebenso die Qualität der Analysen bezüglich Richtigkeit und Präzision. Ausgesprochene Empfehlungen hinsichtlich der Dosierungsstrategie der angeforderten Substanzen wurden häufig nicht beachtet. Verkürzt wurde allerdings die mittlere Latenz, mit der eine Dosisanpassung nach Mitteilung des Laborbefundes erfolgte. Insgesamt ist es mit dieser Arbeit gelungen, einen Beitrag zur Verbesserung des Therapeutischen Drug Monitoring von Antidepressiva zu liefern. In der klinischen Anwendung sind allerdings Interventionen notwendig, um Anwendungsfehler beim TDM von Antidepressiva zu minimieren.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We have realized a Data Acquisition chain for the use and characterization of APSEL4D, a 32 x 128 Monolithic Active Pixel Sensor, developed as a prototype for frontier experiments in high energy particle physics. In particular a transition board was realized for the conversion between the chip and the FPGA voltage levels and for the signal quality enhancing. A Xilinx Spartan-3 FPGA was used for real time data processing, for the chip control and the communication with a Personal Computer through a 2.0 USB port. For this purpose a firmware code, developed in VHDL language, was written. Finally a Graphical User Interface for the online system monitoring, hit display and chip control, based on windows and widgets, was realized developing a C++ code and using Qt and Qwt dedicated libraries. APSEL4D and the full acquisition chain were characterized for the first time with the electron beam of the transmission electron microscope and with 55Fe and 90Sr radioactive sources. In addition, a beam test was performed at the T9 station of the CERN PS, where hadrons of momentum of 12 GeV/c are available. The very high time resolution of APSEL4D (up to 2.5 Mfps, but used at 6 kfps) was fundamental in realizing a single electron Young experiment using nanometric double slits obtained by a FIB technique. On high statistical samples, it was possible to observe the interference and diffractions of single isolated electrons traveling inside a transmission electron microscope. For the first time, the information on the distribution of the arrival time of the single electrons has been extracted.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Data sets describing the state of the earth's atmosphere are of great importance in the atmospheric sciences. Over the last decades, the quality and sheer amount of the available data increased significantly, resulting in a rising demand for new tools capable of handling and analysing these large, multidimensional sets of atmospheric data. The interdisciplinary work presented in this thesis covers the development and the application of practical software tools and efficient algorithms from the field of computer science, aiming at the goal of enabling atmospheric scientists to analyse and to gain new insights from these large data sets. For this purpose, our tools combine novel techniques with well-established methods from different areas such as scientific visualization and data segmentation. In this thesis, three practical tools are presented. Two of these tools are software systems (Insight and IWAL) for different types of processing and interactive visualization of data, the third tool is an efficient algorithm for data segmentation implemented as part of Insight.Insight is a toolkit for the interactive, three-dimensional visualization and processing of large sets of atmospheric data, originally developed as a testing environment for the novel segmentation algorithm. It provides a dynamic system for combining at runtime data from different sources, a variety of different data processing algorithms, and several visualization techniques. Its modular architecture and flexible scripting support led to additional applications of the software, from which two examples are presented: the usage of Insight as a WMS (web map service) server, and the automatic production of a sequence of images for the visualization of cyclone simulations. The core application of Insight is the provision of the novel segmentation algorithm for the efficient detection and tracking of 3D features in large sets of atmospheric data, as well as for the precise localization of the occurring genesis, lysis, merging and splitting events. Data segmentation usually leads to a significant reduction of the size of the considered data. This enables a practical visualization of the data, statistical analyses of the features and their events, and the manual or automatic detection of interesting situations for subsequent detailed investigation. The concepts of the novel algorithm, its technical realization, and several extensions for avoiding under- and over-segmentation are discussed. As example applications, this thesis covers the setup and the results of the segmentation of upper-tropospheric jet streams and cyclones as full 3D objects. Finally, IWAL is presented, which is a web application for providing an easy interactive access to meteorological data visualizations, primarily aimed at students. As a web application, the needs to retrieve all input data sets and to install and handle complex visualization tools on a local machine are avoided. The main challenge in the provision of customizable visualizations to large numbers of simultaneous users was to find an acceptable trade-off between the available visualization options and the performance of the application. Besides the implementational details, benchmarks and the results of a user survey are presented.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Sexually transmitted infections other than HIV are important global health issues. They have, however, been neglected as a public-health priority and control efforts continue to fail. Sexually transmitted infections, by their nature, affect individuals, who are part of partnerships and larger sexual networks, and in turn populations. We propose a framework of individual, partnership, and population levels for examining the effects of sexually transmitted infections and interventions to control them. At the individual level we have a range of effective diagnostic tests, treatments, and vaccines. These options are unavailable or inaccessible in many resource-poor settings, where syndromic management remains the core intervention for individual case management. At the partnership level, partner notification and antenatal syphilis screening have the potential to prevent infection and re-infection. Interventions delivered to whole populations, or groups in whom the risks of infection and onward transmission are very high, have the greatest potential effect. Improvements to the infrastructure of treatment services can reduce the incidence of syphilis and gonorrhoea or urethritis. Strong evidence for the effectiveness of most other interventions on population-level outcomes is, however, scarce. Effective action requires a multifaceted approach including better basic epidemiological and surveillance data, high quality evidence about effectiveness of individual interventions and programmes, better methods to get effective interventions onto the policy agenda, and better advocacy and more commitment to get them implemented properly. We must not allow stigma, prejudice, and moral opposition to obstruct the goals of infectious disease control.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This article gives an overview over the methods used in the low--level analysis of gene expression data generated using DNA microarrays. This type of experiment allows to determine relative levels of nucleic acid abundance in a set of tissues or cell populations for thousands of transcripts or loci simultaneously. Careful statistical design and analysis are essential to improve the efficiency and reliability of microarray experiments throughout the data acquisition and analysis process. This includes the design of probes, the experimental design, the image analysis of microarray scanned images, the normalization of fluorescence intensities, the assessment of the quality of microarray data and incorporation of quality information in subsequent analyses, the combination of information across arrays and across sets of experiments, the discovery and recognition of patterns in expression at the single gene and multiple gene levels, and the assessment of significance of these findings, considering the fact that there is a lot of noise and thus random features in the data. For all of these components, access to a flexible and efficient statistical computing environment is an essential aspect.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: Short-acting agents for neuromuscular block (NMB) require frequent dosing adjustments for individual patient's needs. In this study, we verified a new closed-loop controller for mivacurium dosing in clinical trials. METHODS: Fifteen patients were studied. T1% measured with electromyography was used as input signal for the model-based controller. After induction of propofol/opiate anaesthesia, stabilization of baseline electromyography signal was awaited and a bolus of 0.3 mg kg-1 mivacurium was then administered to facilitate endotracheal intubation. Closed-loop infusion was started thereafter, targeting a neuromuscular block of 90%. Setpoint deviation, the number of manual interventions and surgeon's complaints were recorded. Drug use and its variability between and within patients were evaluated. RESULTS: Median time of closed-loop control for the 11 patients included in the data processing was 135 [89-336] min (median [range]). Four patients had to be excluded because of sensor problems. Mean absolute deviation from setpoint was 1.8 +/- 0.9 T1%. Neither manual interventions nor complaints from the surgeons were recorded. Mean necessary mivacurium infusion rate was 7.0 +/- 2.2 microg kg-1 min-1. Intrapatient variability of mean infusion rates over 30-min interval showed high differences up to a factor of 1.8 between highest and lowest requirement in the same patient. CONCLUSIONS: Neuromuscular block can precisely be controlled with mivacurium using our model-based controller. The amount of mivacurium needed to maintain T1% at defined constant levels differed largely between and within patients. Closed-loop control seems therefore advantageous to automatically maintain neuromuscular block at constant levels.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Submicroscopic changes in chromosomal DNA copy number dosage are common and have been implicated in many heritable diseases and cancers. Recent high-throughput technologies have a resolution that permits the detection of segmental changes in DNA copy number that span thousands of basepairs across the genome. Genome-wide association studies (GWAS) may simultaneously screen for copy number-phenotype and SNP-phenotype associations as part of the analytic strategy. However, genome-wide array analyses are particularly susceptible to batch effects as the logistics of preparing DNA and processing thousands of arrays often involves multiple laboratories and technicians, or changes over calendar time to the reagents and laboratory equipment. Failure to adjust for batch effects can lead to incorrect inference and requires inefficient post-hoc quality control procedures that exclude regions that are associated with batch. Our work extends previous model-based approaches for copy number estimation by explicitly modeling batch effects and using shrinkage to improve locus-specific estimates of copy number uncertainty. Key features of this approach include the use of diallelic genotype calls from experimental data to estimate batch- and locus-specific parameters of background and signal without the requirement of training data. We illustrate these ideas using a study of bipolar disease and a study of chromosome 21 trisomy. The former has batch effects that dominate much of the observed variation in quantile-normalized intensities, while the latter illustrates the robustness of our approach to datasets where as many as 25% of the samples have altered copy number. Locus-specific estimates of copy number can be plotted on the copy-number scale to investigate mosaicism and guide the choice of appropriate downstream approaches for smoothing the copy number as a function of physical position. The software is open source and implemented in the R package CRLMM available at Bioconductor (http:www.bioconductor.org).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The problem of re-sampling spatially distributed data organized into regular or irregular grids to finer or coarser resolution is a common task in data processing. This procedure is known as 'gridding' or 're-binning'. Depending on the quantity the data represents, the gridding-algorithm has to meet different requirements. For example, histogrammed physical quantities such as mass or energy have to be re-binned in order to conserve the overall integral. Moreover, if the quantity is positive definite, negative sampling values should be avoided. The gridding process requires a re-distribution of the original data set to a user-requested grid according to a distribution function. The distribution function can be determined on the basis of the given data by interpolation methods. In general, accurate interpolation with respect to multiple boundary conditions of heavily fluctuating data requires polynomial interpolation functions of second or even higher order. However, this may result in unrealistic deviations (overshoots or undershoots) of the interpolation function from the data. Accordingly, the re-sampled data may overestimate or underestimate the given data by a significant amount. The gridding-algorithm presented in this work was developed in order to overcome these problems. Instead of a straightforward interpolation of the given data using high-order polynomials, a parametrized Hermitian interpolation curve was used to approximate the integrated data set. A single parameter is determined by which the user can control the behavior of the interpolation function, i.e. the amount of overshoot and undershoot. Furthermore, it is shown how the algorithm can be extended to multidimensional grids. The algorithm was compared to commonly used gridding-algorithms using linear and cubic interpolation functions. It is shown that such interpolation functions may overestimate or underestimate the source data by about 10-20%, while the new algorithm can be tuned to significantly reduce these interpolation errors. The accuracy of the new algorithm was tested on a series of x-ray CT-images (head and neck, lung, pelvis). The new algorithm significantly improves the accuracy of the sampled images in terms of the mean square error and a quality index introduced by Wang and Bovik (2002 IEEE Signal Process. Lett. 9 81-4).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: since 1999 data from pulmonary hypertension (PH) patients from all PH centres in Switzerland were prospectively collected. We analyse the epidemiological aspects of these data. METHODS: PH was defined as a mean pulmonary artery pressure of >25 mm Hg at rest or >30 mm Hg during exercise. Patients with pulmonary arterial hypertension (PAH), PH associated with lung diseases, PH due to chronic thrombotic and/or embolic disease (CTEPH), or PH due to miscellaneous disorders were registered. Data from adult patients included between January 1999 and December 2004 were analysed. RESULTS: 250 patients were registered (age 58 +/- 16 years, 104 (41%) males). 152 patients (61%) had PAH, 73 (29%) had CTEPH and 18 (7%) had PH associated with lung disease. Patients <50 years (32%) were more likely to have PAH than patients >50 years (76% vs. 53%, p <0.005). Twenty-four patients (10%) were lost to followup, 58 patients (26%) died and 150 (66%) survived without transplantation or thrombendarterectomy. Survivors differed from patients who died in the baseline six-minute walking distance (400 m [300-459] vs. 273 m [174-415]), the functional impairment (NYHA class III/IV 86% vs. 98%), mixed venous saturation (63% [57-68] vs. 56% [50-61]) and right atrial pressure (7 mm Hg [4-11] vs. 11 mm Hg [4-18]). DISCUSSION: PH is a disease affecting adults of all ages. The management of these patients in specialised centres guarantees a high quality of care. Analysis of the registry data could be an instrument for quality control and might help identify weak points in assessment and treatment of these patients.