935 resultados para Electronic data processing -- Quality control
Resumo:
Background and purpose Survey data quality is a combination of the representativeness of the sample, the accuracy and precision of measurements, data processing and management with several subcomponents in each. The purpose of this paper is to show how, in the final risk factor surveys of the WHO MONICA Project, information on data quality were obtained, quantified, and used in the analysis. Methods and results In the WHO MONICA (Multinational MONItoring of trends and determinants in CArdiovascular disease) Project, the information about the data quality components was documented in retrospective quality assessment reports. On the basis of the documented information and the survey data, the quality of each data component was assessed and summarized using quality scores. The quality scores were used in sensitivity testing of the results both by excluding populations with low quality scores and by weighting the data by its quality scores. Conclusions Detailed documentation of all survey procedures with standardized protocols, training, and quality control are steps towards optimizing data quality. Quantifying data quality is a further step. Methods used in the WHO MONICA Project could be adopted to improve quality in other health surveys.
Resumo:
This document does NOT address the issue of oxygen data quality control (either real-time or delayed mode). As a preliminary step towards that goal, this document seeks to ensure that all countries deploying floats equipped with oxygen sensors document the data and metadata related to these floats properly. We produced this document in response to action item 14 from the AST-10 meeting in Hangzhou (March 22-23, 2009). Action item 14: Denis Gilbert to work with Taiyo Kobayashi and Virginie Thierry to ensure DACs are processing oxygen data according to recommendations. If the recommendations contained herein are followed, we will end up with a more uniform set of oxygen data within the Argo data system, allowing users to begin analysing not only their own oxygen data, but also those of others, in the true spirit of Argo data sharing. Indications provided in this document are valid as of the date of writing this document. It is very likely that changes in sensors, calibrations and conversions equations will occur in the future. Please contact V. Thierry (vthierry@ifremer.fr) for any inconsistencies or missing information. A dedicated webpage on the Argo Data Management website (www) contains all information regarding Argo oxygen data management : current and previous version of this cookbook, oxygen sensor manuals, calibration sheet examples, examples of matlab code to process oxygen data, test data, etc..
Resumo:
This work proposes a method based on both preprocessing and data mining with the objective of identify harmonic current sources in residential consumers. In addition, this methodology can also be applied to identify linear and nonlinear loads. It should be emphasized that the entire database was obtained through laboratory essays, i.e., real data were acquired from residential loads. Thus, the residential system created in laboratory was fed by a configurable power source and in its output were placed the loads and the power quality analyzers (all measurements were stored in a microcomputer). So, the data were submitted to pre-processing, which was based on attribute selection techniques in order to minimize the complexity in identifying the loads. A newer database was generated maintaining only the attributes selected, thus, Artificial Neural Networks were trained to realized the identification of loads. In order to validate the methodology proposed, the loads were fed both under ideal conditions (without harmonics), but also by harmonic voltages within limits pre-established. These limits are in accordance with IEEE Std. 519-1992 and PRODIST (procedures to delivery energy employed by Brazilian`s utilities). The results obtained seek to validate the methodology proposed and furnish a method that can serve as alternative to conventional methods.
Resumo:
Coronary artery disease (CAD) is currently one of the most prevalent diseases in the world population and calcium deposits in coronary arteries are one direct risk factor. These can be assessed by the calcium score (CS) application, available via a computed tomography (CT) scan, which gives an accurate indication of the development of the disease. However, the ionising radiation applied to patients is high. This study aimed to optimise the protocol acquisition in order to reduce the radiation dose and explain the flow of procedures to quantify CAD. The main differences in the clinical results, when automated or semiautomated post-processing is used, will be shown, and the epidemiology, imaging, risk factors and prognosis of the disease described. The software steps and the values that allow the risk of developingCADto be predicted will be presented. A64-row multidetector CT scan with dual source and two phantoms (pig hearts) were used to demonstrate the advantages and disadvantages of the Agatston method. The tube energy was balanced. Two measurements were obtained in each of the three experimental protocols (64, 128, 256 mAs). Considerable changes appeared between the values of CS relating to the protocol variation. The predefined standard protocol provided the lowest dose of radiation (0.43 mGy). This study found that the variation in the radiation dose between protocols, taking into consideration the dose control systems attached to the CT equipment and image quality, was not sufficient to justify changing the default protocol provided by the manufacturer.
Resumo:
As part of the development of the database Bgee (a dataBase for Gene Expression Evolution), we annotate and analyse expression data from different types and different sources, notably Affymetrix data from GEO and ArrayExpress, and RNA-Seq data from SRA. During our quality control procedure, we have identified duplicated content in GEO and ArrayExpress, affecting ∼14% of our data: fully or partially duplicated experiments from independent data submissions, Affymetrix chips reused in several experiments, or reused within an experiment. We present here the procedure that we have established to filter such duplicates from Affymetrix data, and our procedure to identify future potential duplicates in RNA-Seq data. Database URL: http://bgee.unil.ch/
Resumo:
The main goal of CleanEx is to provide access to public gene expression data via unique gene names. A second objective is to represent heterogeneous expression data produced by different technologies in a way that facilitates joint analysis and cross-data set comparisons. A consistent and up-to-date gene nomenclature is achieved by associating each single experiment with a permanent target identifier consisting of a physical description of the targeted RNA population or the hybridization reagent used. These targets are then mapped at regular intervals to the growing and evolving catalogues of human genes and genes from model organisms. The completely automatic mapping procedure relies partly on external genome information resources such as UniGene and RefSeq. The central part of CleanEx is a weekly built gene index containing cross-references to all public expression data already incorporated into the system. In addition, the expression target database of CleanEx provides gene mapping and quality control information for various types of experimental resource, such as cDNA clones or Affymetrix probe sets. The web-based query interfaces offer access to individual entries via text string searches or quantitative expression criteria. CleanEx is accessible at: http://www.cleanex.isb-sib.ch/.
Resumo:
A specification for contractor moisture quality control (QC) in roadway embankment construction has been in use for approximately 10 years in Iowa on about 190 projects. The use of this QC specification and the development of the soils certification program for the Iowa Department of Transportation (DOT) originated from Iowa Highway Research Board (IHRB) embankment quality research projects. Since this research, the Iowa DOT has applied compaction with moisture control on most embankment work under pavements. This study set out to independently evaluate the actual quality of compaction using the current specifications. Results show that Proctor tests conducted by Iowa State University (ISU) using representative material obtained from each test section where field testing was conducted had optimum moisture contents and maximum dry densities that are different from what was selected by the Iowa DOT for QC/quality assurance (QA) testing. Comparisons between the measured and selected values showed a standard error of 2.9 lb/ft3 for maximum dry density and 2.1% for optimum moisture content. The difference in optimum moisture content was as high as 4% and the difference in maximum dry density was as high as 6.5 lb/ft3 . The difference at most test locations, however, were within the allowable variation suggested in AASHTO T 99 for test results between different laboratories. The ISU testing results showed higher rates of data outside of the target limits specified based on the available contractor QC data for cohesive materials. Also, during construction observations, wet fill materials were often observed. Several test points indicated that materials were placed and accepted at wet of the target moisture contents. The statistical analysis results indicate that the results obtained from this study showed improvements over results from previous embankment quality research projects (TR-401 Phases I through III and TR-492) in terms of the percentage of data that fell within the specification limits. Although there was evidence of improvement, QC/QA results are not consistently meeting the target limits/values. Recommendations are provided in this report for Iowa DOT consideration with three proposed options for improvements to the current specifications. Option 1 provides enhancements to current specifications in terms of material-dependent control limits, training, sampling, and process control. Option 2 addresses development of alternative specifications that incorporate dynamic cone penetrometer or light weight deflectometer testing into QC/QA. Option 3 addresses incorporating calibrated intelligent compaction measurements into QC/QA.
Resumo:
Background: Urine is still the matrix of choice to fight against doping, because it can be collected non-invasively during anti-doping tests. Most of the World Anti-Doping Agency's accredited laboratories have more than 20 years experience in analyzing this biological fluid and the majority of the compounds listed in the 2010 Prohibited List - International Standard are eliminated through the urinary apparatus. Storing and transporting urine samples for doping analyses does not include a specific protocol to prevent microbial and thermal degradation. The use of a rapid and reliable screening method could enable determine reference intervals for urine specimens in doping control samples and evaluate notably the prevalence of microbial contamination known to be responsible for the degradation of chemical substances in urine.Methods: The Sysmex(R) UF-500i is a recent urine flow cytometer analyzer capable of quantifying BACT and other urinary particles such as RBC, WBC, EC, DEBRIS, CAST, PATH. CAST, YLC, SRC as well as measuring urine conductivity. To determine urine anti-doping reference intervals, 501 samples received in our laboratory over a period of two months were submitted to an immediate examination. All samples were collected and then transported at room temperature. Analysis of variance was performed to test the effects of factors such as gender, test type [in-competition, out-of-competition] and delivery time.Results: The data obtained showed that most of the urine samples were highly contaminated with bacteria. The other urine particles were also very different according to the factors.Conclusions: The Sysmex(R) UF-500i was capable of providing a snapshot of urine particles present in the samples at the time of the delivery to the laboratory. These particles, BACT in particular, gave a good idea of the possible microbial degradation which had and/or could have occurred in the sample. This information could be used as the first quality control set up in WADA (World Anti-Doping Agency) accredited laboratories to determine if steroid profiles, endogenous and prohibited substances have possibly been altered. (C) 2011 Elsevier Ireland Ltd. All rights reserved.
Resumo:
BACKGROUND: Worldwide data for cancer survival are scarce. We aimed to initiate worldwide surveillance of cancer survival by central analysis of population-based registry data, as a metric of the effectiveness of health systems, and to inform global policy on cancer control. METHODS: Individual tumour records were submitted by 279 population-based cancer registries in 67 countries for 25·7 million adults (age 15-99 years) and 75 000 children (age 0-14 years) diagnosed with cancer during 1995-2009 and followed up to Dec 31, 2009, or later. We looked at cancers of the stomach, colon, rectum, liver, lung, breast (women), cervix, ovary, and prostate in adults, and adult and childhood leukaemia. Standardised quality control procedures were applied; errors were corrected by the registry concerned. We estimated 5-year net survival, adjusted for background mortality in every country or region by age (single year), sex, and calendar year, and by race or ethnic origin in some countries. Estimates were age-standardised with the International Cancer Survival Standard weights. FINDINGS: 5-year survival from colon, rectal, and breast cancers has increased steadily in most developed countries. For patients diagnosed during 2005-09, survival for colon and rectal cancer reached 60% or more in 22 countries around the world; for breast cancer, 5-year survival rose to 85% or higher in 17 countries worldwide. Liver and lung cancer remain lethal in all nations: for both cancers, 5-year survival is below 20% everywhere in Europe, in the range 15-19% in North America, and as low as 7-9% in Mongolia and Thailand. Striking rises in 5-year survival from prostate cancer have occurred in many countries: survival rose by 10-20% between 1995-99 and 2005-09 in 22 countries in South America, Asia, and Europe, but survival still varies widely around the world, from less than 60% in Bulgaria and Thailand to 95% or more in Brazil, Puerto Rico, and the USA. For cervical cancer, national estimates of 5-year survival range from less than 50% to more than 70%; regional variations are much wider, and improvements between 1995-99 and 2005-09 have generally been slight. For women diagnosed with ovarian cancer in 2005-09, 5-year survival was 40% or higher only in Ecuador, the USA, and 17 countries in Asia and Europe. 5-year survival for stomach cancer in 2005-09 was high (54-58%) in Japan and South Korea, compared with less than 40% in other countries. By contrast, 5-year survival from adult leukaemia in Japan and South Korea (18-23%) is lower than in most other countries. 5-year survival from childhood acute lymphoblastic leukaemia is less than 60% in several countries, but as high as 90% in Canada and four European countries, which suggests major deficiencies in the management of a largely curable disease. INTERPRETATION: International comparison of survival trends reveals very wide differences that are likely to be attributable to differences in access to early diagnosis and optimum treatment. Continuous worldwide surveillance of cancer survival should become an indispensable source of information for cancer patients and researchers and a stimulus for politicians to improve health policy and health-care systems. FUNDING: Canadian Partnership Against Cancer (Toronto, Canada), Cancer Focus Northern Ireland (Belfast, UK), Cancer Institute New South Wales (Sydney, Australia), Cancer Research UK (London, UK), Centers for Disease Control and Prevention (Atlanta, GA, USA), Swiss Re (London, UK), Swiss Cancer Research foundation (Bern, Switzerland), Swiss Cancer League (Bern, Switzerland), and University of Kentucky (Lexington, KY, USA).
Resumo:
Human biomonitoring (HBM) is an effective tool for assessing actual exposure to chemicals that takes into account all routes of intake. Although hair analysis is considered to be an optimal biomarker for assessing mercury exposure, the lack of harmonization as regards sampling and analytical procedures has often limited the comparison of data at national and international level. The European-funded projects COPHES and DEMOCOPHES developed and tested a harmonized European approach to Human Biomonitoring in response to the European Environment and Health Action Plan. Herein we describe the quality assurance program (QAP) for assessing mercury levels in hair samples from more than 1800 mother-child pairs recruited in 17 European countries. To ensure the comparability of the results, standard operating procedures (SOPs) for sampling and for mercury analysis were drafted and distributed to participating laboratories. Training sessions were organized for field workers and four external quality-assessment exercises (ICI/EQUAS), followed by the corresponding web conferences, were organized between March 2011 and February 2012. ICI/EQUAS used native hair samples at two mercury concentration ranges (0.20-0.71 and 0.80-1.63) per exercise. The results revealed relative standard deviations of 7.87-13.55% and 4.04-11.31% for the low and high mercury concentration ranges, respectively. A total of 16 out of 18 participating laboratories the QAP requirements and were allowed to analyze samples from the DEMOCOPHES pilot study. Web conferences after each ICI/EQUAS revealed this to be a new and effective tool for improving analytical performance and increasing capacity building. The procedure developed and tested in COPHES/DEMOCOPHES would be optimal for application on a global scale as regards implementation of the Minamata Convention on Mercury.
Resumo:
Evaluation of image quality (IQ) in Computed Tomography (CT) is important to ensure that diagnostic questions are correctly answered, whilst keeping radiation dose to the patient as low as is reasonably possible. The assessment of individual aspects of IQ is already a key component of routine quality control of medical x-ray devices. These values together with standard dose indicators can be used to give rise to 'figures of merit' (FOM) to characterise the dose efficiency of the CT scanners operating in certain modes. The demand for clinically relevant IQ characterisation has naturally increased with the development of CT technology (detectors efficiency, image reconstruction and processing), resulting in the adaptation and evolution of assessment methods. The purpose of this review is to present the spectrum of various methods that have been used to characterise image quality in CT: from objective measurements of physical parameters to clinically task-based approaches (i.e. model observer (MO) approach) including pure human observer approach. When combined together with a dose indicator, a generalised dose efficiency index can be explored in a framework of system and patient dose optimisation. We will focus on the IQ methodologies that are required for dealing with standard reconstruction, but also for iterative reconstruction algorithms. With this concept the previously used FOM will be presented with a proposal to update them in order to make them relevant and up to date with technological progress. The MO that objectively assesses IQ for clinically relevant tasks represents the most promising method in terms of radiologist sensitivity performance and therefore of most relevance in the clinical environment.
Resumo:
The aim of this master’s thesis is to research and analyze how purchase invoice processing can be automated and streamlined in a system renewal project. The impacts of workflow automation on invoice handling are studied by means of time, cost and quality aspects. Purchase invoice processing has a lot of potential for automation because of its labor-intensive and repetitive nature. As a case study combining both qualitative and quantitative methods, the topic is approached from a business process management point of view. The current process was first explored through interviews and workshop meetings to create a holistic understanding of the process at hand. Requirements for process streamlining were then researched focusing on specified vendors and their purchase invoices, which helped to identify the critical factors for successful invoice automation. To optimize the flow from invoice receipt to approval for payment, the invoice receiving process was outsourced and the automation functionalities of the new system utilized in invoice handling. The quality of invoice data and the need of simple structured purchase order (PO) invoices were emphasized in the system testing phase. Hence, consolidated invoices containing references to multiple PO or blanket release numbers should be simplified in order to use automated PO matching. With non-PO invoices, it is important to receive the buyer reference details in an applicable invoice data field so that automation rules could be created to route invoices to a review and approval flow. In the beginning of the project, invoice processing was seen ineffective both time- and cost-wise, and it required a lot of manual labor to carry out all tasks. In accordance with testing results, it was estimated that over half of the invoices could be automated within a year after system implementation. Processing times could be reduced remarkably, which would then result savings up to 40 % in annual processing costs. Due to several advancements in the purchase invoice process, business process quality could also be perceived as improved.
Resumo:
Aiming at improving the quality of Perna perna mussels cultivated and commercialized in Ubatuba, SP, Brazil, the growth and elimination of Staphylococcus aureus and Bacillus cereus artificially inoculated in mussels were studied. The inoculation was carried out in "in natura" and pre-cooked mussels for 30 min, and after that the mussels were kept for 10 hours at room temperature (25 ± 1 °C) and under refrigeration (7 ± 1 °C). Six thermal treatments were evaluated: three using steam (5, 10 and 15 minutes) and three in boiling water (5, 10 and 15 minutes), in order to find the best time/temperature binomial to provide pathogenic control. Yield and physical-chemical and sensory characteristics were evaluated. All thermal treatments were efficient to eliminate microorganisms in 2 logarithmic cycles. However, the boiling water treatments presented better results than the steam treatments. The physical-chemical and sensory analyses did not show statistical differences among the thermal treatments studied. The best performances were reached in the shortest times of heat exposure. Overall, the treatments in boiling water presented better results than the steam treatments.
Resumo:
The objectives of this master’s thesis were to understand the importance of bubbling fluidized bed (BFB) conditions and to find out how digital image processing and acoustic emission technology can help in monitoring the bed quality. An acoustic emission (AE) measurement system and a bottom ash camera system were evaluated in acquiring information about the bed conditions. The theory part of the study describes the fundamentals of BFB boiler and evaluates the characteristics of bubbling bed. Causes and effects of bed material coarsening are explained. The ways and methods to monitor the behaviour of BFB are determined. The study introduces the operating principles of AE technology and digital image processing. The empirical part of the study describes an experimental arrangement and results of a case study at an industrial BFB boiler. Sand consumption of the boiler was reduced by optimization of bottom ash handling and sand feeding. Furthermore, data from the AE measurement system and the bottom ash camera system was collected. The feasibility of these two systems was evaluated. The particle size of bottom ash and the changes in particle size distribution were monitored during the test period. Neither of the systems evaluated was ready to serve in bed quality control accurately or fast enough. Particle size distributions according to the bottom ash camera did not correspond to the results of manual sieving. Comprehensive interpretation of the collected AE data requires much experience. Both technologies do have potential and with more research and development they may enable acquiring reliable and real-time information about the bed conditions. This information could help to maintain disturbance-free combustion process and to optimize bottom ash handling system.
Resumo:
Les études génétiques, telles que les études de liaison ou d’association, ont permis d’acquérir une plus grande connaissance sur l’étiologie de plusieurs maladies affectant les populations humaines. Même si une dizaine de milliers d’études génétiques ont été réalisées sur des centaines de maladies ou autres traits, une grande partie de leur héritabilité reste inexpliquée. Depuis une dizaine d’années, plusieurs percées dans le domaine de la génomique ont été réalisées. Par exemple, l’utilisation des micropuces d’hybridation génomique comparative à haute densité a permis de démontrer l’existence à grande échelle des variations et des polymorphismes en nombre de copies. Ces derniers sont maintenant détectables à l’aide de micropuce d’ADN ou du séquençage à haut débit. De plus, des études récentes utilisant le séquençage à haut débit ont permis de démontrer que la majorité des variations présentes dans l’exome d’un individu étaient rares ou même propres à cet individu. Ceci a permis la conception d’une nouvelle micropuce d’ADN permettant de déterminer rapidement et à faible coût le génotype de plusieurs milliers de variations rares pour un grand ensemble d’individus à la fois. Dans ce contexte, l’objectif général de cette thèse vise le développement de nouvelles méthodologies et de nouveaux outils bio-informatiques de haute performance permettant la détection, à de hauts critères de qualité, des variations en nombre de copies et des variations nucléotidiques rares dans le cadre d’études génétiques. Ces avancées permettront, à long terme, d’expliquer une plus grande partie de l’héritabilité manquante des traits complexes, poussant ainsi l’avancement des connaissances sur l’étiologie de ces derniers. Un algorithme permettant le partitionnement des polymorphismes en nombre de copies a donc été conçu, rendant possible l’utilisation de ces variations structurales dans le cadre d’étude de liaison génétique sur données familiales. Ensuite, une étude exploratoire a permis de caractériser les différents problèmes associés aux études génétiques utilisant des variations en nombre de copies rares sur des individus non reliés. Cette étude a été réalisée avec la collaboration du Wellcome Trust Centre for Human Genetics de l’University of Oxford. Par la suite, une comparaison de la performance des algorithmes de génotypage lors de leur utilisation avec une nouvelle micropuce d’ADN contenant une majorité de marqueurs rares a été réalisée. Finalement, un outil bio-informatique permettant de filtrer de façon efficace et rapide des données génétiques a été implémenté. Cet outil permet de générer des données de meilleure qualité, avec une meilleure reproductibilité des résultats, tout en diminuant les chances d’obtenir une fausse association.