971 resultados para Data quality control
Resumo:
Background: The recent development of semi-automated techniques for staining and analyzing flow cytometry samples has presented new challenges. Quality control and quality assessment are critical when developing new high throughput technologies and their associated information services. Our experience suggests that significant bottlenecks remain in the development of high throughput flow cytometry methods for data analysis and display. Especially, data quality control and quality assessment are crucial steps in processing and analyzing high throughput flow cytometry data. Methods: We propose a variety of graphical exploratory data analytic tools for exploring ungated flow cytometry data. We have implemented a number of specialized functions and methods in the Bioconductor package rflowcyt. We demonstrate the use of these approaches by investigating two independent sets of high throughput flow cytometry data. Results: We found that graphical representations can reveal substantial non-biological differences in samples. Empirical Cumulative Distribution Function and summary scatterplots were especially useful in the rapid identification of problems not identified by manual review. Conclusions: Graphical exploratory data analytic tools are quick and useful means of assessing data quality. We propose that the described visualizations should be used as quality assessment tools and where possible, be used for quality control.
Resumo:
An array of Bio-Argo floats equipped with radiometric sensors has been recently deployed in various open ocean areas representative of the diversity of trophic and bio-optical conditions prevailing in the so-called Case 1 waters. Around solar noon and almost everyday, each float acquires 0-250 m vertical profiles of Photosynthetically Available Radiation and downward irradiance at three wavelengths (380, 412 and 490 nm). Up until now, more than 6500 profiles for each radiometric channel have been acquired. As these radiometric data are collected out of operator’s control and regardless of meteorological conditions, specific and automatic data processing protocols have to be developed. Here, we present a data quality-control procedure aimed at verifying profile shapes and providing near real-time data distribution. This procedure is specifically developed to: 1) identify main issues of measurements (i.e. dark signal, atmospheric clouds, spikes and wave-focusing occurrences); 2) validate the final data with a hierarchy of tests to ensure a scientific utilization. The procedure, adapted to each of the four radiometric channels, is designed to flag each profile in a way compliant with the data management procedure used by the Argo program. Main perturbations in the light field are identified by the new protocols with good performances over the whole dataset. This highlights its potential applicability at the global scale. Finally, the comparison with modeled surface irradiances allows assessing the accuracy of quality-controlled measured irradiance values and identifying any possible evolution over the float lifetime due to biofouling and instrumental drift.
Resumo:
An array of Bio-Argo floats equipped with radiometric sensors has been recently deployed in various open ocean areas representative of the diversity of trophic and bio-optical conditions prevailing in the so-called Case 1 waters. Around solar noon and almost everyday, each float acquires 0-250 m vertical profiles of Photosynthetically Available Radiation and downward irradiance at three wavelengths (380, 412 and 490 nm). Up until now, more than 6500 profiles for each radiometric channel have been acquired. As these radiometric data are collected out of operator’s control and regardless of meteorological conditions, specific and automatic data processing protocols have to be developed. Here, we present a data quality-control procedure aimed at verifying profile shapes and providing near real-time data distribution. This procedure is specifically developed to: 1) identify main issues of measurements (i.e. dark signal, atmospheric clouds, spikes and wave-focusing occurrences); 2) validate the final data with a hierarchy of tests to ensure a scientific utilization. The procedure, adapted to each of the four radiometric channels, is designed to flag each profile in a way compliant with the data management procedure used by the Argo program. Main perturbations in the light field are identified by the new protocols with good performances over the whole dataset. This highlights its potential applicability at the global scale. Finally, the comparison with modeled surface irradiances allows assessing the accuracy of quality-controlled measured irradiance values and identifying any possible evolution over the float lifetime due to biofouling and instrumental drift.
Resumo:
The Gaia space mission is a major project for the European astronomical community. As challenging as it is, the processing and analysis of the huge data-flow incoming from Gaia is the subject of thorough study and preparatory work by the DPAC (Data Processing and Analysis Consortium), in charge of all aspects of the Gaia data reduction. This PhD Thesis was carried out in the framework of the DPAC, within the team based in Bologna. The task of the Bologna team is to define the calibration model and to build a grid of spectro-photometric standard stars (SPSS) suitable for the absolute flux calibration of the Gaia G-band photometry and the BP/RP spectrophotometry. Such a flux calibration can be performed by repeatedly observing each SPSS during the life-time of the Gaia mission and by comparing the observed Gaia spectra to the spectra obtained by our ground-based observations. Due to both the different observing sites involved and the huge amount of frames expected (≃100000), it is essential to maintain the maximum homogeneity in data quality, acquisition and treatment, and a particular care has to be used to test the capabilities of each telescope/instrument combination (through the “instrument familiarization plan”), to devise methods to keep under control, and eventually to correct for, the typical instrumental effects that can affect the high precision required for the Gaia SPSS grid (a few % with respect to Vega). I contributed to the ground-based survey of Gaia SPSS in many respects: with the observations, the instrument familiarization plan, the data reduction and analysis activities (both photometry and spectroscopy), and to the maintenance of the data archives. However, the field I was personally responsible for was photometry and in particular relative photometry for the production of short-term light curves. In this context I defined and tested a semi-automated pipeline which allows for the pre-reduction of imaging SPSS data and the production of aperture photometry catalogues ready to be used for further analysis. A series of semi-automated quality control criteria are included in the pipeline at various levels, from pre-reduction, to aperture photometry, to light curves production and analysis.
Resumo:
Mode of access: Internet.
Resumo:
Cover title.
Resumo:
Cover title: Quality control of wind profiler data. Wind profiler training manual number two.
Resumo:
With the construction of operational oceanography systems, the need for real-time has become more and more important. A lot of work had been done in the past, within National Data Centres (NODC) and International Oceanographic Data and Information Exchange (IODE) to standardise delayed mode quality control procedures. Concerning such quality control procedures applicable in real-time (within hours to a maximum of a week from acquisition), which means automatically, some recommendations were set up for physical parameters but mainly within projects without consolidation with other initiatives. During the past ten years the EuroGOOS community has been working on such procedures within international programs such as Argo, OceanSites or GOSUD, or within EC projects such as Mersea, MFSTEP, FerryBox, ECOOP, and MyOcean. In collaboration with the FP7 SeaDataNet project that is standardizing the delayed mode quality control procedures in NODCs, and MyOcean GMES FP7 project that is standardizing near real time quality control procedures for operational oceanography purposes, the DATA-MEQ working group decided to put together this document to summarize the recommendations for near real-time QC procedures that they judged mature enough to be advertised and recommended to EuroGOOS.
Resumo:
BACKGROUND: Previous publications have documented the damage caused to red blood cells (RBCs) irradiated with X-rays produced by a linear accelerator and with gamma rays derived from a Cs-137 source. The biologic effects on RBCs of gamma rays from a Co-60 source, however, have not been characterized. STUDY DESIGN AND METHODS: This study investigated the effect of 3000 and 4000 cGy on the in vitro properties of RBCs preserved with preservative solution and irradiated with a cobalt teletherapy unit. A thermal device equipped with a data acquisition system was used to maintain and monitor the blood temperature during irradiation. The device was rotated at 2 r.p.m. in the irradiation beam by means of an automated system. The spatial distribution of the absorbed dose over the irradiated volume was obtained with phantom and thermoluminescent dosimeters (TLDs). Levels of Hb, K+, and Cl- were assessed by spectrophotometric techniques over a period of 45 days. The change in the topology of the RBC membrane was investigated by flow cytometry. RESULTS: Irradiation caused significant changes in the extracellular levels of K+ and Hb and in the organizational structure of the phospholipid bilayer of the RBC membrane. Blood temperature ranged from 2 to 4 degrees C during irradiation. Rotation at 2 r.p.m. distributed the dose homogeneously (92%-104%) and did not damage the RBCs. CONCLUSIONS: The method used to store the blood bags during irradiation guaranteed that all damage caused to the cells was exclusively due to the action of radiation at the doses applied. It was demonstrated that prolonged storage of Co-60-irradiated RBCs results in loss of membrane phospholipids asymmetry, exposing phosphatidylserine (PS) on the cells` surface with a time and dose dependence, which can reduce the in vivo recovery of these cells. A time- and dose-dependence effect on the extracellular K+ and plasma-free Hb levels was also observed. The magnitude of all these effects, however, seems not to be clinically important and can support the storage of irradiated RBC units for at last 28 days.
Resumo:
Objectives - Review available guidance for quality assurance (QA) in mammography and discuss its contribution to harmonise practices worldwide. Methods - Literature search was performed on different sources to identify guidance documents for QA in mammography available worldwide in international bodies, healthcare providers, professional/scientific associations. The guidance documents identified were reviewed and a selection was compared for type of guidance (clinical/technical), technology and proposed QA methodologies focusing on dose and image quality (IQ) performance assessment. Results - Fourteen protocols (targeted at conventional and digital mammography) were reviewed. All included recommendations for testing acquisition, processing and display systems associated with mammographic equipment. All guidance reviewed highlighted the importance of dose assessment and testing the Automatic Exposure Control (AEC) system. Recommended tests for assessment of IQ showed variations in the proposed methodologies. Recommended testing focused on assessment of low-contrast detection, spatial resolution and noise. QC of image display is recommended following the American Association of Physicists in Medicine guidelines. Conclusions - The existing QA guidance for mammography is derived from key documents (American College of Radiology and European Union guidelines) and proposes similar tests despite the variations in detail and methodologies. Studies reported on QA data should provide detail on experimental technique to allow robust data comparison. Countries aiming to implement a mammography/QA program may select/prioritise the tests depending on available technology and resources.
Resumo:
The capacity to use geologic materials (soil and rock) that are available in the surrounding environment is inherent to the human civilization and has contributed to the evolution of societies throughout the course of history. The use of these materials in the construction of structures such as houses, roads, railways or dams, stirred the improvement of socioeconomic and environmental conditions. Several reports of structural problems on embankments can be found throughout history. A considerable number of those registers can be linked to inadequate compaction, demonstrating the importance of guaranteeing a suitable quality of soil compaction. Various methodologies and specifications of compaction quality control on site of earthworks, based on the fill moisture content and dry unit weight, were developed during the 20th century. Two widely known methodologies are the conventional and nuclear techniques. The conventional methods are based on the use of the field sand cone test (or similar) and sampling of material for laboratory-based testing to evaluate the fill dry unit weight and water content. The nuclear techniques measure both parameters in the field using a nuclear density gauge. A topic under discussion in the geotechnical community, namely in Portugal, is the comparison between the accuracy of the nuclear gauge and sand cone test results for assessing the compaction and density ratio of earth fills, particularly for dams. The main purpose of this dissertation is to compare both of them. The data used were acquired during the compaction quality control operations at the Coutada/Tamujais dam trial embankment and core construction. This is a 25 m high earth dam located in Vila Velha de Rodão, Portugal. To analyse the spatial distribution of the compaction parameters (water content and compaction ratio), a 3D model was also developed. The main results achieved are discussed and finally some considerations are put forward on the suitability of both techniques to ensure fill compaction quality and on additional research to complement the conclusions obtained.
Resumo:
Objective: We aimed to critically evaluate the importance of quality control (QC) and quality assurance (QA) strategies in the routine work of uterine cervix cytology. Study Design: We revised all the main principles of QC and QA that are already being implemented worldwide and then discussed the positive aspects and limitations of these as well as proposing alternatives when pertinent. Results: A literature review was introduced after highlighting the main historical revisions, and then a critical evaluation of the principal innovations in screening programmes was conducted, with recommendations being postulated. Conclusions: Based on the analysed data, QC and QA are two essential arms that support the quality of a screening programme.