964 resultados para Genetic data quality control


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents a framework for considering quality control of volunteered geographic information (VGI). Different issues need to be considered during the conception, acquisition and post-acquisition phases of VGI creation. This includes items such as collecting metadata on the volunteer, providing suitable training, giving corrective feedback during the mapping process and use of control data, among others. Two examples of VGI data collection are then considered with respect to this quality control framework, i.e. VGI data collection by National Mapping Agencies and by the most recent Geo-Wiki tool, a game called Cropland Capture. Although good practices are beginning to emerge, there is still the need for the development and sharing of best practice, especially if VGI is to be integrated with authoritative map products or used for calibration and/or validation of land cover in the future.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis chronicles the design and implementation of a Internet/Intranet and database based application for the quality control of hurricane surface wind observations. A quality control session consists of selecting desired observation types to be viewed and determining a storm track based time window for viewing the data. All observations of the selected types are then plotted in a storm relative view for the chosen time window and geography is positioned for the storm-center time about which an objective analysis can be performed. Users then make decisions about data validity through visual nearest-neighbor comparison and inspection. The project employed an Object Oriented iterative development method from beginning to end and its implementation primarily features the Java programming language. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis chronicles the design and implementation of a Intemet/Intranet and database based application for the quality control of hurricane surface wind observations. A quality control session consists of selecting desired observation types to be viewed and determining a storm track based time window for viewing the data. All observations of the selected types are then plotted in a storm relative view for the chosen time window and geography is positioned for the storm-center time about which an objective analysis can be performed. Users then make decisions about data validity through visual nearestneighbor comparison and inspection. The project employed an Object Oriented iterative development method from beginning to end and its implementation primarily features the Java programming language.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: Although most gastrointestinal stromal tumours (GIST) carry oncogenic mutations in KIT exons 9, 11, 13 and 17, or in platelet-derived growth factor receptor alpha (PDGFRA) exons 12, 14 and 18, around 10% of GIST are free of these mutations. Genotyping and accurate detection of KIT/PDGFRA mutations in GIST are becoming increasingly useful for clinicians in the management of the disease. METHOD: To evaluate and improve laboratory practice in GIST mutation detection, we developed a mutational screening quality control program. Eleven laboratories were enrolled in this program and 50 DNA samples were analysed, each of them by four different laboratories, giving 200 mutational reports. RESULTS: In total, eight mutations were not detected by at least one laboratory. One false positive result was reported in one sample. Thus, the mean global rate of error with clinical implication based on 200 reports was 4.5%. Concerning specific polymorphisms detection, the rate varied from 0 to 100%, depending on the laboratory. The way mutations were reported was very heterogeneous, and some errors were detected. CONCLUSION: This study demonstrated that such a program was necessary for laboratories to improve the quality of the analysis, because an error rate of 4.5% may have clinical consequences for the patient.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Recommendation for Oxygen Measurements from Argo Floats: Implementation of In-Air-Measurement Routine to Assure Highest Long-term Accuracy As Argo has entered its second decade and chemical/biological sensor technology is improving constantly, the marine biogeochemistry community is starting to embrace the successful Argo float program. An augmentation of the global float observatory, however, has to follow rather stringent constraints regarding sensor characteristics as well as data processing and quality control routines. Owing to the fairly advanced state of oxygen sensor technology and the high scientific value of oceanic oxygen measurements (Gruber et al., 2010), an expansion of the Argo core mission to routine oxygen measurements is perhaps the most mature and promising candidate (Freeland et al., 2010). In this context, SCOR Working Group 142 “Quality Control Procedures for Oxygen and Other Biogeochemical Sensors on Floats and Gliders” (www.scor-int.org/SCOR_WGs_WG142.htm) set out in 2014 to assess the current status of biogeochemical sensor technology with particular emphasis on float-readiness, develop pre- and post-deployment quality control metrics and procedures for oxygen sensors, and to disseminate procedures widely to ensure rapid adoption in the community.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In April 2017, CMEMS plans to launch the WAVES NRT products. This document is focused in the automatic RTQC of the collected wave data. The validation procedure includes the delayed mode quality control of the data and will be specified in another guideline. To perform any kind of quality control to wave data, first it’s necessary to know the nature of the measurements and the analysis performed to those measurements to obtain the wave parameters. For that reason next chapter is dedicated to show the usual wave analysis and the different parameters and estimators obtained.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: The present work aims at the application of the decision theory to radiological image quality control ( QC) in diagnostic routine. The main problem addressed in the framework of decision theory is to accept or reject a film lot of a radiology service. The probability of each decision of a determined set of variables was obtained from the selected films. Methods: Based on a radiology service routine a decision probability function was determined for each considered group of combination characteristics. These characteristics were related to the film quality control. These parameters were also framed in a set of 8 possibilities, resulting in 256 possible decision rules. In order to determine a general utility application function to access the decision risk, we have used a simple unique parameter called r. The payoffs chosen were: diagnostic's result (correct/incorrect), cost (high/low), and patient satisfaction (yes/no) resulting in eight possible combinations. Results: Depending on the value of r, more or less risk will occur related to the decision-making. The utility function was evaluated in order to determine the probability of a decision. The decision was made with patients or administrators' opinions from a radiology service center. Conclusion: The model is a formal quantitative approach to make a decision related to the medical imaging quality, providing an instrument to discriminate what is really necessary to accept or reject a film or a film lot. The method presented herein can help to access the risk level of an incorrect radiological diagnosis decision.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents a novel adaptive control scheme. with improved convergence rate, for the equalization of harmonic disturbances such as engine noise. First, modifications for improving convergence speed of the standard filtered-X LMS control are described. Equalization capabilities are then implemented, allowing the independent tuning of harmonics. Eventually, by providing the desired order vs. engine speed profiles, the pursued sound quality attributes can be achieved. The proposed control scheme is first demonstrated with a simple secondary path model and, then, experimentally validated with the aid of a vehicle mockup which is excited with engine noise. The engine excitation is provided by a real-time sound quality equivalent engine simulator. Stationary and transient engine excitations are used to assess the control performance. The results reveal that the proposed controller is capable of large order-level reductions (up to 30 dB) for stationary excitation, which allows a comfortable margin for equalization. The same holds for slow run-ups ( > 15s) thanks to the improved convergence rate. This margin, however, gets narrower with shorter run-ups (<= 10s). (c) 2010 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Active control solutions appear to be a feasible approach to cope with the steadily increasing requirements for noise reduction in the transportation industry. Active controllers tend to be designed with a target on the sound pressure level reduction. However, the perceived control efficiency for the occupants can be more accurately assessed if psychoacoustic metrics can be taken into account. Therefore, this paper aims to evaluate, numerically and experimentally, the effect of a feedback controller on the sound quality of a vehicle mockup excited with engine noise. The proposed simulation scheme is described and experimentally validated. The engine excitation is provided by a sound quality equivalent engine simulator, running on a real-time platform that delivers harmonic excitation in function of the driving condition. The controller performance is evaluated in terms of specific loudness and roughness. It is shown that the use of a quite simple control strategy, such as a velocity feedback, can result in satisfactory loudness reduction with slightly spread roughness, improving the overall perception of the engine sound. (C) 2008 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A method was optimized for the analysis of omeprazole (OMZ) by ultra-high speed LC with diode array detection using a monolithic Chromolith Fast Gradient RP 18 endcapped column (50 x 2.0 mm id). The analyses were performed at 30 degrees C using a mobile phase consisting of 0.15% (v/v) trifluoroacetic acid (TFA) in water (solvent A) and 0.15% (v/v) TFA in acetonitrile (solvent B) under a linear gradient of 5 to 90% B in 1 min at a flow rate of 1.0 mL/min and detection at 220 nm. Under these conditions, OMZ retention time was approximately 0.74 min. Validation parameters, such as selectivity, linearity, precision, accuracy, and robustness, showed results within the acceptable criteria. The method developed was successfully applied to OMZ enteric-coated pellets, showing that this assay can be used in the pharmaceutical industry for routine QC analysis. Moreover, the analytical conditions established allow for the simultaneous analysis of OMZ metabolites, 5-hydroxyomeprazole and omeprazole sulfone, in the same run, showing that this method can be extended to other matrixes with adequate procedures for sample preparation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: The cerebrospinal fluid (CSF) biomarkers amyloid beta (A beta)-42, total-tau (T-tau), and phosphorylated-tau (P-tau) demonstrate good diagnostic accuracy for Alzheimer`s disease (AD). However, there are large variations in biomarker measurements between studies, and between and within laboratories. The Alzheimer`s Association has initiated a global quality control program to estimate and monitor variability of measurements, quantify batch-to-batch assay variations, and identify sources of variability. In this article, we present the results from the first two rounds of the program. Methods: The program is open for laboratories using commercially available kits for A beta, T-tau, or P-tau. CSF samples (aliquots of pooled CSF) are sent for analysis several times a year from the Clinical Neurochemistry Laboratory at the Molndal campus of the University of Gothenburg, Sweden. Each round consists of three quality control samples. Results: Forty laboratories participated. Twenty-six used INNOTEST enzyme-linked immunosorbent assay kits, 14 used Luminex xMAP with the INNO-BIA AlzBio3 kit (both measure A beta-(1-42), P-tau(181P), and T-tau), and 5 used Mesa Scale Discovery with the A beta triplex (A beta N-42, A beta N-40, and A beta N-38) or T-tau kits. The total coefficients of variation between the laboratories were 13% to 36%. Five laboratories analyzed the samples six times on different occasions. Within-laboratory precisions differed considerably between biomarkers within individual laboratories. Conclusions: Measurements of CSF AD biomarkers show large between-laboratory variability, likely caused by factors related to analytical procedures and the analytical kits. Standardization of laboratory procedures and efforts by kit vendors to increase kit performance might lower variability, and will likely increase the usefulness of CSF AD biomarkers. (C) 2011 The Alzheimer`s Association. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Toxoplasma gondii causes severe disease both to man and livestock and its detection in meat after slaughtering requires PCR or biological tests. Meat packages contain retained exudate that could be used for serology due to its blood content. Similar studies reported false negative assays in those tests. We standardized an anti-T. gondii IgG ELISA in muscle juices from experimentally infected rabbits, with blood content determination by cyanhemoglobin spectrophotometry. IgG titers and immunoblotting profiles were similar in blood, serum or meat juice, after blood content correction. These assays were adequate regardless of the storage time up to 120 days or freeze-thaw cycles, without false negative results. We also found 1.35% (1/74) positive sample in commercial Brazilian rabbit meat cuts, by this assay. The blood content determination shows ELISA of meat juice may be useful for quality control for toxoplasmosis monitoring. (C) 2011 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Many studies have used genetic markers to understand global migration patterns of our species. However, there are only few studies of human migration on a local scale. We, therefore, researched migration dynamics in three Afro-Brazilian rural communities, using demographic data and ten Ancestry Informative Markers. In addition to the description of migration and marriage structures, we carried out genetic comparisons between the three populations, as well as between locals and migrants from each community. Genetic admixture analyses were conducted according to the gene-identity method, with Sub-Saharan Africans, Amerindians, and Europeans as parental populations. The three analyzed Afro-Brazilian rural communities consisted of 16% to 30% of migrants, most of them women. The age pyramid revealed a gap in the segment of men aged between 20 to 30 yrs. While endogamous marriages predominated, exogamous marriages were mainly patrilocal. Migration dynamics are apparently associated with matrimonial customs and other social practices of such communities. The impact of migration upon the populations` genetic composition was low but showed an increase in European alleles with a concomitant decrease in the Amerindian contribution. Admixture analysis evidenced a higher African contribution to the gene pool of the studied populations, followed by the contribution of Europeans and Amerindians, respectively.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Websites are, nowadays, the face of institutions, but they are often neglected, especially when it comes to contents. In the present paper, we put forth an investigation work whose final goal is the development of a model for the measurement of data quality in institutional websites for health units. To that end, we have carried out a bibliographic review of the available approaches for the evaluation of website content quality, in order to identify the most recurrent dimensions and the attributes, and we are currently carrying out a Delphi Method process, presently in its second stage, with the purpose of reaching an adequate set of attributes for the measurement of content quality.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This article presents a research work, the goal of which was to achieve a model for the evaluation of data quality in institutional websites of health units in a broad and balanced way. We have carried out a literature review of the available approaches for the evaluation of website content quality, in order to identify the most recurrent dimensions and the attributes, and we have also carried out a Delphi method process with experts in order to reach an adequate set of attributes and their respective weights for the measurement of content quality. The results obtained revealed a high level of consensus among the experts who participated in the Delphi process. On the other hand, the different statistical analysis and techniques implemented are robust and attach confidence to our results and consequent model obtained.