935 resultados para Electronic data processing -- Quality control


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Quality Management Earthwork (QM-E) special provision was implemented on a pilot project to evaluate quality control (QC) and quality assurance (QA) testing in predominately unsuitable soils. Control limits implemented on this pilot project included the following: 95% relative compaction, moisture content not exceeding +/- 2% of optimum moisture content, soil strength not exceeding a dynamic cone penetrometer (DCP) index of 70 mm/blow, vertical uniformity not exceeding a variation in DCP index of 40 mm/blow, and lift thickness not exceeding depth determined through construction of control strips. Four-point moving averages were used to allow for some variability in the measured parameter values. Management of the QC/QA data proved to be one of the most challenging aspects of the pilot project. Implementing use of the G-RAD data collection system has considerable potential to reduce the time required to develop and maintain QC/QA records for projects using the QM-E special provision. In many cases, results of a single Proctor test were used to establish control limits that were used for several months without retesting. While the data collected for the pilot project indicated that the DCP index control limits could be set more tightly, there is not enough evidence to support making a change. In situ borings, sampling, and testing in natural unsuitable cut material and compacted fill material revealed that the compacted fill had similar strength characteristics to that of the natural cut material after less than three months from the start of construction.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This report describes test results from a full-scale embankment pilot study conducted in Iowa. The intent of the pilot project was to field test and refine the proposed soil classification system and construction specifications developed in Phase II of this research and to evaluate the feasibility of implementing a contractor quality control (QC) and Iowa DOT quality assurance (QA) program for earthwork grading in the future. One of the primary questions for Phase III is “Was embankment quality improved?” The project involved a “quality conscious” contractor, well-qualified and experienced Iowa Department of Transportation field personnel, a good QC consultant technician, and some of our best soils in the state. If the answer to the above question is “yes” for this project, it would unquestionably be “yes” for other projects as well. The answer is yes, the quality was improved, even for this project, as evidenced by dynamic cone penetrometer test data and the amount of disking required to reduce the moisture content to within acceptable control limits (approximately 29% of soils by volume required disking). Perhaps as important is that we know what quality we have. Increased QC/QA field testing, however, increases construction costs, as expected. The quality management-earthwork program resulted in an additional $0.03 per cubic meter, or 1.6%, of the total construction costs. Disking added about $0.04 per cubic meter, or 1.7%, to the total project costs. In our opinion this is a nominal cost increase to improve quality. It is envisioned that future contractor innovations have the potential for negating this increase. The Phase III results show that the new soil classification system and the proposed field test methods worked well during the Iowa Department of Transportation soils design phase and during the construction phase. Recommendations are provided for future implementation of the results of this study by city, county, and state agencies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Assessment of image quality for digital x-ray mammography systems used in European screening programs relies mainly on contrast-detail CDMAM phantom scoring and requires the acquisition and analysis of many images in order to reduce variability in threshold detectability. Part II of this study proposes an alternative method based on the detectability index (d') calculated for a non-prewhitened model observer with an eye filter (NPWE). The detectability index was calculated from the normalized noise power spectrum and image contrast, both measured from an image of a 5 cm poly(methyl methacrylate) phantom containing a 0.2 mm thick aluminium square, and the pre-sampling modulation transfer function. This was performed as a function of air kerma at the detector for 11 different digital mammography systems. These calculated d' values were compared against threshold gold thickness (T) results measured with the CDMAM test object and against derived theoretical relationships. A simple relationship was found between T and d', as a function of detector air kerma; a linear relationship was found between d' and contrast-to-noise ratio. The values of threshold thickness used to specify acceptable performance in the European Guidelines for 0.10 and 0.25 mm diameter discs were equivalent to threshold calculated detectability indices of 1.05 and 6.30, respectively. The NPWE method is a validated alternative to CDMAM scoring for use in the image quality specification, quality control and optimization of digital x-ray systems for screening mammography.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of radiotherapy is to deliver enough radiation to the tumor in order to achieve maximum tumour control in the irradiated volume with as few serious complications as possible with an irradiation dose as low as possible to normal tissue. The quality of radiotherapy is essential for optimal treatment and quality control is to reduce the bias in clinical trials avoiding possible major deviations. The assurance and quality control programs have been developed in large european (EORTC, GORTEC) and american cooperative groups (RTOG) of radiation oncology since the 1980s. We insist here on the importance of quality assurance in radiotherapy and the current status in this domain and the criteria for quality control especially for current clinical trials within GORTEC are discussed here.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Työn tavoitteena oli vertailla paperikoneen lajinvaihdon säätötapoja. Vertailun kohteina olivat Metso Automationin IQGradeChange lajinvaihto-ohjelmisto ja operaattoreiden käsin tekemät lajinvaihdot. Kattavan tutkimusaineiston saamiseksi paperikoneen lajinvaihtodataa kerättiin seitsemän kuukauden ajan. Kerätyt lajinvaihdot käytiin läpi Matlab-ympäristössä lajinvaihtoaikojen selvittämiseksi. Lisäksi lajinvaihdoista laskettiin tuotannon muutokset ((t/h)/min) vanhan ja uudenlajin välillä, jotta päästiin selvyyteen lajinvaihdon laajuudesta ja eri lajinvaihtotapojen suorituskyvyistä. Koeajojaksona paperikoneelta kerättiin kaikkiaan 130 lajinvaihdon tiedot. Näistä lajinvaihdoista 58 tehtiin IQGradeChange lajinvaihto-ohjelmistolla ja 72 oli operaattoreiden käsin tekemiä lajinvaihtoja. Kerätyistä 130 lajinvaihdosta 27 kappaletta päättyi ratakatkoon. Yhtenä tehtävänä olikin tutkia katkoon päättyneitä lajinvaihtoja.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Résumé: L'automatisation du séquençage et de l'annotation des génomes, ainsi que l'application à large échelle de méthodes de mesure de l'expression génique, génèrent une quantité phénoménale de données pour des organismes modèles tels que l'homme ou la souris. Dans ce déluge de données, il devient très difficile d'obtenir des informations spécifiques à un organisme ou à un gène, et une telle recherche aboutit fréquemment à des réponses fragmentées, voir incomplètes. La création d'une base de données capable de gérer et d'intégrer aussi bien les données génomiques que les données transcriptomiques peut grandement améliorer la vitesse de recherche ainsi que la qualité des résultats obtenus, en permettant une comparaison directe de mesures d'expression des gènes provenant d'expériences réalisées grâce à des techniques différentes. L'objectif principal de ce projet, appelé CleanEx, est de fournir un accès direct aux données d'expression publiques par le biais de noms de gènes officiels, et de représenter des données d'expression produites selon des protocoles différents de manière à faciliter une analyse générale et une comparaison entre plusieurs jeux de données. Une mise à jour cohérente et régulière de la nomenclature des gènes est assurée en associant chaque expérience d'expression de gène à un identificateur permanent de la séquence-cible, donnant une description physique de la population d'ARN visée par l'expérience. Ces identificateurs sont ensuite associés à intervalles réguliers aux catalogues, en constante évolution, des gènes d'organismes modèles. Cette procédure automatique de traçage se fonde en partie sur des ressources externes d'information génomique, telles que UniGene et RefSeq. La partie centrale de CleanEx consiste en un index de gènes établi de manière hebdomadaire et qui contient les liens à toutes les données publiques d'expression déjà incorporées au système. En outre, la base de données des séquences-cible fournit un lien sur le gène correspondant ainsi qu'un contrôle de qualité de ce lien pour différents types de ressources expérimentales, telles que des clones ou des sondes Affymetrix. Le système de recherche en ligne de CleanEx offre un accès aux entrées individuelles ainsi qu'à des outils d'analyse croisée de jeux de donnnées. Ces outils se sont avérés très efficaces dans le cadre de la comparaison de l'expression de gènes, ainsi que, dans une certaine mesure, dans la détection d'une variation de cette expression liée au phénomène d'épissage alternatif. Les fichiers et les outils de CleanEx sont accessibles en ligne (http://www.cleanex.isb-sib.ch/). Abstract: The automatic genome sequencing and annotation, as well as the large-scale gene expression measurements methods, generate a massive amount of data for model organisms. Searching for genespecific or organism-specific information througout all the different databases has become a very difficult task, and often results in fragmented and unrelated answers. The generation of a database which will federate and integrate genomic and transcriptomic data together will greatly improve the search speed as well as the quality of the results by allowing a direct comparison of expression results obtained by different techniques. The main goal of this project, called the CleanEx database, is thus to provide access to public gene expression data via unique gene names and to represent heterogeneous expression data produced by different technologies in a way that facilitates joint analysis and crossdataset comparisons. A consistent and uptodate gene nomenclature is achieved by associating each single gene expression experiment with a permanent target identifier consisting of a physical description of the targeted RNA population or the hybridization reagent used. These targets are then mapped at regular intervals to the growing and evolving catalogues of genes from model organisms, such as human and mouse. The completely automatic mapping procedure relies partly on external genome information resources such as UniGene and RefSeq. The central part of CleanEx is a weekly built gene index containing crossreferences to all public expression data already incorporated into the system. In addition, the expression target database of CleanEx provides gene mapping and quality control information for various types of experimental resources, such as cDNA clones or Affymetrix probe sets. The Affymetrix mapping files are accessible as text files, for further use in external applications, and as individual entries, via the webbased interfaces . The CleanEx webbased query interfaces offer access to individual entries via text string searches or quantitative expression criteria, as well as crossdataset analysis tools, and crosschip gene comparison. These tools have proven to be very efficient in expression data comparison and even, to a certain extent, in detection of differentially expressed splice variants. The CleanEx flat files and tools are available online at: http://www.cleanex.isbsib. ch/.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Reference collections of multiple Drosophila lines with accumulating collections of "omics" data have proven especially valuable for the study of population genetics and complex trait genetics. Here we present a description of a resource collection of 84 strains of Drosophila melanogaster whose genome sequences were obtained after 12 generations of full-sib inbreeding. The initial rationale for this resource was to foster development of a systems biology platform for modeling metabolic regulation by the use of natural polymorphisms as perturbations. As reference lines, they are amenable to repeated phenotypic measurements, and already a large collection of metabolic traits have been assayed. Another key feature of these strains is their widespread geographic origin, coming from Beijing, Ithaca, Netherlands, Tasmania, and Zimbabwe. After obtaining 12.5× coverage of paired-end Illumina sequence reads, SNP and indel calls were made with the GATK platform. Thorough quality control was enabled by deep sequencing one line to >100×, and single-nucleotide polymorphisms and indels were validated using ddRAD-sequencing as an orthogonal platform. In addition, a series of preliminary population genetic tests were performed with these single-nucleotide polymorphism data for assessment of data quality. We found 83 segregating inversions among the lines, and as expected these were especially abundant in the African sample. We anticipate that this will make a useful addition to the set of reference D. melanogaster strains, thanks to its geographic structuring and unusually high level of genetic diversity.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND & AIMS: Trace elements (TE) are involved in the immune and antioxidant defences which are of particular importance during critical illness. Determining plasma TE levels is costly. The present quality control study aimed at assessing the economic impact of a computer reminded blood sampling versus a risk guided on-demand monitoring of plasma concentrations of selenium, copper, and zinc. METHODS: Retrospective analysis of 2 cohorts of patients admitted during 6 months periods in 2006 and 2009 to the ICU of a University hospital. INCLUSION CRITERIA: to receive intravenous micronutrient supplements and/or to have a TE sampling during ICU stay. The TE samplings were triggered by computerized reminder in 2006 versus guided by nutritionists in 2009. RESULTS: During the 2 periods 636 patients met the inclusion criteria out of 2406 consecutive admissions, representing 29.7% and 24.9% respectively of the periods' admissions. The 2009 patients had higher SAPS2 scores (p = 0.02) and lower BMI compared to 2006 (p = 0.007). The number of laboratory determinations was drastically reduced in 2009, particularly during the first week, despite the higher severity of the cohort, resulting in à 55% cost reduction. CONCLUSIONS: The monitoring of TE concentrations guided by a nutritionist resulted in a reduction of the sampling frequency, and targeting on the sickest high risk patients, requiring a nutritional prescription adaptation. This control leads to cost reduction compared to an automated sampling prescription.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The emergence of electronic cigarettes (e-cigs) has given cannabis smokers a new method of inhaling cannabinoids. E-cigs differ from traditional marijuana cigarettes in several respects. First, it is assumed that vaporizing cannabinoids at lower temperatures is safer because it produces smaller amounts of toxic substances than the hot combustion of a marijuana cigarette. Recreational cannabis users can discretely "vape" deodorized cannabis extracts with minimal annoyance to the people around them and less chance of detection. There are nevertheless several drawbacks worth mentioning: although manufacturing commercial (or homemade) cannabinoid-enriched electronic liquids (e-liquids) requires lengthy, complex processing, some are readily on the Internet despite their lack of quality control, expiry date, and conditions of preservation and, above all, any toxicological and clinical assessment. Besides these safety problems, the regulatory situation surrounding e-liquids is often unclear. More simply ground cannabis flowering heads or concentrated, oily THC extracts (such as butane honey oil or BHO) can be vaped in specially designed, pen-sized marijuana vaporizers. Analysis of a commercial e-liquid rich in cannabidiol showed that it contained a smaller dose of active ingredient than advertised; testing our laboratory-made, purified BHO, however, confirmed that it could be vaped in an e-cig to deliver a psychoactive dose of THC. The health consequences specific to vaping these cannabis preparations remain largely unknown and speculative due to the absence of comprehensive, robust scientific studies. The most significant health concerns involve the vaping of cannabinoids by children and teenagers. E-cigs could provide an alternative gateway to cannabis use for young people. Furthermore, vaping cannabinoids could lead to environmental and passive contamination.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

La cigarette électronique produit de la vapeur à inhaler contenant du propylène-glycol, des arômes et de la nicotine libérée rapidement. 6,7% de la population suisse, surtout des fumeurs, ont essayé la cigarette électronique et 0,1% l'utilise quotidiennement. Malgré l'incertitude due au bas niveau de preuves, la cigarette électronique pourrait être efficace pour cesser ou réduire le tabagisme. La sécurité de la cigarette électronique est démontrée à court terme mais pas à long terme ; sa toxicité semble très inférieure à celle du tabac. Les non-fumeurs et les jeunes utilisent peu la cigarette électronique qui ne semble pas les amener au tabagisme. Les mesures de santé publique recommandées sont la régulation du produit avec contrôle de la qualité ainsi que l'interdiction d'usage dans les lieux publics, de publicité et de vente aux mineurs. Electronic cigarettes are devices producing vapour containing propylene-glycol, flavourings and quickly delivered nicotine. 6.7% of the Swiss population, mainly smokers, experimented the electronic cigarette while 0.1% use it daily. Despite uncertainty due to the low level of evidence, electronic cigarettes might be effective for smoking cessation and reduction. The safety of electronic cigarettes is demonstrated at short-term but not at long-term; however its eventual toxicity is likely to be much lower than tobacco. Use of electronic cigarettes by non-smokers and youth who do not smoke is low and seems unlikely to lead them to tobacco use. Recommended public health measures include product regulation with quality control, ban in public places, prohibition of advertising and sales to minors.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objective To develop procedures to ensure consistency of printing quality of digital images, by means of hardcopy quantitative analysis based on a standard image. Materials and Methods Characteristics of mammography DI-ML and general purpose DI-HL films were studied through the QC-Test utilizing different processing techniques in a FujiFilm®-DryPix4000 printer. A software was developed for sensitometric evaluation, generating a digital image including a gray scale and a bar pattern to evaluate contrast and spatial resolution. Results Mammography films showed maximum optical density of 4.11 and general purpose films, 3.22. The digital image was developed with a 33-step wedge scale and a high-contrast bar pattern (1 to 30 lp/cm) for spatial resolution evaluation. Conclusion Mammographic films presented higher values for maximum optical density and contrast resolution as compared with general purpose films. The utilized digital processing technique could only change the image pixels matrix values and did not affect the printing standard. The proposed digital image standard allows greater control of the relationship between pixels values and optical density obtained in the analysis of films quality and printing systems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Controlling the quality variables (such as basis weight, moisture etc.) is a vital part of making top quality paper or board. In this thesis, an advanced data assimilation tool is applied to the quality control system (QCS) of a paper or board machine. The functionality of the QCS is based on quality observations that are measured with a traversing scanner making a zigzag path. The basic idea is the following: The measured quality variable has to be separated into its machine direction (MD) and cross direction (CD) variations due to the fact that the QCS works separately in MD and CD. Traditionally this is done simply by assuming one scan of the zigzag path to be the CD profile and its mean value to be one point of the MD trend. In this thesis, a more advanced method is introduced. The fundamental idea is to use the signals’ frequency components to represent the variation in both CD and MD. To be able to get to the frequency domain, the Fourier transform is utilized. The frequency domain, that is, the Fourier components are then used as a state vector in a Kalman filter. The Kalman filter is a widely used data assimilation tool to combine noisy observations with a model. The observations here refer to the quality measurements and the model to the Fourier frequency components. By implementing the two dimensional Fourier transform into the Kalman filter, we get an advanced tool for the separation of CD and MD components in total variation or, to be more general, for data assimilation. A piece of a paper roll is analyzed and this tool is applied to model the dataset. As a result, it is clear that the Kalman filter algorithm is able to reconstruct the main features of the dataset from a zigzag path. Although the results are made with a very short sample of paper roll, it seems that this method has great potential to be used later on as a part of the quality control system.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this article, the results of a modified SERVQUAL questionnaire (Parasuraman et al., 1991) are reported. The modifications consisted in substituting questionnaire items particularly suited to a specific service (banking) and context (county of Girona, Spain) for the original rather general and abstract items. These modifications led to more interpretable factors which accounted for a higher percentage of item variance. The data were submitted to various structural equation models which made it possible to conclude that the questionnaire contains items with a high measurement quality with respect to five identified dimensions of service quality which differ from those specified by Parasuraman et al. And are specific to the banking service. The two dimensions relating to the behaviour of employees have the greatest predictive power on overall quality and satisfaction ratings, which enables managers to use a low-cost reduced version of the questionnaire to monitor quality on a regular basis. It was also found that satisfaction and overall quality were perfectly correlated thus showing that customers do not perceive these concepts as being distinct

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Joints intended for welding frequently show variations in geometry and position, for which it is unfortunately not possible to apply a single set of operating parameters to ensure constant quality. The cause of this difficulty lies in a number of factors, including inaccurate joint preparation and joint fit up, tack welds, as well as thermal distortion of the workpiece. In plasma arc keyhole welding of butt joints, deviations in the gap width may cause weld defects such as an incomplete weld bead, excessive penetration and burn through. Manual adjustment of welding parameters to compensate for variations in the gap width is very difficult, and unsatisfactory weld quality is often obtained. In this study a control system for plasma arc keyhole welding has been developed and used to study the effects of the real time control of welding parameters on gap tolerance during welding of austenitic stainless steel AISI 304L. The welding tests demonstrated the beneficial effect of real time control on weld quality. Compared with welding using constant parameters, the maximum tolerable gap width with an acceptable weld quality was 47% higher when using the real time controlled parameters for a plate thickness of 5 mm. In addition, burn through occurred with significantly larger gap widths when parameters were controlled in real time. Increased gap tolerance enables joints to be prepared and fit up less accurately, saving time and preparation costs for welding. In addition to the control system, a novel technique for back face monitoring is described in this study. The test results showed that the technique could be successfully applied for penetration monitoring when welding non magnetic materials. The results also imply that it is possible to measure the dimensions of the plasma efflux or weld root, and use this information in a feedback control system and, thus, maintain the required weld quality.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The topic of this thesis is the simulation of a combination of several control and data assimilation methods, meant to be used for controlling the quality of paper in a paper machine. Paper making is a very complex process and the information obtained from the web is sparse. A paper web scanner can only measure a zig zag path on the web. An assimilation method is needed to process estimates for Machine Direction (MD) and Cross Direction (CD) profiles of the web. Quality control is based on these measurements. There is an increasing need for intelligent methods to assist in data assimilation. The target of this thesis is to study how such intelligent assimilation methods are affecting paper web quality. This work is based on a paper web simulator, which has been developed in the TEKES funded MASI NoTes project. The simulator is a valuable tool in comparing different assimilation methods. The thesis contains the comparison of four different assimilation methods. These data assimilation methods are a first order Bayesian model estimator, an ARMA model based on a higher order Bayesian estimator, a Fourier transform based Kalman filter estimator and a simple block estimator. The last one can be considered to be close to current operational methods. From these methods Bayesian, ARMA and Kalman all seem to have advantages over the commercial one. The Kalman and ARMA estimators seems to be best in overall performance.