935 resultados para Electronic data processing -- Quality control
Resumo:
Sample preparation procedures for AMS measurements of 129I and 127I in environmental materials and some methodological aspects of quality assurance are discussed. Measurements from analyses of some pre-nuclear soil and thyroid gland samples and of a systematic investigation of natural waters in Lower Saxony, Germany, are described. Although the up-to-now lowest 129I/127I ratios in soils and thyroid glands were observed, they are still suspect to contamination since they are significantly higher than the pre-nuclear equilibrium ratio in the marine hydrosphere. A survey on all available 129I/127I isotopic ratios in precipitation shows a dramatic increase until the middle of the 1980s and a stabilization since 1987 at high isotopic ratios of about (3.6–8.3)×10−7. In surface waters, ratios of (57–380)×10−10 are measured while shallow ground waters show with ratios of (1.3–200)×10−10 significantly lower values with a much larger spread. The data for 129I in soils and in precipitation are used to estimate pre-nuclear and modern 129I deposition densities.
Resumo:
Clinical Research Data Quality Literature Review and Pooled Analysis We present a literature review and secondary analysis of data accuracy in clinical research and related secondary data uses. A total of 93 papers meeting our inclusion criteria were categorized according to the data processing methods. Quantitative data accuracy information was abstracted from the articles and pooled. Our analysis demonstrates that the accuracy associated with data processing methods varies widely, with error rates ranging from 2 errors per 10,000 files to 5019 errors per 10,000 fields. Medical record abstraction was associated with the highest error rates (70–5019 errors per 10,000 fields). Data entered and processed at healthcare facilities had comparable error rates to data processed at central data processing centers. Error rates for data processed with single entry in the presence of on-screen checks were comparable to double entered data. While data processing and cleaning methods may explain a significant amount of the variability in data accuracy, additional factors not resolvable here likely exist. Defining Data Quality for Clinical Research: A Concept Analysis Despite notable previous attempts by experts to define data quality, the concept remains ambiguous and subject to the vagaries of natural language. This current lack of clarity continues to hamper research related to data quality issues. We present a formal concept analysis of data quality, which builds on and synthesizes previously published work. We further posit that discipline-level specificity may be required to achieve the desired definitional clarity. To this end, we combine work from the clinical research domain with findings from the general data quality literature to produce a discipline-specific definition and operationalization for data quality in clinical research. While the results are helpful to clinical research, the methodology of concept analysis may be useful in other fields to clarify data quality attributes and to achieve operational definitions. Medical Record Abstractor’s Perceptions of Factors Impacting the Accuracy of Abstracted Data Medical record abstraction (MRA) is known to be a significant source of data errors in secondary data uses. Factors impacting the accuracy of abstracted data are not reported consistently in the literature. Two Delphi processes were conducted with experienced medical record abstractors to assess abstractor’s perceptions about the factors. The Delphi process identified 9 factors that were not found in the literature, and differed with the literature by 5 factors in the top 25%. The Delphi results refuted seven factors reported in the literature as impacting the quality of abstracted data. The results provide insight into and indicate content validity of a significant number of the factors reported in the literature. Further, the results indicate general consistency between the perceptions of clinical research medical record abstractors and registry and quality improvement abstractors. Distributed Cognition Artifacts on Clinical Research Data Collection Forms Medical record abstraction, a primary mode of data collection in secondary data use, is associated with high error rates. Distributed cognition in medical record abstraction has not been studied as a possible explanation for abstraction errors. We employed the theory of distributed representation and representational analysis to systematically evaluate cognitive demands in medical record abstraction and the extent of external cognitive support employed in a sample of clinical research data collection forms. We show that the cognitive load required for abstraction in 61% of the sampled data elements was high, exceedingly so in 9%. Further, the data collection forms did not support external cognition for the most complex data elements. High working memory demands are a possible explanation for the association of data errors with data elements requiring abstractor interpretation, comparison, mapping or calculation. The representational analysis used here can be used to identify data elements with high cognitive demands.
Resumo:
In this paper, we use ARIMA modelling to estimate a set of characteristics of a short-term indicator (for example, the index of industrial production), as trends, seasonal variations, cyclical oscillations, unpredictability, deterministic effects (as a strike), etc. Thus for each sector and product (more than 1000), we construct a vector of values corresponding to the above-mentioned characteristics, that can be used for data editing.
Resumo:
La emisión de polvo por efecto del viento desde depósitos de residuos mineros o industriales y el paso de vehículos en vías no pavimentadas, es un problema que afecta las actividades productivas; el ambiente y la salud de las personas que permanecen en el área contaminada. En Chile, en los últimos años la sensibilidad social y las exigencias ambientales han aumentado, así como la oferta de diferentes supresores y tecnologías de aplicación. Se han revisado las causas que provocan emisión de polvo y las tecnologías disponibles en Chile para la supresión de polvo, además de las metodologías y normativa para evaluar el desempeño de los materiales tratados con diferentes supresores. En algunos casos no es posible comparar propiedades de desempeño, como durabilidad, dosis a aplicar y frecuencia de las aplicaciones, entre otros aspectos. Los procedimientos descritos en la norma NCh3266-2012 permiten evaluar la erosión eólica en depósitos de residuos, sitios eriazos y caminos no pavimentados, entre otros, junto con evaluar el desempeño de diferentes tipos de supresores de polvo a partir de datos objetivos comparables. Esto permite seleccionar el supresor más adecuado, mejorar la eficiencia de los tratamientos, optimizar los costos y mejorar los procesos productivos. Palabras clave: Erosión-eólica, supresor de polvo, residuos-mineros, caminos-no pavimentados. Dust emissions by wind effect from mining deposits or industrial waste and passing vehicles on unpaved roads, is a problem that affects the productive activities; the environment and the health of those who remain in the contaminated area. The social sensitivity and environmental requirements on this issue in Chile have increased, as well as offering different suppressors and application technologies. Have been reviewed the causes of dust emission and technologies available in Chile for dust suppression, plus methodologies and standards for assessing the performance of the treated materials with different suppressors. In some cases it is not possible to compare performance properties such as durability, application dose and frequency of applications, among others aspects. The procedures described in the NCh 3266-2012 standard allows the assessment of wind erosion in waste deposits, vacant lots and unpaved roads, among others, along with evaluating the performance of different types of dust suppressants from comparable objective data. This allows selecting the most suitable suppressor, improve efficiency of treatments, optimize costs and improve production processes. Keywords: Wind-erosion, dust-suppressor, mining-waste, unpavedroads
Resumo:
PAMELA (Phased Array Monitoring for Enhanced Life Assessment) SHMTM System is an integrated embedded ultrasonic guided waves based system consisting of several electronic devices and one system manager controller. The data collected by all PAMELA devices in the system must be transmitted to the controller, who will be responsible for carrying out the advanced signal processing to obtain SHM maps. PAMELA devices consist of hardware based on a Virtex 5 FPGA with a PowerPC 440 running an embedded Linux distribution. Therefore, PAMELA devices, in addition to the capability of performing tests and transmitting the collected data to the controller, have the capability of perform local data processing or pre-processing (reduction, normalization, pattern recognition, feature extraction, etc.). Local data processing decreases the data traffic over the network and allows CPU load of the external computer to be reduced. Even it is possible that PAMELA devices are running autonomously performing scheduled tests, and only communicates with the controller in case of detection of structural damages or when programmed. Each PAMELA device integrates a software management application (SMA) that allows to the developer downloading his own algorithm code and adding the new data processing algorithm to the device. The development of the SMA is done in a virtual machine with an Ubuntu Linux distribution including all necessary software tools to perform the entire cycle of development. Eclipse IDE (Integrated Development Environment) is used to develop the SMA project and to write the code of each data processing algorithm. This paper presents the developed software architecture and describes the necessary steps to add new data processing algorithms to SMA in order to increase the processing capabilities of PAMELA devices.An example of basic damage index estimation using delay and sum algorithm is provided.
Resumo:
Nowadays, devices that monitor the health of structures consume a lot of power and need a lot of time to acquire, process, and send the information about the structure to the main processing unit. To decrease this time, fast electronic devices are starting to be used to accelerate this processing. In this paper some hardware algorithms implemented in an electronic logic programming device are described. The goal of this implementation is accelerate the process and diminish the information that has to be send. By reaching this goal, the time the processor needs for treating all the information is reduced and so the power consumption is reduced too.
Resumo:
The human asialoglycoprotein receptor H2a subunit contains a charged pentapeptide, EGHRG, in its ectodomain that is the only sequence absent from the H2b alternatively spliced variant. H2b exits the endoplasmic reticulum (ER) even when singly expressed, whereas H2a gives rise to a cleaved soluble secreted ectodomain fragment; uncleaved membrane-bound H2a molecules are completely retained and degraded in the ER. We have inserted the H2a pentapeptide into the sequence of the H1 subunit (H1i5), which caused complete ER retention but, unexpectedly, no degradation. This suggests that the pentapeptide is a determinant for ER retention not colocalizing in H2a with the determinant for degradation. The state of sugar chain processing and the ER localization of H1i5, which was unchanged at 15°C or after treatment with nocodazole, indicate ER retention and not retrieval from the cis-Golgi or the intermediate compartment. H1i5 folded similarly to H1, and both associated to calnexin. However, whereas H1 dissociated with a half time of 45 min, H1i5 remained bound to the chaperone for prolonged periods. The correct global folding of H2a and H1i5 and of other normal precursors and unassembled proteins and the true ER retention, and not exit and retrieval, suggest a difference in their quality control mechanism compared with that of misfolded proteins, which does involve retrieval. However, both pathways may involve calnexin.
Resumo:
Caenorhabditis elegans dauer formation is an alternative larval developmental pathway that the worm can take when environmental conditions become detrimental. Animals can survive several months in this stress-resistant stage and can resume normal development when growth conditions improve. Although the worms integrate a variety of sensory information to commit to dauer formation, it is currently unknown whether they also monitor internal cellular damage. The Ro ribonucleoprotein complex, which was initially described as a human autoantigen, is composed of one major 60-kDa protein, Ro60, that binds to one of four small RNA molecules, designated Y RNAs. Ro60 has been shown to bind mutant 5S rRNA molecules in Xenopus oocytes, suggesting a role for Ro60 in 5S rRNA biogenesis. Analysis of ribosomes from a C. elegans rop-1(−) strain, which is null for the expression of Ro60, demonstrated that they contain a high percentage of mutant 5S rRNA molecules, thereby strengthening the notion of a link between the rop-1 gene product and 5S rRNA quality control. The Ro particle was recently shown to be involved in the resistance of Deinococcus radiodurans to UV irradiation, suggesting a role for the Ro complex in stress resistance. We have studied the role of rop-1 in dauer formation. We present genetic and biochemical evidence that rop-1 interacts with dauer-formation genes and is involved in the regulation of the worms' entry into the dauer stage. Furthermore, we find that the rop-1 gene product undergoes a proteolytic processing step that is regulated by the dauer formation pathway via an aspartic proteinase. These results suggest that the Ro particle may function in an RNA quality-control checkpoint for dauer formation.
Resumo:
Degradation of proteins that, because of improper or suboptimal processing, are retained in the endoplasmic reticulum (ER) involves retrotranslocation to reach the cytosolic ubiquitin-proteasome machinery. We found that substrates of this pathway, the precursor of human asialoglycoprotein receptor H2a and free heavy chains of murine class I major histocompatibility complex (MHC), accumulate in a novel preGolgi compartment that is adjacent to but not overlapping with the centrosome, the Golgi complex, and the ER-to-Golgi intermediate compartment (ERGIC). On its way to degradation, H2a associated increasingly after synthesis with the ER translocon Sec61. Nevertheless, it remained in the secretory pathway upon proteasomal inhibition, suggesting that its retrotranslocation must be tightly coupled to the degradation process. In the presence of proteasomal inhibitors, the ER chaperones calreticulin and calnexin, but not BiP, PDI, or glycoprotein glucosyltransferase, concentrate in the subcellular region of the novel compartment. The “quality control” compartment is possibly a subcompartment of the ER. It depends on microtubules but is insensitive to brefeldin A. We discuss the possibility that it is also the site for concentration and retrotranslocation of proteins that, like the mutant cystic fibrosis transmembrane conductance regulator, are transported to the cytosol, where they form large aggregates, the “aggresomes.”
Resumo:
Mode of access: Internet.
Resumo:
"Research was supported by the United States Air Force through the Air Force Office of Scientific Research, Air Research and Development Command."
Resumo:
"Contract no. AF 30(602)-2138, Project 5554, Task 55102."
Resumo:
The characterization of blood pressure in treatment trials assessing the benefits of blood pressure lowering regimens is a critical factor for the appropriate interpretation of study results. With numerous operators involved in the measurement of blood pressure in many thousands of patients being screened for entry into clinical trials, it is essential that operators follow pre-defined measurement protocols involving multiple measurements and standardized techniques. Blood pressure measurement protocols have been developed by international societies and emphasize the importance of appropriate choice of cuff size, identification of Korotkoff sounds, and digit preference. Training of operators and auditing of blood pressure measurement may assist in reducing the operator-related errors in measurement. This paper describes the quality control activities adopted for the screening stage of the 2nd Australian National Blood Pressure Study (ANBP2). ANBP2 is cardiovascular outcome trial of the treatment of hypertension in the elderly that was conducted entirely in general practices in Australia. A total of 54 288 subjects were screened; 3688 previously untreated subjects were identified as having blood pressure >140/90 mmHg at the initial screening visit, 898 (24%) were not eligible for study entry after two further visits due to the elevated reading not being sustained. For both systolic and diastolic blood pressure recording, observed digit preference fell within 7 percentage points of the expected frequency. Protocol adherence, in terms of the required minimum blood pressure difference between the last two successive recordings, was 99.8%. These data suggest that adherence to blood pressure recording protocols and elimination of digit preferences can be achieved through appropriate training programs and quality control activities in large multi-centre community-based trials in general practice. Repeated blood pressure measurement prior to initial diagnosis and study entry is essential to appropriately characterize hypertension in these elderly patients.
Resumo:
This thesis describes advances in the characterisation, calibration and data processing of optical coherence tomography (OCT) systems. Femtosecond (fs) laser inscription was used for producing OCT-phantoms. Transparent materials are generally inert to infra-red radiations, but with fs lasers material modification occurs via non-linear processes when the highly focused light source interacts with the materials. This modification is confined to the focal volume and is highly reproducible. In order to select the best inscription parameters, combination of different inscription parameters were tested, using three fs laser systems, with different operating properties, on a variety of materials. This facilitated the understanding of the key characteristics of the produced structures with the aim of producing viable OCT-phantoms. Finally, OCT-phantoms were successfully designed and fabricated in fused silica. The use of these phantoms to characterise many properties (resolution, distortion, sensitivity decay, scan linearity) of an OCT system was demonstrated. Quantitative methods were developed to support the characterisation of an OCT system collecting images from phantoms and also to improve the quality of the OCT images. Characterisation methods include the measurement of the spatially variant resolution (point spread function (PSF) and modulation transfer function (MTF)), sensitivity and distortion. Processing of OCT data is a computer intensive process. Standard central processing unit (CPU) based processing might take several minutes to a few hours to process acquired data, thus data processing is a significant bottleneck. An alternative choice is to use expensive hardware-based processing such as field programmable gate arrays (FPGAs). However, recently graphics processing unit (GPU) based data processing methods have been developed to minimize this data processing and rendering time. These processing techniques include standard-processing methods which includes a set of algorithms to process the raw data (interference) obtained by the detector and generate A-scans. The work presented here describes accelerated data processing and post processing techniques for OCT systems. The GPU based processing developed, during the PhD, was later implemented into a custom built Fourier domain optical coherence tomography (FD-OCT) system. This system currently processes and renders data in real time. Processing throughput of this system is currently limited by the camera capture rate. OCTphantoms have been heavily used for the qualitative characterization and adjustment/ fine tuning of the operating conditions of OCT system. Currently, investigations are under way to characterize OCT systems using our phantoms. The work presented in this thesis demonstrate several novel techniques of fabricating OCT-phantoms and accelerating OCT data processing using GPUs. In the process of developing phantoms and quantitative methods, a thorough understanding and practical knowledge of OCT and fs laser processing systems was developed. This understanding leads to several novel pieces of research that are not only relevant to OCT but have broader importance. For example, extensive understanding of the properties of fs inscribed structures will be useful in other photonic application such as making of phase mask, wave guides and microfluidic channels. Acceleration of data processing with GPUs is also useful in other fields.