962 resultados para Data quality problems


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Research in art conservation has been developed from the early 1950s, giving a significant contribution to the conservation-restoration of cultural heritage artefacts. In fact, only through a profound knowledge about the nature and conditions of constituent materials, suitable decisions on the conservation and restoration measures can thus be adopted and preservation practices enhanced. The study of ancient artworks is particularly challenging as they can be considered as heterogeneous and multilayered systems where numerous interactions between the different components as well as degradation and ageing phenomena take place. However, difficulties to physically separate the different layers due to their thickness (1-200 µm) can result in the inaccurate attribution of the identified compounds to a specific layer. Therefore, details can only be analysed when the sample preparation method leaves the layer structure intact, as for example the preparation of embedding cross sections in synthetic resins. Hence, spatially resolved analytical techniques are required not only to exactly characterize the nature of the compounds but also to obtain precise chemical and physical information about ongoing changes. This thesis focuses on the application of FTIR microspectroscopic techniques for cultural heritage materials. The first section is aimed at introducing the use of FTIR microscopy in conservation science with a particular attention to the sampling criteria and sample preparation methods. The second section is aimed at evaluating and validating the use of different FTIR microscopic analytical methods applied to the study of different art conservation issues which may be encountered dealing with cultural heritage artefacts: the characterisation of the artistic execution technique (chapter II-1), the studies on degradation phenomena (chapter II-2) and finally the evaluation of protective treatments (chapter II-3). The third and last section is divided into three chapters which underline recent developments in FTIR spectroscopy for the characterisation of paint cross sections and in particular thin organic layers: a newly developed preparation method with embedding systems in infrared transparent salts (chapter III-1), the new opportunities offered by macro-ATR imaging spectroscopy (chapter III-2) and the possibilities achieved with the different FTIR microspectroscopic techniques nowadays available (chapter III-3). In chapter II-1, FTIR microspectroscopy as molecular analysis, is presented in an integrated approach with other analytical techniques. The proposed sequence is optimized in function of the limited quantity of sample available and this methodology permits to identify the painting materials and characterise the adopted execution technique and state of conservation. Chapter II-2 describes the characterisation of the degradation products with FTIR microscopy since the investigation on the ageing processes encountered in old artefacts represents one of the most important issues in conservation research. Metal carboxylates resulting from the interaction between pigments and binding media are characterized using synthesised metal palmitates and their production is detected on copper-, zinc-, manganese- and lead- (associated with lead carbonate) based pigments dispersed either in oil or egg tempera. Moreover, significant effects seem to be obtained with iron and cobalt (acceleration of the triglycerides hydrolysis). For the first time on sienna and umber paints, manganese carboxylates are also observed. Finally in chapter II-3, FTIR microscopy is combined with further elemental analyses to characterise and estimate the performances and stability of newly developed treatments, which should better fit conservation-restoration problems. In the second part, in chapter III-1, an innovative embedding system in potassium bromide is reported focusing on the characterisation and localisation of organic substances in cross sections. Not only the identification but also the distribution of proteinaceous, lipidic or resinaceous materials, are evidenced directly on different paint cross sections, especially in thin layers of the order of 10 µm. Chapter III-2 describes the use of a conventional diamond ATR accessory coupled with a focal plane array to obtain chemical images of multi-layered paint cross sections. A rapid and simple identification of the different compounds is achieved without the use of any infrared microscope objectives. Finally, the latest FTIR techniques available are highlighted in chapter III-3 in a comparative study for the characterisation of paint cross sections. Results in terms of spatial resolution, data quality and chemical information obtained are presented and in particular, a new FTIR microscope equipped with a linear array detector, which permits reducing the spatial resolution limit to approximately 5 µm, provides very promising results and may represent a good alternative to either mapping or imaging systems.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In the last few years the resolution of numerical weather prediction (nwp) became higher and higher with the progresses of technology and knowledge. As a consequence, a great number of initial data became fundamental for a correct initialization of the models. The potential of radar observations has long been recognized for improving the initial conditions of high-resolution nwp models, while operational application becomes more frequent. The fact that many nwp centres have recently taken into operations convection-permitting forecast models, many of which assimilate radar data, emphasizes the need for an approach to providing quality information which is needed in order to avoid that radar errors degrade the model's initial conditions and, therefore, its forecasts. Environmental risks can can be related with various causes: meteorological, seismical, hydrological/hydraulic. Flash floods have horizontal dimension of 1-20 Km and can be inserted in mesoscale gamma subscale, this scale can be modeled only with nwp model with the highest resolution as the COSMO-2 model. One of the problems of modeling extreme convective events is related with the atmospheric initial conditions, in fact the scale dimension for the assimilation of atmospheric condition in an high resolution model is about 10 Km, a value too high for a correct representation of convection initial conditions. Assimilation of radar data with his resolution of about of Km every 5 or 10 minutes can be a solution for this problem. In this contribution a pragmatic and empirical approach to deriving a radar data quality description is proposed to be used in radar data assimilation and more specifically for the latent heat nudging (lhn) scheme. Later the the nvective capabilities of the cosmo-2 model are investigated through some case studies. Finally, this work shows some preliminary experiments of coupling of a high resolution meteorological model with an Hydrological one.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Aimee Guidera, Director of the National Data Quality Campaign, delivered the second annual Lee Gurel '48 Lecture in Education, "From Dartboards to Dashboards: The Imperative of Using Data to Improve Student Outcomes." Aimee Rogstad Guidera is the Founding Executive Director of the Data Quality Campaign. She manages a growing partnership among national organizations collaborating to improve the quality, accessibility and use of education data to improve student achievement. Working with 10 Founding Partners, Aimee launched the DQC in 2005 with the goal of every state having a robust longitudinal data system in place by 2009. The Campaign is now in the midst of its second phase focusing on State Actions to ensure effective data use. Aimee joined the National Center for Educational Accountability as Director of the Washington, DC office in 2003. During her eight previous years in various roles at the National Alliance of Business, Aimee supported the corporate community's efforts to increase achievement at all levels of learning. As NAB Vice President of Programs, she managed the Business Coalition Network, comprised of over 1,000 business led coalitions focused on improving education in communities across the country. Prior to joining the Alliance, Aimee focused on school readiness, academic standards, education goals and accountability systems while in the Center for Best Practices at the National Governors Association. She taught for the Japanese Ministry of Education in five Hiroshima high schools where she interviewed educators and studied the Japanese education system immediately after receiving her AB from Princeton Universityâs Woodrow Wilson School of Public & International Affairs. Aimee also holds a Masters Degree in Public Policy from Harvardâs John F. Kennedy School of Government.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Recent brain imaging work has expanded our understanding of the mechanisms of perceptual, cognitive, and motor functions in human subjects, but research into the cerebral control of emotional and motivational function is at a much earlier stage. Important concepts and theories of emotion are briefly introduced, as are research designs and multimodal approaches to answering the central questions in the field. We provide a detailed inspection of the methodological and technical challenges in assessing the cerebral correlates of emotional activation, perception, learning, memory, and emotional regulation behavior in healthy humans. fMRI is particularly challenging in structures such as the amygdala as it is affected by susceptibility-related signal loss, image distortion, physiological and motion artifacts and colocalized Resting State Networks (RSNs). We review how these problems can be mitigated by using optimized echo-planar imaging (EPI) parameters, alternative MR sequences, and correction schemes. High-quality data can be acquired rapidly in these problematic regions with gradient compensated multiecho EPI or high resolution EPI with parallel imaging and optimum gradient directions, combined with distortion correction. Although neuroimaging studies of emotion encounter many difficulties regarding the limitations of measurement precision, research design, and strategies of validating neuropsychological emotion constructs, considerable improvement in data quality and sensitivity to subtle effects can be achieved. The methods outlined offer the prospect for fMRI studies of emotion to provide more sensitive, reliable, and representative models of measurement that systematically relate the dynamics of emotional regulation behavior with topographically distinct patterns of activity in the brain. This will provide additional information as an aid to assessment, categorization, and treatment of patients with emotional and personality disorders.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Increasing amounts of clinical research data are collected by manual data entry into electronic source systems and directly from research subjects. For this manual entered source data, common methods of data cleaning such as post-entry identification and resolution of discrepancies and double data entry are not feasible. However data accuracy rates achieved without these mechanisms may be higher than desired for a particular research use. We evaluated a heuristic usability method for utility as a tool to independently and prospectively identify data collection form questions associated with data errors. The method evaluated had a promising sensitivity of 64% and a specificity of 67%. The method was used as described in the literature for usability with no further adaptations or specialization for predicting data errors. We conclude that usability evaluation methodology should be further investigated for use in data quality assurance.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

BACKGROUND We describe the setup of a neonatal quality improvement tool and list which peer-reviewed requirements it fulfils and which it does not. We report on the so-far observed effects, how the units can identify quality improvement potential, and how they can measure the effect of changes made to improve quality. METHODS Application of a prospective longitudinal national cohort data collection that uses algorithms to ensure high data quality (i.e. checks for completeness, plausibility and reliability), and to perform data imaging (Plsek's p-charts and standardized mortality or morbidity ratio SMR charts). The collected data allows monitoring a study collective of very low birth-weight infants born from 2009 to 2011 by applying a quality cycle following the steps 'guideline - perform - falsify - reform'. RESULTS 2025 VLBW live-births from 2009 to 2011 representing 96.1% of all VLBW live-births in Switzerland display a similar mortality rate but better morbidity rates when compared to other networks. Data quality in general is high but subject to improvement in some units. Seven measurements display quality improvement potential in individual units. The methods used fulfil several international recommendations. CONCLUSIONS The Quality Cycle of the Swiss Neonatal Network is a helpful instrument to monitor and gradually help improve the quality of care in a region with high quality standards and low statistical discrimination capacity.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

BACKGROUND The abstraction of data from medical records is a widespread practice in epidemiological research. However, studies using this means of data collection rarely report reliability. Within the Transition after Childhood Cancer Study (TaCC) which is based on a medical record abstraction, we conducted a second independent abstraction of data with the aim to assess a) intra-rater reliability of one rater at two time points; b) the possible learning effects between these two time points compared to a gold-standard; and c) inter-rater reliability. METHOD Within the TaCC study we conducted a systematic medical record abstraction in the 9 Swiss clinics with pediatric oncology wards. In a second phase we selected a subsample of medical records in 3 clinics to conduct a second independent abstraction. We then assessed intra-rater reliability at two time points, the learning effect over time (comparing each rater at two time-points with a gold-standard) and the inter-rater reliability of a selected number of variables. We calculated percentage agreement and Cohen's kappa. FINDINGS For the assessment of the intra-rater reliability we included 154 records (80 for rater 1; 74 for rater 2). For the inter-rater reliability we could include 70 records. Intra-rater reliability was substantial to excellent (Cohen's kappa 0-6-0.8) with an observed percentage agreement of 75%-95%. In all variables learning effects were observed. Inter-rater reliability was substantial to excellent (Cohen's kappa 0.70-0.83) with high agreement ranging from 86% to 100%. CONCLUSIONS Our study showed that data abstracted from medical records are reliable. Investigating intra-rater and inter-rater reliability can give confidence to draw conclusions from the abstracted data and increase data quality by minimizing systematic errors.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Systems for the identification and registration of cattle have gradually been receiving attention for use in syndromic surveillance, a relatively recent approach for the early detection of infectious disease outbreaks. Real or near real-time monitoring of deaths or stillbirths reported to these systems offer an opportunity to detect temporal or spatial clusters of increased mortality that could be caused by an infectious disease epidemic. In Switzerland, such data are recorded in the "Tierverkehrsdatenbank" (TVD). To investigate the potential of the Swiss TVD for syndromic surveillance, 3 years of data (2009-2011) were assessed in terms of data quality, including timeliness of reporting and completeness of geographic data. Two time-series consisting of reported on-farm deaths and stillbirths were retrospectively analysed to define and quantify the temporal patterns that result from non-health related factors. Geographic data were almost always present in the TVD data; often at different spatial scales. On-farm deaths were reported to the database by farmers in a timely fashion; stillbirths were less timely. Timeliness and geographic coverage are two important features of disease surveillance systems, highlighting the suitability of the TVD for use in a syndromic surveillance system. Both time series exhibited different temporal patterns that were associated with non-health related factors. To avoid false positive signals, these patterns need to be removed from the data or accounted for in some way before applying aberration detection algorithms in real-time. Evaluating mortality data reported to systems for the identification and registration of cattle is of value for comparing national data systems and as a first step towards a European-wide early detection system for emerging and re-emerging cattle diseases.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Indoor Air Quality (IAQ) can have significant implications for health, productivity, job performance, and operating cost. Professional experience in the field of indoor air quality suggests that high expectations (better than nationally established standards) (American Society of Heating, Refrigerating, and Air-conditioning Engineers (ASHRAE)) of workplace indoor air quality lead to increase air quality complaints. To determine whether there is a positive association between expectations and indoor air quality complaints, a one-time descriptive and analytical cross-sectional pilot study was conducted. Area Safety Liaisons (n = 330) at University of Texas Health Science Center â Houston were asked to answer a questionnaire regarding their expectations of four workplace indoor air quality indicators i.e., (temperature, relative humidity, carbon dioxide, and carbon monoxide) and if they experienced and reported indoor air quality problems. A chi-square test for independence was used to evaluate associations among the variables of interest. The response rate was 54% (n = 177). Results did not show significant associations between expectation and indoor air quality. However, a greater proportion of Area Safety Liaisons who expected indoor air quality indicators to be better than the established standard experienced greater indoor air quality problems. Similarly, a slightly higher proportion of Area Liaisons who expected indoor air quality indicators to be better than the standard reported greater indoor air quality complaints. ^ The findings indicated that a greater proportion of Area Safety Liaisons with high expectations (conditions that are beyond what is considered normal and acceptable by ASHRAE) experienced greater indoor air quality discomfort. This result suggests a positive association between high expectations and experienced and reported indoor air quality complaints. Future studies may be able to address whether the frequency of complaints and resulting investigations can be reduced through information and education about what are acceptable conditions.^

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Clinical Research Data Quality Literature Review and Pooled Analysis We present a literature review and secondary analysis of data accuracy in clinical research and related secondary data uses. A total of 93 papers meeting our inclusion criteria were categorized according to the data processing methods. Quantitative data accuracy information was abstracted from the articles and pooled. Our analysis demonstrates that the accuracy associated with data processing methods varies widely, with error rates ranging from 2 errors per 10,000 files to 5019 errors per 10,000 fields. Medical record abstraction was associated with the highest error rates (70â5019 errors per 10,000 fields). Data entered and processed at healthcare facilities had comparable error rates to data processed at central data processing centers. Error rates for data processed with single entry in the presence of on-screen checks were comparable to double entered data. While data processing and cleaning methods may explain a significant amount of the variability in data accuracy, additional factors not resolvable here likely exist. Defining Data Quality for Clinical Research: A Concept Analysis Despite notable previous attempts by experts to define data quality, the concept remains ambiguous and subject to the vagaries of natural language. This current lack of clarity continues to hamper research related to data quality issues. We present a formal concept analysis of data quality, which builds on and synthesizes previously published work. We further posit that discipline-level specificity may be required to achieve the desired definitional clarity. To this end, we combine work from the clinical research domain with findings from the general data quality literature to produce a discipline-specific definition and operationalization for data quality in clinical research. While the results are helpful to clinical research, the methodology of concept analysis may be useful in other fields to clarify data quality attributes and to achieve operational definitions. Medical Record Abstractorâs Perceptions of Factors Impacting the Accuracy of Abstracted Data Medical record abstraction (MRA) is known to be a significant source of data errors in secondary data uses. Factors impacting the accuracy of abstracted data are not reported consistently in the literature. Two Delphi processes were conducted with experienced medical record abstractors to assess abstractorâs perceptions about the factors. The Delphi process identified 9 factors that were not found in the literature, and differed with the literature by 5 factors in the top 25%. The Delphi results refuted seven factors reported in the literature as impacting the quality of abstracted data. The results provide insight into and indicate content validity of a significant number of the factors reported in the literature. Further, the results indicate general consistency between the perceptions of clinical research medical record abstractors and registry and quality improvement abstractors. Distributed Cognition Artifacts on Clinical Research Data Collection Forms Medical record abstraction, a primary mode of data collection in secondary data use, is associated with high error rates. Distributed cognition in medical record abstraction has not been studied as a possible explanation for abstraction errors. We employed the theory of distributed representation and representational analysis to systematically evaluate cognitive demands in medical record abstraction and the extent of external cognitive support employed in a sample of clinical research data collection forms. We show that the cognitive load required for abstraction in 61% of the sampled data elements was high, exceedingly so in 9%. Further, the data collection forms did not support external cognition for the most complex data elements. High working memory demands are a possible explanation for the association of data errors with data elements requiring abstractor interpretation, comparison, mapping or calculation. The representational analysis used here can be used to identify data elements with high cognitive demands.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Two seismic surveys were carried out on the high-altitude glacier saddle, Colle Gnifetti, Monte Rosa, Italy/Switzerland. Explosive and vibroseismic sources were tested to explore the best way to generate seismic waves to deduce shallow and intermediate properties (<100 m) of firn and ice. The explosive source (SISSY) excites strong surface and diving waves, degrading data quality for processing; no englacial reflections besides the noisy bed reflector are visible. However, the strong diving waves are analyzed to derive the density distribution of the firn pack, yielding results similar to a nearby ice core. The vibrator source (ElViS), used in both P- and SH-wave modes, produces detectable laterally coherent reflections within the firn and ice column. We compare these with ice-core and radar data. The SH-wave data are particularly useful in providing detailed, high-resolution information on firn and ice stratigraphy. Our analyses demonstrate the potential of seismic methods to determine physical properties of firn and ice, particularly density and potentially also crystal-orientation fabric.