937 resultados para Quality of Data


Relevância:

100.00% 100.00%

Publicador:

Resumo:

AIM: To study prospectively patients after heart transplantation with respect to quality of life, mortality, morbidity, and clinical parameters before and up to 10 years after the operation. METHODS: Sixty patients (47.9 +/- 10.9 years, 57 men, 3 women) were transplanted at the University of Vienna Hospital, Department for Heart and Thorax Surgery and were included in this study. They were assessed when set on the waiting list, then exactly one, 5 and 10 years after the transplantation. The variables evaluated included physical and emotional complaints, well-being, mortality and morbidity. In the sample of patients who survived 10 years (n = 23), morbidity (infections, malignancies, graft arteriosclerosis, and rejection episodes) as well as quality of life were evaluated. RESULTS: Actuarial survival rates were 83.3, 66.7, 48.3% at 1, 5, and 10 years after transplantation, respectively. During the first year, infections were the most important reasons for premature death. As a cause of mortality, malignancies were found between years 1 and 5, and graft arteriosclerosis between years 5 and 10. Physical complaints diminished significantly after the operation, but grew significantly during the period from 5 to 10 years (p < 0.001). However, trembling (p < 0.05) and paraesthesies (p < 0.01) diminished continuously. Emotional complaints such as depression and dysphoria (both p < 0.05) increased until the tenth year after their nadir at year 1. In long-time survivors, 3 malignancies (lung, skin, thyroidea) were diagnosed 6 to 9 years postoperatively. Three patients (13%) had signs of graft arteriosclerosis at year 10; 9 (40%) patients suffered from rejection episodes during the course of 10 years. There were no serious rejection episodes deserving immediate therapy. Quality of life at 10 years is good in these patients. CONCLUSIONS: Heart transplantation is a successful therapy for patients with terminal heart disease. Long-term survivors feel well after 10 years and report a good quality of life.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

High-quality data are essential for veterinary surveillance systems, and their quality can be affected by the source and the method of collection. Data recorded on farms could provide detailed information about the health of a population of animals, but the accuracy of the data recorded by farmers is uncertain. The aims of this study were to evaluate the quality of the data on animal health recorded on 97 Swiss dairy farms, to compare the quality of the data obtained by different recording systems, and to obtain baseline data on the health of the animals on the 97 farms. Data on animal health were collected from the farms for a year. Their quality was evaluated by assessing the completeness and accuracy of the recorded information, and by comparing farmers' and veterinarians' records. The quality of the data provided by the farmers was satisfactory, although electronic recording systems made it easier to trace the animals treated. The farmers tended to record more health-related events than the veterinarians, although this varied with the event considered, and some events were recorded only by the veterinarians. The farmers' attitude towards data collection was positive. Factors such as motivation, feedback, training, and simplicity and standardisation of data collection were important because they influenced the quality of the data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

RATIONALELow-budget rain collectors for water isotope analysis, such as the `ball-in-funnel type collector' (BiFC), are widely used in studies on stable water isotopes of rain. To date, however, an experimental quality assessment of such devices in relation to climatic factors does not exist. METHODSWe used Cavity Ring-Down Spectrometry (CRDS) to quantify the effects of evaporation on the O-18 values of reference water under controlled conditions as a function of the elapsed time between rainfall and collection for isotope analysis, the sample volume and the relative humidity (RH: 31% and 67%; 25 degrees C). The climate chamber conditions were chosen to reflect the warm and dry end of field conditions that favor evaporative enrichment (EE). We also tested the performance of the BiFC in the field, and compared our H-2/O-18 data obtained by isotope ratio mass spectrometry (IRMS) with those from the Swiss National Network for the Observation of Isotopes in the Water Cycle (ISOT). RESULTSThe EE increased with time, with a 1 increase in the O-18 values after 10days (RH: 25%; 25 degrees C; 35mL (corresponding to a 5mm rain event); p <0.001). The sample volume strongly affected the EE (max. value +1.5 parts per thousand for 7mL samples (i.e., 1mm rain events) after 72h at 31% and 67% RH; p <0.001), whereas the relative humidity had no significant effect. Using the BiFC in the field, we obtained very tight relationships of the H-2/O-18 values (r(2) 0.95) for three sites along an elevational gradient, not significantly different from that of the next ISOT station. CONCLUSIONSSince the chosen experimental conditions were extreme compared with the field conditions, it was concluded that the BiFC is a highly reliable and inexpensive collector of rainwater for isotope analysis. Copyright (c) 2014 John Wiley & Sons, Ltd.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Citizens demand more and more data for making decisions in their daily life. Therefore, mechanisms that allow citizens to understand and analyze linked open data (LOD) in a user-friendly manner are highly required. To this aim, the concept of Open Business Intelligence (OpenBI) is introduced in this position paper. OpenBI facilitates non-expert users to (i) analyze and visualize LOD, thus generating actionable information by means of reporting, OLAP analysis, dashboards or data mining; and to (ii) share the new acquired information as LOD to be reused by anyone. One of the most challenging issues of OpenBI is related to data mining, since non-experts (as citizens) need guidance during preprocessing and application of mining algorithms due to the complexity of the mining process and the low quality of the data sources. This is even worst when dealing with LOD, not only because of the different kind of links among data, but also because of its high dimensionality. As a consequence, in this position paper we advocate that data mining for OpenBI requires data quality-aware mechanisms for guiding non-expert users in obtaining and sharing the most reliable knowledge from the available LOD.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose: To examine a single-optic accommodating intraocular lens (IOL) visual performance by correlating IOL implanted eyes’ defocus curve with the intraocular aberrometric profile and the impact on the quality of life (QOL). Methods: Prospective consecutive case series study including a total of 25 eyes of 14 patients with ages ranging between 52 and 79 years old. All cases underwent cataract surgery with implantation of the single-optic accommodating IOL Crystalens HD (Bausch & Lomb). Distance and near visual acuity outcomes, intraocular aberrations, the defocus curve and QOL (NEI VFQ-25) were evaluated 3 months after surgery. Results: A significant improvement in distance visual acuity was found postoperatively (p = 0.02). Mean postoperative LogMAR uncorrected near visual acuity was 0.44 ± 0.23 (20/30). 60% of eyes had a postoperative addition between 0 and 1.5 diopters (D). The defocus curve showed an area of maximum visual acuity for the levels of defocus corresponding to distance and intermediate vision (−1 to +0.5 D). Postoperative intermediate visual acuity correlated significantly some QOL indices (r ≥ 0.51, p ≤ 0.03; difficulty in going down steps or seeing how people react to things that patient says) as well as with J0 component of manifest cylinder. Postoperative distance-corrected near visual acuity correlated significantly with age (r = 0.65, p < 0.01). Conclusions: This accommodating IOL seems to be able to restore the distance visual function as well as to provide an improvement in intermediate and near vision with a significant impact on patient's QOL, although limited by age and astigmatism. Future studies with larger sample sizes should confirm all these trends.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

National Highway Traffic Safety Administration, Washington, D.C.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Mode of access: Internet.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Mode of access: Internet.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Hospital performance reports based on administrative data should distinguish differences in quality of care between hospitals from case mix related variation and random error effects. A study was undertaken to determine which of 12 diagnosis-outcome indicators measured across all hospitals in one state had significant risk adjusted systematic ( or special cause) variation (SV) suggesting differences in quality of care. For those that did, we determined whether SV persists within hospital peer groups, whether indicator results correlate at the individual hospital level, and how many adverse outcomes would be avoided if all hospitals achieved indicator values equal to the best performing 20% of hospitals. Methods: All patients admitted during a 12 month period to 180 acute care hospitals in Queensland, Australia with heart failure (n = 5745), acute myocardial infarction ( AMI) ( n = 3427), or stroke ( n = 2955) were entered into the study. Outcomes comprised in-hospital deaths, long hospital stays, and 30 day readmissions. Regression models produced standardised, risk adjusted diagnosis specific outcome event ratios for each hospital. Systematic and random variation in ratio distributions for each indicator were then apportioned using hierarchical statistical models. Results: Only five of 12 (42%) diagnosis-outcome indicators showed significant SV across all hospitals ( long stays and same diagnosis readmissions for heart failure; in-hospital deaths and same diagnosis readmissions for AMI; and in-hospital deaths for stroke). Significant SV was only seen for two indicators within hospital peer groups ( same diagnosis readmissions for heart failure in tertiary hospitals and inhospital mortality for AMI in community hospitals). Only two pairs of indicators showed significant correlation. If all hospitals emulated the best performers, at least 20% of AMI and stroke deaths, heart failure long stays, and heart failure and AMI readmissions could be avoided. Conclusions: Diagnosis-outcome indicators based on administrative data require validation as markers of significant risk adjusted SV. Validated indicators allow quantification of realisable outcome benefits if all hospitals achieved best performer levels. The overall level of quality of care within single institutions cannot be inferred from the results of one or a few indicators.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background and purpose Survey data quality is a combination of the representativeness of the sample, the accuracy and precision of measurements, data processing and management with several subcomponents in each. The purpose of this paper is to show how, in the final risk factor surveys of the WHO MONICA Project, information on data quality were obtained, quantified, and used in the analysis. Methods and results In the WHO MONICA (Multinational MONItoring of trends and determinants in CArdiovascular disease) Project, the information about the data quality components was documented in retrospective quality assessment reports. On the basis of the documented information and the survey data, the quality of each data component was assessed and summarized using quality scores. The quality scores were used in sensitivity testing of the results both by excluding populations with low quality scores and by weighting the data by its quality scores. Conclusions Detailed documentation of all survey procedures with standardized protocols, training, and quality control are steps towards optimizing data quality. Quantifying data quality is a further step. Methods used in the WHO MONICA Project could be adopted to improve quality in other health surveys.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Data quality is a difficult notion to define precisely, and different communities have different views and understandings of the subject. This causes confusion, a lack of harmonization of data across communities and omission of vital quality information. For some existing data infrastructures, data quality standards cannot address the problem adequately and cannot full all user needs or cover all concepts of data quality. In this paper we discuss some philosophical issues on data quality. We identify actual user needs on data quality, review existing standards and specification on data quality, and propose an integrated model for data quality in the eld of Earth observation. We also propose a practical mechanism for applying the integrated quality information model to large number of datasets through metadata inheritance. While our data quality management approach is in the domain of Earth observation, we believe the ideas and methodologies for data quality management can be applied to wider domains and disciplines to facilitate quality-enabled scientific research.