969 resultados para Data Quality


Relevância:

60.00% 60.00%

Publicador:

Resumo:

At the core of the analysis task in the development process is information systems requirements modelling, Modelling of requirements has been occurring for many years and the techniques used have progressed from flowcharting through data flow diagrams and entity-relationship diagrams to object-oriented schemas today. Unfortunately, researchers have been able to give little theoretical guidance only to practitioners on which techniques to use and when. In an attempt to address this situation, Wand and Weber have developed a series of models based on the ontological theory of Mario Bunge-the Bunge-Wand-Weber (BWW) models. Two particular criticisms of the models have persisted however-the understandability of the constructs in the BWW models and the difficulty in applying the models to a modelling technique. This paper addresses these issues by presenting a meta model of the BWW constructs using a meta language that is familiar to many IS professionals, more specific than plain English text, but easier to understand than the set-theoretic language of the original BWW models. Such a meta model also facilitates the application of the BWW theory to other modelling techniques that have similar meta models defined. Moreover, this approach supports the identification of patterns of constructs that might be common across meta models for modelling techniques. Such findings are useful in extending and refining the BWW theory. (C) 2002 Elsevier Science Ltd. All rights reserved.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The aim of this study was to review the published literature values for the selenium content of Australian foods. A secondary aim was to compare the results for Australian foods with food composition data from international sources to investigate the extent of geographical variation. Published food composition data sources for the selenium content in Australian foods were identified and assessed for data quality using established criteria. The selenium content is available for 148 individual food items. The highest values found are for fish (12.0-63.2 mug/100 g), meats (4.75-37.9 mug/100 g) and eggs (9.00-41.4 mug/100 g), followed by cereals (1.00-20.3 mug/100 g). Moderate levels are seen in dairy products (2.00-7.89 mug/100 g) while most fruits and vegetables have low levels (trace-3.27 mug/100 g). High selenium foods show the greatest level of geographical variation, with foods from the United States generally having higher selenium levels than Australian foods and foods from the United Kingdom and New Zealand having lower levels. This is the first attempt to review the available literature for selenium composition of Australian foods. These data serve as an interim measure for the assessment of selenium intake for use in epidemiological studies of diet-disease relationships. (C) 2002 Published by Elsevier Science Ltd.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Although postal questionnaires, personal interviewing, and telephone interviewing are the main methods of survey-based research, there is an increasing use of e-mail as a data collection medium. However, little, if any, published Western research in general and that of Turkish in particular have investigated e-mail survey technique from pure survey research perspective. Attempting to develop a framework to assess e-mail as a data collection mean, the purpose of this study is to explore e-mail-based questionnaire technique from complementary angles. To this goal, sample representativeness, data quality, response rates, and advantages and disadvantages of e-mail surveying are discussed.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Dissertação para obtenção do grau de Mestre em Engenharia na Área de especialização em Vias de Comunicação e Transportes

Relevância:

60.00% 60.00%

Publicador:

Resumo:

OBJECTIVE To analyze the cases of tuberculosis and the impact of direct follow-up on the assessment of treatment outcomes.METHODS This open prospective cohort study evaluated 504 cases of tuberculosis reported in the Sistema de Informação de Agravos de Notificação (SINAN – Notifiable Diseases Information System) in Juiz de Fora, MG, Southeastern Brazil, between 2008 and 2009. The incidence of treatment outcomes was compared between a group of patients diagnosed with tuberculosis and directly followed up by monthly consultations during return visits (287) and a patient group for which the information was indirectly collected (217) through the city’s surveillance system. The Chi-square test was used to compare the percentages, with a significance level of 0.05. The relative risk (RR) was used to evaluate the differences in the incidence rate of each type of treatment outcome between the two groups.RESULTS Of the outcomes directly and indirectly evaluated, 18.5% and 3.2% corresponded to treatment default and 3.8% and 0.5% corresponded to treatment failure, respectively. The incidence of treatment default and failure was higher in the group with direct follow-up (p < 0.05) (RR = 5.72, 95%CI 2.65;12.34, and RR = 8.31, 95%CI 1.08;63.92, respectively).CONCLUSIONS A higher incidence of treatment default and failure was observed in the directly followed up group, and most of these cases were neglected by the disease reporting system. Therefore, effective measures are needed to improve the control of tuberculosis and data quality.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Dissertation submitted in partial fulfillment of the requirements for the Degree of Master of Science in Geospatial Technologies.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The MAP-i Doctoral Programme in Informatics, of the Universities of Minho, Aveiro and Porto

Relevância:

60.00% 60.00%

Publicador:

Resumo:

1. Species distribution modelling is used increasingly in both applied and theoretical research to predict how species are distributed and to understand attributes of species' environmental requirements. In species distribution modelling, various statistical methods are used that combine species occurrence data with environmental spatial data layers to predict the suitability of any site for that species. While the number of data sharing initiatives involving species' occurrences in the scientific community has increased dramatically over the past few years, various data quality and methodological concerns related to using these data for species distribution modelling have not been addressed adequately. 2. We evaluated how uncertainty in georeferences and associated locational error in occurrences influence species distribution modelling using two treatments: (1) a control treatment where models were calibrated with original, accurate data and (2) an error treatment where data were first degraded spatially to simulate locational error. To incorporate error into the coordinates, we moved each coordinate with a random number drawn from the normal distribution with a mean of zero and a standard deviation of 5 km. We evaluated the influence of error on the performance of 10 commonly used distributional modelling techniques applied to 40 species in four distinct geographical regions. 3. Locational error in occurrences reduced model performance in three of these regions; relatively accurate predictions of species distributions were possible for most species, even with degraded occurrences. Two species distribution modelling techniques, boosted regression trees and maximum entropy, were the best performing models in the face of locational errors. The results obtained with boosted regression trees were only slightly degraded by errors in location, and the results obtained with the maximum entropy approach were not affected by such errors. 4. Synthesis and applications. To use the vast array of occurrence data that exists currently for research and management relating to the geographical ranges of species, modellers need to know the influence of locational error on model quality and whether some modelling techniques are particularly robust to error. We show that certain modelling techniques are particularly robust to a moderate level of locational error and that useful predictions of species distributions can be made even when occurrence data include some error.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Neurocritical care depends, in part, on careful patient monitoring but as yet there are little data on what processes are the most important to monitor, how these should be monitored, and whether monitoring these processes is cost-effective and impacts outcome. At the same time, bioinformatics is a rapidly emerging field in critical care but as yet there is little agreement or standardization on what information is important and how it should be displayed and analyzed. The Neurocritical Care Society in collaboration with the European Society of Intensive Care Medicine, the Society for Critical Care Medicine, and the Latin America Brain Injury Consortium organized an international, multidisciplinary consensus conference to begin to address these needs. International experts from neurosurgery, neurocritical care, neurology, critical care, neuroanesthesiology, nursing, pharmacy, and informatics were recruited on the basis of their research, publication record, and expertise. They undertook a systematic literature review to develop recommendations about specific topics on physiologic processes important to the care of patients with disorders that require neurocritical care. This review does not make recommendations about treatment, imaging, and intraoperative monitoring. A multidisciplinary jury, selected for their expertise in clinical investigation and development of practice guidelines, guided this process. The GRADE system was used to develop recommendations based on literature review, discussion, integrating the literature with the participants' collective experience, and critical review by an impartial jury. Emphasis was placed on the principle that recommendations should be based on both data quality and on trade-offs and translation into clinical practice. Strong consideration was given to providing pragmatic guidance and recommendations for bedside neuromonitoring, even in the absence of high quality data.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The remit of the Institute of Public Health in Ireland (IPH) is to promote cooperation for public health between Northern Ireland and the Republic of Ireland in the areas of research and information, capacity building and policy advice. Our approach is to support Departments of Health and their agencies in both jurisdictions, and maximise the benefits of all-island cooperation to achieve practical benefits for people in Northern Ireland and the Republic of Ireland. IPH have previously responded to consultations to the Department of Health’s Discussion Paper on the Proposed Health Information Bill (June 2008), the Health Information and Quality Authority on their Corporate Plan (Oct 2007), and the Road Safety Authority of Ireland Road Safety Strategy (Jul 2012). IPH supports the development of a national standard demographic dataset for use within the health and social care services. Provided necessary safeguards are put in place (such as ethics and data protection) and the purpose of collecting the information is fully explained to subjects, mandatory provision of a minimum demographic dataset is usually the best way to achieve the necessary coverage and data quality. Demographic information is needed in several forms to support the public health function: Detailed aggregated information for comparison to population counts in order to assess equity of access to healthcare as well as examining population patterns and trends in morbidity and mortality Accurate demographic information for the surveillance of infectious disease outbreaks, monitoring vaccination programmes, setting priorities for public health interventions Linked to other data outside of health and social care such as population data, survey data, and longitudinal studies for research and analysis purposes.   Identify and address public health issues to tackle health inequalities, and to monitor the success of such efforts to tackle them.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The EHLASS survey was set up in April 1986 as a five-year demonstration project. The objective was to monitor home and leisure accidents in a harmonised manner, throughout the EU, to determine their causes, the circumstances of their occurrence, their consequences and, most importantly, to provide information on consumer products involved. Armed with accurate information, it was felt that consumer policy could be directed at the most serious problems andthe best use could be made of available resources.   Data collection systems were set up for the collection of EHLASS data in the casualty departments of selected hospitals in each of the member states. The information was subsequently gathered together by the European Commission in Brussels. Extensive analysis was undertaken on 778,838 accidents reported throughout the EU. Centralised analysis of EHLASS data proved problematic due to lack of  co-ordination in data quality. In 1989 it was decided that each member state should  produce its own annual EHLASS report in a harmonised format specified by the European Commission. This report is the ninth such report for Ireland. Download the Report here

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The report provides analysis of PCT participation levels and investigates data quality issues in the collection of the 2007/08 NCMP dataset.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Astrocytes have recently become a major center of interest in neurochemistry with the discoveries on their major role in brain energy metabolism. An interesting way to probe this glial contribution is given by in vivo (13) C NMR spectroscopy coupled with the infusion labeled glial-specific substrate, such as acetate. In this study, we infused alpha-chloralose anesthetized rats with [2-(13) C]acetate and followed the dynamics of the fractional enrichment (FE) in the positions C4 and C3 of glutamate and glutamine with high sensitivity, using (1) H-[(13) C] magnetic resonance spectroscopy (MRS) at 14.1T. Applying a two-compartment mathematical model to the measured time courses yielded a glial tricarboxylic acid (TCA) cycle rate (Vg ) of 0.27 ± 0.02 μmol/g/min and a glutamatergic neurotransmission rate (VNT ) of 0.15 ± 0.01 μmol/g/min. Glial oxidative ATP metabolism thus accounts for 38% of total oxidative metabolism measured by NMR. Pyruvate carboxylase (VPC ) was 0.09 ± 0.01 μmol/g/min, corresponding to 37% of the glial glutamine synthesis rate. The glial and neuronal transmitochondrial fluxes (Vx (g) and Vx (n) ) were of the same order of magnitude as the respective TCA cycle fluxes. In addition, we estimated a glial glutamate pool size of 0.6 ± 0.1 μmol/g. The effect of spectral data quality on the fluxes estimates was analyzed by Monte Carlo simulations. In this (13) C-acetate labeling study, we propose a refined two-compartment analysis of brain energy metabolism based on (13) C turnover curves of acetate, glutamate and glutamine measured with state of the art in vivo dynamic MRS at high magnetic field in rats, enabling a deeper understanding of the specific role of glial cells in brain oxidative metabolism. In addition, the robustness of the metabolic fluxes determination relative to MRS data quality was carefully studied.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The simultaneous recording of scalp electroencephalography (EEG) and functional magnetic resonance imaging (fMRI) can provide unique insights into the dynamics of human brain function, and the increased functional sensitivity offered by ultra-high field fMRI opens exciting perspectives for the future of this multimodal approach. However, simultaneous recordings are susceptible to various types of artifacts, many of which scale with magnetic field strength and can seriously compromise both EEG and fMRI data quality in recordings above 3T. The aim of the present study was to implement and characterize an optimized setup for simultaneous EEG-fMRI in humans at 7T. The effects of EEG cable length and geometry for signal transmission between the cap and amplifiers were assessed in a phantom model, with specific attention to noise contributions from the MR scanner coldheads. Cable shortening (down to 12cm from cap to amplifiers) and bundling effectively reduced environment noise by up to 84% in average power and 91% in inter-channel power variability. Subject safety was assessed and confirmed via numerical simulations of RF power distribution and temperature measurements on a phantom model, building on the limited existing literature at ultra-high field. MRI data degradation effects due to the EEG system were characterized via B0 and B1(+) field mapping on a human volunteer, demonstrating important, although not prohibitive, B1 disruption effects. With the optimized setup, simultaneous EEG-fMRI acquisitions were performed on 5 healthy volunteers undergoing two visual paradigms: an eyes-open/eyes-closed task, and a visual evoked potential (VEP) paradigm using reversing-checkerboard stimulation. EEG data exhibited clear occipital alpha modulation and average VEPs, respectively, with concomitant BOLD signal changes. On a single-trial level, alpha power variations could be observed with relative confidence on all trials; VEP detection was more limited, although statistically significant responses could be detected in more than 50% of trials for every subject. Overall, we conclude that the proposed setup is well suited for simultaneous EEG-fMRI at 7T.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

PRINCIPLES: International guidelines for heart failure (HF) care recommend the implementation of inter-professional disease management programmes. To date, no such programme has been tested in Switzerland. The aim of this randomised controlled trial (RCT) was to test the effect on hospitalisation, mortality and quality of life of an adult ambulatory disease management programme for patients with HF in Switzerland.METHODS: Consecutive patients admitted to internal medicine in a Swiss university hospital were screened for decompensated HF. A total of 42 eligible patients were randomised to an intervention (n = 22) or usual care group (n = 20). Medical treatment was optimised and lifestyle recommendations were given to all patients. Intervention patients additionally received a home visit by a HF-nurse, followed by 17 telephone calls of decreasing frequency over 12 months, focusing on self-care. Calls from the HF nurse to primary care physicians communicated health concerns and identified goals of care. Data were collected at baseline, 3, 6, 9 and 12 months. Mixed regression analysis (quality of life) was used. Outcome assessment was conducted by researchers blinded to group assignment.RESULTS: After 12 months, 22 (52%) patients had an all-cause re-admission or died. Only 3 patients were hospitalised with HF decompensation. No significant effect of the intervention was found on HF related to quality of life.CONCLUSIONS: An inter-professional disease management programme is possible in the Swiss healthcare setting but effects on outcomes need to be confirmed in larger studies.