22 resultados para Data Quality Management

em University of Queensland eSpace - Australia


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Although managers consider accurate, timely, and relevant information as critical to the quality of their decisions, evidence of large variations in data quality abounds. Over a period of twelve months, the action research project reported herein attempted to investigate and track data quality initiatives undertaken by the participating organisation. The investigation focused on two types of errors: transaction input errors and processing errors. Whenever the action research initiative identified non-trivial errors, the participating organisation introduced actions to correct the errors and prevent similar errors in the future. Data quality metrics were taken quarterly to measure improvements resulting from the activities undertaken during the action research project. The action research project results indicated that for a mission-critical database to ensure and maintain data quality, commitment to continuous data quality improvement is necessary. Also, communication among all stakeholders is required to ensure common understanding of data quality improvement goals. The action research project found that to further substantially improve data quality, structural changes within the organisation and to the information systems are sometimes necessary. The major goal of the action research study is to increase the level of data quality awareness within all organisations and to motivate them to examine the importance of achieving and maintaining high-quality data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper reviews the key features of an environment to support domain users in spatial information system (SIS) development. It presents a full design and prototype implementation of a repository system for the storage and management of metadata, focusing on a subset of spatial data integrity constraint classes. The system is designed to support spatial system development and customization by users within the domain that the system will operate.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background and purpose Survey data quality is a combination of the representativeness of the sample, the accuracy and precision of measurements, data processing and management with several subcomponents in each. The purpose of this paper is to show how, in the final risk factor surveys of the WHO MONICA Project, information on data quality were obtained, quantified, and used in the analysis. Methods and results In the WHO MONICA (Multinational MONItoring of trends and determinants in CArdiovascular disease) Project, the information about the data quality components was documented in retrospective quality assessment reports. On the basis of the documented information and the survey data, the quality of each data component was assessed and summarized using quality scores. The quality scores were used in sensitivity testing of the results both by excluding populations with low quality scores and by weighting the data by its quality scores. Conclusions Detailed documentation of all survey procedures with standardized protocols, training, and quality control are steps towards optimizing data quality. Quantifying data quality is a further step. Methods used in the WHO MONICA Project could be adopted to improve quality in other health surveys.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

With the proliferation of relational database programs for PC's and other platforms, many business end-users are creating, maintaining, and querying their own databases. More importantly, business end-users use the output of these queries as the basis for operational, tactical, and strategic decisions. Inaccurate data reduce the expected quality of these decisions. Implementing various input validation controls, including higher levels of normalisation, can reduce the number of data anomalies entering the databases. Even in well-maintained databases, however, data anomalies will still accumulate. To improve the quality of data, databases can be queried periodically to locate and correct anomalies. This paper reports the results of two experiments that investigated the effects of different data structures on business end-users' abilities to detect data anomalies in a relational database. The results demonstrate that both unnormalised and higher levels of normalisation lower the effectiveness and efficiency of queries relative to the first normal form. First normal form databases appear to provide the most effective and efficient data structure for business end-users formulating queries to detect data anomalies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The data structure of an information system can significantly impact the ability of end users to efficiently and effectively retrieve the information they need. This research develops a methodology for evaluating, ex ante, the relative desirability of alternative data structures for end user queries. This research theorizes that the data structure that yields the lowest weighted average complexity for a representative sample of information requests is the most desirable data structure for end user queries. The theory was tested in an experiment that compared queries from two different relational database schemas. As theorized, end users querying the data structure associated with the less complex queries performed better Complexity was measured using three different Halstead metrics. Each of the three metrics provided excellent predictions of end user performance. This research supplies strong evidence that organizations can use complexity metrics to evaluate, ex ante, the desirability of alternate data structures. Organizations can use these evaluations to enhance the efficient and effective retrieval of information by creating data structures that minimize end user query complexity.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The Brisbane River and Moreton Bay Study, an interdisciplinary study of Moreton Bay and its major tributaries, was initiated to address water quality issues which link sewage and diffuse loading with environmental degradation. Runoff and deposition of fine-grained sediments into Moreton Bay, followed by resuspension, have been linked with increased turbidity and significant loss of seagrass habitat. Sewage-derived nutrient enrichment, particularly nitrogen (N), has been linked to algal blooms by sewage plume maps. Blooms of a marine cyanobacterium, Lyngbya majuscula, in Moreton Bay have resulted in significant impacts on human health (e.g., contact dermatitis) and ecological health (e.g., seagrass loss), and the availability of dissolved iron from acid sulfate soil runoff has been hypothesised. The impacts of catchment activities resulting in runoff of sediments, nutrients and dissolved iron on the health of the Moreton Bay waterways are addressed. The Study, established by 6 local councils in association with two state departments in 1994, forms a regional component of a national and state program to achieve ecologically sustainable use of the waterways by protecting and enhancing their health, while maintaining economic and social development. The Study framework illustrates a unique integrated approach to water quality management whereby scientific research, community participation and the strategy development were done in parallel with each other. This collaborative effort resulted in a water quality management strategy which focuses on the integration of socioeconomic and ecological values of the waterways. This work has led to significant cost savings in infrastructure by providing a clear focus on initiatives towards achieving healthy waterways. The Study's Stage 2 initiatives form the basis for this paper.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

X-ray crystallography is the most powerful method for determining the three-dimensional structure of biological macromolecules. One of the major obstacles in the process is the production of high-quality crystals for structure determination. All too often, crystals are produced that are of poor quality and are unsuitable for diffraction studies. This review provides a compilation of post-crystallization methods that can convert poorly diffracting crystals into data-quality crystals. Protocols for annealing, dehydration, soaking and cross-linking are outlined and examples of some spectacular changes in crystal quality are provided. The protocols are easily incorporated into the structure-determination pipeline and a practical guide is provided that shows how and when to use the different post-crystallization treatments for improving crystal quality.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Even when data repositories exhibit near perfect data quality, users may formulate queries that do not correspond to the information requested. Users’ poor information retrieval performance may arise from either problems understanding of the data models that represent the real world systems, or their query skills. This research focuses on users’ understanding of the data structures, i.e., their ability to map the information request and the data model. The Bunge-Wand-Weber ontology was used to formulate three sets of hypotheses. Two laboratory experiments (one using a small data model and one using a larger data model) tested the effect of ontological clarity on users’ performance when undertaking component, record, and aggregate level tasks. The results indicate for the hypotheses associated with different representations but equivalent semantics that parsimonious data model participants performed better for component level tasks but that ontologically clearer data model participants performed better for record and aggregate level tasks.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

There is substantial disagreement among published epidemiological studies regarding environmental risk factors for Parkinson’s disease (PD). Differences in the quality of measurement of environmental exposures may contribute to this variation. The current study examined the test–retest repeatability of self-report data on risk factors for PD obtained from a series of 32 PD cases recruited from neurology clinics and 29 healthy sex-, age-and residential suburb-matched controls. Exposure data were collected in face-to-face interviews using a structured questionnaire derived from previous epidemiological studies. High repeatability was demonstrated for ‘lifestyle’ exposures, such as smoking and coffee/tea consumption (kappas 0.70–1.00). Environmental exposures that involved some action by the person, such as pesticide application and use of solvents and metals, also showed high repeatability (kappas>0.78). Lower repeatability was seen for rural residency and bore water consumption (kappa 0.39–0.74). In general, we found that case and control participants provided similar rates of incongruent and missing responses for categorical and continuous occupational, domestic, lifestyle and medical exposures.