40 resultados para Data Warehousing Systems
em University of Queensland eSpace - Australia
Resumo:
Large amounts of information can be overwhelming and costly to process, especially when transmitting data over a network. A typical modern Geographical Information System (GIS) brings all types of data together based on the geographic component of the data and provides simple point-and-click query capabilities as well as complex analysis tools. Querying a Geographical Information System, however, can be prohibitively expensive due to the large amounts of data which may need to be processed. Since the use of GIS technology has grown dramatically in the past few years, there is now a need more than ever, to provide users with the fastest and least expensive query capabilities, especially since an approximated 80 % of data stored in corporate databases has a geographical component. However, not every application requires the same, high quality data for its processing. In this paper we address the issues of reducing the cost and response time of GIS queries by preaggregating data by compromising the data accuracy and precision. We present computational issues in generation of multi-level resolutions of spatial data and show that the problem of finding the best approximation for the given region and a real value function on this region, under a predictable error, in general is "NP-complete.
Resumo:
An important feature of some conceptual modelling grammars is the features they provide to allow database designers to show real-world things may or may not possess a particular attribute or relationship. In the entity-relationship model, for example, the fact that a thing may not possess an attribute can be represented by using a special symbol to indicate that the attribute is optional. Similarly, the fact that a thing may or may not be involved in a relationship can be represented by showing the minimum cardinality of the relationship as zero. Whether these practices should be followed, however, is a contentious issue. An alternative approach is to eliminate optional attributes and relationships from conceptual schema diagrams by using subtypes that have only mandatory attributes and relationships. In this paper, we first present a theory that led us to predict that optional attributes and relationships should be used in conceptual schema diagrams only when users of the diagrams require a surface-level understanding of the domain being represented by the diagrams. When users require a deep-level understanding, however, optional attributes and relationships should not be used because they undermine users' abilities to grasp important domain semantics. We describe three experiments which we then undertook to test our predictions. The results of the experiments support our predictions.
Resumo:
At the core of the analysis task in the development process is information systems requirements modelling, Modelling of requirements has been occurring for many years and the techniques used have progressed from flowcharting through data flow diagrams and entity-relationship diagrams to object-oriented schemas today. Unfortunately, researchers have been able to give little theoretical guidance only to practitioners on which techniques to use and when. In an attempt to address this situation, Wand and Weber have developed a series of models based on the ontological theory of Mario Bunge-the Bunge-Wand-Weber (BWW) models. Two particular criticisms of the models have persisted however-the understandability of the constructs in the BWW models and the difficulty in applying the models to a modelling technique. This paper addresses these issues by presenting a meta model of the BWW constructs using a meta language that is familiar to many IS professionals, more specific than plain English text, but easier to understand than the set-theoretic language of the original BWW models. Such a meta model also facilitates the application of the BWW theory to other modelling techniques that have similar meta models defined. Moreover, this approach supports the identification of patterns of constructs that might be common across meta models for modelling techniques. Such findings are useful in extending and refining the BWW theory. (C) 2002 Elsevier Science Ltd. All rights reserved.
Resumo:
A finite difference method for simulating voltammograms of electrochemically driven enzyme catalysis is presented. The method enables any enzyme mechanism to be simulated. The finite difference equations can be represented as a matrix equation containing a nonlinear sparse matrix. This equation has been solved using the software package Mathematica. Our focus is on the use of cyclic voltammetry since this is the most commonly employed electrochemical method used to elucidate mechanisms. The use of cyclic voltammetry to obtain data from systems obeying Michaelis-Menten kinetics is discussed, and we then verify our observations on the Michaelis-Menten system using the finite difference simulation. Finally, we demonstrate how the method can be used to obtain mechanistic information on a real redox enzyme system, the complex bacterial molybdoenzyme xanthine dehydrogenase.
Resumo:
The effect of methyl jasmonate treatment on gene expression in sugarcane roots signalling between roots and shoots was studied. A collection of 829 ESTs were obtained from sugarcane roots treated with the defence-regulator methyl jasmonate (MJ) treatment. A subset of 747 of these were combined with 4793 sugarcane ESTs obtained from stem tissues in a cDNA microarray and experiments undertaken to identify genes that were induced in roots 24-120 h following treatment with MJ. Two data analysis systems (t-statistic and tRMA) were used to analyse the microarray results and these methods identified a common set of 21 ESTs corresponding to transcripts significantly induced by MJ in roots and 23 that were reduced in expression following MJ treatment. The induction of six transcripts identified in the microarray analysis was tested and confirmed using northern blotting. Homologues of genes encoding lipoxygenase and PR-10 proteins were induced 824 It after MJ treatment while the other four selected transcripts were induced at later time points. Following treatment of roots with MJ, the lipoxygenase homologue, but not the PR-10 homologue, was induced in untreated stem and leaf tissues. The PR-10 homologue and a PR-1 homologue, but not the lipoxygenase homologue, were induced in untreated tissues after the application of SA to roots. Repeated foliar application of MJ had no apparent effects on plant growth and was demonstrated to increase lipoxygenase transcripts in roots, but did not increase transcript levels-of other genes tested. These results lay a foundation for further studies of induced pest and disease resistance in sugarcane roots. (C) 2004 Elsevier Ireland Ltd. All rights reserved.
Resumo:
OBJECTIVE: This paper describes the Australian experience to date with a national 'roll out' of routine outcome measurement in public sector mental health services. METHODS: Consultations were held with 123 stakeholders representing a range of roles. RESULTS: Australia has made an impressive start to nationally implementing routine outcome measurement in mental health services, although it still has a long way to go. All States/Territories have established data collection systems, although some are more streamlined than others. Significant numbers of clinicians and managers have been trained in the use of routine outcome measures, and thought is now being given to ongoing training strategies. Outcome measurement is now occurring 'on the ground'; all States/Territories will be reporting data for 2003-04, and a number have been doing so for several years. Having said this, there is considerable variability regarding data coverage, completeness and compliance. Some States/Territories have gone to considerable lengths to 'embed' outcome measurement in day-to-day practice. To date, reporting of outcome data has largely been limited to reports profiling individual consumers and/or aggregate reports that focus on compliance and data quality issues, although a few States/Territories have begun to turn their attention to producing aggregate reports of consumers by clinician, team or service. CONCLUSION: Routine outcome measurement is possible if it is supported by a co-ordinated, strategic approach and strong leadership, and there is commitment from clinicians and managers. The Australian experience can provide lessons for other countries.
Resumo:
Reliable, comparable information about the main causes of disease and injury in populations, and how these are changing, is a critical input for debates about priorities in the health sector. Traditional sources of information about the descriptive epidemiology of diseases, injuries and risk factors are generally incomplete, fragmented and of uncertain reliability and comparability. Lack of a standardized measurement framework to permit comparisons across diseases and injuries, as well as risk factors, and failure to systematically evaluate data quality have impeded comparative analyses of the true public health importance of various conditions and risk factors. As a consequence the impact of major conditions and hazards on population health has been poorly appreciated, often leading to a lack of public health investment. Global disease and risk factor quantification improved dramatically in the early 1990s with the completion of the first Global Burden of Disease Study. For the first time, the comparative importance of over 100 diseases and injuries, and ten major risk factors, for global and regional health status could be assessed using a common metric (Disability-Adjusted Life Years) which simultaneously accounted for both premature mortality and the prevalence, duration and severity of the non-fatal consequences of disease and injury. As a consequence, mental health conditions and injuries, for which non-fatal outcomes are of particular significance, were identified as being among the leading causes of disease/injury burden worldwide, with clear implications for policy, particularly prevention. A major achievement of the Study was the complete global descriptive epidemiology, including incidence, prevalence and mortality, by age, sex and Region, of over 100 diseases and injuries. National applications, further methodological research and an increase in data availability have led to improved national, regional and global estimates for 2000, but substantial uncertainty around the disease burden caused by major conditions, including, HIV, remains. The rapid implementation of cost-effective data collection systems in developing countries is a key priority if global public policy to promote health is to be more effectively informed.
Resumo:
Regular and systematic monitoring of drug markets provides the basis for evidence-based policy. In Australia, trends in ecstasy and related drug (ERD) markets have been monitored in selected jurisdictions since 2000 and nationally since 2003, by the Party Drugs Initiative (PDI). The PDI maximises the validity of conclusions by triangulating information from (a) interviews with regular ecstasy users (REU), (b) interviews with key experts and (c) indicator data. There is currently no other system in Australia for monitoring these markets systematically; however, the value of the PDI has been constrained by the quality of available data. Difficulties in recruiting and interviewing appropriate consumers (REU) and key experts have been experienced, but largely overcome. Limitations of available indicator data from both health and law enforcement continue to present challenges and there remains considerable scope for enhancing existing routine data collection systems, to facilitate monitoring of ERD markets. With an expanding market for ecstasy and related drugs in Australia, and in the context of indicator data that continue to be limited in scope and detail, there is a strong argument for the continued collection of annual, comparable data from a sentinel group of REU, such as those recruited for the PDI.
Resumo:
Benchmarking of the performance of states, provinces, or districts in a decentralised health system is important for fostering of accountability, monitoring of progress, identification of determinants of success and failure, and creation of a culture of evidence. The Mexican Ministry of Health has, since 2001, used a benchmarking approach based on the World Health Organization (WHO) concept of effective coverage of an intervention, which is defined as the proportion of potential health gain that could be delivered by the health system to that which is actually delivered. Using data collection systems, including state representative examination surveys, vital registration, and hospital discharge registries, we have monitored the delivery of 14 interventions for 2005-06. Overall effective coverage ranges from 54.0% in Chiapas, a poor state, to 65.1% in the Federal District. Effective coverage for maternal and child health interventions is substantially higher than that for interventions that target other health problems. Effective coverage for the lowest wealth quintile is 52% compared with 61% for the highest quintile. Effective coverage is closely related to public-health spending per head across states; this relation is stronger for interventions that are not related to maternal and child health than those for maternal and child health. Considerable variation also exists in effective coverage at similar amounts of spending. We discuss the implications of these issues for the further development of the Mexican health-information system. Benchmarking of performance by measuring effective coverage encourages decision-makers to focus on quality service provision, not only service availability. The effective coverage calculation is an important device for health-system stewardship. In adopting this approach, other countries should select interventions to be measured on the basis of the criteria of affordability, effect on population health, effect on health inequalities, and capacity to measure the effects of the intervention. The national institutions undertaking this benchmarking must have the mandate, skills, resources, and independence to succeed.
Resumo:
A structurally-based quasi-chemical viscosity model has been developed for the Al2O3 CaO-'FeO'-MgO-SiO2 system. The model links the slag viscosity to the internal structure of melts through the concentrations of various anion/cation Si0.5O, Me2/nn+O and Me1/nn+Si0.25O viscous flow structural units. The concentrations of structural units are derived from the quasi-chemical thermodynamic model. The focus of the work described in the present paper is the analysis of experimental data and the viscosity models for fully liquid slags in the Al2O3-CaO-MgO, Al2O3 MgO-SiO2 and CaO-MgO-SiO2 systems.
Resumo:
This paper discusses a multi-layer feedforward (MLF) neural network incident detection model that was developed and evaluated using field data. In contrast to published neural network incident detection models which relied on simulated or limited field data for model development and testing, the model described in this paper was trained and tested on a real-world data set of 100 incidents. The model uses speed, flow and occupancy data measured at dual stations, averaged across all lanes and only from time interval t. The off-line performance of the model is reported under both incident and non-incident conditions. The incident detection performance of the model is reported based on a validation-test data set of 40 incidents that were independent of the 60 incidents used for training. The false alarm rates of the model are evaluated based on non-incident data that were collected from a freeway section which was video-taped for a period of 33 days. A comparative evaluation between the neural network model and the incident detection model in operation on Melbourne's freeways is also presented. The results of the comparative performance evaluation clearly demonstrate the substantial improvement in incident detection performance obtained by the neural network model. The paper also presents additional results that demonstrate how improvements in model performance can be achieved using variable decision thresholds. Finally, the model's fault-tolerance under conditions of corrupt or missing data is investigated and the impact of loop detector failure/malfunction on the performance of the trained model is evaluated and discussed. The results presented in this paper provide a comprehensive evaluation of the developed model and confirm that neural network models can provide fast and reliable incident detection on freeways. (C) 1997 Elsevier Science Ltd. All rights reserved.