5 resultados para Data utility
em Aston University Research Archive
Resumo:
This article is aimed primarily at eye care practitioners who are undertaking advanced clinical research, and who wish to apply analysis of variance (ANOVA) to their data. ANOVA is a data analysis method of great utility and flexibility. This article describes why and how ANOVA was developed, the basic logic which underlies the method and the assumptions that the method makes for it to be validly applied to data from clinical experiments in optometry. The application of the method to the analysis of a simple data set is then described. In addition, the methods available for making planned comparisons between treatment means and for making post hoc tests are evaluated. The problem of determining the number of replicates or patients required in a given experimental situation is also discussed. Copyright (C) 2000 The College of Optometrists.
Resumo:
The aims of the project were twofold: 1) To investigate classification procedures for remotely sensed digital data, in order to develop modifications to existing algorithms and propose novel classification procedures; and 2) To investigate and develop algorithms for contextual enhancement of classified imagery in order to increase classification accuracy. The following classifiers were examined: box, decision tree, minimum distance, maximum likelihood. In addition to these the following algorithms were developed during the course of the research: deviant distance, look up table and an automated decision tree classifier using expert systems technology. Clustering techniques for unsupervised classification were also investigated. Contextual enhancements investigated were: mode filters, small area replacement and Wharton's CONAN algorithm. Additionally methods for noise and edge based declassification and contextual reclassification, non-probabilitic relaxation and relaxation based on Markov chain theory were developed. The advantages of per-field classifiers and Geographical Information Systems were investigated. The conclusions presented suggest suitable combinations of classifier and contextual enhancement, given user accuracy requirements and time constraints. These were then tested for validity using a different data set. A brief examination of the utility of the recommended contextual algorithms for reducing the effects of data noise was also carried out.
Resumo:
The performance of a supply chain depends critically on the coordinating actions and decisions undertaken by the trading partners. The sharing of product and process information plays a central role in the coordination and is a key driver for the success of the supply chain. In this paper we propose the concept of "Linked pedigrees" - linked datasets, that enable the sharing of traceability information of products as they move along the supply chain. We present a distributed and decentralised, linked data driven architecture that consumes real time supply chain linked data to generate linked pedigrees. We then present a communication protocol to enable the exchange of linked pedigrees among trading partners. We exemplify the utility of linked pedigrees by illustrating examples from the perishable goods logistics supply chain.
Resumo:
Since privatisation, maintenance of DNO LV feeder maximum demand information has gradually demised in some Utility Areas, and it is postulated that lack of knowledge about 11kV and LV electrical networks is resulting in a less economical and energy efficient Network as a whole. In an attempt to quantify the negative impact, this paper examines ten postulated new connection scenarios for a set of real LV load readings, in order to find the difference in design solutions when LV load readings were and were not known. The load profiles of the substations were examined in order to explore the utilisation profile. It was found that in 70% of the scenarios explored, significant cost differences were found. These cost differences varied by an average of 1000%, between schemes designed with and without load readings. Obviously, over designing a system and therefore operating more, underutilised transformers becomes less financially beneficial and less energy efficient. The paper concludes that new connection design is improved in terms of cost when carried out based on known LV load information and enhances the case for regular maximum feeder demand information and/or metering of LV feeders. © 2013 IEEE.
Resumo:
Emergency managers are faced with critical evacuation decisions. These decisions must balance conflicting objectives as well as high levels of uncertainty. Multi-Attribute Utility Theory (MAUT) provides a framework through which objective trade-offs can be analyzed to make optimal evacuation decisions. This paper is the result of data gathered during the European Commission Project, Evacuation Responsiveness by Government Organizations (ERGO) and outlines a preliminary decision model for the evacuation decision. The illustrative model identifies levels of risk at which point evacuation actions should be taken by emergency managers in a storm surge scenario with forecasts at 12 and 9 hour intervals. The results illustrate how differences in forecast precision affect the optimal evacuation decision. Additional uses for this decision model are also discussed along with improvements to the model through future ERGO data-gathering.