895 resultados para Data alignment
Resumo:
Successful inclusive product design requires knowledge about the capabilities, needs and aspirations of potential users and should cater for the different scenarios in which people will use products, systems and services. This should include: the individual at home; in the workplace; for businesses, and for products in these contexts. It needs to reflect the development of theory, tools and techniques as research moves on.
Resumo:
Spreadsheet for Creative City Index 2012
Resumo:
Summary: More than ever before contemporary societies are characterised by the huge amounts of data being transferred. Authorities, companies, academia and other stakeholders refer to Big Data when discussing the importance of large and complex datasets and developing possible solutions for their use. Big Data promises to be the next frontier of innovation for institutions and individuals, yet it also offers possibilities to predict and influence human behaviour with ever-greater precision
Resumo:
The decisions people make about medical treatments have a great impact on their lives. Health care practitioners, providers and patients often make decisions about medical treatments without complete understanding of the circumstances. The main reason for this is that medical data are available in fragmented, disparate and heterogeneous data silos. Without a centralised data warehouse structure to integrate these data silos, it is highly unlikely and impractical for the users to get all the information required on time to make a correct decision. In this research paper, a clinical data integration approach using SAS Clinical Data Integration Server tools is presented.
Resumo:
The health system is one sector dealing with very large amount of complex data. Many healthcare organisations struggle to utilise these volumes of health data effectively and efficiently. Therefore, there is a need for very effective system to capture, collate and distribute this health data. There are number of technologies have been identified to integrate data from different sources. Data warehousing is one technology can be used to manage clinical data in the healthcare. This paper addresses how data warehousing assist to improve cardiac surgery decision making. This research used the cardiac surgery unit at the Prince Charles Hospital (TPCH) as the case study. In order to deal with other units efficiently, it is important to integrate disparate data to a single point interrogation. We propose implementing a data warehouse for the cardiac surgery unit at TPCH. The data warehouse prototype developed using SAS enterprise data integration studio 4.2 and data was analysed using SAS enterprise edition 4.3. This improves access to integrated clinical and financial data with, improved framing of data to the clinical context, giving potentially better informed decision making for both improved management and patient care.
Resumo:
In Australia, as in some other western nations, governments impose accountability measures on educational institutions (Earl, 2005). One such accountability measure is the National Assessment Program - Literacy and Numeracy (NAPLAN) from which high-stakes assessment data is generated. In this article, a practical method of data analysis known as the Over Time Assessment Data Analysis (OTADA) is offered as an analytical process by which schools can monitor their current and over time performances. This analysis developed by the author, is currently used extensively in schools throughout Queensland. By Analysing in this way, teachers, and in particular principals, can obtain a quick and insightful performance overview. For those seeking to track the achievements and progress of year level cohorts, the OTADA should be considered.
Resumo:
This research was a step forward in developing a data integration framework for Electronic Health Records. The outcome of the research is a conceptual and logical Data Warehousing model for integrating Cardiac Surgery electronic data records. This thesis investigated the main obstacles for the healthcare data integration and proposes a data warehousing model suitable for integrating fragmented data in a Cardiac Surgery Unit.
Resumo:
This project researched the performance of emerging digital technology for high voltage electricity substations that significantly improves safety for staff and reduces the potential impact on the environment of equipment failure. The experimental evaluation used a scale model of a substation control system that incorporated real substation control and networking equipment with real-time simulation of the power system. The outcomes confirm that it is possible to implement Ethernet networks in high voltage substations that meet the needs of utilities; however component-level testing of devices is necessary to achieve this. The assessment results have been used to further develop international standards for substation communication and precision timing.
Resumo:
This document provides data for the case study presented in our recent earthwork planning papers. Some results are also provided in a graphical format using Excel.
Resumo:
Electricity cost has become a major expense for running data centers and server consolidation using virtualization technology has been used as an important technology to improve the energy efficiency of data centers. In this research, a genetic algorithm and a simulation-annealing algorithm are proposed for the static virtual machine placement problem that considers the energy consumption in both the servers and the communication network, and a trading algorithm is proposed for dynamic virtual machine placement. Experimental results have shown that the proposed methods are more energy efficient than existing solutions.
Resumo:
The Council of Australian Governments (COAG) in 2003 gave in-principle approval to a best-practice report recommending a holistic approach to managing natural disasters in Australia incorporating a move from a traditional response-centric approach to a greater focus on mitigation, recovery and resilience with community well-being at the core. Since that time, there have been a range of complementary developments that have supported the COAG recommended approach. Developments have been administrative, legislative and technological, both, in reaction to the COAG initiative and resulting from regular natural disasters. This paper reviews the characteristics of the spatial data that is becoming increasingly available at Federal, state and regional jurisdictions with respect to their being fit for the purpose for disaster planning and mitigation and strengthening community resilience. In particular, Queensland foundation spatial data, which is increasingly accessible by the public under the provisions of the Right to Information Act 2009, Information Privacy Act 2009, and recent open data reform initiatives are evaluated. The Fitzroy River catchment and floodplain is used as a case study for the review undertaken. The catchment covers an area of 142,545 km2, the largest river catchment flowing to the eastern coast of Australia. The Fitzroy River basin experienced extensive flooding during the 2010–2011 Queensland floods. The basin is an area of important economic, environmental and heritage values and contains significant infrastructure critical for the mining and agricultural sectors, the two most important economic sectors for Queensland State. Consequently, the spatial datasets for this area play a critical role in disaster management and for protecting critical infrastructure essential for economic and community well-being. The foundation spatial datasets are assessed for disaster planning and mitigation purposes using data quality indicators such as resolution, accuracy, integrity, validity and audit trail.
Resumo:
The ability to steer business operations in alignment with the true origins of costs, and to be informed about this on a real-time basis, allows businesses to increase profitability. In most organisations however, high-level cost-based managerial decisions are still being made separately from process-related operational decisions. In this paper, we describe how process-related decisions at the operational level can be guided by cost considerations and how these cost-informed decision rules can be supported by a workflow management system. The paper presents the conceptual framework together with data requirements and technical challenges that need to be addressed to realise cost-informed workflow execution. The feasibility of our approach is demonstrated using a prototype implementation in the YAWL workflow environment.
Resumo:
Health complaint commissions in Australia: Time for a national approach • There is considerable variation between jurisdictions in the ways complaint data are defined, collected and recorded by the Health complaint commissions. • Complaints from the public are an important accountability mechanism and an indicator of service quality. • The lack of a consistent approach leads to fragmentation of complaint data and a lost opportunity to use national data to assist policy development and identify the main areas causing consumers to complain. • We need a national approach to complaints data collection by the Health complaints commissions in order to better respond to patients’ concerns
Resumo:
This thesis improves the process of recommending people to people in social networks using new clustering algorithms and ranking methods. The proposed system and methods are evaluated on the data collected from a real life social network. The empirical analysis of this research confirms that the proposed system and methods achieved improvements in the accuracy and efficiency of matching and recommending people, and overcome some of the problems that social matching systems usually suffer.