920 resultados para Data representation
Resumo:
The Queensland University of Technology (QUT) Library, like many other academic and research institution libraries in Australia, has been collaborating with a range of academic and service provider partners to develop a range of research data management services and collections. Three main strategies are being employed and an overview of process, infrastructure, usage and benefits is provided of each of these service aspects. The development of processes and infrastructure to facilitate the strategic identification and management of QUT developed datasets has been a major focus. A number of Australian National Data Service (ANDS) sponsored projects - including Seeding the Commons; Metadata Hub / Store; Data Capture and Gold Standard Record Exemplars have / will provide QUT with a data registry system, linkages to storage, processes for identifying and describing datasets, and a degree of academic awareness. QUT supports open access and has established a culture for making its research outputs available via the QUT ePrints institutional repository. Incorporating open access research datasets into the library collections is an equally important aspect of facilitating the adoption of data-centric eresearch methods. Some datasets are available commercially, and the library has collaborated with QUT researchers, in the QUT Business School especially strongly, to identify and procure a rapidly growing range of financial datasets to support research. The library undertakes licensing and uses the Library Resource Allocation to pay for the subscriptions. It is a new area of collection development for with much to be learned. The final strategy discussed is the library acting as “data broker”. QUT Library has been working with researchers to identify these datasets and undertake the licensing, payment and access as a centrally supported service on behalf of researchers.
Resumo:
Management of groundwater systems requires realistic conceptual hydrogeological models as a framework for numerical simulation modelling, but also for system understanding and communicating this to stakeholders and the broader community. To help overcome these challenges we developed GVS (Groundwater Visualisation System), a stand-alone desktop software package that uses interactive 3D visualisation and animation techniques. The goal was a user-friendly groundwater management tool that could support a range of existing real-world and pre-processed data, both surface and subsurface, including geology and various types of temporal hydrological information. GVS allows these data to be integrated into a single conceptual hydrogeological model. In addition, 3D geological models produced externally using other software packages, can readily be imported into GVS models, as can outputs of simulations (e.g. piezometric surfaces) produced by software such as MODFLOW or FEFLOW. Boreholes can be integrated, showing any down-hole data and properties, including screen information, intersected geology, water level data and water chemistry. Animation is used to display spatial and temporal changes, with time-series data such as rainfall, standing water levels and electrical conductivity, displaying dynamic processes. Time and space variations can be presented using a range of contouring and colour mapping techniques, in addition to interactive plots of time-series parameters. Other types of data, for example, demographics and cultural information, can also be readily incorporated. The GVS software can execute on a standard Windows or Linux-based PC with a minimum of 2 GB RAM, and the model output is easy and inexpensive to distribute, by download or via USB/DVD/CD. Example models are described here for three groundwater systems in Queensland, northeastern Australia: two unconfined alluvial groundwater systems with intensive irrigation, the Lockyer Valley and the upper Condamine Valley, and the Surat Basin, a large sedimentary basin of confined artesian aquifers. This latter example required more detail in the hydrostratigraphy, correlation of formations with drillholes and visualisation of simulation piezometric surfaces. Both alluvial system GVS models were developed during drought conditions to support government strategies to implement groundwater management. The Surat Basin model was industry sponsored research, for coal seam gas groundwater management and community information and consultation. The “virtual” groundwater systems in these 3D GVS models can be interactively interrogated by standard functions, plus production of 2D cross-sections, data selection from the 3D scene, rear end database and plot displays. A unique feature is that GVS allows investigation of time-series data across different display modes, both 2D and 3D. GVS has been used successfully as a tool to enhance community/stakeholder understanding and knowledge of groundwater systems and is of value for training and educational purposes. Projects completed confirm that GVS provides a powerful support to management and decision making, and as a tool for interpretation of groundwater system hydrological processes. A highly effective visualisation output is the production of short videos (e.g. 2–5 min) based on sequences of camera ‘fly-throughs’ and screen images. Further work involves developing support for multi-screen displays and touch-screen technologies, distributed rendering, gestural interaction systems. To highlight the visualisation and animation capability of the GVS software, links to related multimedia hosted online sites are included in the references.
Resumo:
This study uses borehole geophysical log data of sonic velocity and electrical resistivity to estimate permeability in sandstones in the northern Galilee Basin, Queensland. The prior estimates of permeability are calculated according to the deterministic log–log linear empirical correlations between electrical resistivity and measured permeability. Both negative and positive relationships are influenced by the clay content. The prior estimates of permeability are updated in a Bayesian framework for three boreholes using both the cokriging (CK) method and a normal linear regression (NLR) approach to infer the likelihood function. The results show that the mean permeability estimated from the CK-based Bayesian method is in better agreement with the measured permeability when a fairly apparent linear relationship exists between the logarithm of permeability and sonic velocity. In contrast, the NLR-based Bayesian approach gives better estimates of permeability for boreholes where no linear relationship exists between logarithm permeability and sonic velocity.
Resumo:
“Supermassive” is a synchronised four-channel video installation with sound. Each video channel shows a different camera view of an animated three-dimensional scene, which visually references galactic or astral imagery. This scene is comprised of forty-four separate clusters of slowly orbiting white text. Each cluster refers to a different topic that has been sourced online. The topics are diverse with recurring subjects relating to spirituality, science, popular culture, food and experiences of contemporary urban life. The slow movements of the text and camera views are reinforced through a rhythmic, contemplative soundtrack. As an immersive installation, “Supermassive” operates somewhere between a meditational mind map and a representation of a contemporary data stream. “Supermassive” contributes to studies in the field of contemporary art. It is particularly concerned with the ways that graphic representations of language can operate in the exploration of contemporary lived experiences, whether actual or virtual. Artists such as Ed Ruscha and Christopher Wool have long explored the emotive and psychological potentials of graphic text. Other artists such as Doug Aitken and Pipilotti Rist have engaged with the physical and spatial potentials of audio-visual installations to create emotive and symbolic experiences for their audiences. Using a practice-led research methodology, “Supermassive” extends these creative inquiries. By creating a reflective atmosphere in which divergent textual subjects are pictured together, the work explores not only how we navigate information, but also how such navigations inform understandings of our physical and psychological realities. “Supermassive” has been exhibited internationally at LA Louver Gallery, Venice, California in 2013 and nationally with GBK as part of Art Month Sydney, also in 2013. It has been critically reviewed in The Los Angeles Times.
Resumo:
While scientists continue to explore the level of climate change impact to new weather patterns and our environment in general, there have been some devastating natural disasters worldwide in the last two decades. Indeed natural disasters are becoming a major concern in our society. Yet in many previous examples, our reconstruction efforts only focused on providing short-term necessities. How to develop resilience in the long run is now a highlight for research and industry practice. This paper introduces a research project aimed at exploring the relationship between resilience building and sustainability in order to identify key factors during reconstruction efforts. From extensive literature study, the authors considered the inherent linkage between the two issues as evidenced from past research. They found that sustainability considerations can improve the level of resilience but are not currently given due attention. Reconstruction efforts need to focus on resilience factors but as part of urban development, they must also respond to the sustainability challenge. Sustainability issues in reconstruction projects need to be amplified, identified, processed, and managed properly. On-going research through empirical study aims to establish critical factors (CFs) for stakeholders in disaster prone areas to plan for and develop new building infrastructure through holistic considerations and balanced approaches to sustainability. A questionnaire survey examined a range of potential factors and the subsequent data analysis revealed six critical factors for sustainable Post Natural Disaster Reconstruction that include: considerable building materials and construction methods, good governance, multilateral coordination, appropriate land-use planning and policies, consideration of different social needs, and balanced combination of long-term and short-term needs. Findings from this study should have an influence on policy development towards Post Natural Disaster Reconstruction and help with the achievement of sustainable objectives.
Resumo:
Successful inclusive product design requires knowledge about the capabilities, needs and aspirations of potential users and should cater for the different scenarios in which people will use products, systems and services. This should include: the individual at home; in the workplace; for businesses, and for products in these contexts. It needs to reflect the development of theory, tools and techniques as research moves on.
Resumo:
Spreadsheet for Creative City Index 2012
Resumo:
Summary: More than ever before contemporary societies are characterised by the huge amounts of data being transferred. Authorities, companies, academia and other stakeholders refer to Big Data when discussing the importance of large and complex datasets and developing possible solutions for their use. Big Data promises to be the next frontier of innovation for institutions and individuals, yet it also offers possibilities to predict and influence human behaviour with ever-greater precision
Resumo:
The building sector is the dominant consumer of energy and therefore a major contributor to anthropomorphic climate change. The rapid generation of photorealistic, 3D environment models with incorporated surface temperature data has the potential to improve thermographic monitoring of building energy efficiency. In pursuit of this goal, we propose a system which combines a range sensor with a thermal-infrared camera. Our proposed system can generate dense 3D models of environments with both appearance and temperature information, and is the first such system to be developed using a low-cost RGB-D camera. The proposed pipeline processes depth maps successively, forming an ongoing pose estimate of the depth camera and optimizing a voxel occupancy map. Voxels are assigned 4 channels representing estimates of their true RGB and thermal-infrared intensity values. Poses corresponding to each RGB and thermal-infrared image are estimated through a combination of timestamp-based interpolation and a pre-determined knowledge of the extrinsic calibration of the system. Raycasting is then used to color the voxels to represent both visual appearance using RGB, and an estimate of the surface temperature. The output of the system is a dense 3D model which can simultaneously represent both RGB and thermal-infrared data using one of two alternative representation schemes. Experimental results demonstrate that the system is capable of accurately mapping difficult environments, even in complete darkness.
Resumo:
Topic modeling has been widely utilized in the fields of information retrieval, text mining, text classification etc. Most existing statistical topic modeling methods such as LDA and pLSA generate a term based representation to represent a topic by selecting single words from multinomial word distribution over this topic. There are two main shortcomings: firstly, popular or common words occur very often across different topics that bring ambiguity to understand topics; secondly, single words lack coherent semantic meaning to accurately represent topics. In order to overcome these problems, in this paper, we propose a two-stage model that combines text mining and pattern mining with statistical modeling to generate more discriminative and semantic rich topic representations. Experiments show that the optimized topic representations generated by the proposed methods outperform the typical statistical topic modeling method LDA in terms of accuracy and certainty.
Resumo:
The decisions people make about medical treatments have a great impact on their lives. Health care practitioners, providers and patients often make decisions about medical treatments without complete understanding of the circumstances. The main reason for this is that medical data are available in fragmented, disparate and heterogeneous data silos. Without a centralised data warehouse structure to integrate these data silos, it is highly unlikely and impractical for the users to get all the information required on time to make a correct decision. In this research paper, a clinical data integration approach using SAS Clinical Data Integration Server tools is presented.
Resumo:
LiFePO4 is a commercially available battery material with good theoretical discharge capacity, excellent cycle life and increased safety compared with competing Li-ion chemistries. It has been the focus of considerable experimental and theoretical scrutiny in the past decade, resulting in LiFePO4 cathodes that perform well at high discharge rates. This scrutiny has raised several questions about the behaviour of LiFePO4 material during charge and discharge. In contrast to many other battery chemistries that intercalate homogeneously, LiFePO4 can phase-separate into highly and lowly lithiated phases, with intercalation proceeding by advancing an interface between these two phases. The main objective of this thesis is to construct mathematical models of LiFePO4 cathodes that can be validated against experimental discharge curves. This is in an attempt to understand some of the multi-scale dynamics of LiFePO4 cathodes that can be difficult to determine experimentally. The first section of this thesis constructs a three-scale mathematical model of LiFePO4 cathodes that uses a simple Stefan problem (which has been used previously in the literature) to describe the assumed phase-change. LiFePO4 crystals have been observed agglomerating in cathodes to form a porous collection of crystals and this morphology motivates the use of three size-scales in the model. The multi-scale model developed validates well against experimental data and this validated model is then used to examine the role of manufacturing parameters (including the agglomerate radius) on battery performance. The remainder of the thesis is concerned with investigating phase-field models as a replacement for the aforementioned Stefan problem. Phase-field models have recently been used in LiFePO4 and are a far more accurate representation of experimentally observed crystal-scale behaviour. They are based around the Cahn-Hilliard-reaction (CHR) IBVP, a fourth-order PDE with electrochemical (flux) boundary conditions that is very stiff and possesses multiple time and space scales. Numerical solutions to the CHR IBVP can be difficult to compute and hence a least-squares based Finite Volume Method (FVM) is developed for discretising both the full CHR IBVP and the more traditional Cahn-Hilliard IBVP. Phase-field models are subject to two main physicality constraints and the numerical scheme presented performs well under these constraints. This least-squares based FVM is then used to simulate the discharge of individual crystals of LiFePO4 in two dimensions. This discharge is subject to isotropic Li+ diffusion, based on experimental evidence that suggests the normally orthotropic transport of Li+ in LiFePO4 may become more isotropic in the presence of lattice defects. Numerical investigation shows that two-dimensional Li+ transport results in crystals that phase-separate, even at very high discharge rates. This is very different from results shown in the literature, where phase-separation in LiFePO4 crystals is suppressed during discharge with orthotropic Li+ transport. Finally, the three-scale cathodic model used at the beginning of the thesis is modified to simulate modern, high-rate LiFePO4 cathodes. High-rate cathodes typically do not contain (large) agglomerates and therefore a two-scale model is developed. The Stefan problem used previously is also replaced with the phase-field models examined in earlier chapters. The results from this model are then compared with experimental data and fit poorly, though a significant parameter regime could not be investigated numerically. Many-particle effects however, are evident in the simulated discharges, which match the conclusions of recent literature. These effects result in crystals that are subject to local currents very different from the discharge rate applied to the cathode, which impacts the phase-separating behaviour of the crystals and raises questions about the validity of using cathodic-scale experimental measurements in order to determine crystal-scale behaviour.
Resumo:
The health system is one sector dealing with very large amount of complex data. Many healthcare organisations struggle to utilise these volumes of health data effectively and efficiently. Therefore, there is a need for very effective system to capture, collate and distribute this health data. There are number of technologies have been identified to integrate data from different sources. Data warehousing is one technology can be used to manage clinical data in the healthcare. This paper addresses how data warehousing assist to improve cardiac surgery decision making. This research used the cardiac surgery unit at the Prince Charles Hospital (TPCH) as the case study. In order to deal with other units efficiently, it is important to integrate disparate data to a single point interrogation. We propose implementing a data warehouse for the cardiac surgery unit at TPCH. The data warehouse prototype developed using SAS enterprise data integration studio 4.2 and data was analysed using SAS enterprise edition 4.3. This improves access to integrated clinical and financial data with, improved framing of data to the clinical context, giving potentially better informed decision making for both improved management and patient care.
Resumo:
In Australia, as in some other western nations, governments impose accountability measures on educational institutions (Earl, 2005). One such accountability measure is the National Assessment Program - Literacy and Numeracy (NAPLAN) from which high-stakes assessment data is generated. In this article, a practical method of data analysis known as the Over Time Assessment Data Analysis (OTADA) is offered as an analytical process by which schools can monitor their current and over time performances. This analysis developed by the author, is currently used extensively in schools throughout Queensland. By Analysing in this way, teachers, and in particular principals, can obtain a quick and insightful performance overview. For those seeking to track the achievements and progress of year level cohorts, the OTADA should be considered.