995 resultados para atomic data
Resumo:
This study uses borehole geophysical log data of sonic velocity and electrical resistivity to estimate permeability in sandstones in the northern Galilee Basin, Queensland. The prior estimates of permeability are calculated according to the deterministic log–log linear empirical correlations between electrical resistivity and measured permeability. Both negative and positive relationships are influenced by the clay content. The prior estimates of permeability are updated in a Bayesian framework for three boreholes using both the cokriging (CK) method and a normal linear regression (NLR) approach to infer the likelihood function. The results show that the mean permeability estimated from the CK-based Bayesian method is in better agreement with the measured permeability when a fairly apparent linear relationship exists between the logarithm of permeability and sonic velocity. In contrast, the NLR-based Bayesian approach gives better estimates of permeability for boreholes where no linear relationship exists between logarithm permeability and sonic velocity.
Resumo:
While scientists continue to explore the level of climate change impact to new weather patterns and our environment in general, there have been some devastating natural disasters worldwide in the last two decades. Indeed natural disasters are becoming a major concern in our society. Yet in many previous examples, our reconstruction efforts only focused on providing short-term necessities. How to develop resilience in the long run is now a highlight for research and industry practice. This paper introduces a research project aimed at exploring the relationship between resilience building and sustainability in order to identify key factors during reconstruction efforts. From extensive literature study, the authors considered the inherent linkage between the two issues as evidenced from past research. They found that sustainability considerations can improve the level of resilience but are not currently given due attention. Reconstruction efforts need to focus on resilience factors but as part of urban development, they must also respond to the sustainability challenge. Sustainability issues in reconstruction projects need to be amplified, identified, processed, and managed properly. On-going research through empirical study aims to establish critical factors (CFs) for stakeholders in disaster prone areas to plan for and develop new building infrastructure through holistic considerations and balanced approaches to sustainability. A questionnaire survey examined a range of potential factors and the subsequent data analysis revealed six critical factors for sustainable Post Natural Disaster Reconstruction that include: considerable building materials and construction methods, good governance, multilateral coordination, appropriate land-use planning and policies, consideration of different social needs, and balanced combination of long-term and short-term needs. Findings from this study should have an influence on policy development towards Post Natural Disaster Reconstruction and help with the achievement of sustainable objectives.
Resumo:
Successful inclusive product design requires knowledge about the capabilities, needs and aspirations of potential users and should cater for the different scenarios in which people will use products, systems and services. This should include: the individual at home; in the workplace; for businesses, and for products in these contexts. It needs to reflect the development of theory, tools and techniques as research moves on.
Resumo:
Spreadsheet for Creative City Index 2012
Resumo:
Summary: More than ever before contemporary societies are characterised by the huge amounts of data being transferred. Authorities, companies, academia and other stakeholders refer to Big Data when discussing the importance of large and complex datasets and developing possible solutions for their use. Big Data promises to be the next frontier of innovation for institutions and individuals, yet it also offers possibilities to predict and influence human behaviour with ever-greater precision
Resumo:
The decisions people make about medical treatments have a great impact on their lives. Health care practitioners, providers and patients often make decisions about medical treatments without complete understanding of the circumstances. The main reason for this is that medical data are available in fragmented, disparate and heterogeneous data silos. Without a centralised data warehouse structure to integrate these data silos, it is highly unlikely and impractical for the users to get all the information required on time to make a correct decision. In this research paper, a clinical data integration approach using SAS Clinical Data Integration Server tools is presented.
Resumo:
The health system is one sector dealing with very large amount of complex data. Many healthcare organisations struggle to utilise these volumes of health data effectively and efficiently. Therefore, there is a need for very effective system to capture, collate and distribute this health data. There are number of technologies have been identified to integrate data from different sources. Data warehousing is one technology can be used to manage clinical data in the healthcare. This paper addresses how data warehousing assist to improve cardiac surgery decision making. This research used the cardiac surgery unit at the Prince Charles Hospital (TPCH) as the case study. In order to deal with other units efficiently, it is important to integrate disparate data to a single point interrogation. We propose implementing a data warehouse for the cardiac surgery unit at TPCH. The data warehouse prototype developed using SAS enterprise data integration studio 4.2 and data was analysed using SAS enterprise edition 4.3. This improves access to integrated clinical and financial data with, improved framing of data to the clinical context, giving potentially better informed decision making for both improved management and patient care.
Resumo:
In Australia, as in some other western nations, governments impose accountability measures on educational institutions (Earl, 2005). One such accountability measure is the National Assessment Program - Literacy and Numeracy (NAPLAN) from which high-stakes assessment data is generated. In this article, a practical method of data analysis known as the Over Time Assessment Data Analysis (OTADA) is offered as an analytical process by which schools can monitor their current and over time performances. This analysis developed by the author, is currently used extensively in schools throughout Queensland. By Analysing in this way, teachers, and in particular principals, can obtain a quick and insightful performance overview. For those seeking to track the achievements and progress of year level cohorts, the OTADA should be considered.
Resumo:
This research was a step forward in developing a data integration framework for Electronic Health Records. The outcome of the research is a conceptual and logical Data Warehousing model for integrating Cardiac Surgery electronic data records. This thesis investigated the main obstacles for the healthcare data integration and proposes a data warehousing model suitable for integrating fragmented data in a Cardiac Surgery Unit.
Resumo:
This project researched the performance of emerging digital technology for high voltage electricity substations that significantly improves safety for staff and reduces the potential impact on the environment of equipment failure. The experimental evaluation used a scale model of a substation control system that incorporated real substation control and networking equipment with real-time simulation of the power system. The outcomes confirm that it is possible to implement Ethernet networks in high voltage substations that meet the needs of utilities; however component-level testing of devices is necessary to achieve this. The assessment results have been used to further develop international standards for substation communication and precision timing.
Resumo:
This document provides data for the case study presented in our recent earthwork planning papers. Some results are also provided in a graphical format using Excel.
Resumo:
Ab initio Density Functional Theory (DFT) calculations are performed to study the diffusion of atomic hydrogen on a Mg(0001) surface and their migration into the subsurface layers. A carbon atom located initially on a Mg(0001) surface can migrate into the sub-surface layer and occupy a fcc site, with charge transfer to the C atom from neighboring Mg atoms. The cluster of postively charged Mg atoms surrounding a sub-surface C is then shown to facilitate the dissociative chemisorption of molecular hydrogen on the Mg(0001) surface, and the surface migration and subsequent diffusion into the subsurface of atomic hydrogen. This helps rationalize the experimentally-observed improvement in absorption kinetics of H2 when graphite or single walled carbon nanotubes (SWCNT) are introduced into the Mg powder during ball milling.
Resumo:
The impact induced chemisorption of hydrocarbon molecules (CH3 and CH2) on H-terminated diamond (001)-(2x1) surface was investigated by molecular dynamics simulation using the many-body Brenner potential. The deposition dynamics of the CH3 radical at impact energies of 0.1-50 eV per molecule was studied and the energy threshold for chemisorption was calculated. The impact-induced decomposition of hydrogen atoms and the dimer opening mechanism on the surface was investigated. Furthermore, the probability for dimer opening event induced by chemisorption of CH, was simulated by randomly varying the impact position as well as the orientation of the molecule relative to the surface. Finally, the energetic hydrocarbons were modeled, slowing down one after the other to simulate the initial fabrication of diamond-like carbon (DLC) films. The structure characteristic in synthesized films with different hydrogen flux was studied. Our results indicate that CH3, CH2 and H are highly reactive and important species in diamond growth. Especially, the fraction of C-atoms in the film having sp(3) hybridization will be enhanced in the presence of H atoms, which is in good agreement with experimental observations. (C) 2002 Elsevier Science B.V. All rights reserved.
Resumo:
Introduction: The motivation for developing megavoltage (and kilovoltage) cone beam CT (MV CBCT) capabilities in the radiotherapy treatment room was primarily based on the need to improve patient set-up accuracy. There has recently been an interest in using the cone beam CT data for treatment planning. Accurate treatment planning, however, requires knowledge of the electron density of the tissues receiving radiation in order to calculate dose distributions. This is obtained from CT, utilising a conversion between CT number and electron density of various tissues. The use of MV CBCT has particular advantages compared to treatment planning with kilovoltage CT in the presence of high atomic number materials and requires the conversion of pixel values from the image sets to electron density. Therefore, a study was undertaken to characterise the pixel value to electron density relationship for the Siemens MV CBCT system, MVision, and determine the effect, if any, of differing the number of monitor units used for acquisition. If a significant difference with number of monitor units was seen then pixel value to ED conversions may be required for each of the clinical settings. The calibration of the MV CT images for electron density offers the possibility for a daily recalculation of the dose distribution and the introduction of new adaptive radiotherapy treatment strategies. Methods: A Gammex Electron Density CT Phantom was imaged with the MVCB CT system. The pixel value for each of the sixteen inserts, which ranged from 0.292 to 1.707 relative electron density to the background solid water, was determined by taking the mean value from within a region of interest centred on the insert, over 5 slices within the centre of the phantom. These results were averaged and plotted against the relative electron densities of each insert with a linear least squares fit was preformed. This procedure was performed for images acquired with 5, 8, 15 and 60 monitor units. Results: The linear relationship between MVCT pixel value and ED was demonstrated for all monitor unit settings and over a range of electron densities. The number of monitor units utilised was found to have no significant impact on this relationship. Discussion: It was found that the number of MU utilised does not significantly alter the pixel value obtained for different ED materials. However, to ensure the most accurate and reproducible MV to ED calibration, one MU setting should be chosen and used routinely. To ensure accuracy for the clinical situation this MU setting should correspond to that which is used clinically. If more than one MU setting is used clinically then an average of the CT values acquired with different numbers of MU could be utilized without loss in accuracy. Conclusions: No significant differences have been shown between the pixel value to ED conversion for the Siemens MV CT cone beam unit with change in monitor units. Thus as single conversion curve could be utilised for MV CT treatment planning. To fully utilise MV CT imaging for radiotherapy treatment planning further work will be undertaken to ensure all corrections have been made and dose calculations verified. These dose calculations may be either for treatment planning purposes or for reconstructing the delivered dose distribution from transit dosimetry measurements made using electronic portal imaging devices. This will potentially allow the cumulative dose distribution to be determined through the patient’s multi-fraction treatment and adaptive treatment strategies developed to optimize the tumour response.