972 resultados para Distance measurement
Resumo:
This thesis applies Monte Carlo techniques to the study of X-ray absorptiometric methods of bone mineral measurement. These studies seek to obtain information that can be used in efforts to improve the accuracy of the bone mineral measurements. A Monte Carlo computer code for X-ray photon transport at diagnostic energies has been developed from first principles. This development was undertaken as there was no readily available code which included electron binding energy corrections for incoherent scattering and one of the objectives of the project was to study the effects of inclusion of these corrections in Monte Carlo models. The code includes the main Monte Carlo program plus utilities for dealing with input data. A number of geometrical subroutines which can be used to construct complex geometries have also been written. The accuracy of the Monte Carlo code has been evaluated against the predictions of theory and the results of experiments. The results show a high correlation with theoretical predictions. In comparisons of model results with those of direct experimental measurements, agreement to within the model and experimental variances is obtained. The code is an accurate and valid modelling tool. A study of the significance of inclusion of electron binding energy corrections for incoherent scatter in the Monte Carlo code has been made. The results show this significance to be very dependent upon the type of application. The most significant effect is a reduction of low angle scatter flux for high atomic number scatterers. To effectively apply the Monte Carlo code to the study of bone mineral density measurement by photon absorptiometry the results must be considered in the context of a theoretical framework for the extraction of energy dependent information from planar X-ray beams. Such a theoretical framework is developed and the two-dimensional nature of tissue decomposition based on attenuation measurements alone is explained. This theoretical framework forms the basis for analytical models of bone mineral measurement by dual energy X-ray photon absorptiometry techniques. Monte Carlo models of dual energy X-ray absorptiometry (DEXA) have been established. These models have been used to study the contribution of scattered radiation to the measurements. It has been demonstrated that the measurement geometry has a significant effect upon the scatter contribution to the detected signal. For the geometry of the models studied in this work the scatter has no significant effect upon the results of the measurements. The model has also been used to study a proposed technique which involves dual energy X-ray transmission measurements plus a linear measurement of the distance along the ray path. This is designated as the DPA( +) technique. The addition of the linear measurement enables the tissue decomposition to be extended to three components. Bone mineral, fat and lean soft tissue are the components considered here. The results of the model demonstrate that the measurement of bone mineral using this technique is stable over a wide range of soft tissue compositions and hence would indicate the potential to overcome a major problem of the two component DEXA technique. However, the results also show that the accuracy of the DPA( +) technique is highly dependent upon the composition of the non-mineral components of bone and has poorer precision (approximately twice the coefficient of variation) than the standard DEXA measurements. These factors may limit the usefulness of the technique. These studies illustrate the value of Monte Carlo computer modelling of quantitative X-ray measurement techniques. The Monte Carlo models of bone densitometry measurement have:- 1. demonstrated the significant effects of the measurement geometry upon the contribution of scattered radiation to the measurements, 2. demonstrated that the statistical precision of the proposed DPA( +) three tissue component technique is poorer than that of the standard DEXA two tissue component technique, 3. demonstrated that the proposed DPA(+) technique has difficulty providing accurate simultaneous measurement of body composition in terms of a three component model of fat, lean soft tissue and bone mineral,4. and provided a knowledge base for input to decisions about development (or otherwise) of a physical prototype DPA( +) imaging system. The Monte Carlo computer code, data, utilities and associated models represent a set of significant, accurate and valid modelling tools for quantitative studies of physical problems in the fields of diagnostic radiology and radiography.
Resumo:
A one year mathematics project that focused on measurement was conducted with six Torres Strait Islander schools and communities. Its key focus was to contextualise the teaching and learning of measurement within the students’ culture, communities and home languages. There were six teachers and two teacher aides who participated in the project. This paper reports on the findings from the teachers’ and teacher aides’ survey questionnaire used in the first Professional Development session to identify: a) teachers’ experience of teaching in Torres Strait Islands, b) teachers’ beliefs about effective ways to teach Torres Strait Islander students, and c) contexualising measurement within Torres Strait Islander culture, Communities and home languages. A wide range of differing levels of knowledge and understanding about how to contextualise measurement to support student learning were identified and analysed. For example, an Indigenous teacher claimed that mathematics and the environment are relational, that is, they are not discrete and in isolation from one another, rather they interconnect with mathematical ideas emerging from the environment of the Torres Strait Communities.
Resumo:
The effect of conversion from forest-to-pasture upon soil carbon stocks has been intensively discussed, but few studies focus on how this land-use change affects carbon (C) distribution across soil fractions in the Amazon basin. We investigated this in the 20 cm depth along a chronosequence of sites from native forest to three successively older pastures. We performed a physicochemical fractionation of bulk soil samples to better understand the mechanisms by which soil C is stabilized and evaluate the contribution of each C fraction to total soil C. Additionally, we used a two-pool model to estimate the mean residence time (MRT) for the slow and active pool C in each fraction. Soil C increased with conversion from forest-to-pasture in the particulate organic matter (> 250 mu m), microaggregate (53-250 mu m), and d-clay (< 2 mu m) fractions. The microaggregate comprised the highest soil C content after the conversion from forest-to-pasture. The C content of the d-silt fraction decreased with time since conversion to pasture. Forest-derived C remained in all fractions with the highest concentration in the finest fractions, with the largest proportion of forest-derived soil C associated with clay minerals. Results from this work indicate that microaggregate formation is sensitive to changes in management and might serve as an indicator for management-induced soil carbon changes, and the soil C changes in the fractions are dependent on soil texture.
Resumo:
High growth in the uptake of electrical appliances is accounting for a significant increase in electricity consumption globally. In some developed countries, standby energy alone may account for about 10% of residential electricity use. The standby power for many appliances used in Australia is still well above the national goal of 1 W or less. In this paper, field measurements taken of standby power and operating power for a range of electrical appliances are presented. It was found that the difference between minimum value and maximum value of standby power could be quite large, up to 22.13 W for home theatre systems, for example. With the exception of home audio systems, however, the annual operating energy used by most electrical appliances was generally greater than the annual standby energy. Consumer behaviour and product choice can have a significant impact on standby power and operating power, which influences both energy demand and greenhouse gas emissions.
Resumo:
Uncooperative iris identification systems at a distance and on the move often suffer from poor resolution and poor focus of the captured iris images. The lack of pixel resolution and well-focused images significantly degrades the iris recognition performance. This paper proposes a new approach to incorporate the focus score into a reconstruction-based super-resolution process to generate a high resolution iris image from a low resolution and focus inconsistent video sequence of an eye. A reconstruction-based technique, which can incorporate middle and high frequency components from multiple low resolution frames into one desired super-resolved frame without introducing false high frequency components, is used. A new focus assessment approach is proposed for uncooperative iris at a distance and on the move to improve performance for variations in lighting, size and occlusion. A novel fusion scheme is then proposed to incorporate the proposed focus score into the super-resolution process. The experiments conducted on the The Multiple Biometric Grand Challenge portal database shows that our proposed approach achieves an EER of 2.1%, outperforming the existing state-of-the-art averaging signal-level fusion approach by 19.2% and the robust mean super-resolution approach by 8.7%.
Resumo:
National estimates of the prevalence of child abuse-related injuries are obtained from a variety of sectors including welfare, justice, and health resulting in inconsistent estimates across sectors. The International Classification of Diseases (ICD) is used as the international standard for categorising health data and aggregating data for statistical purposes, though there has been limited validation of the quality, completeness or concordance of these data with other sectors. This research study examined the quality of documentation and coding of child abuse recorded in hospital records in Queensland and the concordance of these data with child welfare records. A retrospective medical record review was used to examine the clinical documentation of over 1000 hospitalised injured children from 20 hospitals in Queensland. A data linkage methodology was used to link these records with records in the child welfare database. Cases were sampled from three sub-groups according to the presence of target ICD codes: Definite abuse, Possible abuse, unintentional injury. Less than 2% of cases coded as being unintentional were recoded after review as being possible abuse, and only 5% of cases coded as possible abuse cases were reclassified as unintentional, though there was greater variation in the classification of cases as definite abuse compared to possible abuse. Concordance of health data with child welfare data varied across patient subgroups. This study will inform the development of strategies to improve the quality, consistency and concordance of information between health and welfare agencies to ensure adequate system responses to children at risk of abuse.
Resumo:
Electrostatic discharge is the sudden and brief electric current that flashes between two objects at different voltages. This is a serious issue ranging in application from solid-state electronics to spectacular and dangerous lightning strikes (arc flashes). The research herein presents work on the experimental simulation and measurement of the energy in an electrostatic discharge. The energy released in these discharges has been linked to ignitions and burning in a number of documented disasters and can be enormously hazardous in many other industrial scenarios. Simulations of electrostatic discharges were designed to specifications by IEC standards. This is typically based on the residual voltage/charge on the discharge capacitor, whereas this research examines the voltage and current in the actual spark in order to obtain a more precise comparative measurement of the energy dissipated.
Resumo:
The link between measured sub-saturated hygroscopicity and cloud activation potential of secondary organic aerosol particles produced by the chamber photo-oxidation of α-pinene in the presence or absence of ammonium sulphate seed aerosol was investigated using two models of varying complexity. A simple single hygroscopicity parameter model and a more complex model (incorporating surface effects) were used to assess the detail required to predict the cloud condensation nucleus (CCN) activity from the subsaturated water uptake. Sub-saturated water uptake measured by three hygroscopicity tandem differential mobility analyser (HTDMA) instruments was used to determine the water activity for use in the models. The predicted CCN activity was compared to the measured CCN activation potential using a continuous flow CCN counter. Reconciliation using the more complex model formulation with measured cloud activation could be achieved widely different assumed surface tension behavior of the growing droplet; this was entirely determined by the instrument used as the source of water activity data. This unreliable derivation of the water activity as a function of solute concentration from sub-saturated hygroscopicity data indicates a limitation in the use of such data in predicting cloud condensation nucleus behavior of particles with a significant organic fraction. Similarly, the ability of the simpler single parameter model to predict cloud activation behaviour was dependent on the instrument used to measure sub-saturated hygroscopicity and the relative humidity used to provide the model input. However, agreement was observed for inorganic salt solution particles, which were measured by all instruments in agreement with theory. The difference in HTDMA data from validated and extensively used instruments means that it cannot be stated with certainty the detail required to predict the CCN activity from sub-saturated hygroscopicity. In order to narrow the gap between measurements of hygroscopic growth and CCN activity the processes involved must be understood and the instrumentation extensively quality assured. It is impossible to say from the results presented here due to the differences in HTDMA data whether: i) Surface tension suppression occurs ii) Bulk to surface partitioning is important iii) The water activity coefficient changes significantly as a function of the solute concentration.
Resumo:
The neutron logging method has been widely used for field measurement of soil moisture content. This non-destructive method permitted the measurement of in-situ soil moisture content at various depths without the need for burying any sensor. Twenty-three sites located around regional Melbourne have been selected for long term monitoring of soil moisture content using neutron probe. Soil samples collected during the installation are used for site characterisation and neutron probe calibration purposes. A linear relationship is obtained between the corrected neutron probe reading and moisture content for both the individual and combined data from seven sites. It is observed that the liner relationship, developed using combined data, can be used for all sites with an average accuracy of about 80%. Monitoring of the variation of soil moisture content with depth in six months for two sites is presented in this paper.
Resumo:
Background Heavy vehicle transportation continues to grow internationally; yet crash rates are high, and the risk of injury and death extends to all road users. The work environment for the heavy vehicle driver poses many challenges; conditions such as scheduling and payment are proposed risk factors for crash, yet the precise measure of these needs quantifying. Other risk factors such as sleep disorders including obstructive sleep apnoea have been shown to increase crash risk in motor vehicle drivers however the risk of heavy vehicle crash from this and related health conditions needs detailed investigation. Methods and Design The proposed case control study will recruit 1034 long distance heavy vehicle drivers: 517 who have crashed and 517 who have not. All participants will be interviewed at length, regarding their driving and crash history, typical workloads, scheduling and payment, trip history over several days, sleep patterns, health, and substance use. All participants will have administered a nasal flow monitor for the detection of obstructive sleep apnoea. Discussion Significant attention has been paid to the enforcement of legislation aiming to deter problems such as excess loading, speeding and substance use; however, there is inconclusive evidence as to the direction and strength of associations of many other postulated risk factors for heavy vehicle crashes. The influence of factors such as remuneration and scheduling on crash risk is unclear; so too the association between sleep apnoea and the risk of heavy vehicle driver crash. Contributory factors such as sleep quality and quantity, body mass and health status will be investigated. Quantifying the measure of effect of these factors on the heavy vehicle driver will inform policy development that aims toward safer driving practices and reduction in heavy vehicle crash; protecting the lives of many on the road network.
Resumo:
Training designed to support and strengthen higher-order mental abilities now often involves immersion in Virtual Reality where dangerous real world scenarios can be safely replicated. However despite the growing popularity of advanced training simulations, methods for evaluating their use rely heavily on subjective measures or analysis of final outcomes. Without dynamic, objective performance measures the outcome of training in terms of impact on cognitive skills and ability to transfer newly acquired skills to the real world is unknown. The relationship between affective intensity and cognitive learning provides a potential new approach to ensure the processing of cognitions which occur prior to final outcomes, such as problem-solving and decision-making, are adequately evaluated. This paper describes the technical aspects of pilot work recently undertaken to develop a new measurement tool designed to objectively track individual affect levels during simulation-based training.
Resumo:
Process modeling is an emergent area of Information Systems research that is characterized through an abundance of conceptual work with little empirical research. To fill this gap, this paper reports on the development and validation of an instrument to measure user acceptance of process modeling grammars. We advance an extended model for a multi-stage measurement instrument development procedure, which incorporates feedback from both expert and user panels. We identify two main contributions: First, we provide a validated measurement instrument for the study of user acceptance of process modeling grammars, which can be used to assist in further empirical studies that investigate phenomena associated with the business process modeling domain. Second, in doing so, we describe in detail a procedural model for developing measurement instruments that ensures high levels of reliability and validity, which may assist fellow scholars in executing their empirical research.
Resumo:
The process of offsetting land against unavoidable disturbance of development sites in Queensland will benefit from a method that allows the best possible selection to be made of alternative lands. With site selection now advocated through a combination of Regional Ecosystem and Land Capability classifications state-wide, a case study has determined methods of assessing the functional lift – that is, measures of net environmental gain – of such action. Outcomes with potentially high functional lift are determined, that offer promise not only for endangered ecosystems but also for managing adjacent conservation reserves.