918 resultados para Scattered trees
Resumo:
Emerging data streaming applications in Wireless Sensor Networks require reliable and energy-efficient Transport Protocols. Our recent Wireless Sensor Network deployment in the Burdekin delta, Australia, for water monitoring [T. Le Dinh, W. Hu, P. Sikka, P. Corke, L. Overs, S. Brosnan, Design and deployment of a remote robust sensor network: experiences from an outdoor water quality monitoring network, in: Second IEEE Workshop on Practical Issues in Building Sensor Network Applications (SenseApp 2007), Dublin, Ireland, 2007] is one such example. This application involves streaming sensed data such as pressure, water flow rate, and salinity periodically from many scattered sensors to the sink node which in turn relays them via an IP network to a remote site for archiving, processing, and presentation. While latency is not a primary concern in this class of application (the sampling rate is usually in terms of minutes or hours), energy-efficiency is. Continuous long-term operation and reliable delivery of the sensed data to the sink are also desirable. This paper proposes ERTP, an Energy-efficient and Reliable Transport Protocol for Wireless Sensor Networks. ERTP is designed for data streaming applications, in which sensor readings are transmitted from one or more sensor sources to a base station (or sink). ERTP uses a statistical reliability metric which ensures the number of data packets delivered to the sink exceeds the defined threshold. Our extensive discrete event simulations and experimental evaluations show that ERTP is significantly more energyefficient than current approaches and can reduce energy consumption by more than 45% when compared to current approaches. Consequently, sensor nodes are more energy-efficient and the lifespan of the unattended WSN is increased.
Resumo:
Habitat models are widely used in ecology, however there are relatively few studies of rare species, primarily because of a paucity of survey records and lack of robust means of assessing accuracy of modelled spatial predictions. We investigated the potential of compiled ecological data in developing habitat models for Macadamia integrifolia, a vulnerable mid-stratum tree endemic to lowland subtropical rainforests of southeast Queensland, Australia. We compared performance of two binomial models—Classification and Regression Trees (CART) and Generalised Additive Models (GAM)—with Maximum Entropy (MAXENT) models developed from (i) presence records and available absence data and (ii) developed using presence records and background data. The GAM model was the best performer across the range of evaluation measures employed, however all models were assessed as potentially useful for informing in situ conservation of M. integrifolia, A significant loss in the amount of M. integrifolia habitat has occurred (p < 0.05), with only 37% of former habitat (pre-clearing) remaining in 2003. Remnant patches are significantly smaller, have larger edge-to-area ratios and are more isolated from each other compared to pre-clearing configurations (p < 0.05). Whilst the network of suitable habitat patches is still largely intact, there are numerous smaller patches that are more isolated in the contemporary landscape compared with their connectedness before clearing. These results suggest that in situ conservation of M. integrifolia may be best achieved through a landscape approach that considers the relative contribution of small remnant habitat fragments to the species as a whole, as facilitating connectivity among the entire network of habitat patches.
Resumo:
This thesis applies Monte Carlo techniques to the study of X-ray absorptiometric methods of bone mineral measurement. These studies seek to obtain information that can be used in efforts to improve the accuracy of the bone mineral measurements. A Monte Carlo computer code for X-ray photon transport at diagnostic energies has been developed from first principles. This development was undertaken as there was no readily available code which included electron binding energy corrections for incoherent scattering and one of the objectives of the project was to study the effects of inclusion of these corrections in Monte Carlo models. The code includes the main Monte Carlo program plus utilities for dealing with input data. A number of geometrical subroutines which can be used to construct complex geometries have also been written. The accuracy of the Monte Carlo code has been evaluated against the predictions of theory and the results of experiments. The results show a high correlation with theoretical predictions. In comparisons of model results with those of direct experimental measurements, agreement to within the model and experimental variances is obtained. The code is an accurate and valid modelling tool. A study of the significance of inclusion of electron binding energy corrections for incoherent scatter in the Monte Carlo code has been made. The results show this significance to be very dependent upon the type of application. The most significant effect is a reduction of low angle scatter flux for high atomic number scatterers. To effectively apply the Monte Carlo code to the study of bone mineral density measurement by photon absorptiometry the results must be considered in the context of a theoretical framework for the extraction of energy dependent information from planar X-ray beams. Such a theoretical framework is developed and the two-dimensional nature of tissue decomposition based on attenuation measurements alone is explained. This theoretical framework forms the basis for analytical models of bone mineral measurement by dual energy X-ray photon absorptiometry techniques. Monte Carlo models of dual energy X-ray absorptiometry (DEXA) have been established. These models have been used to study the contribution of scattered radiation to the measurements. It has been demonstrated that the measurement geometry has a significant effect upon the scatter contribution to the detected signal. For the geometry of the models studied in this work the scatter has no significant effect upon the results of the measurements. The model has also been used to study a proposed technique which involves dual energy X-ray transmission measurements plus a linear measurement of the distance along the ray path. This is designated as the DPA( +) technique. The addition of the linear measurement enables the tissue decomposition to be extended to three components. Bone mineral, fat and lean soft tissue are the components considered here. The results of the model demonstrate that the measurement of bone mineral using this technique is stable over a wide range of soft tissue compositions and hence would indicate the potential to overcome a major problem of the two component DEXA technique. However, the results also show that the accuracy of the DPA( +) technique is highly dependent upon the composition of the non-mineral components of bone and has poorer precision (approximately twice the coefficient of variation) than the standard DEXA measurements. These factors may limit the usefulness of the technique. These studies illustrate the value of Monte Carlo computer modelling of quantitative X-ray measurement techniques. The Monte Carlo models of bone densitometry measurement have:- 1. demonstrated the significant effects of the measurement geometry upon the contribution of scattered radiation to the measurements, 2. demonstrated that the statistical precision of the proposed DPA( +) three tissue component technique is poorer than that of the standard DEXA two tissue component technique, 3. demonstrated that the proposed DPA(+) technique has difficulty providing accurate simultaneous measurement of body composition in terms of a three component model of fat, lean soft tissue and bone mineral,4. and provided a knowledge base for input to decisions about development (or otherwise) of a physical prototype DPA( +) imaging system. The Monte Carlo computer code, data, utilities and associated models represent a set of significant, accurate and valid modelling tools for quantitative studies of physical problems in the fields of diagnostic radiology and radiography.
Resumo:
Greenhouse gas markets, where invisible gases are traded, must seem like black boxes to most people. Farmers can make money on these markets, such as the Chicago Climate Exchange, by installing methane capture technologies in animal-based systems, no-till farming, establishing grasslands, and planting trees.
Resumo:
Fault tree analysis (FTA) is presented to model the reliability of a railway traction power system in this paper. First, the construction of fault tree is introduced to integrate components in traction power systems into a fault tree; then the binary decision diagram (BDD) method is used to evaluate fault trees qualitatively and quantitatively. The components contributing to the reliability of overall system are identified with their relative importance through sensitivity analysis. Finally, an AC traction power system is evaluated by the proposed methods.
Resumo:
Various countries have been introducing sustainable assessment tools for real estate design to produce integrated sustainability components not just for the building, but also the landscape component of the development. This paper aims to present the comparison between international and local assessment tools of landscape design for housing estate developments in Bangkok Metropolitan Region (BMR), Thailand. The methodologies used are to review, then compare and identify discrepancy indicators among the tools. This paper will examine four international tools; LEED for Neighbourhood Development (LEED – ND) of United State of America (USA), EnviroDevelopment standards of Australia, Residential Landscape Sustainability of United Kingdom (UK) and Green Mark for Infrastructure of Singapore; and three BMR’s existing tools; Land Subdivision Act B.E. 2543, Environmental Impact Assessment Monitoring Awards (EIA-MA) and Thai’s Rating for Energy and Environmental Sustainability of New construction and major renovation (TREES-NC). The findings show that there are twenty two elements of three categories which are neighbourhood design, community management, and environmental condition. Moreover, only one element in neighbourhood designs different between the international and local tools. The sustainable assessment tools have existed in BMR but they are not complete in only one assessment tool. Thus, the development of new comprehensive assessment tool will be necessary in BMR; however, it should meet the specific environment and climate condition for housing estate development at BMR.
Resumo:
Background For CAM to feature prominently in health care decision-making there is a need to expand the evidence-base and to further incorporate economic evaluation into research priorities. In a world of scarce health care resources and an emphasis on efficiency and clinical efficacy, CAM, as indeed do all other treatments, requires rigorous evaluation to be considered in budget decision-making. Methods Economic evaluation provides the tools to measure the costs and health consequences of CAM interventions and thereby inform decision making. This article offers CAM researchers an introductory framework for understanding, undertaking and disseminating economic evaluation. The types of economic evaluation available for the study of CAM are discussed, and decision modelling is introduced as a method for economic evaluation with much potential for use in CAM. Two types of decision models are introduced, decision trees and Markov models, along with a worked example of how each method is used to examine costs and health consequences. This is followed by a discussion of how this information is used by decision makers. Conclusions Undoubtedly, economic evaluation methods form an important part of health care decision making. Without formal training it can seem a daunting task to consider economic evaluation, however, multidisciplinary teams provide an opportunity for health economists, CAM practitioners and other interested researchers, to work together to further develop the economic evaluation of CAM.
Resumo:
The purpose of this thesis is to outline the relationship that existed in the past and exists in the present, between Australians and the War Graves and Memorials to the Missing. commemorations of Australians who died during the First World War. Their final resting places are scattered all over the world and provide a tangible record of the sacrifice of men and women in the war, and represent the final result by Official Agencies such as the Imperial, and later, Commonwealth War Graves Commission, and its agency representative, the Office of Australian War Graves, of an attempt to appropriately commemorate them. The study follows the path of history from the event of death of an individual in the First World War, through their burial; temporary grave or memorial commemoration; the permanent commemoration; the family and public reaction to the deaths; how the Official Agencies of related Commonwealth Governments dealt with the dead; and finally, how the Australian dead are represented on the battlefields of the world in the 21st century. Australia.s war dead of the First World War are scattered around the globe in more than 40 countries and are represented in war cemeteries and civil cemeteries; and listed on large „Memorials to the Missing., which commemorate the individuals devoid of a known graves or final resting place.
Resumo:
The present rate of technological advance continues to place significant demands on data storage devices. The sheer amount of digital data being generated each year along with consumer expectations, fuels these demands. At present, most digital data is stored magnetically, in the form of hard disk drives or on magnetic tape. The increase in areal density (AD) of magnetic hard disk drives over the past 50 years has been of the order of 100 million times, and current devices are storing data at ADs of the order of hundreds of gigabits per square inch. However, it has been known for some time that the progress in this form of data storage is approaching fundamental limits. The main limitation relates to the lower size limit that an individual bit can have for stable storage. Various techniques for overcoming these fundamental limits are currently the focus of considerable research effort. Most attempt to improve current data storage methods, or modify these slightly for higher density storage. Alternatively, three dimensional optical data storage is a promising field for the information storage needs of the future, offering very high density, high speed memory. There are two ways in which data may be recorded in a three dimensional optical medium; either bit-by-bit (similar in principle to an optical disc medium such as CD or DVD) or by using pages of bit data. Bit-by-bit techniques for three dimensional storage offer high density but are inherently slow due to the serial nature of data access. Page-based techniques, where a two-dimensional page of data bits is written in one write operation, can offer significantly higher data rates, due to their parallel nature. Holographic Data Storage (HDS) is one such page-oriented optical memory technique. This field of research has been active for several decades, but with few commercial products presently available. Another page-oriented optical memory technique involves recording pages of data as phase masks in a photorefractive medium. A photorefractive material is one by which the refractive index can be modified by light of the appropriate wavelength and intensity, and this property can be used to store information in these materials. In phase mask storage, two dimensional pages of data are recorded into a photorefractive crystal, as refractive index changes in the medium. A low-intensity readout beam propagating through the medium will have its intensity profile modified by these refractive index changes and a CCD camera can be used to monitor the readout beam, and thus read the stored data. The main aim of this research was to investigate data storage using phase masks in the photorefractive crystal, lithium niobate (LiNbO3). Firstly the experimental methods for storing the two dimensional pages of data (a set of vertical stripes of varying lengths) in the medium are presented. The laser beam used for writing, whose intensity profile is modified by an amplitudemask which contains a pattern of the information to be stored, illuminates the lithium niobate crystal and the photorefractive effect causes the patterns to be stored as refractive index changes in the medium. These patterns are read out non-destructively using a low intensity probe beam and a CCD camera. A common complication of information storage in photorefractive crystals is the issue of destructive readout. This is a problem particularly for holographic data storage, where the readout beam should be at the same wavelength as the beam used for writing. Since the charge carriers in the medium are still sensitive to the read light field, the readout beam erases the stored information. A method to avoid this is by using thermal fixing. Here the photorefractive medium is heated to temperatures above 150�C; this process forms an ionic grating in the medium. This ionic grating is insensitive to the readout beam and therefore the information is not erased during readout. A non-contact method for determining temperature change in a lithium niobate crystal is presented in this thesis. The temperature-dependent birefringent properties of the medium cause intensity oscillations to be observed for a beam propagating through the medium during a change in temperature. It is shown that each oscillation corresponds to a particular temperature change, and by counting the number of oscillations observed, the temperature change of the medium can be deduced. The presented technique for measuring temperature change could easily be applied to a situation where thermal fixing of data in a photorefractive medium is required. Furthermore, by using an expanded beam and monitoring the intensity oscillations over a wide region, it is shown that the temperature in various locations of the crystal can be monitored simultaneously. This technique could be used to deduce temperature gradients in the medium. It is shown that the three dimensional nature of the recording medium causes interesting degradation effects to occur when the patterns are written for a longer-than-optimal time. This degradation results in the splitting of the vertical stripes in the data pattern, and for long writing exposure times this process can result in the complete deterioration of the information in the medium. It is shown in that simply by using incoherent illumination, the original pattern can be recovered from the degraded state. The reason for the recovery is that the refractive index changes causing the degradation are of a smaller magnitude since they are induced by the write field components scattered from the written structures. During incoherent erasure, the lower magnitude refractive index changes are neutralised first, allowing the original pattern to be recovered. The degradation process is shown to be reversed during the recovery process, and a simple relationship is found relating the time at which particular features appear during degradation and recovery. A further outcome of this work is that the minimum stripe width of 30 ìm is required for accurate storage and recovery of the information in the medium, any size smaller than this results in incomplete recovery. The degradation and recovery process could be applied to an application in image scrambling or cryptography for optical information storage. A two dimensional numerical model based on the finite-difference beam propagation method (FD-BPM) is presented and used to gain insight into the pattern storage process. The model shows that the degradation of the patterns is due to the complicated path taken by the write beam as it propagates through the crystal, and in particular the scattering of this beam from the induced refractive index structures in the medium. The model indicates that the highest quality pattern storage would be achieved with a thin 0.5 mm medium; however this type of medium would also remove the degradation property of the patterns and the subsequent recovery process. To overcome the simplistic treatment of the refractive index change in the FD-BPM model, a fully three dimensional photorefractive model developed by Devaux is presented. This model shows significant insight into the pattern storage, particularly for the degradation and recovery process, and confirms the theory that the recovery of the degraded patterns is possible since the refractive index changes responsible for the degradation are of a smaller magnitude. Finally, detailed analysis of the pattern formation and degradation dynamics for periodic patterns of various periodicities is presented. It is shown that stripe widths in the write beam of greater than 150 ìm result in the formation of different types of refractive index changes, compared with the stripes of smaller widths. As a result, it is shown that the pattern storage method discussed in this thesis has an upper feature size limit of 150 ìm, for accurate and reliable pattern storage.
Resumo:
This study examined the distribution of major mosquito species and their roles in the transmission of Ross River virus (RRV) infection for coastline and inland areas in Brisbane, Australia (27°28′ S, 153°2′ E). We obtained data on the monthly counts of RRV cases in Brisbane between November 1998 and December 2001 by statistical local areas from the Queensland Department of Health and the monthly mosquito abundance from the Brisbane City Council. Correlation analysis was used to assess the pairwise relationships between mosquito density and the incidence of RRV disease. This study showed that the mosquito abundance of Aedes vigilax (Skuse), Culex annulirostris (Skuse), and Aedes vittiger (Skuse) were significantly associated with the monthly incidence of RRV in the coastline area, whereas Aedes vigilax, Culex annulirostris, and Aedes notoscriptus (Skuse) were significantly associated with the monthly incidence of RRV in the inland area. The results of the classification and regression tree (CART) analysis show that both occurrence and incidence of RRV were influenced by interactions between species in both coastal and inland regions. We found that there was an 89% chance for an occurrence of RRV if the abundance of Ae. vigifax was between 64 and 90 in the coastline region. There was an 80% chance for an occurrence of RRV if the density of Cx. annulirostris was between 53 and 74 in the inland area. The results of this study may have applications as a decision support tool in planning disease control of RRV and other mosquito-borne diseases.
Resumo:
After bone fracture, various cellular activities lead to the formation of different tissue types, which form the basis for the process of secondary bone healing. Although these tissues have been quantified by histology, their material properties are not well understood. Thus, the aim of this study is to correlate the spatial and temporal variations in the mineral content and the nanoindentation modulus of the callus formed via intramembranous ossification over the course of bone healing. Midshaft tibial samples from a sheep osteotomy model at time points of 2, 3, 6 and 9 weeks were employed. PMMA embedded blocks were used for quantitative back scattered electron imaging and nanoindentation of the newly formed periosteal callus near the cortex. The resulting indentation modulus maps show the heterogeneity in the modulus in the selected regions of the callus. The indentation modulus of the embedded callus is about 6 GPa at the early stage. At later stages of mineralization, the average indentation modulus reaches 14 GPa. There is a slight decrease in average indentation modulus in regions distant to the cortex, probably due to remodelling of the peripheral callus. The spatial and temporal distribution of mineral content in the callus tissue also illustrates the ongoing remodelling process observed from histological analysis. Most interestingly the average indentation modulus, even at 9 weeks, remains as low as 13 GPa, which is roughly 60% of that for cortical sheep bone. The decreased indentation modulus in the callus compared to cortex is due to the lower average mineral content and may be perhaps also due to the properties of the organic matrix which might be different from normal bone.
Resumo:
Developing safe and sustainable road systems is a common goal in all countries. Applications to assist with road asset management and crash minimization are sought universally. This paper presents a data mining methodology using decision trees for modeling the crash proneness of road segments using available road and crash attributes. The models quantify the concept of crash proneness and demonstrate that road segments with only a few crashes have more in common with non-crash roads than roads with higher crash counts. This paper also examines ways of dealing with highly unbalanced data sets encountered in the study.
Resumo:
Road crashes cost world and Australian society a significant proportion of GDP, affecting productivity and causing significant suffering for communities and individuals. This paper presents a case study that generates data mining models that contribute to understanding of road crashes by allowing examination of the role of skid resistance (F60) and other road attributes in road crashes. Predictive data mining algorithms, primarily regression trees, were used to produce road segment crash count models from the road and traffic attributes of crash scenarios. The rules derived from the regression trees provide evidence of the significance of road attributes in contributing to crash, with a focus on the evaluation of skid resistance.