896 resultados para Weather Conditions
Resumo:
The Street Computing workshop, held in conjunction with OZCHI 2009, solicits papers discussing new research directions, early research results, works-in-progress and critical surveys of prior research work in the areas of ubiquitous computing and interaction design for urban environments. Urban spaces have unique characteristics. Typically, they are densely populated, buzzing with life twenty-four hours a day, seven days a week. These traits afford many opportunities, but they also present many challenges: traffic jams, smog and pollution, stress placed on public services, and more. Computing technology, particularly the kind that can be placed in the hands of citizens, holds much promise in combating some of these challenges. Yet, computation is not merely a tool for overcoming challenges; rather, when embedded appropriately in our everyday lives, it becomes a tool of opportunity, for shaping how our cities evolve, for enabling us to interact with our city and its people in new ways, and for uncovering useful, but hidden relationships and correlations between elements of the city. The increasing availability of an urban computing infrastructure has lead to new and exciting ways inhabitants can interact with their city. This includes interaction with a wide range of services (e.g. public transport, public services), conceptual representations of the city (e.g. local weather and traffic conditions), the availability of a variety of shared and personal displays (e.g. public, ambient, mobile) and the use of different interaction modes (e.g. tangible, gesture-based, token-based). This workshop solicits papers that address the above themes in some way. We encourage researchers to submit work that deals with challenges and possibilities that the availability of urban computing infrastructure such as sensors and middleware for sensor networks pose. This includes new and innovative ways of interacting with and within urban environments; user experience design and participatory design approaches for urban environments; social aspects of urban computing; and other related areas.
Resumo:
Purpose: To evaluate the on-road driving performance of persons with homonymous hemianopia or quadrantanopia in comparison to age-matched controls with normal visual fields. Methods: Participants were 22 hemianopes and eight quadrantanopes (mean age 53 years) and 30 persons with normal visual fields (mean age 52 years) and were either current drivers or aiming to resume driving. All participants completed a battery of tests of vision (ETDRS visual acuity, Pelli-Robson letter contrast sensitivity, Humphrey visual fields), cognitive tests (trials A and B, Mini Mental State Examination, Digit Symbol Substitution) and an on-road driving assessment. Driving performance was assessed in a dual-brake vehicle with safety monitored by a certified driving rehabilitation specialist. Backseat evaluators masked to the clinical characteristics of participants independently rated driving performance along a 22.7 kilometre route involving urban and interstate driving. Results: Seventy-three per cent of the hemianopes, 88 per cent of quadrantanopes and all of the drivers with normal fields received safe driving ratings. Those hemianopic and quadrantanopic drivers rated as unsafe tended to have problems with maintaining appropriate lane position, steering steadiness and gap judgment compared to controls. Unsafe driving was associated with slower visual processing speed and impairments in contrast sensitivity, visual field sensitivity and executive function. Conclusions: Our findings suggest that some drivers with hemianopia or quadrantanopia are capable of safe driving performance, when compared to those of the same age with normal visual fields. This finding has important implications for the assessment of fitness to drive in this population.
Resumo:
The dynamic interaction between building systems and external climate is extremely complex, involving a large number of difficult-to-predict variables. In order to study the impact of climate change on the built environment, the use of building simulation techniques together with forecast weather data are often necessary. Since most of building simulation programs require hourly meteorological input data for their thermal comfort and energy evaluation, the provision of suitable weather data becomes critical. In this paper, the methods used to prepare future weather data for the study of the impact of climate change are reviewed. The advantages and disadvantages of each method are discussed. The inherent relationship between these methods is also illustrated. Based on these discussions and the analysis of Australian historic climatic data, an effective framework and procedure to generate future hourly weather data is presented. It is shown that this method is not only able to deal with different levels of available information regarding the climate change, but also can retain the key characters of a “typical” year weather data for a desired period.
Resumo:
Local climate is a critical element in the design of energy efficient buildings. In this paper, ten years of historical weather data in Australia's eight capital cities were profiled and analysed to characterize the variations of climatic variables in Australia. The method of descriptive statistics was employed. Either the pattern of cumulative distribution and/or the profile of percentage distribution are presented. It was found that although weather variables vary with different locations, there is often a good, nearly linear relation between a weather variable and its cumulative percentage for the majority of middle part of the cumulative curves. By comparing the slopes of these distribution profiles, it may be possible to determine the relative range of changes of the particular weather variables for a given city. The implications of these distribution profiles of key weather variables on energy efficient building design are also discussed.
Resumo:
Seasonal patterns have been found in a remarkable range of health conditions, including birth defects, respiratory infections and cardiovascular disease. Accurately estimating the size and timing of seasonal peaks in disease incidence is an aid to understanding the causes and possibly to developing interventions. With global warming increasing the intensity of seasonal weather patterns around the world, a review of the methods for estimating seasonal effects on health is timely. This is the first book on statistical methods for seasonal data written for a health audience. It describes methods for a range of outcomes (including continuous, count and binomial data) and demonstrates appropriate techniques for summarising and modelling these data. It has a practical focus and uses interesting examples to motivate and illustrate the methods. The statistical procedures and example data sets are available in an R package called ‘season’. Adrian Barnett is a senior research fellow at Queensland University of Technology, Australia. Annette Dobson is a Professor of Biostatistics at The University of Queensland, Australia. Both are experienced medical statisticians with a commitment to statistical education and have previously collaborated in research in the methodological developments and applications of biostatistics, especially to time series data. Among other projects, they worked together on revising the well-known textbook "An Introduction to Generalized Linear Models," third edition, Chapman Hall/CRC, 2008. In their new book they share their knowledge of statistical methods for examining seasonal patterns in health.
Resumo:
Visual localization systems that are practical for autonomous vehicles in outdoor industrial applications must perform reliably in a wide range of conditions. Changing outdoor conditions cause difficulty by drastically altering the information available in the camera images. To confront the problem, we have developed a visual localization system that uses a surveyed three-dimensional (3D)-edge map of permanent structures in the environment. The map has the invariant properties necessary to achieve long-term robust operation. Previous 3D-edge map localization systems usually maintain a single pose hypothesis, making it difficult to initialize without an accurate prior pose estimate and also making them susceptible to misalignment with unmapped edges detected in the camera image. A multihypothesis particle filter is employed here to perform the initialization procedure with significant uncertainty in the vehicle's initial pose. A novel observation function for the particle filter is developed and evaluated against two existing functions. The new function is shown to further improve the abilities of the particle filter to converge given a very coarse estimate of the vehicle's initial pose. An intelligent exposure control algorithm is also developed that improves the quality of the pertinent information in the image. Results gathered over an entire sunny day and also during rainy weather illustrate that the localization system can operate in a wide range of outdoor conditions. The conclusion is that an invariant map, a robust multihypothesis localization algorithm, and an intelligent exposure control algorithm all combine to enable reliable visual localization through challenging outdoor conditions.
Resumo:
The increasing use of biodegradable devices in tissue engineering and regenerative medicine means it is essential to study and understand their degradation behaviour. Accelerated degradation systems aim to achieve similar degradation profiles within a shorter period of time, compared with standard conditions. However, these conditions only partially mimic the actual situation, and subsequent analyses and derived mechanisms must be treated with caution and should always be supported by actual long-term degradation data obtained under physiological conditions. Our studies revealed that polycaprolactone (PCL) and PCL-composite scaffolds degrade very differently under these different degradation conditions, whilst still undergoing hydrolysis. Molecular weight and mass loss results differ due to the different degradation pathways followed (surface degradation pathway for accelerated conditions and bulk degradation pathway for simulated physiological conditions). Crystallinity studies revealed similar patterns of recrystallization dynamics, and mechanical data indicated that the scaffolds retained their functional stability, in both instances, over the course of degradation. Ultimately, polymer degradation was shown to be chiefly governed by molecular weight, crystallinity susceptibility to hydrolysis and device architecture considerations whilst maintaining its thermodynamic equilibrium.
Resumo:
The paper examines whether there was an excess of deaths and the relative role of temperature and ozone in a heatwave during 7–26 February 2004 in Brisbane, Australia, a subtropical city accustomed to warm weather. The data on daily counts of deaths from cardiovascular disease and non-external causes, meteorological conditions, and air pollution in Brisbane from 1 January 2001 to 31 October 2004 were supplied by the Australian Bureau of Statistics, Australian Bureau of Meteorology, and Queensland Environmental Protection Agency, respectively. The relationship between temperature and mortality was analysed using a Poisson time series regression model with smoothing splines to control for nonlinear effects of confounding factors. The highest temperature recorded in the 2004 heatwave was 42°C compared with the highest recorded temperature of 34°C during the same periods of 2001–2003. There was a significant relationship between exposure to heat and excess deaths in the 2004 heatwave estimated increase in non-external deaths: 75 [(95% confidence interval, CI: 11–138; cardiovascular deaths: 41 (95% CI: −2 to 84)]. There was no apparent evidence of substantial short-term mortality displacement. The excess deaths were mainly attributed to temperature but exposure to ozone also contributed to these deaths.
Resumo:
The Inflatable Rescue Boat (IRB) is arguably the most effective rescue tool used by the Australian surf lifesavers. The exceptional features of high mobility and rapid response have enabled it to become an icon on Australia's popular beaches. However, the IRB's extensive use within an environment that is as rugged as it is spectacular, has led it to become a danger to those who risk their lives to save others. Epidemiological research revealed lower limb injuries to be predominant, particularly the right leg. The common types of injuries were fractures and dislocations, as well as muscle or ligament strains and tears. The concern expressed by Surf Life Saving Queensland (SLSQ) and Surf Life Saving Australia (SLSA) led to a biomechanical investigation into this unique and relatively unresearched field. The aim of the research was to identify the causes of injury and propose processes that may reduce the instances and severity of injury to surf lifesavers during IRB operation. Following a review of related research, a design analysis of the craft was undertaken as an introduction to the craft, its design and uses. The mechanical characteristics of the vessel were then evaluated and the accelerations applied to the crew in the IRB were established through field tests. The data were then combined and modelled in the 3-D mathematical modelling and simulation package, MADYMO. A tool was created to compare various scenarios of boat design and methods of operation to determine possible mechanisms to reduce injuries. The results of this study showed that under simulated wave loading the boats flex around a pivot point determined by the position of the hinge in the floorboard. It was also found that the accelerations experienced by the crew exhibited similar characteristics to road vehicle accidents. Staged simulations indicated the attributes of an optimum foam in terms of thickness and density. Likewise, modelling of the boat and crew produced simulations that predicted realistic crew response to tested variables. Unfortunately, the observed lack of adherence to the SLSA footstrap Standard has impeded successful epidemiological and modelling outcomes. If uniformity of boat setup can be assured then epidemiological studies will be able to highlight the influence of implementing changes to the boat design. In conclusion, the research provided a tool to successfully link the epidemiology and injury diagnosis to the mechanical engineering design through the use of biomechanics. This was a novel application of the mathematical modelling software MADYMO. Other craft can also be investigated in this manner to provide solutions to the problem identified and therefore reduce risk of injury for the operators.
Resumo:
Speaker verification is the process of verifying the identity of a person by analysing their speech. There are several important applications for automatic speaker verification (ASV) technology including suspect identification, tracking terrorists and detecting a person’s presence at a remote location in the surveillance domain, as well as person authentication for phone banking and credit card transactions in the private sector. Telephones and telephony networks provide a natural medium for these applications. The aim of this work is to improve the usefulness of ASV technology for practical applications in the presence of adverse conditions. In a telephony environment, background noise, handset mismatch, channel distortions, room acoustics and restrictions on the available testing and training data are common sources of errors for ASV systems. Two research themes were pursued to overcome these adverse conditions: Modelling mismatch and modelling uncertainty. To directly address the performance degradation incurred through mismatched conditions it was proposed to directly model this mismatch. Feature mapping was evaluated for combating handset mismatch and was extended through the use of a blind clustering algorithm to remove the need for accurate handset labels for the training data. Mismatch modelling was then generalised by explicitly modelling the session conditions as a constrained offset of the speaker model means. This session variability modelling approach enabled the modelling of arbitrary sources of mismatch, including handset type, and halved the error rates in many cases. Methods to model the uncertainty in speaker model estimates and verification scores were developed to address the difficulties of limited training and testing data. The Bayes factor was introduced to account for the uncertainty of the speaker model estimates in testing by applying Bayesian theory to the verification criterion, with improved performance in matched conditions. Modelling the uncertainty in the verification score itself met with significant success. Estimating a confidence interval for the "true" verification score enabled an order of magnitude reduction in the average quantity of speech required to make a confident verification decision based on a threshold. The confidence measures developed in this work may also have significant applications for forensic speaker verification tasks.