941 resultados para Biodosimetry errors


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Coal Seam Gas (CSG) production is achieved by extracting groundwater to depressurize coal seam aquifers in order to promote methane gas desorption from coal micropores. CSG waters are characteristically alkaline, have a neutral pH (~7), are of the Na-HCO3-Cl type, and exhibit brackish salinity. In 2004, a CSG exploration company carried out a gas flow test in an exploration well located in Maramarua (Waikato Region, New Zealand). This resulted in 33 water samples exhibiting noteworthy chemical variations induced by pumping. This research identifies the main causes of hydrochemical variations in CSG water, makes recommendations to manage this effect, and discusses potential environmental implications. Hydrochemical variations were studied using Factor Analysis and this was supported with hydrochemical modelling and a laboratory experiment. This reveals carbon dioxide (CO2) degassing as the principal source of hydrochemical variability (about 33%). Factor Analysis also shows that major ion variations could also reflect changes in hydrochemical composition induced by different pumping regimes. Subsequent chloride, calcium, and TDS variations could be a consequence of analytical errors potentially committed during laboratory determinations. CSG water chemical variations due to degassing during pumping can be minimized with good completion and production techniques; variations due to sample degassing can be controlled by taking precautions during sampling, transit, storage and analysis. In addition, the degassing effect observed in CSG waters can lead to an underestimation of their potential environmental effect. Calcium precipitation due to exposure to normal atmospheric pressure results in a 23% increase in SAR values from Maramarua CSG water samples.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A significant issue encountered when fusing data received from multiple sensors is the accuracy of the timestamp associated with each piece of data. This is particularly important in applications such as Simultaneous Localisation and Mapping (SLAM) where vehicle velocity forms an important part of the mapping algorithms; on fastmoving vehicles, even millisecond inconsistencies in data timestamping can produce errors which need to be compensated for. The timestamping problem is compounded in a robot swarm environment due to the use of non-deterministic readily-available hardware (such as 802.11-based wireless) and inaccurate clock synchronisation protocols (such as Network Time Protocol (NTP)). As a result, the synchronisation of the clocks between robots can be out by tens-to-hundreds of milliseconds making correlation of data difficult and preventing the possibility of the units performing synchronised actions such as triggering cameras or intricate swarm manoeuvres. In this thesis, a complete data fusion unit is designed, implemented and tested. The unit, named BabelFuse, is able to accept sensor data from a number of low-speed communication buses (such as RS232, RS485 and CAN Bus) and also timestamp events that occur on General Purpose Input/Output (GPIO) pins referencing a submillisecondaccurate wirelessly-distributed "global" clock signal. In addition to its timestamping capabilities, it can also be used to trigger an attached camera at a predefined start time and frame rate. This functionality enables the creation of a wirelessly-synchronised distributed image acquisition system over a large geographic area; a real world application for this functionality is the creation of a platform to facilitate wirelessly-distributed 3D stereoscopic vision. A ‘best-practice’ design methodology is adopted within the project to ensure the final system operates according to its requirements. Initially, requirements are generated from which a high-level architecture is distilled. This architecture is then converted into a hardware specification and low-level design, which is then manufactured. The manufactured hardware is then verified to ensure it operates as designed and firmware and Linux Operating System (OS) drivers are written to provide the features and connectivity required of the system. Finally, integration testing is performed to ensure the unit functions as per its requirements. The BabelFuse System comprises of a single Grand Master unit which is responsible for maintaining the absolute value of the "global" clock. Slave nodes then determine their local clock o.set from that of the Grand Master via synchronisation events which occur multiple times per-second. The mechanism used for synchronising the clocks between the boards wirelessly makes use of specific hardware and a firmware protocol based on elements of the IEEE-1588 Precision Time Protocol (PTP). With the key requirement of the system being submillisecond-accurate clock synchronisation (as a basis for timestamping and camera triggering), automated testing is carried out to monitor the o.sets between each Slave and the Grand Master over time. A common strobe pulse is also sent to each unit for timestamping; the correlation between the timestamps of the di.erent units is used to validate the clock o.set results. Analysis of the automated test results show that the BabelFuse units are almost threemagnitudes more accurate than their requirement; clocks of the Slave and Grand Master units do not di.er by more than three microseconds over a running time of six hours and the mean clock o.set of Slaves to the Grand Master is less-than one microsecond. The common strobe pulse used to verify the clock o.set data yields a positive result with a maximum variation between units of less-than two microseconds and a mean value of less-than one microsecond. The camera triggering functionality is verified by connecting the trigger pulse output of each board to a four-channel digital oscilloscope and setting each unit to output a 100Hz periodic pulse with a common start time. The resulting waveform shows a maximum variation between the rising-edges of the pulses of approximately 39¥ìs, well below its target of 1ms.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The aim of this study was to investigate the effect of court surface (clay v hard-court) on technical, physiological and perceptual responses to on-court training. Four high-performance junior male players performed two identical training sessions on hard and clay courts, respectively. Sessions included both physical conditioning and technical elements as led by the coach. Each session was filmed for later notational analysis of stroke count and error rates. Further, players wore a global positioning satellite device to measure distance covered during each session; whilst heart rate, countermovement jump distance and capillary blood measures of metabolites were measured before, during and following each session. Additionally a respective coach and athlete rating of perceived exertion (RPE) were measured following each session. Total duration and distance covered during of each session were comparable (P>0.05; d<0.20). While forehand and backhands stroke volume did not differ between sessions (P>0.05; d<0.30); large effects for increased unforced and forced errors were present on the hard court (P>0.05; d>0.90). Furthermore, large effects for increased heart rate, blood lactate and RPE values were evident on clay compared to hard courts (P>0.05; d>0.90). Additionally, while player and coach RPE on hard courts were similar, there were large effects for coaches to underrate the RPE of players on clay courts (P>0.05; d>0.90). In conclusion, training on clay courts results in trends for increased heart rate, lactate and RPE values, suggesting sessions on clay tend towards higher physiological and perceptual loads than hard courts. Further, coaches appear effective at rating player RPE on hard courts, but may underrate the perceived exertion of sessions on clay courts.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The aim of children's vision screenings is to detect visual problems that are common in this age category through valid and reliable tests. Nevertheless, the cost effectiveness of paediatric vision screenings, the nature of the tests included in the screening batteries and the ideal screening age has been the cause of much debate in Australia and worldwide. Therefore, the purpose of this review is to report on the current practice of children's vision screenings in Australia and other countries, as well as to evaluate the evidence for and against the provision of such screenings. This was undertaken through a detailed investigation of peer-reviewed publications on this topic. The current review demonstrates that there is no agreed vision screening protocol for children in Australia. This appears to be a result of the lack of strong evidence supporting the benefit of such screenings. While amblyopia, strabismus and, to a lesser extent refractive error, are targeted by many screening programs during pre-school and at school entry, there is less agreement regarding the value of screening for other visual conditions, such as binocular vision disorders, ocular health problems and refractive errors that are less likely to reduce distance visual acuity. In addition, in Australia, little agreement exists in the frequency and coverage of screening programs between states and territories and the screening programs that are offered are ad hoc and poorly documented. Australian children stand to benefit from improved cohesion and communication between jurisdictions and health professionals to enable an equitable provision of validated vision screening services that have the best chance of early detection and intervention for a range of paediatric visual problems.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Motorcycle trauma is a serious road safety issue in Queensland and throughout Australia. In 2009, Queensland Transport (later Transport and Main Roads or TMR) appointed CARRS-Q to provide a three-year program of Road Safety Research Services for Motorcycle Rider Safety. Funding for this research originated from the Motor Accident Insurance Commission. This program of research was undertaken to produce knowledge to assist TMR to improve motorcycle safety by further strengthening the licensing and training system to make learner riders safer by developing a pre-learner package (Deliverable 1 which is the focus of this report), and by evaluating the Q-Ride CAP program to ensure that it is maximally effective and contributes to the best possible training for new riders (Deliverable 2), which is the focus of this report. Deliverable 3 of the program identified potential new licensing components that will reduce the incidence of risky riding and improve higher-order cognitive skills in new riders. While fatality and injury rates for learner car drivers are typically lower than for those with intermediate licences, this pattern is not found for learner motorcycle riders. Learner riders cannot be supervised as effectively as learner car drivers and errors are more likely to result in injury for learner riders than learner drivers. It is therefore imperative to improve safety for learner riders. Deliverable 1 examines the potential for improving the motorcycle learner and licence scheme by introducing a pre-learner motorcycle licensing and training scheme within Queensland. The tasks undertaken for Deliverable 1 were a literature review, analysis of learner motorcyclist crash and licensing data, and the development of a potential pre-learner motorcycle rider program.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This study compared the performance of a local and three robust optimality criteria in terms of the standard error for a one-parameter and a two-parameter nonlinear model with uncertainty in the parameter values. The designs were also compared in conditions where there was misspecification in the prior parameter distribution. The impact of different correlation between parameters on the optimal design was examined in the two-parameter model. The designs and standard errors were solved analytically whenever possible and numerically otherwise.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The accuracy and reliability of urban stormwater quality modelling outcomes are important for stormwater management decision making. The commonly adopted approach where only a limited number of factors are used to predict urban stormwater quality may not adequately represent the complexity of the quality response to a rainfall event or site-to-site differences to support efficient treatment design. This paper discusses an investigation into the influence of rainfall and catchment characteristics on urban stormwater quality in order to investigate the potential areas for errors in current stormwater quality modelling practices. It was found that the influence of rainfall characteristics on pollutant wash-off is step-wise based on specific thresholds. This means that a modelling approach where the wash-off process is predicted as a continuous function of rainfall intensity and duration is not appropriate. Additionally, other than conventional catchment characteristics, namely, land use and impervious surface fraction, other catchment characteristics such as impervious area layout, urban form and site specific characteristics have an important influence on both, pollutant build-up and wash-off processes. Finally, the use of solids as a surrogate to estimate other pollutant species was found to be inappropriate. Individually considering build-up and wash-off processes for each pollutant species should be the preferred option.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Flood related scientific and community-based data are rarely systematically collected and analysed in the Philippines. Over the last decades the Pagsangaan River Basin, Leyte, has experienced several flood events. However, documentation describing flood characteristics such as extent, duration or height of these floods are close to non-existing. To address this issue, computerized flood modelling was used to reproduce past events where there was data available for at least partial calibration and validation. The model was also used to provide scenario-based predictions based on A1B climate change assumptions for the area. The most important input for flood modelling is a Digital Elevation Model (DEM) of the river basin. No accurate topographic maps or Light Detection And Ranging (LIDAR)-generated data are available for the Pagsangaan River. Therefore, the Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) Global Digital Elevation Map (GDEM), Version 1, was chosen as the DEM. Although the horizontal spatial resolution of 30 m is rather desirable, it contains substantial vertical errors. These were identified, different correction methods were tested and the resulting DEM was used for flood modelling. The above mentioned data were combined with cross-sections at various strategic locations of the river network, meteorological records, river water level, and current velocity to develop the 1D-2D flood model. SOBEK was used as modelling software to create different rainfall scenarios, including historic flooding events. Due to the lack of scientific data for the verification of the model quality, interviews with local stakeholders served as the gauge to judge the quality of the generated flood maps. According to interviewees, the model reflects reality more accurately than previously available flood maps. The resulting flood maps are now used by the operations centre of a local flood early warning system for warnings and evacuation alerts. Furthermore these maps can serve as a basis to identify flood hazard areas for spatial land use planning purposes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Deploying networked control systems (NCSs) over wireless networks is becoming more and more popular. However, the widely-used transport layer protocols, Transmission Control Protocol (TCP) and User Datagram Protocol (UDP), are not designed for real-time applications. Therefore, they may not be suitable for many NCS application scenarios because of their limitations on reliability and/or delay performance, which real-control systems concern. Considering a typical type of NCSs with periodic and sporadic real-time traffic, this paper proposes a highly reliable transport layer protocol featuring a packet loss-sensitive retransmission mechanism and a prioritized transmission mechanism. The packet loss-sensitive retransmission mechanism is designed to improve the reliability of all traffic flows. And the prioritized transmission mechanism offers differentiated services for periodic and sporadic flows. Simulation results show that the proposed protocol has better reliability than UDP and improved delay performance than TCP over wireless networks, particularly when channel errors and congestions occur.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Reliable approaches for predicting pollutant build-up are essential for accurate urban stormwater quality modelling. Based on the in-depth investigation of metal build-up on residential road surfaces, this paper presents empirical models for predicting metal loads on these surfaces. The study investigated metals commonly present in the urban environment. Analysis undertaken found that the build-up process for metals primarily originating from anthropogenic (copper and zinc) and geogenic (aluminium, calcium, iron and manganese) sources were different. Chromium and nickel were below detection limits. Lead was primarily associated with geogenic sources, but also exhibited a significant relationship with anthropogenic sources. The empirical prediction models developed were validated using an independent data set and found to have relative prediction errors of 12-50%, which is generally acceptable for complex systems such as urban road surfaces. Also, the predicted values were very close to the observed values and well within 95% prediction interval.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

As a key department within a healthcare organisation, the operating room is a hazardous environment, where the consequences of errors are high, despite the relatively low rates of occurrence. Team performance in surgery is increasingly being considered crucial for a culture of safety. The aim of this study was to describe team communication and the ways it fostered or threatened safety culture in surgery. Ethnography was used, and involved a 6-month fieldwork period of observation and 19 interviews with 24 informants from nursing, anaesthesia and surgery. Data were collected during 2009 in the operating rooms of a tertiary care facility in Queensland, Australia. Through analysis of the textual data, three themes that exemplified teamwork culture in surgery were generated: ‘‘building shared understandings through open communication’’; ‘‘managing contextual stressors in a hierarchical environment’’ and ‘‘intermittent membership influences team performance’’. In creating a safety culture in a healthcare organisation, a team’s optimal performance relies on the open discussion of teamwork and team expectation, and significantly depends on how the organisational culture promotes such discussions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

None of currently used tonometers produce estimated IOP values that are free of errors. Measurement incredibility arises from indirect measurement of corneal deformation and the fact that pressure calculations are based on population averaged parameters of anterior segment. Reliable IOP values are crucial for understanding and monitoring of number of eye pathologies e.g. glaucoma. We have combined high speed swept source OCT with air-puff chamber. System provides direct measurement of deformation of cornea and anterior surface of the lens. This paper describes in details the performance of air-puff ssOCT instrument. We present different approaches of data presentation and analysis. Changes in deformation amplitude appears to be good indicator of IOP changes. However, it seems that in order to provide accurate intraocular pressure values an additional information on corneal biomechanics is necessary. We believe that such information could be extracted from data provided by air-puff ssOCT.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Many older people have difficulties using modern consumer products due to increased product complexity both in terms of functionality and interface design. Previous research has shown that older people have more difficulty in using complex devices intuitively when compared to the younger. Furthermore, increased life expectancy and a falling birth rate have been catalysts for changes in world demographics over the past two decades. This trend also suggests a proportional increase of older people in the work-force. This realisation has led to research on the effective use of technology by older populations in an effort to engage them more productively and to assist them in leading independent lives. Ironically, not enough attention has been paid to the development of interaction design strategies that would actually enable older users to better exploit new technologies. Previous research suggests that if products are designed to reflect people's prior knowledge, they will appear intuitive to use. Since intuitive interfaces utilise domain-specific prior knowledge of users, they require minimal learning for effective interaction. However, older people are very diverse in their capabilities and domain-specific prior knowledge. In addition, ageing also slows down the process of acquiring new knowledge. Keeping these suggestions and limitations in view, the aim of this study was set to investigate possible approaches to developing interfaces that facilitate their intuitive use by older people. In this quest to develop intuitive interfaces for older people, two experiments were conducted that systematically investigated redundancy (the use of both text and icons) in interface design, complexity of interface structure (nested versus flat), and personal user factors such as cognitive abilities, perceived self-efficacy and technology anxiety. All of these factors could interfere with intuitive use. The results from the first experiment suggest that, contrary to what was hypothesised, older people (65+ years) completed the tasks on the text only based interface design faster than on the redundant interface design. The outcome of the second experiment showed that, as expected, older people took more time on a nested interface. However, they did not make significantly more errors compared with younger age groups. Contrary to what was expected, older age groups also did better under anxious conditions. The findings of this study also suggest that older age groups are more heterogeneous in their capabilities and their intuitive use of contemporary technological devices is mediated more by domain-specific technology prior knowledge and by their cognitive abilities, than chronological age. This makes it extremely difficult to develop product interfaces that are entirely intuitive to use. However, by keeping in view the cognitive limitations of older people when interfaces are developed, and using simple text-based interfaces with flat interface structure, would help them intuitively learn and use complex technological products successfully during early encounter with a product. These findings indicate that it might be more pragmatic if interfaces are designed for intuitive learning rather than for intuitive use. Based on this research and the existing literature, a model for adaptable interface design as a strategy for developing intuitively learnable product interfaces was proposed. An adaptable interface can initially use a simple text only interface to help older users to learn and successfully use the new system. Over time, this can be progressively changed to a symbols-based nested interface for more efficient and intuitive use.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

There has been a low level of interest in peripheral aberrations and corresponding image quality for over 200 years. Most work has been concerned with the second-order aberrations of defocus and astigmatism that can be corrected with conventional lenses. Studies have found high levels of aberration, often amounting to several dioptres, even in eyes with only small central defocus and astigmatism. My investigations have contributed to understanding shape changes in the eye with increases in myopia, changes in eye optics with ageing, and how surgical interventions intended to correct central refractive errors have unintended effects on peripheral optics. My research group has measured peripheral second- and higher-order aberrations over a 42° horizontal × 32° vertical diameter visual field. There is substantial variation in individual aberrations with age and pathology. While the higher-order aberrations in the periphery are usually small compared with second-order aberrations, they can be substantial and change considerably after refractive surgery. The thrust of my research in the next few years is to understand more about the peripheral aberrations of the human eye, to measure visual performance in the periphery and determine whether this can be improved by adaptive optics correction, to use measurements of peripheral aberrations to learn more about the optics of the eye and in particular the gradient index structure of the lens, and to investigate ways of increasing the size of the field of good retinal image quality.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The primary objective of this study is to develop a robust queue estimation algorithm for motorway on-ramps. Real-time queue information is a vital input for dynamic queue management on metered on-ramps. Accurate and reliable queue information enables the management of on-ramp queue in an adaptive manner to the actual traffic queue size and thus minimises the adverse impacts of queue flush while increasing the benefit of ramp metering. The proposed algorithm is developed based on the Kalman filter framework. The fundamental conservation model is used to estimate the system state (queue size) with the flow-in and flow-out measurements. This projection results are updated with the measurement equation using the time occupancies from mid-link and link-entrance loop detectors. This study also proposes a novel single point correction method. This method resets the estimated system state to eliminate the counting errors that accumulate over time. In the performance evaluation, the proposed algorithm demonstrated accurate and reliable performances and consistently outperformed the benchmarked Single Occupancy Kalman filter (SOKF) method. The improvements over SOKF are 62% and 63% in average in terms of the estimation accuracy (MAE) and reliability (RMSE), respectively. The benefit of the innovative concepts of the algorithm is well justified by the improved estimation performance in congested ramp traffic conditions where long queues may significantly compromise the benchmark algorithm’s performance.