941 resultados para Roundoff errors.


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Physical access control systems play a central role in the protection of critical infrastructures, where both the provision of timely access and preserving the security of sensitive areas are paramount. In this paper we discuss the shortcomings of existing approaches to the administration of physical access control in complex environments. At the heart of the problem is the current dependency on human administrators to reason about the implications of the provision or the revocation of staff access to an area within these facilities. We demonstrate how utilising Building Information Models (BIMs) and the capabilities they provide, including 3D representation of a facility and path-finding can reduce possible intentional or accidental errors made by security administrators.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

At present, many approaches have been proposed for deformable face alignment with varying degrees of success. However, the common drawback to nearly all these approaches is the inaccurate landmark registrations. The registration errors which occur are predominantly heterogeneous (i.e. low error for some frames in a sequence and higher error for others). In this paper we propose an approach for simultaneously aligning an ensemble of deformable face images stemming from the same subject given noisy heterogeneous landmark estimates. We propose that these initial noisy landmark estimates can be used as an “anchor” in conjunction with known state-of-the-art objectives for unsupervised image ensemble alignment. Impressive alignment performance is obtained using well known deformable face fitting algorithms as “anchors.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

LexisNexis Questions & Answers - Contract Law provides an understanding of contract law and gives a clear and systematic approach to analysing and answering problem and exam questions. Each chapter commences with a summary of the relevant cases and identification of the key issues. Each question is followed by a suggested answer plan, a sample answer and comments on how the answer might be assessed by an examiner. The author also offers advice on common errors to avoid and practical hints and tips on how to achieve higher marks.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purpose: The prevalence of refractive errors in children has been extensively researched. Comparisons between studies can, however, be compromised because of differences between accommodation control methods and techniques used for measuring refractive error. The aim of this study was to compare spherical refractive error results obtained at baseline and using two different accommodation control methods – extended optical fogging and cycloplegia, for two measurement techniques – autorefraction and retinoscopy. Methods: Participants comprised twenty-five school children aged between 6 and 13 years (mean age: 9.52 ± 2.06 years). The refractive error of one eye was measured at baseline and again under two different accommodation control conditions: extended optical fogging (+2.00DS for 20 minutes) and cycloplegia (1% cyclopentolate). Autorefraction and retinoscopy were both used to measure most plus spherical power for each condition. Results: A significant interaction was demonstrated between measurement technique and accommodation control method (p = 0.036), with significant differences in spherical power evident between accommodation control methods for each of the measurement techniques (p < 0.005). For retinoscopy, refractive errors were significantly more positive for cycloplegia compared to optical fogging, which were in turn significantly more positive than baseline, while for autorefraction, there were significant differences between cycloplegia and extended optical fogging and between cycloplegia and baseline only. Conclusions: Determination of refractive error under cycloplegia elicits more plus than using extended optical fogging as a method to relax accommodation. These findings support the use of cycloplegic refraction compared with extended optical fogging as a means of controlling accommodation for population based refractive error studies in children.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

For decades there have been two young driver concepts: the „young driver problem‟ where the driver cohort represents a key problem for road safety; and the „problem young driver‟ where a sub-sample of drivers represents the greatest road safety problem. Given difficulties associated with identifying and then modifying the behaviour of the latter group, broad countermeasures such as graduated driver licensing (GDL) have generally been relied upon to address the young driver problem. GDL evaluations reveal general road safety benefits for young drivers, yet they continue to be overrepresented in fatality and injury statistics. Therefore it is timely for researchers revisit the problem young driver concept to assess its potential countermeasure implications. Personal characteristics, behaviours and attitudes of 378 Queensland novice drivers aged 17-25 years were explored during their pre-, Learner and Provisional 1 (intermediate) licence as part of a larger longitudinal research project. Self-reported risky driving was measured by the Behaviour of Young Novice Drivers Scale (BYNDS), and five subscale scores were used to cluster the drivers into three groups (high risk n = 49, medium risk n = 163, low risk n = 166). High risk „problem young‟ drivers were characterised by self-reported pre-Licence driving, unsupervised Learner driving, and speeding, driving errors, risky driving exposure, crash involvement, and offence detection during the Provisional period. Medium risk drivers were also characterised by more risky road use behaviours than the low risk group. Interestingly problem young drivers appear to have some insight into their high-risk driving, and they report significantly greater intentions to bend road rules in future driving. The results suggest that in addition to broad countermeasures such as GDL which target the young driver problem, tailored intervention efforts may need to target problem young drivers. Driving behaviours and crash-involvement could be used to identify these drivers as pre-intervention screening measures.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Coal Seam Gas (CSG) production is achieved by extracting groundwater to depressurize coal seam aquifers in order to promote methane gas desorption from coal micropores. CSG waters are characteristically alkaline, have a neutral pH (~7), are of the Na-HCO3-Cl type, and exhibit brackish salinity. In 2004, a CSG exploration company carried out a gas flow test in an exploration well located in Maramarua (Waikato Region, New Zealand). This resulted in 33 water samples exhibiting noteworthy chemical variations induced by pumping. This research identifies the main causes of hydrochemical variations in CSG water, makes recommendations to manage this effect, and discusses potential environmental implications. Hydrochemical variations were studied using Factor Analysis and this was supported with hydrochemical modelling and a laboratory experiment. This reveals carbon dioxide (CO2) degassing as the principal source of hydrochemical variability (about 33%). Factor Analysis also shows that major ion variations could also reflect changes in hydrochemical composition induced by different pumping regimes. Subsequent chloride, calcium, and TDS variations could be a consequence of analytical errors potentially committed during laboratory determinations. CSG water chemical variations due to degassing during pumping can be minimized with good completion and production techniques; variations due to sample degassing can be controlled by taking precautions during sampling, transit, storage and analysis. In addition, the degassing effect observed in CSG waters can lead to an underestimation of their potential environmental effect. Calcium precipitation due to exposure to normal atmospheric pressure results in a 23% increase in SAR values from Maramarua CSG water samples.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A significant issue encountered when fusing data received from multiple sensors is the accuracy of the timestamp associated with each piece of data. This is particularly important in applications such as Simultaneous Localisation and Mapping (SLAM) where vehicle velocity forms an important part of the mapping algorithms; on fastmoving vehicles, even millisecond inconsistencies in data timestamping can produce errors which need to be compensated for. The timestamping problem is compounded in a robot swarm environment due to the use of non-deterministic readily-available hardware (such as 802.11-based wireless) and inaccurate clock synchronisation protocols (such as Network Time Protocol (NTP)). As a result, the synchronisation of the clocks between robots can be out by tens-to-hundreds of milliseconds making correlation of data difficult and preventing the possibility of the units performing synchronised actions such as triggering cameras or intricate swarm manoeuvres. In this thesis, a complete data fusion unit is designed, implemented and tested. The unit, named BabelFuse, is able to accept sensor data from a number of low-speed communication buses (such as RS232, RS485 and CAN Bus) and also timestamp events that occur on General Purpose Input/Output (GPIO) pins referencing a submillisecondaccurate wirelessly-distributed "global" clock signal. In addition to its timestamping capabilities, it can also be used to trigger an attached camera at a predefined start time and frame rate. This functionality enables the creation of a wirelessly-synchronised distributed image acquisition system over a large geographic area; a real world application for this functionality is the creation of a platform to facilitate wirelessly-distributed 3D stereoscopic vision. A ‘best-practice’ design methodology is adopted within the project to ensure the final system operates according to its requirements. Initially, requirements are generated from which a high-level architecture is distilled. This architecture is then converted into a hardware specification and low-level design, which is then manufactured. The manufactured hardware is then verified to ensure it operates as designed and firmware and Linux Operating System (OS) drivers are written to provide the features and connectivity required of the system. Finally, integration testing is performed to ensure the unit functions as per its requirements. The BabelFuse System comprises of a single Grand Master unit which is responsible for maintaining the absolute value of the "global" clock. Slave nodes then determine their local clock o.set from that of the Grand Master via synchronisation events which occur multiple times per-second. The mechanism used for synchronising the clocks between the boards wirelessly makes use of specific hardware and a firmware protocol based on elements of the IEEE-1588 Precision Time Protocol (PTP). With the key requirement of the system being submillisecond-accurate clock synchronisation (as a basis for timestamping and camera triggering), automated testing is carried out to monitor the o.sets between each Slave and the Grand Master over time. A common strobe pulse is also sent to each unit for timestamping; the correlation between the timestamps of the di.erent units is used to validate the clock o.set results. Analysis of the automated test results show that the BabelFuse units are almost threemagnitudes more accurate than their requirement; clocks of the Slave and Grand Master units do not di.er by more than three microseconds over a running time of six hours and the mean clock o.set of Slaves to the Grand Master is less-than one microsecond. The common strobe pulse used to verify the clock o.set data yields a positive result with a maximum variation between units of less-than two microseconds and a mean value of less-than one microsecond. The camera triggering functionality is verified by connecting the trigger pulse output of each board to a four-channel digital oscilloscope and setting each unit to output a 100Hz periodic pulse with a common start time. The resulting waveform shows a maximum variation between the rising-edges of the pulses of approximately 39¥ìs, well below its target of 1ms.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The aim of this study was to investigate the effect of court surface (clay v hard-court) on technical, physiological and perceptual responses to on-court training. Four high-performance junior male players performed two identical training sessions on hard and clay courts, respectively. Sessions included both physical conditioning and technical elements as led by the coach. Each session was filmed for later notational analysis of stroke count and error rates. Further, players wore a global positioning satellite device to measure distance covered during each session; whilst heart rate, countermovement jump distance and capillary blood measures of metabolites were measured before, during and following each session. Additionally a respective coach and athlete rating of perceived exertion (RPE) were measured following each session. Total duration and distance covered during of each session were comparable (P>0.05; d<0.20). While forehand and backhands stroke volume did not differ between sessions (P>0.05; d<0.30); large effects for increased unforced and forced errors were present on the hard court (P>0.05; d>0.90). Furthermore, large effects for increased heart rate, blood lactate and RPE values were evident on clay compared to hard courts (P>0.05; d>0.90). Additionally, while player and coach RPE on hard courts were similar, there were large effects for coaches to underrate the RPE of players on clay courts (P>0.05; d>0.90). In conclusion, training on clay courts results in trends for increased heart rate, lactate and RPE values, suggesting sessions on clay tend towards higher physiological and perceptual loads than hard courts. Further, coaches appear effective at rating player RPE on hard courts, but may underrate the perceived exertion of sessions on clay courts.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The aim of children's vision screenings is to detect visual problems that are common in this age category through valid and reliable tests. Nevertheless, the cost effectiveness of paediatric vision screenings, the nature of the tests included in the screening batteries and the ideal screening age has been the cause of much debate in Australia and worldwide. Therefore, the purpose of this review is to report on the current practice of children's vision screenings in Australia and other countries, as well as to evaluate the evidence for and against the provision of such screenings. This was undertaken through a detailed investigation of peer-reviewed publications on this topic. The current review demonstrates that there is no agreed vision screening protocol for children in Australia. This appears to be a result of the lack of strong evidence supporting the benefit of such screenings. While amblyopia, strabismus and, to a lesser extent refractive error, are targeted by many screening programs during pre-school and at school entry, there is less agreement regarding the value of screening for other visual conditions, such as binocular vision disorders, ocular health problems and refractive errors that are less likely to reduce distance visual acuity. In addition, in Australia, little agreement exists in the frequency and coverage of screening programs between states and territories and the screening programs that are offered are ad hoc and poorly documented. Australian children stand to benefit from improved cohesion and communication between jurisdictions and health professionals to enable an equitable provision of validated vision screening services that have the best chance of early detection and intervention for a range of paediatric visual problems.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Motorcycle trauma is a serious road safety issue in Queensland and throughout Australia. In 2009, Queensland Transport (later Transport and Main Roads or TMR) appointed CARRS-Q to provide a three-year program of Road Safety Research Services for Motorcycle Rider Safety. Funding for this research originated from the Motor Accident Insurance Commission. This program of research was undertaken to produce knowledge to assist TMR to improve motorcycle safety by further strengthening the licensing and training system to make learner riders safer by developing a pre-learner package (Deliverable 1 which is the focus of this report), and by evaluating the Q-Ride CAP program to ensure that it is maximally effective and contributes to the best possible training for new riders (Deliverable 2), which is the focus of this report. Deliverable 3 of the program identified potential new licensing components that will reduce the incidence of risky riding and improve higher-order cognitive skills in new riders. While fatality and injury rates for learner car drivers are typically lower than for those with intermediate licences, this pattern is not found for learner motorcycle riders. Learner riders cannot be supervised as effectively as learner car drivers and errors are more likely to result in injury for learner riders than learner drivers. It is therefore imperative to improve safety for learner riders. Deliverable 1 examines the potential for improving the motorcycle learner and licence scheme by introducing a pre-learner motorcycle licensing and training scheme within Queensland. The tasks undertaken for Deliverable 1 were a literature review, analysis of learner motorcyclist crash and licensing data, and the development of a potential pre-learner motorcycle rider program.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This study compared the performance of a local and three robust optimality criteria in terms of the standard error for a one-parameter and a two-parameter nonlinear model with uncertainty in the parameter values. The designs were also compared in conditions where there was misspecification in the prior parameter distribution. The impact of different correlation between parameters on the optimal design was examined in the two-parameter model. The designs and standard errors were solved analytically whenever possible and numerically otherwise.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The accuracy and reliability of urban stormwater quality modelling outcomes are important for stormwater management decision making. The commonly adopted approach where only a limited number of factors are used to predict urban stormwater quality may not adequately represent the complexity of the quality response to a rainfall event or site-to-site differences to support efficient treatment design. This paper discusses an investigation into the influence of rainfall and catchment characteristics on urban stormwater quality in order to investigate the potential areas for errors in current stormwater quality modelling practices. It was found that the influence of rainfall characteristics on pollutant wash-off is step-wise based on specific thresholds. This means that a modelling approach where the wash-off process is predicted as a continuous function of rainfall intensity and duration is not appropriate. Additionally, other than conventional catchment characteristics, namely, land use and impervious surface fraction, other catchment characteristics such as impervious area layout, urban form and site specific characteristics have an important influence on both, pollutant build-up and wash-off processes. Finally, the use of solids as a surrogate to estimate other pollutant species was found to be inappropriate. Individually considering build-up and wash-off processes for each pollutant species should be the preferred option.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Flood related scientific and community-based data are rarely systematically collected and analysed in the Philippines. Over the last decades the Pagsangaan River Basin, Leyte, has experienced several flood events. However, documentation describing flood characteristics such as extent, duration or height of these floods are close to non-existing. To address this issue, computerized flood modelling was used to reproduce past events where there was data available for at least partial calibration and validation. The model was also used to provide scenario-based predictions based on A1B climate change assumptions for the area. The most important input for flood modelling is a Digital Elevation Model (DEM) of the river basin. No accurate topographic maps or Light Detection And Ranging (LIDAR)-generated data are available for the Pagsangaan River. Therefore, the Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) Global Digital Elevation Map (GDEM), Version 1, was chosen as the DEM. Although the horizontal spatial resolution of 30 m is rather desirable, it contains substantial vertical errors. These were identified, different correction methods were tested and the resulting DEM was used for flood modelling. The above mentioned data were combined with cross-sections at various strategic locations of the river network, meteorological records, river water level, and current velocity to develop the 1D-2D flood model. SOBEK was used as modelling software to create different rainfall scenarios, including historic flooding events. Due to the lack of scientific data for the verification of the model quality, interviews with local stakeholders served as the gauge to judge the quality of the generated flood maps. According to interviewees, the model reflects reality more accurately than previously available flood maps. The resulting flood maps are now used by the operations centre of a local flood early warning system for warnings and evacuation alerts. Furthermore these maps can serve as a basis to identify flood hazard areas for spatial land use planning purposes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Deploying networked control systems (NCSs) over wireless networks is becoming more and more popular. However, the widely-used transport layer protocols, Transmission Control Protocol (TCP) and User Datagram Protocol (UDP), are not designed for real-time applications. Therefore, they may not be suitable for many NCS application scenarios because of their limitations on reliability and/or delay performance, which real-control systems concern. Considering a typical type of NCSs with periodic and sporadic real-time traffic, this paper proposes a highly reliable transport layer protocol featuring a packet loss-sensitive retransmission mechanism and a prioritized transmission mechanism. The packet loss-sensitive retransmission mechanism is designed to improve the reliability of all traffic flows. And the prioritized transmission mechanism offers differentiated services for periodic and sporadic flows. Simulation results show that the proposed protocol has better reliability than UDP and improved delay performance than TCP over wireless networks, particularly when channel errors and congestions occur.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Reliable approaches for predicting pollutant build-up are essential for accurate urban stormwater quality modelling. Based on the in-depth investigation of metal build-up on residential road surfaces, this paper presents empirical models for predicting metal loads on these surfaces. The study investigated metals commonly present in the urban environment. Analysis undertaken found that the build-up process for metals primarily originating from anthropogenic (copper and zinc) and geogenic (aluminium, calcium, iron and manganese) sources were different. Chromium and nickel were below detection limits. Lead was primarily associated with geogenic sources, but also exhibited a significant relationship with anthropogenic sources. The empirical prediction models developed were validated using an independent data set and found to have relative prediction errors of 12-50%, which is generally acceptable for complex systems such as urban road surfaces. Also, the predicted values were very close to the observed values and well within 95% prediction interval.