941 resultados para Biodosimetry errors
Resumo:
The primary objective of this study is to develop a robust queue estimation algorithm for motorway on-ramps. Real-time queue information is the most vital input for a dynamic queue management that can treat long queues on metered on-ramps more sophistically. The proposed algorithm is developed based on the Kalman filter framework. The fundamental conservation model is used to estimate the system state (queue size) with the flow-in and flow-out measurements. This projection results are updated with the measurement equation using the time occupancies from mid-link and link-entrance loop detectors. This study also proposes a novel single point correction method. This method resets the estimated system state to eliminate the counting errors that accumulate over time. In the performance evaluation, the proposed algorithm demonstrated accurate and reliable performances and consistently outperformed the benchmarked Single Occupancy Kalman filter (SOKF) method. The improvements over SOKF are 62% and 63% in average in terms of the estimation accuracy (MAE) and reliability (RMSE), respectively. The benefit of the innovative concepts of the algorithm is well justified by the improved estimation performance in the congested ramp traffic conditions where long queues may significantly compromise the benchmark algorithm’s performance.
Resumo:
This paper describes the implementation of the first portable, embedded data acquisition unit (BabelFuse) that is able to acquire and timestamp generic sensor data and trigger General Purpose I/O (GPIO) events against a microsecond-accurate wirelessly-distributed ‘global’ clock. A significant issue encountered when fusing data received from multiple sensors is the accuracy of the timestamp associated with each piece of data. This is particularly important in applications such as Simultaneous Localisation and Mapping (SLAM) where vehicle velocity forms an important part of the mapping algorithms; on fast-moving vehicles, even millisecond inconsistencies in data timestamping can produce errors which need to be compensated for. The timestamping problem is compounded in a robot swarm environment especially if non-deterministic communication hardware (such as IEEE-802.11-based wireless) and inaccurate clock synchronisation protocols are used. The issue of differing timebases makes correlation of data difficult and prevents the units from reliably performing synchronised operations or manoeuvres. By utilising hardware-assisted timestamping, clock synchronisation protocols based on industry standards and firmware designed to minimise indeterminism, an embedded data acquisition unit capable of microsecond-level clock synchronisation is presented.
Resumo:
Process modeling grammars are used to create scripts of a business domain that a process-aware information system is intended to support. A key grammatical construct of such grammars is known as a Gateway. A Gateway construct is used to describe scenarios in which the workflow of a process diverges or converges according to relevant conditions. Gateway constructs have been subjected to much academic discussion about their meaning, role and usefulness, and have been linked to both process-modeling errors and process-model understandability. This paper examines perceptual discriminability effects of Gateway constructs on an individual's abilities to interpret process models. We compare two ways of expressing two convergence and divergence patterns – Parallel Split and Simple Merge – implemented in a process modeling grammar. On the basis of an experiment with 98 students, we provide empirical evidence that Gateway constructs aid the interpretation of process models due to a perceptual discriminability effect, especially when models are complex. We discuss the emerging implications for research and practice, in terms of revisions to grammar specifications, guideline development and design choices in process modeling.
Resumo:
Background: There is a well developed literature on research investigating the relationship between various driving behaviours and road crash involvement. However, this research has predominantly been conducted in developed economies dominated by western types of cultural environments. To date no research has been published that has empirically investigated this relationship within the context of the emerging economies such as Oman. Objective: The present study aims to investigate driving behaviour as indexed in the Driving Behaviour Questionnaire (DBQ) among a group of Omani university students and staff. Methods: A convenience non-probability self- selection sampling approach was utilized with Omani university students and staff. Results: A total of 1003 Omani students (n= 632) and staff (n=371) participated in the survey. Factor analysis of the BDQ revealed four main factors that were errors, speeding violation, lapses and aggressive violation. In the multivariate logistic backward regression analysis, the following factors were identified as significant predictors of being involved in causing at least one crash: driving experience, history of offences and two DBQ components i.e. errors and aggressive violation. Conclusion: This study indicates that errors and aggressive violation of the traffic regulations as well as history of having traffic offences are major risk factors for road traffic crashes among the sample. While previous international research has demonstrated that speeding is a primary cause of crashing, in the current context, the results indicate that an array of factors is associated with crashes. Further research using more rigorous methodology is warranted to inform the development of road safety countermeasures in Oman that improves overall traffic safety culture.
Resumo:
Health care is an information-intensive business. Sharing information in health care processes is a smart use of data enabling informed decision-making whilst ensuring. the privacy and security of patient information. To achieve this, we propose data encryption techniques embedded Information Accountability Framework (IAF) that establishes transitions of the technological concept, thus enabling understanding of shared responsibility, accessibility, and efficient cost effective informed decisions between health care professionals and patients. The IAF results reveal possibilities of efficient informed medical decision making and minimisation of medical errors. Of achieving this will require significant cultural changes and research synergies to ensure the sustainability, acceptability and durability of the IAF
Resumo:
For decades there have been two young driver concepts: the ‘young driver problem’ where the driver cohort represents a key problem for road safety; and the ‘problem young driver’ where a sub-sample of drivers represents the greatest road safety problem. Given difficulties associated with identifying and then modifying the behaviour of the latter group, broad countermeasures such as graduated driver licensing (GDL) have generally been relied upon to address the young driver problem. GDL evaluations reveal general road safety benefits for young drivers, yet they continue to be overrepresented in fatality and injury statistics. Therefore it is timely for researchers to revisit the ‘problem young driver’ concept to assess its potential countermeasure implications. This is particularly relevant within the context of broader countermeasures that have been designed to address the ‘young driver problem’. Personal characteristics, behaviours and attitudes of 378 Queensland novice drivers aged 17-25 years were explored during their pre-, Learner and Provisional 1 (intermediate) licence as part of a larger longitudinal project. Self-reported risky driving was measured by the Behaviour of Young Novice Drivers Scale (BYNDS), and five subscale scores were used to cluster the drivers into three groups (high risk n=49, medium risk n=163, low risk n=166). High risk ‘problem young drivers’ were characterised by greater self-reported pre-Licence driving, unsupervised Learner driving, and speeding, driving errors, risky driving exposure, crash involvement, and offence detection during the Provisional period. Medium risk drivers were also characterised by more risky road use than the low risk group. Interestingly problem young drivers appear to have some insight into their high-risk driving, since they report significantly greater intentions to bend road rules in future driving. The results suggest that tailored intervention efforts may need to target problem young drivers within the context of broad countermeasures such as GDL which address the young driver problem in general. Experiences such as crash-involvement could be used to identify these drivers as a pre-intervention screening measure.
Resumo:
Iris based identity verification is highly reliable but it can also be subject to attacks. Pupil dilation or constriction stimulated by the application of drugs are examples of sample presentation security attacks which can lead to higher false rejection rates. Suspects on a watch list can potentially circumvent the iris based system using such methods. This paper investigates a new approach using multiple parts of the iris (instances) and multiple iris samples in a sequential decision fusion framework that can yield robust performance. Results are presented and compared with the standard full iris based approach for a number of iris degradations. An advantage of the proposed fusion scheme is that the trade-off between detection errors can be controlled by setting parameters such as the number of instances and the number of samples used in the system. The system can then be operated to match security threat levels. It is shown that for optimal values of these parameters, the fused system also has a lower total error rate.
Resumo:
Background and purpose: The purpose of the work presented in this paper was to determine whether patient positioning and delivery errors could be detected using electronic portal images of intensity modulated radiotherapy (IMRT). Patients and methods: We carried out a series of controlled experiments delivering an IMRT beam to a humanoid phantom using both the dynamic and multiple static field method of delivery. The beams were imaged, the images calibrated to remove the IMRT fluence variation and then compared with calibrated images of the reference beams without any delivery or position errors. The first set of experiments involved translating the position of the phantom both laterally and in a superior/inferior direction a distance of 1, 2, 5 and 10 mm. The phantom was also rotated 1 and 28. For the second set of measurements the phantom position was kept fixed and delivery errors were introduced to the beam. The delivery errors took the form of leaf position and segment intensity errors. Results: The method was able to detect shifts in the phantom position of 1 mm, leaf position errors of 2 mm, and dosimetry errors of 10% on a single segment of a 15 segment IMRT step and shoot delivery (significantly less than 1% of the total dose). Conclusions: The results of this work have shown that the method of imaging the IMRT beam and calibrating the images to remove the intensity modulations could be a useful tool in verifying both the patient position and the delivery of the beam.
Resumo:
Introduction: Recent advances in the planning and delivery of radiotherapy treatments have resulted in improvements in the accuracy and precision with which therapeutic radiation can be administered. As the complexity of the treatments increases it becomes more difficult to predict the dose distribution in the patient accurately. Monte Carlo (MC) methods have the potential to improve the accuracy of the dose calculations and are increasingly being recognised as the ‘gold standard’ for predicting dose deposition in the patient [1]. This project has three main aims: 1. To develop tools that enable the transfer of treatment plan information from the treatment planning system (TPS) to a MC dose calculation engine. 2. To develop tools for comparing the 3D dose distributions calculated by the TPS and the MC dose engine. 3. To investigate the radiobiological significance of any errors between the TPS patient dose distribution and the MC dose distribution in terms of Tumour Control Probability (TCP) and Normal Tissue Complication Probabilities (NTCP). The work presented here addresses the first two aims. Methods: (1a) Plan Importing: A database of commissioned accelerator models (Elekta Precise and Varian 2100CD) has been developed for treatment simulations in the MC system (EGSnrc/BEAMnrc). Beam descriptions can be exported from the TPS using the widespread DICOM framework, and the resultant files are parsed with the assistance of a software library (PixelMed Java DICOM Toolkit). The information in these files (such as the monitor units, the jaw positions and gantry orientation) is used to construct a plan-specific accelerator model which allows an accurate simulation of the patient treatment field. (1b) Dose Simulation: The calculation of a dose distribution requires patient CT images which are prepared for the MC simulation using a tool (CTCREATE) packaged with the system. Beam simulation results are converted to absolute dose per- MU using calibration factors recorded during the commissioning process and treatment simulation. These distributions are combined according to the MU meter settings stored in the exported plan to produce an accurate description of the prescribed dose to the patient. (2) Dose Comparison: TPS dose calculations can be obtained using either a DICOM export or by direct retrieval of binary dose files from the file system. Dose difference, gamma evaluation and normalised dose difference algorithms [2] were employed for the comparison of the TPS dose distribution and the MC dose distribution. These implementations are spatial resolution independent and able to interpolate for comparisons. Results and Discussion: The tools successfully produced Monte Carlo input files for a variety of plans exported from the Eclipse (Varian Medical Systems) and Pinnacle (Philips Medical Systems) planning systems: ranging in complexity from a single uniform square field to a five-field step and shoot IMRT treatment. The simulation of collimated beams has been verified geometrically, and validation of dose distributions in a simple body phantom (QUASAR) will follow. The developed dose comparison algorithms have also been tested with controlled dose distribution changes. Conclusion: The capability of the developed code to independently process treatment plans has been demonstrated. A number of limitations exist: only static fields are currently supported (dynamic wedges and dynamic IMRT will require further development), and the process has not been tested for planning systems other than Eclipse and Pinnacle. The tools will be used to independently assess the accuracy of the current treatment planning system dose calculation algorithms for complex treatment deliveries such as IMRT in treatment sites where patient inhomogeneities are expected to be significant. Acknowledgements: Computational resources and services used in this work were provided by the HPC and Research Support Group, Queensland University of Technology, Brisbane, Australia. Pinnacle dose parsing made possible with the help of Paul Reich, North Coast Cancer Institute, North Coast, New South Wales.
Resumo:
We have taken a new method of calibrating portal images of IMRT beams and used this to measure patient set-up accuracy and delivery errors, such as leaf errors and segment intensity errors during treatment. A calibration technique was used to remove the intensity modulations from the images leaving equivalent open field images that show patient anatomy that can be used for verification of the patient position. The images of the treatment beam can also be used to verify the delivery of the beam in terms of multileaf collimator leaf position and dosimetric errors. A series of controlled experiments delivering an IMRT anterior beam to the head and neck of a humanoid phantom were undertaken. A 2mm translation in the position of the phantom could be detected. With intentional introduction of delivery errors into the beam this method allowed us to detect leaf positioning errors of 2mm and variation in monitor units of 1%. The method was then applied to the case of a patient who received IMRT treatment to the larynx and cervical nodes. The anterior IMRT beam was imaged during four fractions and the images calibrated and investigated for the characteristic signs of patient position error and delivery error that were shown in the control experiments. No significant errors were seen. The method of imaging the IMRT beam and calibrating the images to remove the intensity modulations can be a useful tool in verifying both the patient position and the delivery of the beam.
Resumo:
We introduce a broad lattice manipulation technique for expressive cryptography, and use it to realize functional encryption for access structures from post-quantum hardness assumptions. Specifically, we build an efficient key-policy attribute-based encryption scheme, and prove its security in the selective sense from learning-with-errors intractability in the standard model.
Resumo:
Modernized GPS and GLONASS, together with new GNSS systems, BeiDou and Galileo, offer code and phase ranging signals in three or more carriers. Traditionally, dual-frequency code and/or phase GPS measurements are linearly combined to eliminate effects of ionosphere delays in various positioning and analysis. This typical treatment method has imitations in processing signals at three or more frequencies from more than one system and can be hardly adapted itself to cope with the booming of various receivers with a broad variety of singles. In this contribution, a generalized-positioning model that the navigation system independent and the carrier number unrelated is promoted, which is suitable for both single- and multi-sites data processing. For the synchronization of different signals, uncalibrated signal delays (USD) are more generally defined to compensate the signal specific offsets in code and phase signals respectively. In addition, the ionospheric delays are included in the parameterization with an elaborate consideration. Based on the analysis of the algebraic structures, this generalized-positioning model is further refined with a set of proper constrains to regularize the datum deficiency of the observation equation system. With this new model, uncalibrated signal delays (USD) and ionospheric delays are derived for both GPS and BeiDou with a large dada set. Numerical results demonstrate that, with a limited number of stations, the uncalibrated code delays (UCD) are determinate to a precision of about 0.1 ns for GPS and 0.4 ns for BeiDou signals, while the uncalibrated phase delays (UPD) for L1 and L2 are generated with 37 stations evenly distributed in China for GPS with a consistency of about 0.3 cycle. Extra experiments concerning the performance of this novel model in point positioning with mixed-frequencies of mixed-constellations is analyzed, in which the USD parameters are fixed with our generated values. The results are evaluated in terms of both positioning accuracy and convergence time.
Resumo:
This research aims to develop a reliable density estimation method for signalised arterials based on cumulative counts from upstream and downstream detectors. In order to overcome counting errors associated with urban arterials with mid-link sinks and sources, CUmulative plots and Probe Integration for Travel timE estimation (CUPRITE) is employed for density estimation. The method, by utilizing probe vehicles’ samples, reduces or cancels the counting inconsistencies when vehicles’ conservation is not satisfied within a section. The method is tested in a controlled environment, and the authors demonstrate the effectiveness of CUPRITE for density estimation in a signalised section, and discuss issues associated with the method.
Resumo:
The process of building safer roads and roadsides needs to be managed to minimise risks to both the road using public and roadworkers. However, detailed and accurate data on fatalities and injuries at roadworks across Australia are not available. The lack of reliable safety records and consequent poor understanding of the hazards at roadworks motivated this research to examine the common trends in incidents and to understand workers' perceptions of the causes of incidents at roadworks. To achieve these aims, 66 roadworks personnel were interviewed in Queensland including road construction workers, traffic controllers, engineers, and managers. Qualitative analyses identified several major issues and themes. Vehicles driving into work areas, traffic controllers hit by vehicles, rear end crashes at roadwork approaches, and reversing incidents involving work vehicles and machinery were the most common types of incidents. Roadworkers perceived driver errors, such as violation of speed limits, distracted driving, and ignoring signage and traffic controllers' instructions as the main causes of the incidents.
Resumo:
This thesis investigates the possibility of using an adaptive tutoring system for beginning programming students. The work involved, designing, developing and evaluating such a system and showing that it was effective in increasing the students’ test scores. In doing so, Artificial Intelligence techniques were used to analyse PHP programs written by students and to provide feedback based on any specific errors made by them. Methods were also included to provide students with the next best exercise to suit their particular level of knowledge.