941 resultados para Biodosimetry errors


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Expert knowledge is valuable in many modelling endeavours, particularly where data is not extensive or sufficiently robust. In Bayesian statistics, expert opinion may be formulated as informative priors, to provide an honest reflection of the current state of knowledge, before updating this with new information. Technology is increasingly being exploited to help support the process of eliciting such information. This paper reviews the benefits that have been gained from utilizing technology in this way. These benefits can be structured within a six-step elicitation design framework proposed recently (Low Choy et al., 2009). We assume that the purpose of elicitation is to formulate a Bayesian statistical prior, either to provide a standalone expert-defined model, or for updating new data within a Bayesian analysis. We also assume that the model has been pre-specified before selecting the software. In this case, technology has the most to offer to: targeting what experts know (E2), eliciting and encoding expert opinions (E4), whilst enhancing accuracy (E5), and providing an effective and efficient protocol (E6). Benefits include: -providing an environment with familiar nuances (to make the expert comfortable) where experts can explore their knowledge from various perspectives (E2); -automating tedious or repetitive tasks, thereby minimizing calculation errors, as well as encouraging interaction between elicitors and experts (E5); -cognitive gains by educating users, enabling instant feedback (E2, E4-E5), and providing alternative methods of communicating assessments and feedback information, since experts think and learn differently; and -ensuring a repeatable and transparent protocol is used (E6).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Ubiquitous access to patient medical records is an important aspect of caring for patient safety. Unavailability of sufficient medical information at the point-ofcare could possibly lead to a fatality. The U.S. Institute of Medicine has reported that between 44,000 and 98,000 people die each year due to medical errors, such as incorrect medication dosages, due to poor legibility in manual records, or delays in consolidating needed information to discern the proper intervention. In this research we propose employing emergent technologies such as Java SIM Cards (JSC), Smart Phones (SP), Next Generation Networks (NGN), Near Field Communications (NFC), Public Key Infrastructure (PKI), and Biometric Identification to develop a secure framework and related protocols for ubiquitous access to Electronic Health Records (EHR). A partial EHR contained within a JSC can be used at the point-of-care in order to help quick diagnosis of a patient’s problems. The full EHR can be accessed from an Electronic Health Records Centre (EHRC) when time and network availability permit. Moreover, this framework and related protocols enable patients to give their explicit consent to a doctor to access their personal medical data, by using their Smart Phone, when the doctor needs to see or update the patient’s medical information during an examination. Also our proposed solution would give the power to patients to modify the Access Control List (ACL) related to their EHRs and view their EHRs through their Smart Phone. Currently, very limited research has been done on using JSCs and similar technologies as a portable repository of EHRs or on the specific security issues that are likely to arise when JSCs are used with ubiquitous access to EHRs. Previous research is concerned with using Medicare cards, a kind of Smart Card, as a repository of medical information at the patient point-of-care. However, this imposes some limitations on the patient’s emergency medical care, including the inability to detect the patient’s location, to call and send information to an emergency room automatically, and to interact with the patient in order to get consent. The aim of our framework and related protocols is to overcome these limitations by taking advantage of the SIM card and the technologies mentioned above. Briefly, our framework and related protocols will offer the full benefits of accessing an up-to-date, precise, and comprehensive medical history of a patient, whilst its mobility will provide ubiquitous access to medical and patient information everywhere it is needed. The objective of our framework and related protocols is to automate interactions between patients, healthcare providers and insurance organisations, increase patient safety, improve quality of care, and reduce the costs.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The work presents a new approach to the problem of simultaneous localization and mapping - SLAM - inspired by computational models of the hippocampus of rodents. The rodent hippocampus has been extensively studied with respect to navigation tasks, and displays many of the properties of a desirable SLAM solution. RatSLAM is an implementation of a hippocampal model that can perform SLAM in real time on a real robot. It uses a competitive attractor network to integrate odometric information with landmark sensing to form a consistent representation of the environment. Experimental results show that RatSLAM can operate with ambiguous landmark information and recover from both minor and major path integration errors.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Most statistical methods use hypothesis testing. Analysis of variance, regression, discrete choice models, contingency tables, and other analysis methods commonly used in transportation research share hypothesis testing as the means of making inferences about the population of interest. Despite the fact that hypothesis testing has been a cornerstone of empirical research for many years, various aspects of hypothesis tests commonly are incorrectly applied, misinterpreted, and ignored—by novices and expert researchers alike. On initial glance, hypothesis testing appears straightforward: develop the null and alternative hypotheses, compute the test statistic to compare to a standard distribution, estimate the probability of rejecting the null hypothesis, and then make claims about the importance of the finding. This is an oversimplification of the process of hypothesis testing. Hypothesis testing as applied in empirical research is examined here. The reader is assumed to have a basic knowledge of the role of hypothesis testing in various statistical methods. Through the use of an example, the mechanics of hypothesis testing is first reviewed. Then, five precautions surrounding the use and interpretation of hypothesis tests are developed; examples of each are provided to demonstrate how errors are made, and solutions are identified so similar errors can be avoided. Remedies are provided for common errors, and conclusions are drawn on how to use the results of this paper to improve the conduct of empirical research in transportation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Greenhouse gas emissions from a well established, unfertilized tropical grass-legume pasture were monitored over two consecutive years using high resolution automatic sampling. Nitrous oxide emissions were highest during the summer months and were highly episodic, related more to the size and distribution of rain events than WFPS alone. Mean annual emissions were significantly higher during 2008 (5.7 ± 1.0 g N2O-N/ha/day) than 2007 (3.9 ± 0.4 and g N2O-N/ha/day) despite receiving nearly 500 mm less rain. Mean CO2 (28.2 ± 1.5 kg CO2 C/ha/day) was not significantly different (P < 0.01) between measurement years, emissions being highly dependent on temperature. A negative correlation between CO2 and WFPS at >70% indicated a threshold for soil conditions favouring denitrification. The use of automatic chambers for high resolution greenhouse gas sampling can greatly reduce emission estimation errors associated with temperature and WFPS changes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

There are a number of gel dosimeter calibration methods in contemporary usage. The present study is a detailed Monte Carlo investigation into the accuracy of several calibration techniques. Results show that for most arrangements the dose to gel accurately reflects the dose to water, with the most accurate method involving the use of a large diameter flask of gel into which multiple small fields of varying dose are directed. The least accurate method was found to be that of a long test tube in a water phantom, coaxial with the beam. The large flask method is also the most straightforward and least likely to introduce errors during setup, though, to its detriment, the volume of gel required is much more than other methods.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background Most questionnaires used for physical activity (PA) surveillance have been developed for adults aged ≤65 years. Given the health benefits of PA for older adults and the aging of the population, it is important to include adults aged 65+ years in PA surveillance. However, few studies have examined how well older adults understand PA surveillance questionnaires. This study aimed to document older adults’ understanding of questions from the International PA Questionnaire (IPAQ), which is used worldwide for PA surveillance. Methods Participants were 41 community-dwelling adults aged 65-89 years. They each completed IPAQ in a face-to-face semi-structured interview, using the “think-aloud” method, in which they expressed their thoughts out loud as they answered IPAQ questions. Interviews were transcribed and coded according to a three-stage model: understanding the intent of the question; performing the primary task (conducting the mental operations required to formulate a response); and response formatting (mapping the response into pre-specified response options). Results Most difficulties occurred during the understanding and performing the primary task stages. Errors included recalling PA in an “average” week, not in the previous 7 days; including PA lasting ≤10 minutes/session; reporting the same PA twice or thrice; and including the total time of an activity for which only a part of that time was at the intensity specified in the question. Participants were unclear what activities fitted within a question’s scope and used a variety of strategies for determining the frequency and duration of their activities. Participants experienced more difficulties with the moderate-intensity PA and walking questions than with the vigorous-intensity PA questions. The sitting time question, particularly difficult for many participants, required the use of an answer strategy different from that used to answer questions about PA. Conclusions These findings indicate a need for caution in administering IPAQ to adults aged ≥65 years. Most errors resulted in over-reporting, although errors resulting in under-reporting were also noted. Given the nature of the errors made by participants, it is possible that similar errors occur when IPAQ is used in younger populations and that the errors identified could be minimized with small modifications to IPAQ.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This research discusses some of the issues encountered while developing a set of WGEN parameters for Chile and advice for others interested in developing WGEN parameters for arid climates. The WGEN program is a commonly used and a valuable research tool; however, it has specific limitations in arid climates that need careful consideration. These limitations are analysed in the context of generating a set of WGEN parameters for Chile. Fourteen to 26 years of precipitation data are used to calculate precipitation parameters for 18 locations in Chile, and 3–8 years of temperature and solar radiation data are analysed to generate parameters for seven of these locations. Results indicate that weather generation parameters in arid regions are sensitive to erroneous or missing precipitation data. Research shows that the WGEN-estimated gamma distribution shape parameter (α) for daily precipitation in arid zones will tend to cluster around discrete values of 0 or 1, masking the high sensitivity of these parameters to additional data. Rather than focus on the length in years when assessing the adequacy of a data record for estimation of precipitation parameters, researchers should focus on the number of wet days in dry months in a data set. Analysis of the WGEN routines for the estimation of temperature and solar radiation parameters indicates that errors can occur when individual ‘months’ have fewer than two wet days in the data set. Recommendations are provided to improve methods for estimation of WGEN parameters in arid climates.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Safety interventions (e.g., median barriers, photo enforcement) and road features (e.g., median type and width) can influence crash severity, crash frequency, or both. Both dimensions—crash frequency and crash severity—are needed to obtain a full accounting of road safety. Extensive literature and common sense both dictate that crashes are not created equal, with fatalities costing society more than 1,000 times the cost of property damage crashes on average. Despite this glaring disparity, the profession has not unanimously embraced or successfully defended a nonarbitrary severity weighting approach for analyzing safety data and conducting safety analyses. It is argued here that the two dimensions (frequency and severity) are made available by intelligently and reliably weighting crash frequencies and converting all crashes to property-damage-only crash equivalents (PDOEs) by using comprehensive societal unit crash costs. This approach is analogous to calculating axle load equivalents in the prediction of pavement damage: for instance, a 40,000-lb truck causes 4,025 times more stress than does a 4,000-lb car and so simply counting axles is not sufficient. Calculating PDOEs using unit crash costs is the most defensible and nonarbitrary weighting scheme, allows for the simple incorporation of severity and frequency, and leads to crash models that are sensitive to factors that affect crash severity. Moreover, using PDOEs diminishes the errors introduced by underreporting of less severe crashes—an added benefit of the PDOE analysis approach. The method is illustrated with rural road segment data from South Korea (which in practice would develop PDOEs with Korean crash cost data).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Most infrastructure project developments are complex in nature, particularly in the planning phase. During this stage, many vague alternatives are tabled - from the strategic to operational level. Human judgement and decision making are characterised by biases, errors and the use of heuristics. These factors are intangible and hard to measure because they are subjective and qualitative in nature. The problem with human judgement becomes more complex when a group of people are involved. The variety of different stakeholders may cause conflict due to differences in personal judgements. Hence, the available alternatives increase the complexities of the decision making process. Therefore, it is desirable to find ways of enhancing the efficiency of decision making to avoid misunderstandings and conflict within organisations. As a result, numerous attempts have been made to solve problems in this area by leveraging technologies such as decision support systems. However, most construction project management decision support systems only concentrate on model development and neglect fundamentals of computing such as requirement engineering, data communication, data management and human centred computing. Thus, decision support systems are complicated and are less efficient in supporting the decision making of project team members. It is desirable for decision support systems to be simpler, to provide a better collaborative platform, to allow for efficient data manipulation, and to adequately reflect user needs. In this chapter, a framework for a more desirable decision support system environment is presented. Some key issues related to decision support system implementation are also described.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In rural low-voltage networks, distribution lines are usually highly resistive. When many distributed generators are connected to such lines, power sharing among them is difficult when using conventional droop control, as the real and reactive power have strong coupling with each other. A high droop gain can alleviate this problem but may lead the system to instability. To overcome4 this, two droop control methods are proposed for accurate load sharing with frequency droop controller. The first method considers no communication among the distributed generators and regulates the output voltage and frequency, ensuring acceptable load sharing. The droop equations are modified with a transformation matrix based on the line R/X ration for this purpose. The second proposed method, with minimal low bandwidth communication, modifies the reference frequency of the distributed generators based on the active and reactive power flow in the lines connected to the points of common coupling. The performance of these two proposed controllers is compared with that of a controller, which includes an expensive high bandwidth communication system through time-domain simulation of a test system. The magnitude of errors in power sharing between these three droop control schemes are evaluated and tabulated.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purpose. To investigate the effect of various presbyopic vision corrections on nighttime driving performance on a closed-road driving circuit. Methods. Participants were 11 presbyopes (mean age, 57.3 ± 5.8 years), with a mean best sphere distance refractive error of R+0.23±1.53 DS and L+0.20±1.50 DS, whose only experience of wearing presbyopic vision correction was reading spectacles. The study involved a repeated-measures design by which a participant's nighttime driving performance was assessed on a closed-road circuit while wearing each of four power-matched vision corrections. These included single-vision distance lenses (SV), progressive-addition spectacle lenses (PAL), monovision contact lenses (MV), and multifocal contact lenses (MTF CL) worn in a randomized order. Measures included low-contrast road hazard detection and avoidance, road sign and near target recognition, lane-keeping, driving time, and legibility distance for street signs. Eye movement data (fixation duration and number of fixations) were also recorded. Results. Street sign legibility distances were shorter when wearing MV and MTF CL than SV and PAL (P < 0.001), and participants drove more slowly with MTF CL than with PALs (P = 0.048). Wearing SV resulted in more errors (P < 0.001) and in more (P = 0.002) and longer (P < 0.001) fixations when responding to near targets. Fixation duration was also longer when viewing distant signs with MTF CL than with PAL (P = 0.031). Conclusions. Presbyopic vision corrections worn by naive, unadapted wearers affected nighttime driving. Overall, spectacle corrections (PAL and SV) performed well for distance driving tasks, but SV negatively affected viewing near dashboard targets. MTF CL resulted in the shortest legibility distance for street signs and longer fixation times.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background, aim, and scope Urban motor vehicle fleets are a major source of particulate matter pollution, especially of ultrafine particles (diameters < 0.1 µm), and exposure to particulate matter has known serious health effects. A considerable body of literature is available on vehicle particle emission factors derived using a wide range of different measurement methods for different particle sizes, conducted in different parts of the world. Therefore the choice as to which are the most suitable particle emission factors to use in transport modelling and health impact assessments presented as a very difficult task. The aim of this study was to derive a comprehensive set of tailpipe particle emission factors for different vehicle and road type combinations, covering the full size range of particles emitted, which are suitable for modelling urban fleet emissions. Materials and methods A large body of data available in the international literature on particle emission factors for motor vehicles derived from measurement studies was compiled and subjected to advanced statistical analysis, to determine the most suitable emission factors to use in modelling urban fleet emissions. Results This analysis resulted in the development of five statistical models which explained 86%, 93%, 87%, 65% and 47% of the variation in published emission factors for particle number, particle volume, PM1, PM2.5 and PM10 respectively. A sixth model for total particle mass was proposed but no significant explanatory variables were identified in the analysis. From the outputs of these statistical models, the most suitable particle emission factors were selected. This selection was based on examination of the statistical robustness of the statistical model outputs, including consideration of conservative average particle emission factors with the lowest standard errors, narrowest 95% confidence intervals and largest sample sizes, and the explanatory model variables, which were Vehicle Type (all particle metrics), Instrumentation (particle number and PM2.5), Road Type (PM10) and Size Range Measured and Speed Limit on the Road (particle volume). Discussion A multiplicity of factors need to be considered in determining emission factors that are suitable for modelling motor vehicle emissions, and this study derived a set of average emission factors suitable for quantifying motor vehicle tailpipe particle emissions in developed countries. Conclusions The comprehensive set of tailpipe particle emission factors presented in this study for different vehicle and road type combinations enable the full size range of particles generated by fleets to be quantified, including ultrafine particles (measured in terms of particle number). These emission factors have particular application for regions which may have a lack of funding to undertake measurements, or insufficient measurement data upon which to derive emission factors for their region. Recommendations and perspectives In urban areas motor vehicles continue to be a major source of particulate matter pollution and of ultrafine particles. It is critical that in order to manage this major pollution source methods are available to quantify the full size range of particles emitted for traffic modelling and health impact assessments.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We describe the introduction, service growth, benefits and holistic support approach of a centrally supported universitywide online survey tool for researchers at QUT. The online survey service employs the Key Survey software, and has grown into a significant service for QUT researchers since being introduced in 2009. Key benefits of the approach include the ability of QUT to handle important issues relating to data such as security, privacy, integrity, archiving & disposal. The service also incorporates a workflow process that enhances the institution’s ability to ensure survey quality control through controlled approval and pilot testing before any survey is widely released. An important issue is that a tool like this can make it very easy to do very poor research very quickly while creating lots of data, due to the absence of a rigorous methodology designed to reduce errors and collect accurate, comprehensive, timely data. With this in mind, a holistic approach to service provision and support has been taken, which has included the introduction of an integrated system of seminars, tools and workshops to get researchers thinking about the quality of their research while becoming operational quickly. The system of seminars, workshops, checks and approvals we have put in place at QUT is designed to ensure better quality outcomes for QUT’s research and the individual researchers concerned.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purpose: The goal of this conceptual paper is to provide tools to help maximise the value delivered by infrastructure projects, by developing methods to increase adoption of innovative products during construction. Methods: The role of knowledge flows in determining innovation adoption rates is conceptually examined. A promising new approach is developed. Open innovation system theory is extended, by reviewing the role of three frameworks: (1) knowledge intermediaries, (2) absorptive capacity and (3) governance arrangements. Originality: We develop a novel open innovation system model to guide further research in the area of adoption of innovation on infrastructure projects. The open innovation system model currently lacks definition of core concepts, especially with regard to the impact of different degrees and types of openness. The three frameworks address this issue and add substance to the open innovation system model, addressing widespread criticism that it is underdeveloped. The novelty of our model is in the combination of the three frameworks to explore the system. These frameworks promise new insights into system dynamics and facilitate the development of new methods to optimise the diffusion of innovation. Practical Implications: The framework will help to reveal gaps in knowledge flows that impede the uptake of innovations. In the past, identifying these gaps has been difficult given the lack of nuance in existing theory. The knowledge maps proposed will enable informed policy advice to effectively harness the power of knowledge networks, increase innovation diffusion and improve the performance of infrastructure projects. The models developed in this paper will be used in planned empirical research into innovation on large scale infrastructure projects in the Australian built environment.