941 resultados para Biodosimetry errors
Resumo:
This issue of Precedent is concerned with professional legal ethics. ln my view, professional ethics are rules about how you do your job, based on moral principles. By virtue of the nature of the work they do, the reputation of the institution through which they are admitted to practice (the court), and the consequences that can flow if they act inappropriately or incompetently, lawyers are under constant scrutiny in all aspects of their lives. Errors, omissions or misdeeds in both their professional and their personal lives have the potential to damage them, their clients, the profession itself and the court. We ought never to take for granted the trust the public places in us to preserve the integrity of the legal system itself, especially in times when that system may be under threat, either from without or from within.
Resumo:
The Driver Behaviour Questionnaire (DBQ) continues to be the most widely utilised self-report scale globally to assess crash risk and aberrant driving behaviours among motorists. However, the scale also attracts criticism regarding its perceived limited ability to accurately identify those most at risk of crash involvement. This study reports on the utilisation of the DBQ to examine the self-reported driving behaviours (and crash outcomes) of drivers in three separate Australian fleet samples (N = 443, N = 3414, & N = 4792), and whether combining the samples increases the tool’s predictive ability. Either on-line or paper versions of the questionnaire were completed by fleet employees in three organisations. Factor analytic techniques identified either three or four factor solutions (in each of the separate studies) and the combined sample produced expected factors of: (a) errors, (b) highway-code violations and (c) aggressive driving violations. Highway code violations (and mean scores) were comparable across the studies. However, across the three samples, multivariate analyses revealed that exposure to the road was the best predictor of crash involvement at work, rather than DBQ constructs. Furthermore, combining the scores to produce a sample of 8649 drivers did not improve the predictive ability of the tool for identifying crashes (e.g., 0.4% correctly identified) or for demerit point loss (0.3%). The paper outlines the major findings of this comparative sample study in regards to utilising self-report measurement tools to identify “at risk” drivers as well as the application of such data to future research endeavours.
Resumo:
The travel and tourism industry has come to rely heavily on information and communication technologies to facilitate relations with consumers. Compiling consumer data profiles has become easier and it is widely thought that consumers place great importance on how that data is handled by firms. Lack of trust may cause consumers to have privacy concerns and may, in turn, have an adverse impact on consumers’ willingness to purchase online. Three specific aspects of privacy that have received attention from researchers are unauthorized use of secondary data, invasion of privacy, and errors. A survey study was undertaken to examine the effects of these factors on both prior purchase of travel products via the Internet and future purchase probability. Surprisingly, no significant relationships were found to indicate that such privacy concerns affect online purchase behavior within the travel industry. Implications for managers are discussed.
Resumo:
Path integration is a process with which navigators derive their current position and orientation by integrating self-motion signals along a locomotion trajectory. It has been suggested that path integration becomes disproportionately erroneous when the trajectory crosses itself. However, there is a possibility that this previous finding was confounded by effects of the length of a traveled path and the amount of turns experienced along the path, two factors that are known to affect path integration performance. The present study was designed to investigate whether the crossover of a locomotion trajectory truly increases errors of path integration. In an experiment, blindfolded human navigators were guided along four paths that varied in their lengths and turns, and attempted to walk directly back to the beginning of the paths. Only one of the four paths contained a crossover. Results showed that errors yielded from the path containing the crossover were not always larger than those observed in other paths, and the errors were attributed solely to the effects of longer path lengths or greater degrees of turns. These results demonstrated that path crossover does not always cause significant disruption in path integration processes. Implications of the present findings for models of path integration are discussed.
Resumo:
Over the past decades there has been a considerable development in the modeling of car-following (CF) behavior as a result of research undertaken by both traffic engineers and traffic psychologists. While traffic engineers seek to understand the behavior of a traffic stream, traffic psychologists seek to describe the human abilities and errors involved in the driving process. This paper provides a comprehensive review of these two research streams. It is necessary to consider human-factors in {CF} modeling for a more realistic representation of {CF} behavior in complex driving situations (for example, in traffic breakdowns, crash-prone situations, and adverse weather conditions) to improve traffic safety and to better understand widely-reported puzzling traffic flow phenomena, such as capacity drop, stop-and-go oscillations, and traffic hysteresis. While there are some excellent reviews of {CF} models available in the literature, none of these specifically focuses on the human factors in these models. This paper addresses this gap by reviewing the available literature with a specific focus on the latest advances in car-following models from both the engineering and human behavior points of view. In so doing, it analyses the benefits and limitations of various models and highlights future research needs in the area.
Resumo:
Saliva contains a number of biochemical components which may be useful for diagnosis/monitoring of metabolic disorders, and as markers of cancer or heart disease. Saliva collection is attractive as a non-invasive sampling method for infants and elderly patients. We present a method suitable for saliva collection from neonates. We have applied this technique for the determination of salivary nucleotide metabolites. Saliva was collected from 10 healthy neonates using washed cotton swabs, and directly from 10 adults. Two methods for saliva extraction from oral swabs were evaluated. The analytes were then separated using high performance liquid chromatography (HPLC) with tandem mass spectrometry (MS/MS). The limits of detection for 14 purine/pyrimidine metabolites were variable, ranging from 0.01 to 1.0 mu M. Recovery of hydrophobic purine/pyrimidine metabolites from cotton tips was consistently high using water/acetonitrile extraction (92.7-111%) compared with water extraction alone. The concentrations of these metabolites were significantly higher in neonatal saliva than in adults. Preliminary ranges for nucleotide metabolites in neonatal and adult saliva are reported. Hypoxanthine and xanthine were grossly raised in neonates (49.3 +/- 25.4; 30.9 +/- 19.5 mu M respectively) compared to adults (4.3 +/- 3.3; 4.6 +/- 4.5 mu M); nucleosides were also markedly raised in neonates. This study focuses on three essential details: contamination of oral swabs during manufacturing and how to overcome this; weighing swabs to accurately measure small saliva volumes; and methods for extracting saliva metabolites of interest from cotton swabs. A method is described for determining nucleotide metabolites using HPLC with photo-diode array or MS/MS. The advantages of utilising saliva are highlighted. Nucleotide metabolites were not simply in equilibrium with plasma, but may be actively secreted into saliva, and this process is more active in neonates than adults. (C) 2013 Elsevier B.V. All rights reserved.
Resumo:
Flexible graphene-based thin film supercapacitors were made using carbon nanotube (CNT) films as current collectors and graphene films as electrodes. The graphene sheets were produced by simple electrochemical exfoliation, while the graphene films with controlled thickness were prepared by vacuum filtration. The solid-state supercapacitor was made by using two graphene/CNT films on plastic substrates to sandwich a thin layer of gelled electrolyte. We found that the thin graphene film with thickness <1 μm can greatly increase the capacitance. Using only CNT films as electrodes, the device exhibited a capacitance as low as ~0.4 mF cm−2, whereas by adding a 360 nm thick graphene film to the CNT electrodes led to a ~4.3 mF cm−2 capacitance. We experimentally demonstrated that the conductive CNT film is equivalent to gold as a current collector while it provides a stronger binding force to the graphene film. Combining the high capacitance of the thin graphene film and the high conductivity of the CNT film, our devices exhibited high energy density (8–14 Wh kg−1) and power density (250–450 kW kg−1).
Resumo:
Purpose: Older adults have increased visual impairment, including refractive blur from presbyopic multifocal spectacle corrections, and are less able to extract visual information from the environment to plan and execute appropriate stepping actions; these factors may collectively contribute to their higher risk of falls. The aim of this study was to examine the effect of refractive blur and target visibility on the stepping accuracy and visuomotor stepping strategies of older adults during a precision stepping task. Methods: Ten healthy, visually normal older adults (mean age 69.4 ± 5.2 years) walked up and down a 20 m indoor corridor stepping onto selected high and low-contrast targets while viewing under three visual conditions: best-corrected vision, +2.00 DS and +3.00 DS blur; the order of blur conditions was randomised between participants. Stepping accuracy and gaze behaviours were recorded using an eyetracker and a secondary hand-held camera. Results: Older adults made significantly more stepping errors with increasing levels of blur, particularly exhibiting under-stepping (stepping more posteriorly) onto the targets (p<0.05), while visuomotor stepping strategies did not significantly alter. Stepping errors were also significantly greater for the low compared to the high contrast targets and differences in visuomotor stepping strategies were found, including increased duration of gaze and increased interval between gaze onset and initiation of the leg swing when stepping onto the low contrast targets. Conclusions: These findings highlight that stepping accuracy is reduced for low visibility targets, and for high levels of refractive blur at levels typically present in multifocal spectacle corrections, despite significant changes in some of the visuomotor stepping strategies. These findings highlight the importance of maximising the contrast of objects in the environment, and may help explain why older adults wearing multifocal spectacle corrections exhibit an increased risk of falling.
Resumo:
The textual turn is a good friend of expert spectating, where it assumes the role of writing-productive apparatus, but no friend at all of expert practices or practitioners (Melrose, 2003). Introduction The challenge of time-based embodied performance when the artefact is unstable As a former full-time professional practitioner with an embodied dance practice as performer, choreographer and artistic director for three decades, I somewhat unexpectedly entered the world of academia in 2000 after completing a practice-based PhD, which was described by its examiners as ‘pioneering’. Like many artists my intention was to deepen and extend my practice through formal research into my work and its context (which was intercultural) and to privilege the artist’s voice in a research world where it was too often silent. Practice as research, practice-based research, and practice-led research were not yet fully named. It was in its infancy and my biggest challenge was to find a serviceable methodology which did not betray my intentions to keep practice at the centre of the research. Over the last 15 years, practice led doctoral research, where examinable creative work is placed alongside an accompanying (exegetical) written component, has come a long way. It has been extensively debated with a range of theories and models proposed (Barrett & Bolt, 2007, Pakes, 2003 & 2004, Piccini, 2005, Philips, Stock & Vincs 2009, Stock, 2009 & 2010, Riley & Hunter 2009, Haseman, 2006, Hecq, 2012). Much of this writing is based around epistemological concerns where the research methodologies proposed normally incorporate a contextualisation of the creative work in its field of practice, and more importantly validation and interrogation of the processes of the practice as the central ‘data gathering’ method. It is now widely accepted, at least in the Australian creative arts context, that knowledge claims in creative practice research arise from the material activities of the practice itself (Carter, 2004). The creative work explicated as the tangible outcome of that practice is sometimes referred to as the ‘artefact’. Although the making of the artefact, according to Colbert (2009, p. 7) is influenced by “personal, experiential and iterative processes”, mapping them through a research pathway is “difficult to predict [for] “the adjustments made to the artefact in the light of emerging knowledge and insights cannot be foreshadowed”. Linking the process and the practice outcome most often occurs through the textual intervention of an exegesis which builds, and/or builds on, theoretical concerns arising in and from the work. This linking produces what Barrett (2007) refers to as “situated knowledge… that operates in relation to established knowledge” (p. 145). But what if those material forms or ‘artefacts’ are not objects or code or digitised forms, but live within the bodies of artist/researchers where the nature of the practice itself is live, ephemeral and constantly transforming, as in dance and physical performance? Even more unsettling is when the ‘artefact’ is literally embedded and embodied in the work and in the maker/researcher; when subject and object are merged. To complicate matters, the performing arts are necessarily collaborative, relying not only on technical mastery and creative/interpretive processes, but on social and artistic relationships which collectively make up the ‘artefact’. This chapter explores issues surrounding live dance and physical performance when placed in a research setting, specifically the complexities of being required to translate embodied dance findings into textual form. Exploring how embodied knowledge can be shared in a research context for those with no experiential knowledge of communicating through and in dance, I draw on theories of “dance enaction” (Warburton, 2011) together with notions of “affective intensities” and “performance mastery” (Melrose, 2003), “intentional activity” (Pakes, 2004) and the place of memory. In seeking ways to capture in another form the knowledge residing in live dance practice, thus making implicit knowledge explicit, I further propose there is a process of triple translation as the performance (the living ‘artefact’) is documented in multi-facetted ways to produce something durable which can be re-visited. This translation becomes more complex if the embodied knowledge resides in culturally specific practices, formed by world views and processes quite different from accepted norms and conventions (even radical ones) of international doctoral research inquiry. But whatever the combination of cultural, virtual and genre-related dance practices being researched, embodiment is central to the process, outcome and findings, and the question remains of how we will use text and what forms that text might take.
Resumo:
Recently, attempts to improve decision making in species management have focussed on uncertainties associated with modelling temporal fluctuations in populations. Reducing model uncertainty is challenging; while larger samples improve estimation of species trajectories and reduce statistical errors, they typically amplify variability in observed trajectories. In particular, traditional modelling approaches aimed at estimating population trajectories usually do not account well for nonlinearities and uncertainties associated with multi-scale observations characteristic of large spatio-temporal surveys. We present a Bayesian semi-parametric hierarchical model for simultaneously quantifying uncertainties associated with model structure and parameters, and scale-specific variability over time. We estimate uncertainty across a four-tiered spatial hierarchy of coral cover from the Great Barrier Reef. Coral variability is well described; however, our results show that, in the absence of additional model specifications, conclusions regarding coral trajectories become highly uncertain when considering multiple reefs, suggesting that management should focus more at the scale of individual reefs. The approach presented facilitates the description and estimation of population trajectories and associated uncertainties when variability cannot be attributed to specific causes and origins. We argue that our model can unlock value contained in large-scale datasets, provide guidance for understanding sources of uncertainty, and support better informed decision making
Resumo:
Purpose: To provide a comprehensive overview of research examining the impact of astigmatism on clinical and functional measures of vision, the short and longer term adaptations to astigmatism that occur in the visual system, and the currently available clinical options for the management of patients with astigmatism. Recent findings: The presence of astigmatism can lead to substantial reductions in visual performance in a variety of clinical vision measures and functional visual tasks. Recent evidence demonstrates that astigmatic blur results in short-term adaptations in the visual system that appear to reduce the perceived impact of astigmatism on vision. In the longer term, uncorrected astigmatism in childhood can also significantly impact on visual development, resulting in amblyopia. Astigmatism is also associated with the development of spherical refractive errors. Although the clinical correction of small magnitudes of astigmatism is relatively straightforward, the precise, reliable correction of astigmatism (particularly high astigmatism) can be challenging. A wide variety of refractive corrections are now available for the patient with astigmatism, including spectacle, contact lens and surgical options. Conclusion: Astigmatism is one of the most common refractive errors managed in clinical ophthalmic practice. The significant visual and functional impacts of astigmatism emphasise the importance of its reliable clinical management. With continued improvements in ocular measurement techniques and developments in a range of different refractive correction technologies, the future promises the potential for more precise and comprehensive correction options for astigmatic patients.
Resumo:
Purpose To examine macular retinal thickness and retinal layer thickness with spectral domain optical coherence tomography (OCT) in a population of children with normal ocular health and minimal refractive errors. Methods High resolution macular OCT scans from 196 children aged from 4 to 12 years (mean age 8 ± 2 years) were analysed to determine total retinal thickness and the thickness of 6 different retinal layers across the central 5 mm of the posterior pole. Automated segmentation with manual correction was used to derive retinal thickness values. Results The mean total retinal thickness in the central 1 mm foveal zone was 255 ± 16 μm, and this increased significantly with age (mean increase of 1.8 microns per year) in childhood (p<0.001). Age-related increases in thickness of some retinal layers were also observed, with changes of highest statistical significance found in the outer retinal layers in the central foveal region (p<0.01). Significant topographical variations in thickness of each of the retinal layers were also observed (p<0.001). Conclusions Small magnitude, statistically significant increases in total retinal thickness and retinal layer thickness occur from early childhood to adolescence. The most prominent changes appear to occur in the outer retinal layers of the central fovea.
Resumo:
To enhance the efficiency of regression parameter estimation by modeling the correlation structure of correlated binary error terms in quantile regression with repeated measurements, we propose a Gaussian pseudolikelihood approach for estimating correlation parameters and selecting the most appropriate working correlation matrix simultaneously. The induced smoothing method is applied to estimate the covariance of the regression parameter estimates, which can bypass density estimation of the errors. Extensive numerical studies indicate that the proposed method performs well in selecting an accurate correlation structure and improving regression parameter estimation efficiency. The proposed method is further illustrated by analyzing a dental dataset.
Resumo:
Background: Hot air ballooning incidents are relatively rare, however, when they do occur they are likely to result in a fatality or serious injury. Human error is commonly attributed as the cause of hot air ballooning incidents; however, error in itself is not an explanation for safety failures. This research aims to identify, and establish the relative importance of factors contributing towards hot air ballooning incidents. Methods: Twenty-two Australian Ballooning Federation (ABF) incident reports were thematically coded using a bottom up approach to identify causal factors. Subsequently, 69 balloonists (mean 19.51 years’ experience) participated in a survey to identify additional causal factors and rate (out of seven) the perceived frequency and potential impact to ballooning operations of each of the previously identified causal factors. Perceived associated risk was calculated by multiplying mean perceived frequency and impact ratings. Results: Incident report coding identified 54 causal factors within nine higher level areas: Attributes, Crew resource management, Equipment, Errors, Instructors, Organisational, Physical Environment, Regulatory body and Violations. Overall, ‘weather’, ‘inexperience’ and ‘poor/inappropriate decisions’ were rated as having greatest perceived associated risk. Discussion: Although errors were nominated as a prominent cause of hot air ballooning incidents, physical environment and personal attributes are also particularly important for safe hot air ballooning operations. In identifying a range of causal factors the areas of weakness surrounding ballooning operations have been defined; it is hoped that targeted safety and training strategies can now be put into place removing these contributing factors and reducing the chance of pilot error.
Resumo:
Accurate process model elicitation continues to be a time consuming task, requiring skill on the part of the interviewer to extract explicit and tacit process information from the interviewee. Many errors occur in this elicitation stage that would be avoided by better activity recall, more consistent specification methods and greater engagement in the elicitation process by interviewees. Metasonic GmbH has developed a process elicitation tool for their process suite. As part of a research engagement with Metasonic, staff from QUT, Australia have developed a 3D virtual world approach to the same problem, viz. eliciting process models from stakeholders in an intuitive manner. This book chapter tells the story of how QUT staff developed a 3D Virtual World tool for process elicitation, took the outcomes of their research project to Metasonic for evaluation, and finally, Metasonic’s response to the initial proof of concept.