950 resultados para consistency
Resumo:
We compare the consistency of choices in two methods to used elicit risk preferences on an aggregate as well as on an individual level. We asked subjects to choose twice from a list of nine decision between two lotteries, as introduced by Holt and Laury (2002, 2005) alternating with nine decisions using the budget approach introduced by Andreoni and Harbaugh (2009). We find that while on an aggregate(subject pool) level the results are (roughly) consistent, on an individual(within-subject) level,behavior is far from consistent. Within each method as well as across methods we observe low correlations. This again questions the reliability of experimental risk elicitation measures and the ability to use results from such methods to control for the risk aversion of subjects when explaining e�ects in other experimental games.
Resumo:
The field of Arts-Health practice and research has grown exponentially in the past 30 years. While researchers are using applied arts as the subject of investigation in research, the evaluation of practice and participant benefits has a limited general focus. In recent years, the field has witnessed a growing concentration on the evaluation of health outcomes, outputs and tangential benefits for participants engaging in Arts-Health practice. The wide range of methodological approaches applied arts practitioners implement make the field difficult to define. This article introduces the term Arts-Health intersections as a model of practice and framework to promote consistency in design, implementation and evaluative processes in applied arts programmes promoting health outcomes. The article challenges the current trend to solely evaluate health outcomes in the field, and promotes a concurrent and multidisciplinary methodological approach that can be adopted to promote evaluation, consistency and best practice in the field of Arts-Health intersections. The article provides a theoretical overview of Arts-Health intersections, and then takes this theoretical platform and details a best model of practice for developing Arts-Health intersections and presents this model as a guide.
Resumo:
- The RAH was activated over 2500 trauma calls in 2009. This figure is over twice the number of calls put out by similar services. - Many trauma calls (in particular L2 trauma calls) from the existing system do not warrant activation of the trauma team - Sometimes trauma calls are activated for nontrauma reasons (eg rapid access to radiology, departmental pressures etc) - The excess of trauma calls has several deleterious effects particularly on time management for the trauma service staff: ward rounds/tertiary survey rounds, education, quality improvement, research
Resumo:
Introduction Road safety researchers rely heavily on self-report data to explore the aetiology of crash risk. However, researchers consistently acknowledge a range of limitations associated with this methodological approach (e.g., self-report bias), which has been hypothesised to reduce the predictive efficacy of scales. Although well researched in other areas, one important factor often neglected in road safety studies is the fallibility of human memory. Given accurate recall is a key assumption in many studies, the validity and consistency of self-report data warrants investigation. The aim of the current study was to examine the consistency of self-report data of crash history and details of the most recent reported crash on two separate occasions. Materials & Method A repeated measures design was utilised to examine the self-reported crash involvement history of 214 general motorists over a two month period. Results A number of interesting discrepancies were noted in relation to number of lifetime crashes reported by the participants and the descriptions of their most recent crash across the two occasions. Of the 214 participants who reported having been involved in a crash, 35 (22.3%) reported a lower number of lifetime crashes as Time 2, than at Time 1. Of the 88 drivers who reported no change in number of lifetime crashes, 10 (11.4%) described a different most recent crash. Additionally, of the 34 reporting an increase in the number of lifetime crashes, 29 (85.3%) of these described the same crash on both occasions. Assessed as a whole, at least 47.1% of participants made a confirmed mistake at Time 1 or Time 2. Conclusions These results raise some doubt in regard to the accuracy of memory recall across time. Given that self-reported crash involvement is the predominant dependent variable used in the majority of road safety research, this issue warrants further investigation. Replication of the study with a larger sample size that includes multiple recall periods would enhance understanding into the significance of this issue for road safety methodology.
Resumo:
We compare the consistency of choices in two methods used to elicit risk preferences on an aggregate as well as on an individual level. We ask subjects to choose twice from a list of nine decisions between two lotteries, as introduced by Holt and Laury (2002, 2005) alternating with nine decisions using the budget approach introduced by Andreoni and Harbaugh (2009). We find that, while on an aggregate (subject pool) level the results are consistent, on an individual (within-subject) level, behaviour is far from consistent. Within each method as well as across methods we observe low (simple and rank) correlations.
Resumo:
Until quite recently, most Australian jurisdictions gave statutory force to the principle of imprisonment as a sanction of last resort, reflecting its status as the most punitive sentencing option open to the court.1 That principle gave primary discretion as to whether incarceration was the most appropriate means of achieving the purpose of a sentence to the sentencing court, which received all of the information relevant to the offence, the offender and any victim(s). The disestablishment of this principle is symptomatic of an increasing erosion of judicial discretion with respect to sentencing, which appears to be resulting in some extremely punitive consequences.
Resumo:
The ultimate goal of profiling is to identify the major behavioral and personality characteristics to narrow the suspect pool. Inferences about offender characteristics can be accomplished deductively, based on the analysis of discrete offender behaviors established within a particular case. They can also be accomplished inductively, involving prediction based on abstract offender averages from group data (these methods and the logic on which they are based is detailed extensively in Chapters 2 and 4). As discussed, these two approaches are by no means equal.
Resumo:
Criminal profiling is an investigative tool used around the world to infer the personality and behavioural characteristics of an offender based on their crime. Case linkage, the process of determining discreet connections between crimes of the same offender, is a practice that falls under the general banner of criminal profiling and has been widely criticized. Two theories, behavioural consistency and the homology assumption, are examined and their impact on profiling in general and case linkage specifically is discussed...
Resumo:
Given that there is increasing recognition of the effect that submillimetre changes in collimator position can have on radiotherapy beam dosimetry, this study aimed to evaluate the potential variability in small field collimation that may exist between otherwise matched linacs. Field sizes and field output factors were measured using radiochromic film and an electron diode, for jaw- and MLC-collimated fields produced by eight dosimetrically matched Varian iX linacs (Varian Medical Systems, Palo Alto, USA). This study used nominal sizes from 0.6×0.6 to 10×10 cm215 , for jaw-collimated fields,and from 1×1 to 10×10 cm216 , for MLC-collimated fields, delivered from a zero (head up, beam directed vertically downward) gantry angle. Differences between the field sizes measured for the eight linacs exceeded the uncertainty of the film measurements and the repositioning uncertainty of the jaws and MLCs on one linac. The dimensions of fields defined by MLC leaves were more consistent between linacs, while also differing more from their nominal values than fields defined by orthogonal jaws. The field output factors measured for the different linacs generally increased with increasing measured field size for the nominal 0.6×0.6 and 1×1 cm2 fields, and became consistent between linacs for nominal field sizes of 2×2 cm2 25 and larger. The inclusion in radiotherapy treatment planning system beam data of small field output factors acquired in fields collimated by jaws (rather than the more-reproducible MLCs), associated with either the nominal or the measured field sizes, should be viewed with caution. The size and reproducibility of the fields (especially the small fields) used to acquire treatment planning data should be investigated thoroughly as part of the linac or planning system commissioning process. Further investigation of these issues, using different linac models, collimation systems and beam orientations, is recommended.
Resumo:
Fusing data from multiple sensing modalities, e.g. laser and radar, is a promising approach to achieve resilient perception in challenging environmental conditions. However, this may lead to \emph{catastrophic fusion} in the presence of inconsistent data, i.e. when the sensors do not detect the same target due to distinct attenuation properties. It is often difficult to discriminate consistent from inconsistent data across sensing modalities using local spatial information alone. In this paper we present a novel consistency test based on the log marginal likelihood of a Gaussian process model that evaluates data from range sensors in a relative manner. A new data point is deemed to be consistent if the model statistically improves as a result of its fusion. This approach avoids the need for absolute spatial distance threshold parameters as required by previous work. We report results from object reconstruction with both synthetic and experimental data that demonstrate an improvement in reconstruction quality, particularly in cases where data points are inconsistent yet spatially proximal.
Resumo:
We report the results of two studies of aspects of the consistency of truncated nonlinear integral equation based theories of freezing: (i) We show that the self-consistent solutions to these nonlinear equations are unfortunately sensitive to the level of truncation. For the hard sphere system, if the Wertheim–Thiele representation of the pair direct correlation function is used, the inclusion of part but not all of the triplet direct correlation function contribution, as has been common, worsens the predictions considerably. We also show that the convergence of the solutions found, with respect to number of reciprocal lattice vectors kept in the Fourier expansion of the crystal singlet density, is slow. These conclusions imply great sensitivity to the quality of the pair direct correlation function employed in the theory. (ii) We show the direct correlation function based and the pair correlation function based theories of freezing can be cast into a form which requires solution of isomorphous nonlinear integral equations. However, in the pair correlation function theory the usual neglect of the influence of inhomogeneity of the density distribution on the pair correlation function is shown to be inconsistent to the lowest order in the change of density on freezing, and to lead to erroneous predictions. The Journal of Chemical Physics is copyrighted by The American Institute of Physics.
Resumo:
We report the results of two studies of aspects of the consistency of truncated nonlinear integral equation based theories of freezing: (i) We show that the self-consistent solutions to these nonlinear equations are unfortunately sensitive to the level of truncation. For the hard sphere system, if the Wertheim–Thiele representation of the pair direct correlation function is used, the inclusion of part but not all of the triplet direct correlation function contribution, as has been common, worsens the predictions considerably. We also show that the convergence of the solutions found, with respect to number of reciprocal lattice vectors kept in the Fourier expansion of the crystal singlet density, is slow. These conclusions imply great sensitivity to the quality of the pair direct correlation function employed in the theory. (ii) We show the direct correlation function based and the pair correlation function based theories of freezing can be cast into a form which requires solution of isomorphous nonlinear integral equations. However, in the pair correlation function theory the usual neglect of the influence of inhomogeneity of the density distribution on the pair correlation function is shown to be inconsistent to the lowest order in the change of density on freezing, and to lead to erroneous predictions. The Journal of Chemical Physics is copyrighted by The American Institute of Physics.