29 resultados para ERROR rates

em CentAUR: Central Archive University of Reading - UK


Relevância:

70.00% 70.00%

Publicador:

Resumo:

Background: Medication errors are an important cause of morbidity and mortality in primary care. The aims of this study are to determine the effectiveness, cost effectiveness and acceptability of a pharmacist-led information-technology-based complex intervention compared with simple feedback in reducing proportions of patients at risk from potentially hazardous prescribing and medicines management in general (family) practice. Methods: Research subject group: "At-risk" patients registered with computerised general practices in two geographical regions in England. Design: Parallel group pragmatic cluster randomised trial. Interventions: Practices will be randomised to either: (i) Computer-generated feedback; or (ii) Pharmacist-led intervention comprising of computer-generated feedback, educational outreach and dedicated support. Primary outcome measures: The proportion of patients in each practice at six and 12 months post intervention: - with a computer-recorded history of peptic ulcer being prescribed non-selective non-steroidal anti-inflammatory drugs - with a computer-recorded diagnosis of asthma being prescribed beta-blockers - aged 75 years and older receiving long-term prescriptions for angiotensin converting enzyme inhibitors or loop diuretics without a recorded assessment of renal function and electrolytes in the preceding 15 months. Secondary outcome measures; These relate to a number of other examples of potentially hazardous prescribing and medicines management. Economic analysis: An economic evaluation will be done of the cost per error avoided, from the perspective of the UK National Health Service (NHS), comparing the pharmacist-led intervention with simple feedback. Qualitative analysis: A qualitative study will be conducted to explore the views and experiences of health care professionals and NHS managers concerning the interventions, and investigate possible reasons why the interventions prove effective, or conversely prove ineffective. Sample size: 34 practices in each of the two treatment arms would provide at least 80% power (two-tailed alpha of 0.05) to demonstrate a 50% reduction in error rates for each of the three primary outcome measures in the pharmacist-led intervention arm compared with a 11% reduction in the simple feedback arm. Discussion: At the time of submission of this article, 72 general practices have been recruited (36 in each arm of the trial) and the interventions have been delivered. Analysis has not yet been undertaken.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

In order to examine metacognitive accuracy (i.e., the relationship between metacognitive judgment and memory performance), researchers often rely on by-participant analysis, where metacognitive accuracy (e.g., resolution, as measured by the gamma coefficient or signal detection measures) is computed for each participant and the computed values are entered into group-level statistical tests such as the t-test. In the current work, we argue that the by-participant analysis, regardless of the accuracy measurements used, would produce a substantial inflation of Type-1 error rates, when a random item effect is present. A mixed-effects model is proposed as a way to effectively address the issue, and our simulation studies examining Type-1 error rates indeed showed superior performance of mixed-effects model analysis as compared to the conventional by-participant analysis. We also present real data applications to illustrate further strengths of mixed-effects model analysis. Our findings imply that caution is needed when using the by-participant analysis, and recommend the mixed-effects model analysis.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In clinical trials, situations often arise where more than one response from each patient is of interest; and it is required that any decision to stop the study be based upon some or all of these measures simultaneously. Theory for the design of sequential experiments with simultaneous bivariate responses is described by Jennison and Turnbull (Jennison, C., Turnbull, B. W. (1993). Group sequential tests for bivariate response: interim analyses of clinical trials with both efficacy and safety endpoints. Biometrics 49:741-752) and Cook and Farewell (Cook, R. J., Farewell, V. T. (1994). Guidelines for monitoring efficacy and toxicity responses in clinical trials. Biometrics 50:1146-1152) in the context of one efficacy and one safety response. These expositions are in terms of normally distributed data with known covariance. The methods proposed require specification of the correlation, ρ between test statistics monitored as part of the sequential test. It can be difficult to quantify ρ and previous authors have suggested simply taking the lowest plausible value, as this will guarantee power. This paper begins with an illustration of the effect that inappropriate specification of ρ can have on the preservation of trial error rates. It is shown that both the type I error and the power can be adversely affected. As a possible solution to this problem, formulas are provided for the calculation of correlation from data collected as part of the trial. An adaptive approach is proposed and evaluated that makes use of these formulas and an example is provided to illustrate the method. Attention is restricted to the bivariate case for ease of computation, although the formulas derived are applicable in the general multivariate case.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A sequential study design generally makes more efficient use of available information than a fixed sample counterpart of equal power. This feature is gradually being exploited by researchers in genetic and epidemiological investigations that utilize banked biological resources and in studies where time, cost and ethics are prominent considerations. Recent work in this area has focussed on the sequential analysis of matched case-control studies with a dichotomous trait. In this paper, we extend the sequential approach to a comparison of the associations within two independent groups of paired continuous observations. Such a comparison is particularly relevant in familial studies of phenotypic correlation using twins. We develop a sequential twin method based on the intraclass correlation and show that use of sequential methodology can lead to a substantial reduction in the number of observations without compromising the study error rates. Additionally, our approach permits straightforward allowance for other explanatory factors in the analysis. We illustrate our method in a sequential heritability study of dysplasia that allows for the effect of body mass index and compares monozygotes with pairs of singleton sisters. Copyright (c) 2006 John Wiley & Sons, Ltd.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Studies in the literature have proposed techniques to facilitate pointing in graphical user interfaces through the use of proxy targets. Proxy targets effectively bring the target to the cursor, thereby reducing the distance that the cursor must travel. This paper describes a study which aims to provide an initial understanding of how older adults respond to proxy targets, and compares older with younger users. We found that users in both age groups adjusted to the proxy targets without difficulty, and there was no indication in the cursor trajectories that users were confused about which target, i.e. the original versus the proxy, was to be selected. In terms of times, preliminary results show that for younger users, proxies did not provide any benefits over direct selection, while for older users, times were increased with proxy targets. A full analysis of the movement times, error rates, throughput and subjective feedback is currently underway.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Embodied theories of cognition propose that neural substrates used in experiencing the referent of a word, for example perceiving upward motion, should be engaged in weaker form when that word, for example ‘rise’, is comprehended. Motivated by the finding that the perception of irrelevant background motion at near-threshold, but not supra-threshold, levels interferes with task execution, we assessed whether interference from near-threshold background motion was modulated by its congruence with the meaning of words (semantic content) when participants completed a lexical decision task (deciding if a string of letters is a real word or not). Reaction times for motion words, such as ‘rise’ or ‘fall’, were slower when the direction of visual motion and the ‘motion’ of the word were incongruent — but only when the visual motion was at nearthreshold levels. When motion was supra-threshold, the distribution of error rates, not reaction times, implicated low-level motion processing in the semantic processing of motion words. As the perception of near-threshold signals is not likely to be influenced by strategies, our results support a close contact between semantic information and perceptual systems.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Background: Recent studies have indicated that many children with autism spectrum disorders present with language difficulties that are similar to those of children with specific language impairments, leading some to argue for similar structural deficits in these two disorders. Aims: Repetition of sentences involving long-distance dependencies was used to investigate complex syntax in these groups. Methods & Procedures: Adolescents with specific language impairments (mean age = 15;3, n = 14) and autism spectrum disorders plus language impairment (autism plus language impairment; mean age = 14;8, n = 16) were recruited alongside typically developing adolescents (mean age = 14;4, n = 17). They were required to repeat sentences containing relative clauses that varied in syntactic complexity. Outcomes & Results: The adolescents with specific language impairments presented with greater syntactic difficulties than the adolescents with autism plus language impairment, as manifested by higher error rates on the more complex object relative clauses, and a greater tendency to make syntactic changes during repetition. Conclusions & Implications: Adolescents with specific language impairments may have more severe syntactic difficulties than adolescents with autism plus language impairment, possibly due to their short-term memory limitations.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Non-word repetition (NWR) was investigated in adolescents with typical development, Specific Language Impairment (SLI) and Autism Plus language Impairment (ALI) (n = 17, 13, 16, and mean age 14;4, 15;4, 14;8 respectively). The study evaluated the hypothesis that poor NWR performance in both groups indicates an overlapping language phenotype (Kjelgaard & Tager-Flusberg, 2001). Performance was investigated both quantitatively, e.g. overall error rates, and qualitatively, e.g. effect of length on repetition, proportion of errors affecting phonological structure, and proportion of consonant substitutions involving manner changes. Findings were consistent with previous research (Whitehouse, Barry, & Bishop, 2008) demonstrating a greater effect of length in the SLI group than the ALI group, which may be due to greater short-term memory limitations. In addition, an automated count of phoneme errors identified poorer performance in the SLI group than the ALI group. These findings indicate differences in the language profiles of individuals with SLI and ALI, but do not rule out a partial overlap. Errors affecting phonological structure were relatively frequent, accounting for around 40% of phonemic errors, but less frequent than straight Consonant-for-Consonant or vowel-for-vowel substitutions. It is proposed that these two different types of errors may reflect separate contributory mechanisms. Around 50% of consonant substitutions in the clinical groups involved manner changes, suggesting poor auditory-perceptual encoding. From a clinical perspective algorithms which automatically count phoneme errors may enhance sensitivity of NWR as a diagnostic marker of language impairment. Learning outcomes: Readers will be able to (1) describe and evaluate the hypothesis that there is a phenotypic overlap between SLI and Autism Spectrum Disorders (2) describe differences in the NWR performance of adolescents with SLI and ALI, and discuss whether these differences support or refute the phenotypic overlap hypothesis, and (3) understand how computational algorithms such as the Levenshtein Distance may be used to analyse NWR data.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A role for sequential test procedures is emerging in genetic and epidemiological studies using banked biological resources. This stems from the methodology's potential for improved use of information relative to comparable fixed sample designs. Studies in which cost, time and ethics feature prominently are particularly suited to a sequential approach. In this paper sequential procedures for matched case–control studies with binary data will be investigated and assessed. Design issues such as sample size evaluation and error rates are identified and addressed. The methodology is illustrated and evaluated using both real and simulated data sets.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Although a number of studies have reported that force feedback gravity wells can improve performance in "point-and-click" tasks, there have been few studies addressing issues surrounding the use of gravity wells for multiple on-screen targets. This paper investigates the performance of users, both with and without motion-impairments, in a "point-and-click" task when an undesired haptic distractor is present. The importance of distractor location is studied explicitly. Results showed that gravity wells can still improve times and error rates, even on occasions when the cursor is pulled into a distractor. The greatest improvement is seen for the most impaired users. In addition to traditional measures such as time and errors, performance is studied in terms of measures of cursor movement along a path. Two cursor measures, angular distribution and temporal components, are proposed and their ability to explain performance differences is explored.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This paper presents practical approaches to the problem of sample size re-estimation in the case of clinical trials with survival data when proportional hazards can be assumed. When data are readily available at the time of the review, on a full range of survival experiences across the recruited patients, it is shown that, as expected, performing a blinded re-estimation procedure is straightforward and can help to maintain the trial's pre-specified error rates. Two alternative methods for dealing with the situation where limited survival experiences are available at the time of the sample size review are then presented and compared. In this instance, extrapolation is required in order to undertake the sample size re-estimation. Worked examples, together with results from a simulation study are described. It is concluded that, as in the standard case, use of either extrapolation approach successfully protects the trial error rates. Copyright © 2012 John Wiley & Sons, Ltd.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Point and click interactions using a mouse are an integral part of computer use for current desktop systems. Compared with younger users though, older adults experience greater difficulties performing cursor positioning tasks, and this can present limitations to using a computer easily and effectively. Target expansion is a technique for improving pointing performance, where the target dynamically grows as the cursor approaches. This has the advantage that targets conserve screen real estate in their unexpanded state, yet can still provide the benefits of a larger area to click on. This paper presents two studies of target expansion with older and younger participants, involving multidirectional point-select tasks with a computer mouse. Study 1 compares static versus expanding targets, and Study 2 compares static targets with three alternative techniques for expansion. Results show that expansion can improve times by up to 14%, and reduce error rates by up to 50%. Additionally, expanding targets are beneficial even when the expansion happens late in the movement, i.e. after the cursor has reached the expanded target area or even after it has reached the original target area. Participants’ subjective feedback on the target expansion are generally favorable, and this lends further support for the technique.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Synesthesia entails a special kind of sensory perception, where stimulation in one sensory modality leads to an internally generated perceptual experience of another, not stimulated sensory modality. This phenomenon can be viewed as an abnormal multisensory integration process as here the synesthetic percept is aberrantly fused with the stimulated modality. Indeed, recent synesthesia research has focused on multimodal processing even outside of the specific synesthesia-inducing context and has revealed changed multimodal integration, thus suggesting perceptual alterations at a global level. Here, we focused on audio-visual processing in synesthesia using a semantic classification task in combination with visually or auditory-visually presented animated and in animated objects in an audio-visual congruent and incongruent manner. Fourteen subjects with auditory-visual and/or grapheme-color synesthesia and 14 control subjects participated in the experiment. During presentation of the stimuli, event-related potentials were recorded from 32 electrodes. The analysis of reaction times and error rates revealed no group differences with best performance for audio-visually congruent stimulation indicating the well-known multimodal facilitation effect. We found enhanced amplitude of the N1 component over occipital electrode sites for synesthetes compared to controls. The differences occurred irrespective of the experimental condition and therefore suggest a global influence on early sensory processing in synesthetes.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In this paper, we develop an energy-efficient resource-allocation scheme with proportional fairness for downlink multiuser orthogonal frequency-division multiplexing (OFDM) systems with distributed antennas. Our aim is to maximize energy efficiency (EE) under the constraints of the overall transmit power of each remote access unit (RAU), proportional fairness data rates, and bit error rates (BERs). Because of the nonconvex nature of the optimization problem, obtaining the optimal solution is extremely computationally complex. Therefore, we develop a low-complexity suboptimal algorithm, which separates subcarrier allocation and power allocation. For the low-complexity algorithm, we first allocate subcarriers by assuming equal power distribution. Then, by exploiting the properties of fractional programming, we transform the nonconvex optimization problem in fractional form into an equivalent optimization problem in subtractive form, which includes a tractable solution. Next, an optimal energy-efficient power-allocation algorithm is developed to maximize EE while maintaining proportional fairness. Through computer simulation, we demonstrate the effectiveness of the proposed low-complexity algorithm and illustrate the fundamental trade off between energy and spectral-efficient transmission designs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

New data show that island arc rocks have (Pb-210/Ra-226)(o) ratios which range from as low as 0.24 up to 2.88. In contrast, (Ra-22S/Th-232) appears always within error of I suggesting that the large Ra-226-excesses observed in arc rocks were generated more than 30 years ago. This places a maximum estimate on melt ascent velocities of around 4000 m/year and provides further confidence that the Ra-226 excesses reflect deep (source) processes rather than shallow level alteration or seawater contamination. Conversely, partial melting must have occurred more than 30 years prior to eruption. The Pb-210 deficits are most readily explained by protracted magma degassing. Using published numerical models, the data suggest that degassing occurred continuously for periods up to several decades just prior to eruption but no link with eruption periodicity was found. Longer periods are required if degassing is discontinuous, less than 100% efficient or if magma is recharged or stored after degassing. The long durations suggest much of this degassing occurs at depth with implications for the formation of hydrothermal and copper-porphyry systems. A suite of lavas erupted in 1985-1986 from Sangeang Api volcano in the Sunda arc are characterised by deficits of Pb-210 relative to Ra-226 from which 6-8 years of continuous Rn-222 degassing would be inferred from recent numerical models. These data also form a linear (Pb-210)/Pb-(Ra-226)/Pb array which might be interpreted as a 71-year isochron. However, the array passes through the origin suggesting displacement downwards from the equiline in response to degassing and so the slope of the array is inferred not to have any age significance. Simple modelling shows that the range of (Ra-226)/Pb ratios requires thousands of years to develop consistent with differentiation occurring in response to cooling at the base of the crust. Thus, degassing post-dated, and was not responsible for magma differentiation. The formation, migration and extraction of gas bubbles must be extremely efficient in mafic magma whereas the higher viscosity of more siliceous magmas retards the process and can lead to Pb-210 excesses. A possible negative correlation between (Pb-210/Ra-226)(o) and SO2 emission rate requires further testing but may have implications for future eruptions. (C) 2004 Elsevier B.V. All rights reserved.