925 resultados para Timing errors
Resumo:
Objective To undertake a process evaluation of pharmacists' recommendations arising in the context of a complex IT-enabled pharmacist-delivered randomised controlled trial (PINCER trial) to reduce the risk of hazardous medicines management in general practices. Methods PINCER pharmacists manually recorded patients’ demographics, details of interventions recommended, actions undertaken by practice staff and time taken to manage individual cases of hazardous medicines management. Data were coded and double entered into SPSS v15, and then summarised using percentages for categorical data (with 95% CI) and, as appropriate, means (SD) or medians (IQR) for continuous data. Key findings Pharmacists spent a median of 20 minutes (IQR 10, 30) reviewing medical records, recommending interventions and completing actions in each case of hazardous medicines management. Pharmacists judged 72% (95%CI 70, 74) (1463/2026) of cases of hazardous medicines management to be clinically relevant. Pharmacists recommended 2105 interventions in 74% (95%CI 73, 76) (1516/2038) of cases and 1685 actions were taken in 61% (95%CI 59, 63) (1246/2038) of cases; 66% (95%CI 64, 68) (1383/2105) of interventions recommended by pharmacists were completed and 5% (95%CI 4, 6) (104/2105) of recommendations were accepted by general practitioners (GPs), but not completed at the end of the pharmacists’ placement; the remaining recommendations were rejected or considered not relevant by GPs. Conclusions The outcome measures were used to target pharmacist activity in general practice towards patients at risk from hazardous medicines management. Recommendations from trained PINCER pharmacists were found to be broadly acceptable to GPs and led to ameliorative action in the majority of cases. It seems likely that the approach used by the PINCER pharmacists could be employed by other practice pharmacists following appropriate training.
Resumo:
Objective To determine the prevalence and nature of prescribing and monitoring errors in general practices in England. Design Retrospective case note review of unique medication items prescribed over a 12 month period to a 2% random sample of patients. Mixed effects logistic regression was used to analyse the data. Setting Fifteen general practices across three primary care trusts in England. Data sources Examination of 6048 unique prescription items prescribed over the previous 12 months for 1777 patients. Main outcome measures Prevalence of prescribing and monitoring errors, and severity of errors, using validated definitions. Results Prescribing and/or monitoring errors were detected in 4.9% (296/6048) of all prescription items (95% confidence interval 4.4 - 5.5%). The vast majority of errors were of mild to moderate severity, with 0.2% (11/6048) of items having a severe error. After adjusting for covariates, patient-related factors associated with an increased risk of prescribing and/or monitoring errors were: age less than 15 (Odds Ratio (OR) 1.87, 1.19 to 2.94, p=0.006) or greater than 64 years (OR 1.68, 1.04 to 2.73, p=0.035), and higher numbers of unique medication items prescribed (OR 1.16, 1.12 to 1.19, p<0.001). Conclusion Prescribing and monitoring errors are common in English general practice, although severe errors are unusual. Many factors increase the risk of error. Having identified the most common and important errors, and the factors associated with these, strategies to prevent future errors should be developed based on the study findings.
Resumo:
Remote sensing observations often have correlated errors, but the correlations are typically ignored in data assimilation for numerical weather prediction. The assumption of zero correlations is often used with data thinning methods, resulting in a loss of information. As operational centres move towards higher-resolution forecasting, there is a requirement to retain data providing detail on appropriate scales. Thus an alternative approach to dealing with observation error correlations is needed. In this article, we consider several approaches to approximating observation error correlation matrices: diagonal approximations, eigendecomposition approximations and Markov matrices. These approximations are applied in incremental variational assimilation experiments with a 1-D shallow water model using synthetic observations. Our experiments quantify analysis accuracy in comparison with a reference or ‘truth’ trajectory, as well as with analyses using the ‘true’ observation error covariance matrix. We show that it is often better to include an approximate correlation structure in the observation error covariance matrix than to incorrectly assume error independence. Furthermore, by choosing a suitable matrix approximation, it is feasible and computationally cheap to include error correlation structure in a variational data assimilation algorithm.
Resumo:
In this paper, we present case studies of the optical and magnetic signatures of the characteristics of the first minute of substorm expansion phase onset observed in the ionosphere. We find that for two isolated substorms, the onset of magnetic pulsations in the 24–96 s period wavelet band are colocated in time and space with the formation and development of small-scale optical undulations along the most equatorward preexisting auroral arc prior to auroral breakup. These undulations undergo an inverse spatial cascade into vortices prior to the release of the westward traveling surge. We also present a case study of a multiple activation substorm, whereby discrete onsets of ULF wave power above a predetermined quiet time threshold are shown to be associated with specific optical intensifications and brightenings. Moreover, in the multiple activation substorm event, we show that neither the formation of the small-scale undulations nor the formation of similar structures along a north–south aligned arc is sufficient to produce auroral breakup associated with expansion phase onset. It is only ∼10 min after these two disparate activation regions initiate that auroral breakup and the subsequent formation of a westward traveling surge occur. We discuss the implications of these results in terms of the triggering mechanisms likely to be occurring during these specific events.
Resumo:
Aim: To examine the causes of prescribing and monitoring errors in English general practices and provide recommendations for how they may be overcome. Design: Qualitative interview and focus group study with purposive sampling and thematic analysis informed by Reason’s accident causation model. Participants: General practice staff participated in a combination of semi-structured interviews (n=34) and six focus groups (n=46). Setting: Fifteen general practices across three primary care trusts in England. Results: We identified seven categories of high-level error-producing conditions: the prescriber, the patient, the team, the task, the working environment, the computer system, and the primary-secondary care interface. Each of these was further broken down to reveal various error-producing conditions. The prescriber’s therapeutic training, drug knowledge and experience, knowledge of the patient, perception of risk, and their physical and emotional health, were all identified as possible causes. The patient’s characteristics and the complexity of the individual clinical case were also found to have contributed to prescribing errors. The importance of feeling comfortable within the practice team was highlighted, as well as the safety of general practitioners (GPs) in signing prescriptions generated by nurses when they had not seen the patient for themselves. The working environment with its high workload, time pressures, and interruptions, and computer related issues associated with mis-selecting drugs from electronic pick-lists and overriding alerts, were all highlighted as possible causes of prescribing errors and often interconnected. Conclusion: This study has highlighted the complex underlying causes of prescribing and monitoring errors in general practices, several of which are amenable to intervention.
Resumo:
We show that retrievals of sea surface temperature from satellite infrared imagery are prone to two forms of systematic error: prior error (familiar from the theory of atmospheric sounding) and error arising from nonlinearity. These errors have different complex geographical variations, related to the differing geographical distributions of the main geophysical variables that determine clear-sky brightness-temperatures over the oceans. We show that such errors arise as an intrinsic consequence of the form of the retrieval (rather than as a consequence of sub-optimally specified retrieval coefficients, as is often assumed) and that the pattern of observed errors can be simulated in detail using radiative-transfer modelling. The prior error has the linear form familiar from atmospheric sounding. A quadratic equation for nonlinearity error is derived, and it is verified that the nonlinearity error exhibits predominantly quadratic behaviour in this case.
Resumo:
Using the eye-movement monitoring technique in two reading comprehension experiments, we investigated the timing of constraints on wh-dependencies (so-called ‘island’ constraints) in native and nonnative sentence processing. Our results show that both native and nonnative speakers of English are sensitive to extraction islands during processing, suggesting that memory storage limitations affect native and nonnative comprehenders in essentially the same way. Furthermore, our results show that the timing of island effects in native compared to nonnative sentence comprehension is affected differently by the type of cue (semantic fit versus filled gaps) signalling whether dependency formation is possible at a potential gap site. Whereas English native speakers showed immediate sensitivity to filled gaps but not to lack of semantic fit, proficient German-speaking learners of L2 English showed the opposite sensitivity pattern. This indicates that initial wh-dependency formation in nonnative processing is based on semantic feature-matching rather than being structurally mediated as in native comprehension.
Processing reflexives in a second language: the timing of structural and discourse-level information
Resumo:
We report the results from two eye-movement monitoring experiments examining the processing of reflexive pronouns by proficient German-speaking learners of second language (L2) English. Our results show that the nonnative speakers initially tried to link English argument reflexives to a discourse-prominent but structurally inaccessible antecedent, thereby violating binding condition A. Our native speaker controls, in contrast, showed evidence of applying condition A immediately during processing. Together, our findings show that L2 learners’ initial focusing on a structurally inaccessible antecedent cannot be due to first language influence and is also independent of whether the inaccessible antecedent c-commands the reflexive. This suggests that unlike native speakers, nonnative speakers of English initially attempt to interpret reflexives through discourse-based coreference assignment rather than syntactic binding.
Resumo:
The extensive shoreline deposits of Lake Chilwa, southern Malawi, a shallow water body today covering 600 km2 of a basin of 7500 km2, are investigated for their record of late Quaternary highstands. OSL dating, applied to 36 samples from five sediment cores from the northern and western marginal sand ridges, reveal a highstand record spanning 44 ka. Using two different grouping methods, highstand phases are identified at 43.7–33.3 ka, 26.2–21.0 ka and 17.9–12.0 ka (total error method) or 38.4–35.5 ka, 24.3–22.3 ka, 16.2–15.1 ka and 13.5–12.7 ka (Finite Mixture Model age components) with two further discrete events recorded at 11.01 ± 0.76 ka and 8.52 ± 0.56 ka. Highstands are comparable to the timing of wet phases from other basins in East and southern Africa, demonstrating wet conditions in the region before the LGM, which was dry, and a wet Lateglacial, which commenced earlier in the southern compared to northern hemisphere in East Africa. We find no evidence that wet phases are insolation driven, but analysis of the dataset and GCM modelling experiments suggest that Heinrich events may be associated with enhanced monsoon activity in East Africa in both timing and as a possible causal mechanism.
Resumo:
Introduction: Care home residents are at particular risk from medication errors, and our objective was to determine the prevalence and potential harm of prescribing, monitoring, dispensing and administration errors in UK care homes, and to identify their causes. Methods: A prospective study of a random sample of residents within a purposive sample of homes in three areas. Errors were identified by patient interview, note review, observation of practice and examination of dispensed items. Causes were understood by observation and from theoretically framed interviews with home staff, doctors and pharmacists. Potential harm from errors was assessed by expert judgement. Results: The 256 residents recruited in 55 homes were taking a mean of 8.0 medicines. One hundred and seventy-eight (69.5%) of residents had one or more errors. The mean number per resident was 1.9 errors. The mean potential harm from prescribing, monitoring, administration and dispensing errors was 2.6, 3.7, 2.1 and 2.0 (0 = no harm, 10 = death), respectively. Contributing factors from the 89 interviews included doctors who were not accessible, did not know the residents and lacked information in homes when prescribing; home staff’s high workload, lack of medicines training and drug round interruptions; lack of team work among home, practice and pharmacy; inefficient ordering systems; inaccurate medicine records and prevalence of verbal communication; and difficult to fill (and check) medication administration systems. Conclusions: That two thirds of residents were exposed to one or more medication errors is of concern. The will to improve exists, but there is a lack of overall responsibility. Action is required from all concerned.
Resumo:
This article elucidates the Typological Primacy Model (TPM; Rothman, 2010, 2011, 2013) for the initial stages of adult third language (L3) morphosyntactic transfer, addressing questions that stem from the model and its application. The TPM maintains that structural proximity between the L3 and the L1 and/or the L2 determines L3 transfer. In addition to demonstrating empirical support for the TPM, this article articulates a proposal for how the mind unconsciously determines typological (structural) proximity based on linguistic cues from the L3 input stream used by the parser early on to determine holistic transfer of one previous (the L1 or the L2) system. This articulated version of the TPM is motivated by argumentation appealing to cognitive and linguistic factors. Finally, in line with the general tenets of the TPM, I ponder if and why L3 transfer might obtain differently depending on the type of bilingual (e.g. early vs. late) and proficiency level of bilingualism involved in the L3 process.
Resumo:
Bayesian analysis is given of an instrumental variable model that allows for heteroscedasticity in both the structural equation and the instrument equation. Specifically, the approach for dealing with heteroscedastic errors in Geweke (1993) is extended to the Bayesian instrumental variable estimator outlined in Rossi et al. (2005). Heteroscedasticity is treated by modelling the variance for each error using a hierarchical prior that is Gamma distributed. The computation is carried out by using a Markov chain Monte Carlo sampling algorithm with an augmented draw for the heteroscedastic case. An example using real data illustrates the approach and shows that ignoring heteroscedasticity in the instrument equation when it exists may lead to biased estimates.
Resumo:
Using annual observations on industrial production over the last three centuries, and on GDP over a 100-year period, we seek an historical perspective on the forecastability of these UK output measures. The series are dominated by strong upward trends, so we consider various specifications of this, including the local linear trend structural time-series model, which allows the level and slope of the trend to vary. Our results are not unduly sensitive to how the trend in the series is modelled: the average sizes of the forecast errors of all models, and the wide span of prediction intervals, attests to a great deal of uncertainty in the economic environment. It appears that, from an historical perspective, the postwar period has been relatively more forecastable.
Resumo:
It is increasingly important to know about when energy is used in the home, at work and on the move. Issues of time and timing have not featured strongly in energy policy analysis and in modelling, much of which has focused on estimating and reducing total average annual demand per capita. If smarter ways of balancing supply and demand are to take hold, and if we are to make better use of decarbonised forms of supply, it is essential to understand and intervene in patterns of societal synchronisation. This calls for detailed knowledge of when, and on what occasions many people engage in the same activities at the same time, of how such patterns are changing, and of how might they be shaped. In addition, the impact of smart meters and controls partly depends on whether there is, in fact scope for shifting the timing of what people do, and for changing the rhythm of the day. Is the scheduling of daily life an arena that policy can influence, and if so how? The DEMAND Centre has been linking time use, energy consumption and travel diary data as a means of addressing these questions and in this working paper we present some of the issues and results arising from that exercise.
Resumo:
Understanding the sources of systematic errors in climate models is challenging because of coupled feedbacks and errors compensation. The developing seamless approach proposes that the identification and the correction of short term climate model errors have the potential to improve the modeled climate on longer time scales. In previous studies, initialised atmospheric simulations of a few days have been used to compare fast physics processes (convection, cloud processes) among models. The present study explores how initialised seasonal to decadal hindcasts (re-forecasts) relate transient week-to-month errors of the ocean and atmospheric components to the coupled model long-term pervasive SST errors. A protocol is designed to attribute the SST biases to the source processes. It includes five steps: (1) identify and describe biases in a coupled stabilized simulation, (2) determine the time scale of the advent of the bias and its propagation, (3) find the geographical origin of the bias, (4) evaluate the degree of coupling in the development of the bias, (5) find the field responsible for the bias. This strategy has been implemented with a set of experiments based on the initial adjustment of initialised simulations and exploring various degrees of coupling. In particular, hindcasts give the time scale of biases advent, regionally restored experiments show the geographical origin and ocean-only simulations isolate the field responsible for the bias and evaluate the degree of coupling in the bias development. This strategy is applied to four prominent SST biases of the IPSLCM5A-LR coupled model in the tropical Pacific, that are largely shared by other coupled models, including the Southeast Pacific warm bias and the equatorial cold tongue bias. Using the proposed protocol, we demonstrate that the East Pacific warm bias appears in a few months and is caused by a lack of upwelling due to too weak meridional coastal winds off Peru. The cold equatorial bias, which surprisingly takes 30 years to develop, is the result of an equatorward advection of midlatitude cold SST errors. Despite large development efforts, the current generation of coupled models shows only little improvement. The strategy proposed in this study is a further step to move from the current random ad hoc approach, to a bias-targeted, priority setting, systematic model development approach.