97 resultados para probability of error
Resumo:
Background: Medication errors in general practice are an important source of potentially preventable morbidity and mortality. Building on previous descriptive, qualitative and pilot work, we sought to investigate the effectiveness, cost-effectiveness and likely generalisability of a complex pharm acist-led IT-based intervention aiming to improve prescribing safety in general practice. Objectives: We sought to: • Test the hypothesis that a pharmacist-led IT-based complex intervention using educational outreach and practical support is more effective than simple feedback in reducing the proportion of patients at risk from errors in prescribing and medicines management in general practice. • Conduct an economic evaluation of the cost per error avoided, from the perspective of the National Health Service (NHS). • Analyse data recorded by pharmacists, summarising the proportions of patients judged to be at clinical risk, the actions recommended by pharmacists, and actions completed in the practices. • Explore the views and experiences of healthcare professionals and NHS managers concerning the intervention; investigate potential explanations for the observed effects, and inform decisions on the future roll-out of the pharmacist-led intervention • Examine secular trends in the outcome measures of interest allowing for informal comparison between trial practices and practices that did not participate in the trial contributing to the QRESEARCH database. Methods Two-arm cluster randomised controlled trial of 72 English general practices with embedded economic analysis and longitudinal descriptive and qualitative analysis. Informal comparison of the trial findings with a national descriptive study investigating secular trends undertaken using data from practices contributing to the QRESEARCH database. The main outcomes of interest were prescribing errors and medication monitoring errors at six- and 12-months following the intervention. Results: Participants in the pharmacist intervention arm practices were significantly less likely to have been prescribed a non-selective NSAID without a proton pump inhibitor (PPI) if they had a history of peptic ulcer (OR 0.58, 95%CI 0.38, 0.89), to have been prescribed a beta-blocker if they had asthma (OR 0.73, 95% CI 0.58, 0.91) or (in those aged 75 years and older) to have been prescribed an ACE inhibitor or diuretic without a measurement of urea and electrolytes in the last 15 months (OR 0.51, 95% CI 0.34, 0.78). The economic analysis suggests that the PINCER pharmacist intervention has 95% probability of being cost effective if the decision-maker’s ceiling willingness to pay reaches £75 (6 months) or £85 (12 months) per error avoided. The intervention addressed an issue that was important to professionals and their teams and was delivered in a way that was acceptable to practices with minimum disruption of normal work processes. Comparison of the trial findings with changes seen in QRESEARCH practices indicated that any reductions achieved in the simple feedback arm were likely, in the main, to have been related to secular trends rather than the intervention. Conclusions Compared with simple feedback, the pharmacist-led intervention resulted in reductions in proportions of patients at risk of prescribing and monitoring errors for the primary outcome measures and the composite secondary outcome measures at six-months and (with the exception of the NSAID/peptic ulcer outcome measure) 12-months post-intervention. The intervention is acceptable to pharmacists and practices, and is likely to be seen as costeffective by decision makers.
Resumo:
Aircraft flying through cold ice-supersaturated air produce persistent contrails which contribute to the climate impact of aviation. Here, we demonstrate the importance of the weather situation, together with the route and altitude of the aircraft through this, on estimating contrail coverage. The results have implications for determining the climate impact of contrails as well as potential mitigation strategies. Twenty-one years of re-analysis data are used to produce a climatological assessment of conditions favorable for persistent contrail formation between 200 and 300 hPa over the north Atlantic in winter. The seasonal-mean frequency of cold ice-supersaturated regions is highest near 300 hPa, and decreases with altitude. The frequency of occurrence of ice-supersaturated regions varies with large-scale weather pattern; the most common locations are over Greenland, on the southern side of the jet stream and around the northern edge of high pressure ridges. Assuming aircraft take a great circle route, as opposed to a more realistic time-optimal route, is likely to lead to an error in the estimated contrail coverage, which can exceed 50% for westbound north Atlantic flights. The probability of contrail formation can increase or decrease with height, depending on the weather pattern, indicating that the generic suggestion that flying higher leads to fewer contrails is not robust.
Resumo:
The present study investigates the growth of error in baroclinic waves. It is found that stable or neutral waves are particularly sensitive to errors in the initial condition. Short stable waves are mainly sensitive to phase errors and the ultra long waves to amplitude errors. Analysis simulation experiments have indicated that the amplitudes of the very long waves become usually too small in the free atmosphere, due to the sparse and very irregular distribution of upper air observations. This also applies to the four-dimensional data assimilation experiments, since the amplitudes of the very long waves are usually underpredicted. The numerical experiments reported here show that if the very long waves have these kinds of amplitude errors in the upper troposphere or lower stratosphere the error is rapidly propagated (within a day or two) to the surface and to the lower troposphere.
Resumo:
In this paper, we investigate half-duplex two-way dual-hop channel state information (CSI)-assisted amplify-and-forward (AF) relaying in the presence of in-phase and quadrature-phase (I/Q) imbalance. A compensation approach for the I/Q imbalance is proposed, which employs the received signals together with their conjugations to detect the desired signal. We also derive the average symbol error probability of the considered half-duplex two-way dual-hop CSI-assisted AF relaying networks with and without compensation for I/Q imbalance in Rayleigh fading channels. Numerical results are provided and show that the proposed compensation method mitigates the impact of I/Q imbalance to a certain extent.
Resumo:
In probabilistic decision tasks, an expected value (EV) of a choice is calculated, and after the choice has been made, this can be updated based on a temporal difference (TD) prediction error between the EV and the reward magnitude (RM) obtained. The EV is measured as the probability of obtaining a reward x RM. To understand the contribution of different brain areas to these decision-making processes, functional magnetic resonance imaging activations related to EV versus RM (or outcome) were measured in a probabilistic decision task. Activations in the medial orbitofrontal cortex were correlated with both RM and with EV and confirmed in a conjunction analysis to extend toward the pregenual cingulate cortex. From these representations, TD reward prediction errors could be produced. Activations in areas that receive from the orbitofrontal cortex including the ventral striatum, midbrain, and inferior frontal gyrus were correlated with the TD error. Activations in the anterior insula were correlated negatively with EV, occurring when low reward outcomes were expected, and also with the uncertainty of the reward, implicating this region in basic and crucial decision-making parameters, low expected outcomes, and uncertainty.
Resumo:
Representation error arises from the inability of the forecast model to accurately simulate the climatology of the truth. We present a rigorous framework for understanding this kind of error of representation. This framework shows that the lack of an inverse in the relationship between the true climatology (true attractor) and the forecast climatology (forecast attractor) leads to the error of representation. A new gain matrix for the data assimilation problem is derived that illustrates the proper approaches one may take to perform Bayesian data assimilation when the observations are of states on one attractor but the forecast model resides on another. This new data assimilation algorithm is the optimal scheme for the situation where the distributions on the true attractor and the forecast attractors are separately Gaussian and there exists a linear map between them. The results of this theory are illustrated in a simple Gaussian multivariate model.
Resumo:
Recent work has shown that both the amplitude of upper-level Rossby waves and the tropopause sharpness decrease with forecast lead time for several days in some operational weather forecast systems. In this contribution, the evolution of error growth in a case study of this forecast error type is diagnosed through analysis of operational forecasts and hindcast simulations. Potential vorticity (PV) on the 320-K isentropic surface is used to diagnose Rossby waves. The Rossby-wave forecast error in the operational ECMWF high-resolution forecast is shown to be associated with errors in the forecast of a warm conveyor belt (WCB) through trajectory analysis and an error metric for WCB outflows. The WCB forecast error is characterised by an overestimation of WCB amplitude, a location of the WCB outflow regions that is too far to the southeast, and a resulting underestimation of the magnitude of the negative PV anomaly in the outflow. Essentially the same forecast error development also occurred in all members of the ECMWF Ensemble Prediction System and the Met Office MOGREPS-15 suggesting that in this case model error made an important contribution to the development of forecast error in addition to initial condition error. Exploiting this forecast error robustness, a comparison was performed between the realised flow evolution, proxied by a sequence of short-range simulations, and a contemporaneous forecast. Both the proxy to the realised flow and the contemporaneous forecast a were produced with the Met Office Unified Model enhanced with tracers of diabatic processes modifying potential temperature and PV. Clear differences were found in the way potential temperature and PV are modified in the WCB between proxy and forecast. These results demonstrate that differences in potential temperature and PV modification in the WCB can be responsible for forecast errors in Rossby waves.
Resumo:
Satellite-based rainfall monitoring is widely used for climatological studies because of its full global coverage but it is also of great importance for operational purposes especially in areas such as Africa where there is a lack of ground-based rainfall data. Satellite rainfall estimates have enormous potential benefits as input to hydrological and agricultural models because of their real time availability, low cost and full spatial coverage. One issue that needs to be addressed is the uncertainty on these estimates. This is particularly important in assessing the likely errors on the output from non-linear models (rainfall-runoff or crop yield) which make use of the rainfall estimates, aggregated over an area, as input. Correct assessment of the uncertainty on the rainfall is non-trivial as it must take account of • the difference in spatial support of the satellite information and independent data used for calibration • uncertainties on the independent calibration data • the non-Gaussian distribution of rainfall amount • the spatial intermittency of rainfall • the spatial correlation of the rainfall field This paper describes a method for estimating the uncertainty on satellite-based rainfall values taking account of these factors. The method involves firstly a stochastic calibration which completely describes the probability of rainfall occurrence and the pdf of rainfall amount for a given satellite value, and secondly the generation of ensemble of rainfall fields based on the stochastic calibration but with the correct spatial correlation structure within each ensemble member. This is achieved by the use of geostatistical sequential simulation. The ensemble generated in this way may be used to estimate uncertainty at larger spatial scales. A case study of daily rainfall monitoring in the Gambia, west Africa for the purpose of crop yield forecasting is presented to illustrate the method.
Resumo:
Identifying the signature of global warming in the world's oceans is challenging because low frequency circulation changes can dominate local temperature changes. The IPCC fourth assessment reported an average ocean heating rate of 0.21 ± 0.04 Wm−2 over the period 1961–2003, with considerable spatial, interannual and inter-decadal variability. We present a new analysis of millions of ocean temperature profiles designed to filter out local dynamical changes to give a more consistent view of the underlying warming. Time series of temperature anomaly for all waters warmer than 14°C show large reductions in interannual to inter-decadal variability and a more spatially uniform upper ocean warming trend (0.12 Wm−2 on average) than previous results. This new measure of ocean warming is also more robust to some sources of error in the ocean observing system. Our new analysis provides a useful addition for evaluation of coupled climate models, to the traditional fixed depth analyses.
Resumo:
Remote sensing can potentially provide information useful in improving pollution transport modelling in agricultural catchments. Realisation of this potential will depend on the availability of the raw data, development of information extraction techniques, and the impact of the assimilation of the derived information into models. High spatial resolution hyperspectral imagery of a farm near Hereford, UK is analysed. A technique is described to automatically identify the soil and vegetation endmembers within a field, enabling vegetation fractional cover estimation. The aerially-acquired laser altimetry is used to produce digital elevation models of the site. At the subfield scale the hypothesis that higher resolution topography will make a substantial difference to contaminant transport is tested using the AGricultural Non-Point Source (AGNPS) model. Slope aspect and direction information are extracted from the topography at different resolutions to study the effects on soil erosion, deposition, runoff and nutrient losses. Field-scale models are often used to model drainage water, nitrate and runoff/sediment loss, but the demanding input data requirements make scaling up to catchment level difficult. By determining the input range of spatial variables gathered from EO data, and comparing the response of models to the range of variation measured, the critical model inputs can be identified. Response surfaces to variation in these inputs constrain uncertainty in model predictions and are presented. Although optical earth observation analysis can provide fractional vegetation cover, cloud cover and semi-random weather patterns can hinder data acquisition in Northern Europe. A Spring and Autumn cloud cover analysis is carried out over seven UK sites close to agricultural districts, using historic satellite image metadata, climate modelling and historic ground weather observations. Results are assessed in terms of probability of acquisition probability and implications for future earth observation missions. (C) 2003 Elsevier Ltd. All rights reserved.
Resumo:
Structure is an important physical feature of the soil that is associated with water movement, the soil atmosphere, microorganism activity and nutrient uptake. A soil without any obvious organisation of its components is known as apedal and this state can have marked effects on several soil processes. Accurate maps of topsoil and subsoil structure are desirable for a wide range of models that aim to predict erosion, solute transport, or flow of water through the soil. Also such maps would be useful to precision farmers when deciding how to apply nutrients and pesticides in a site-specific way, and to target subsoiling and soil structure stabilization procedures. Typically, soil structure is inferred from bulk density or penetrometer resistance measurements and more recently from soil resistivity and conductivity surveys. To measure the former is both time-consuming and costly, whereas observations made by the latter methods can be made automatically and swiftly using a vehicle-mounted penetrometer or resistivity and conductivity sensors. The results of each of these methods, however, are affected by other soil properties, in particular moisture content at the time of sampling, texture, and the presence of stones. Traditional methods of observing soil structure identify the type of ped and its degree of development. Methods of ranking such observations from good to poor for different soil textures have been developed. Indicator variograms can be computed for each category or rank of structure and these can be summed to give the sum of indicator variograms (SIV). Observations of the topsoil and subsoil structure were made at four field sites where the soil had developed on different parent materials. The observations were ranked by four methods and indicator and the sum of indicator variograms were computed and modelled for each method of ranking. The individual indicators were then kriged with the parameters of the appropriate indicator variogram model to map the probability of encountering soil with the structure represented by that indicator. The model parameters of the SIVs for each ranking system were used with the data to krige the soil structure classes, and the results are compared with those for the individual indicators. The relations between maps of soil structure and selected wavebands from aerial photographs are examined as basis for planning surveys of soil structure. (C) 2007 Elsevier B.V. All rights reserved.
Resumo:
To make informed decisions about taking medicinal drugs, people need accurate information about side-effects. A European Union guideline now recommends use of qualitative descriptions for five bands of risk, ranging from very rare (affecting < 0·01% of the population), to very common (>10%). We did four studies of more than 750 people, whom we asked to estimate the probability of having a side-effect on the basis of qualitative and quantitative descriptions. Our results showed that qualitative descriptions led to gross overestimation of risk. Until further work is done on how patients taking the drugs interpret these terms, the terms should not be used in drug information leaflets.
Resumo:
Objectives - To assess the general public's interpretation of the verbal descriptors for side effect frequency recommended for use in medicine information leaflets by a European Union (EU) guideline, and to examine the extent to which differences in interpretation affect people's perception of risk and their judgments of intention to comply with the prescribed treatment. Method - Two studies used a controlled empirical methodology in which people were presented with a hypothetical, but realistic, scenario about visiting their general practitioner and being prescribed medication. They were given an explanation that focused on the side effects of the medicine, together with information about the probability of occurrence using either numerical percentages or the corresponding EU verbal descriptors. Interpretation of the descriptors was assessed. In study 2, participants were also required to make various judgments, including risk to health and intention to comply. Key findings - In both studies, use of the EU recommended descriptors led to significant overestimations of the likelihood of particular side effects occurring. Study 2 further showed that the "overestimation" resulted in significantly increased ratings of perceived severity of side effects and risk to health, as well as significantly reduced ratings of intention to comply, compared with those for people who received the probability information in numerical form. Conclusion - While it is recognised that the current findings require replication in a clinical setting, the European and national authorities should suspend the use of the EU recommended terms until further research is available to allow the use of an evidence-based approach.
Resumo:
Model catalysts of Pd nanoparticles and films on TiO2 (I 10) were fabricated by metal vapour deposition (MVD). Molecular beam measurements show that the particles are active for CO adsorption, with a global sticking probability of 0.25, but that they are deactivated by annealing above 600 K, an effect indicative of SMSI. The Pd nanoparticles are single crystals oriented with their (I 11) plane parallel to the surface plane of the titania. Analysis of the surface by atomic resolution STM shows that new structures have formed at the surface of the Pd nanoparticles and films after annealing above 800 K. There are only two structures, a zigzag arrangement and a much more complex "pinwheel" structure. The former has a unit cell containing 7 atoms, and the latter is a bigger unit cell containing 25 atoms. These new structures are due to an overlayer of titania that has appeared on the surface of the Pd nanoparticles after annealing, and it is proposed that the surface layer that causes the SMSI effect is a mixed alloy of Pd and Ti, with only two discrete ratios of atoms: Pd/Ti of 1: 1 (pinwheel) and 1:2 (zigzag). We propose that it is these structures that cause the SMSI effect. (c) 2005 Elsevier Inc. All rights reserved.
Resumo:
Aims To investigate the effects of electronic prescribing (EP) on prescribing quality, as indicated by prescribing errors and pharmacists' clinical interventions, in a UK hospital. Methods Prescribing errors and pharmacists' interventions were recorded by the ward pharmacist during a 4 week period both pre- and post-EP, with a second check by the principal investigator. The percentage of new medication orders with a prescribing error and/or pharmacist's intervention was calculated for each study period. Results Following the introduction of EP, there was a significant reduction in both pharmacists' interventions and prescribing errors. Interventions reduced from 73 (3.0% of all medication orders) to 45 (1.9%) (95% confidence interval (CI) for the absolute reduction 0.2, 2.0%), and errors from 94 (3.8%) to 48 (2.0%) (95% CI 0.9, 2.7%). Ten EP-specific prescribing errors were identified. Only 52% of pharmacists' interventions related to a prescribing error pre-EP, and 60% post-EP; only 40% and 56% of prescribing errors resulted in an intervention pre- and post-EP, respectively. Conclusions EP improved the quality of prescribing by reducing both prescribing errors and pharmacists' clinical interventions. Prescribers and pharmacists need to be aware of new types of error with EP, so that they can best target their activities to reduce clinical risk. Pharmacists may need to change the way they work to complement, rather than duplicate, the benefits of EP.