818 resultados para prescription misuse


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background: Medication errors are common in primary care and are associated with considerable risk of patient harm. We tested whether a pharmacist-led, information technology-based intervention was more effective than simple feedback in reducing the number of patients at risk of measures related to hazardous prescribing and inadequate blood-test monitoring of medicines 6 months after the intervention. Methods: In this pragmatic, cluster randomised trial general practices in the UK were stratified by research site and list size, and randomly assigned by a web-based randomisation service in block sizes of two or four to one of two groups. The practices were allocated to either computer-generated simple feedback for at-risk patients (control) or a pharmacist-led information technology intervention (PINCER), composed of feedback, educational outreach, and dedicated support. The allocation was masked to general practices, patients, pharmacists, researchers, and statisticians. Primary outcomes were the proportions of patients at 6 months after the intervention who had had any of three clinically important errors: non-selective non-steroidal anti-inflammatory drugs (NSAIDs) prescribed to those with a history of peptic ulcer without co-prescription of a proton-pump inhibitor; β blockers prescribed to those with a history of asthma; long-term prescription of angiotensin converting enzyme (ACE) inhibitor or loop diuretics to those 75 years or older without assessment of urea and electrolytes in the preceding 15 months. The cost per error avoided was estimated by incremental cost-eff ectiveness analysis. This study is registered with Controlled-Trials.com, number ISRCTN21785299. Findings: 72 general practices with a combined list size of 480 942 patients were randomised. At 6 months’ follow-up, patients in the PINCER group were significantly less likely to have been prescribed a non-selective NSAID if they had a history of peptic ulcer without gastroprotection (OR 0∙58, 95% CI 0∙38–0∙89); a β blocker if they had asthma (0∙73, 0∙58–0∙91); or an ACE inhibitor or loop diuretic without appropriate monitoring (0∙51, 0∙34–0∙78). PINCER has a 95% probability of being cost eff ective if the decision-maker’s ceiling willingness to pay reaches £75 per error avoided at 6 months. Interpretation: The PINCER intervention is an effective method for reducing a range of medication errors in general practices with computerised clinical records. Funding: Patient Safety Research Portfolio, Department of Health, England.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Aim: To determine the prevalence and nature of prescribing errors in general practice; to explore the causes, and to identify defences against error. Methods: 1) Systematic reviews; 2) Retrospective review of unique medication items prescribed over a 12 month period to a 2% sample of patients from 15 general practices in England; 3) Interviews with 34 prescribers regarding 70 potential errors; 15 root cause analyses, and six focus groups involving 46 primary health care team members Results: The study involved examination of 6,048 unique prescription items for 1,777 patients. Prescribing or monitoring errors were detected for one in eight patients, involving around one in 20 of all prescription items. The vast majority of the errors were of mild to moderate severity, with one in 550 items being associated with a severe error. The following factors were associated with increased risk of prescribing or monitoring errors: male gender, age less than 15 years or greater than 64 years, number of unique medication items prescribed, and being prescribed preparations in the following therapeutic areas: cardiovascular, infections, malignant disease and immunosuppression, musculoskeletal, eye, ENT and skin. Prescribing or monitoring errors were not associated with the grade of GP or whether prescriptions were issued as acute or repeat items. A wide range of underlying causes of error were identified relating to the prescriber, patient, the team, the working environment, the task, the computer system and the primary/secondary care interface. Many defences against error were also identified, including strategies employed by individual prescribers and primary care teams, and making best use of health information technology. Conclusion: Prescribing errors in general practices are common, although severe errors are unusual. Many factors increase the risk of error. Strategies for reducing the prevalence of error should focus on GP training, continuing professional development for GPs, clinical governance, effective use of clinical computer systems, and improving safety systems within general practices and at the interface with secondary care.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Recent concerns over the valuation process in collective leasehold enfranchisement and lease extension cases have culminated in new legislation. To underpin this, the Government (Department of Environment Transport and the Regions (DETR)) commissioned new research, which examined whether the valuation of the freehold in such cases could be simplified through the prescription of either yield or marriage value/relativity. This paper, which is based on that research, examines whether it is possible or desirable to prescribe such factors in the valuation process. Market, settlement and Local Valuation Tribunal (LVT) decisions are analysed, and the basis of 'relativity charts' used in practice is critically examined. Ultimately the imperfect nature of the market in freehold investment sales and leasehold vacant possession sales means that recommendations must rest on an analysis of LVT data. New relativity curves are developed from this data and used in conjunction with an alternative approach to valuation yields (based on other investment assets). However, the paper concludes that although the prescription of yields and relativity is possible, it is not fully defensible because of problems in determining risk premia; that the evidential basis for relativity consists of LVT decisions; and that a formula approach would tend to 'lead' the market as a whole.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Around 40% of total energy consumption in the UK is consumed by creating comfortable indoor environment for occupants. Occupants’ behaviour in terms of achieving thermal comfort could have a significant impact on a building’s energy consumption. Therefore, understanding the interactions of occupants with their buildings would be essential to provide a thermal comfort environment that is less reliance on energy-intensive heating, ventilation and air-conditioning systems, to meet energysaving and carbon emission targets. This paper presents the findings of a year-long field study conducted in non-air-conditioned office buildings in the UK. Occupants’ adaptive responses in terms of technological and personal dimensions are dynamic processes which could vary with both indoor and outdoor thermal conditions. The adaptive behaviours of occupants in the surveyed building show substantial seasonal and daily variations. Our study shows that non-physical factors such as habit could influence the adaptive responses of occupants. However, occupants sometimes displayed inappropriate adaptive behaviour, which could lead to a misuse of energy. This paper attempts to illustrate how occupants would adapt and interact with their built environment and consequently contribute to development of a guide for future design/refurbishment of buildings and to develop energy management systems for a comfortable built environment.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Organizations introduce acceptable use policies to deter employee computer misuse. Despite the controlling, monitoring and other forms of interventions employed, some employees misuse the organizational computers to carry out their personal work such as sending emails, surfing internet, chatting, playing games etc. These activities not only waste productive time of employees but also bring a risk to the organization. A questionnaire was administrated to a random sample of employees selected from large and medium scale software development organizations, which measured the work computer misuse levels and the factors that influence such behavior. The presence of guidelines provided no evidence of significant effect on the level of employee computer misuse. Not having access to Internet /email away from work and organizational settings were identified to be the most significant influences of work computer misuse.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Objective To determine the prevalence and nature of prescribing and monitoring errors in general practices in England. Design Retrospective case note review of unique medication items prescribed over a 12 month period to a 2% random sample of patients. Mixed effects logistic regression was used to analyse the data. Setting Fifteen general practices across three primary care trusts in England. Data sources Examination of 6048 unique prescription items prescribed over the previous 12 months for 1777 patients. Main outcome measures Prevalence of prescribing and monitoring errors, and severity of errors, using validated definitions. Results Prescribing and/or monitoring errors were detected in 4.9% (296/6048) of all prescription items (95% confidence interval 4.4 - 5.5%). The vast majority of errors were of mild to moderate severity, with 0.2% (11/6048) of items having a severe error. After adjusting for covariates, patient-related factors associated with an increased risk of prescribing and/or monitoring errors were: age less than 15 (Odds Ratio (OR) 1.87, 1.19 to 2.94, p=0.006) or greater than 64 years (OR 1.68, 1.04 to 2.73, p=0.035), and higher numbers of unique medication items prescribed (OR 1.16, 1.12 to 1.19, p<0.001). Conclusion Prescribing and monitoring errors are common in English general practice, although severe errors are unusual. Many factors increase the risk of error. Having identified the most common and important errors, and the factors associated with these, strategies to prevent future errors should be developed based on the study findings.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

There are significant discrepancies between observational datasets of Arctic sea ice concentrations covering the last three decades, which result in differences of over 20% in Arctic summer sea ice extent/area and 5%–10% in winter. Previous modeling studies have shown that idealized sea ice anomalies have the potential for making a substantial impact on climate. In this paper, this theory is further developed by performing a set of simulations using the third Hadley Centre Coupled Atmospheric Model (HadAM3). The model was driven with monthly climatologies of sea ice fractions derived from three of these records to investigate potential implications of sea ice inaccuracies for climate simulations. The standard sea ice climatology from the Met Office provided a control. This study focuses on the effects of actual inaccuracies of concentration retrievals, which vary spatially and are larger in summer than winter. The smaller sea ice discrepancies in winter have a much larger influence on climate than the much greater summer sea ice differences. High sensitivity to sea ice prescription was observed, even though no SST feedbacks were included. Significant effects on surface fields were observed in the Arctic, North Atlantic, and North Pacific. Arctic average surface air temperature anomalies in winter vary by 2.5°C, and locally exceed 12°C. Arctic mean sea level pressure varies by up to 5 mb locally. Anomalies extend to 45°N over North America and Eurasia but not to lower latitudes, and with limited changes in circulation above the boundary layer. No statistically significant impact on climate variability was simulated, in terms of the North Atlantic Oscillation. Results suggest that the uncertainty in summer sea ice prescription is not critical but that winter values require greater accuracy, with the caveats that the influences of ocean–sea ice feedbacks were not included in this study.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Simulated multi-model “diversity” in aerosol direct radiative forcing estimates is often perceived as a measure of aerosol uncertainty. However, current models used for aerosol radiative forcing calculations vary considerably in model components relevant for forcing calculations and the associated “host-model uncertainties” are generally convoluted with the actual aerosol uncertainty. In this AeroCom Prescribed intercomparison study we systematically isolate and quantify host model uncertainties on aerosol forcing experiments through prescription of identical aerosol radiative properties in twelve participating models. Even with prescribed aerosol radiative properties, simulated clear-sky and all-sky aerosol radiative forcings show significant diversity. For a purely scattering case with globally constant optical depth of 0.2, the global-mean all-sky top-of-atmosphere radiative forcing is −4.47Wm−2 and the inter-model standard deviation is 0.55Wm−2, corresponding to a relative standard deviation of 12 %. For a case with partially absorbing aerosol with an aerosol optical depth of 0.2 and single scattering albedo of 0.8, the forcing changes to 1.04Wm−2, and the standard deviation increases to 1.01W−2, corresponding to a significant relative standard deviation of 97 %. However, the top-of-atmosphere forcing variability owing to absorption (subtracting the scattering case from the case with scattering and absorption) is low, with absolute (relative) standard deviations of 0.45Wm−2 (8 %) clear-sky and 0.62Wm−2 (11 %) all-sky. Scaling the forcing standard deviation for a purely scattering case to match the sulfate radiative forcing in the Aero- Com Direct Effect experiment demonstrates that host model uncertainties could explain about 36% of the overall sulfate forcing diversity of 0.11Wm−2 in the AeroCom Direct Radiative Effect experiment.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Anxiety disorders in childhood and adolescence are extremely common and are often associated with lifelong psychiatric disturbance. Consistent with DSM-5 and the extant literature, this review concerns the assessment and treatment of specific phobias, separation anxiety disorder, generalised anxiety disorder, social anxiety disorder, panic disorder and agoraphobia. Evidence-based psychological treatments (cognitive behaviour therapy; CBT) for these disorders have been developed and investigated, and in recent years promising low-intensity versions of CBT interventions have been proposed that offer a means to increase access to evidence-based treatments. There is some evidence of effectiveness of pharmacological treatments for anxiety disorders in children and young people, however, routine prescription is not recommended due to concerns about potential harm.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

It is a known fact that some employees misuse the organizational computers to do their personal work such as sending emails, surfing the Internet, chatting, playing games. These activities not only waste productive time of employees but also bring a risk factor to the organization. This affects organizations in the software industry very much as almost all of their employees are connected to the Internet throughout them day./ By introducing an Acceptable Use Policy (AUP) for an organization, it is believed that the computer misuse by its employees could be reduced. In many countries Acceptable Use Policies are used and they have been studied with various perspectives. In Sri Lankan context research on these areas are scarce. This research explored the situation in Sri Lanka with respect to AUPs and their effectiveness./ A descriptive study was carried out to identify the large and medium scale software development organizations that had implemented computer usage guidelines for employees. A questionnaire was used to gather information regarding employee’s usual computer usage behavior. Stratified random sampling was employed to draw a representative sample from the population./ Majority of the organizations have not employed a written guideline on acceptable use of work computers. The study results did not provide evidence to conclude that the presence or non presence of an AUP has a significant difference in computer use behaviors of employees. A significant negative correlation was observed between level of awareness about AUP and misuse. Access to the Internet and organizational settings were identified as significant factors that influence employee computer misuse behavior.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The climate over the Arctic has undergone changes in recent decades. In order to evaluate the coupled response of the Arctic system to external and internal forcing, our study focuses on the estimation of regional climate variability and its dependence on large-scale atmospheric and regional ocean circulations. A global ocean–sea ice model with regionally high horizontal resolution is coupled to an atmospheric regional model and global terrestrial hydrology model. This way of coupling divides the global ocean model setup into two different domains: one coupled, where the ocean and the atmosphere are interacting, and one uncoupled, where the ocean model is driven by prescribed atmospheric forcing and runs in a so-called stand-alone mode. Therefore, selecting a specific area for the regional atmosphere implies that the ocean–atmosphere system can develop ‘freely’ in that area, whereas for the rest of the global ocean, the circulation is driven by prescribed atmospheric forcing without any feedbacks. Five different coupled setups are chosen for ensemble simulations. The choice of the coupled domains was done to estimate the influences of the Subtropical Atlantic, Eurasian and North Pacific regions on northern North Atlantic and Arctic climate. Our simulations show that the regional coupled ocean–atmosphere model is sensitive to the choice of the modelled area. The different model configurations reproduce differently both the mean climate and its variability. Only two out of five model setups were able to reproduce the Arctic climate as observed under recent climate conditions (ERA-40 Reanalysis). Evidence is found that the main source of uncertainty for Arctic climate variability and its predictability is the North Pacific. The prescription of North Pacific conditions in the regional model leads to significant correlation with observations, even if the whole North Atlantic is within the coupled model domain. However, the inclusion of the North Pacific area into the coupled system drastically changes the Arctic climate variability to a point where the Arctic Oscillation becomes an ‘internal mode’ of variability and correlations of year-to-year variability with observational data vanish. In line with previous studies, our simulations provide evidence that Arctic sea ice export is mainly due to ‘internal variability’ within the Arctic region. We conclude that the choice of model domains should be based on physical knowledge of the atmospheric and oceanic processes and not on ‘geographic’ reasons. This is particularly the case for areas like the Arctic, which has very complex feedbacks between components of the regional climate system.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper presents a novel application of virtual environments to assist in encouraging behavior change in individuals who misuse drugs or alcohol. We describe the user-centered design of a series of scenes to engage users in the identification of triggers and to encourage discussions about relevant coping skills. Results from the initial testing of this application with six service users showed variation in user responses. Results also suggested that the system should encourage group discussion and that it was linked to a small improvement in users’ confidence in understanding and identifying triggers.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This article explores the fine detail of practice by three teachers, recognised as effective teachers of literacy. All three were observed during nine literacy lessons, working with Year 2 (6/7 year olds) classes of successful inner-city primary schools in the South of England. Data collection took place in 2003, just as their schools were moving away from the early prescription of the National Literacy Strategy (NLS), and follow up visits were made in 2005. My initial interest had been in what these three teachers did with the NLS in order to motivate pupils and ensure high pupil attainment. Following observations, interviews and coding of teacher-pupil interaction, it became clear that The NLS Framework for Teaching (DfES, 2001) was not the driver of their success but a valuable vehicle for subtle and intuitive teacher behaviours that grew from a detailed understanding of how children develop as readers and writers. Implications for training student teachers to marry theoretical understanding with the expectations of a prescribed curriculum for literacy are discussed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A causal explanation provides information about the causal history of whatever is being explained. However, most causal histories extend back almost infinitely and can be described in almost infinite detail. Causal explanations therefore involve choices about which elements of causal histories to pick out. These choices are pragmatic: they reflect our explanatory interests. When adjudicating between competing causal explanations, we must therefore consider not only questions of epistemic adequacy (whether we have good grounds for identifying certain factors as causes) but also questions of pragmatic adequacy (whether the aspects of the causal history picked out are salient to our explanatory interests). Recognizing that causal explanations differ pragmatically as well as epistemically is crucial for identifying what is at stake in competing explanations of the relative peacefulness of the nineteenth-century Concert system. It is also crucial for understanding how explanations of past events can inform policy prescription.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Although there are signs of decline, homicides and traffic-related injuries and deaths in Brazil account for almost two-thirds of all deaths from external causes. In 2007, the homicide rate was 26.8 per 100 000 people and traffic-related mortality was 23.5 per 100 000. Domestic violence might not lead to as many deaths, but its share of violence-related morbidity is large. These are important public health problems that lead to enormous individual and collective costs. Young, black, and poor men are the main victims and perpetrators of community violence, whereas poor black women and children are the main victims of domestic violence. Regional differentials are also substantial. Besides the sociocultural determinants, much of the violence in Brazil has been associated with the misuse of alcohol and illicit drugs, and the wide availability of firearms. The high traffic-related morbidity and mortality in Brazil have been linked to the chosen model for the transport system that has given priority to roads and private-car use without offering adequate infrastructure. The system is often poorly equipped to deal with violations of traffic rules. In response to the major problems of violence and injuries, Brazil has greatly advanced in terms of legislation and action plans. The main challenge is to assess these advances to identify, extend, integrate, and continue the successful ones.