27 resultados para PBS prescriptions
em CentAUR: Central Archive University of Reading - UK
Resumo:
This paper focuses upon the policy and institutional change that has taken place within the Argentine electricity market since the country’s economic and social crisis of 2001/2. As one of the first less developed countries (LDCs) to liberalise and privatise its electricity industry, Argentina has since moved away from the orthodox market model after consumer prices were frozen by the Government in early 2002 when the national currency was devalued by 70%. Although its reforms were widely praised during the 1990s, the electricity market has undergone a number of interventions, ostensibly to keep consumer prices low and to avert the much-discussed energy ‘crisis’ caused by a dearth of new investment combined with rising demand levels. This paper explores how the economic crisis and its consequences have both enabled and legitimised these policy and institutional amendments, while drawing upon the specifics of the post-neoliberal market ‘re-reforms’ to consider the extent to which the Government appears to be moving away from market-based prescriptions. In addition, this paper contributes to sector-specific understandings of how, despite these changes, neoliberal ideas and assumptions continue to dominate Argentine public policy well beyond the postcrisis era.
Resumo:
This study evaluates computer-generated written explanations about drug prescriptions that are based on an analysis of both patient and doctor informational needs. Three experiments examine the effects of varying the type of information given about the possible side effects of the medication, and the order of information within the explanation. Experiment 1 investigated the effects of these two factors on people's ratings of how good they consider the explanations to be and of their perceived likelihood of taking the medication, as well as on their memory for the information in the explanation. Experiment 2 further examined the effects of varying information about side effects by separating out the contribution of number and severity of side effects. It was found that participants in this study did not “like” explanations that described severe side effects, and also judged that they would be less likely to take the medication if given such explanations. Experiment 3 therefore investigated whether information about severe side effects could be presented in such a way as to increase judgements of how good explanations are thought to be, as well as the perceived likelihood of adherence. The results showed some benefits of providing additional explanatory information.
Resumo:
A two phase study is reported. In the first phase, we asked a number of doctors to rate a list of information categories (identified by Berry, Gillie and Banbury 1995) in terms of how important they felt it was for the items to be included in an explanation to a patient about a drug prescription. In the second phase, we presented a large sample of people with a scenario about visiting their doctor and being prescribed medication, together with an explanation about the prescription which was said to be provided by the doctor. Four different explanations were compared, which were either based on what people in our earlier study wanted to know about drug prescriptions or on what the doctors thought it was important lo tell them. We also manipulated whether or not the explanations conveyed negative information (e.g. about the possible side effects of the medication). The results showed that people 'preferred' the explanations based on what the participants in the earlier study wanted to know about their medicines, rather than those based on what the doctors thought they should be told. They also 'preferred' the explanations that did not convey negative information, rather than those that did convey some negative information. In addition, the inclusion of negative information affected ratings of likely compliance with the prescribed medication.
Resumo:
We present a procedure for estimating two quantities defining the spatial externality in discrete-choice commonly referred to as 'the neighbourhood effect'. One quantity, the propensity for neighbours to make the same decision, reflects traditional preoccupations; the other quantity, the magnitude of the neighbourhood itself, is novel. Because both quantities have fundamental bearing on the magnitude of the spatial externality, it is desirable to have a robust algorithm for their estimation. Using recent advances in Bayesian estimation and model comparison, we devise such an algorithm and illustrate its application to a sample of northern-Filipino smallholders. We determine that a significant, positive, neighbourhood effect exists; that, among the 12 geographical units comprising the sample, the neighbourhood spans a three-unit radius; and that policy prescriptions are significantly altered when calculations account for the spatial externality.
Resumo:
This paper presents the findings from a recent study funded by the Joseph Rowntree Foundation examining the housing and neighbourhood needs of 44 visually impaired children. Our research found that disabled people’s needs have been too narrowly based on ‘accessibility’ criteria, which do not take into account the health and safety issues so important for children. Indeed, the home environment is the main site of accidental death or injury for young children under 4 years, and children from low income families are particularly susceptible to burns, scalds, falls, swallowing foreign objects or poisonous substances within it (CRDU 1994). As disabled children are statistically more likely to be in low income families, this places them at high risk. If ‘accessibility’ is to be reconceived as design for usability throughout the lifecourse, this challenges us to move beyond the pragmatic but limited application of design prescriptions for disabled people as a separate and adult group, and to re-think all of the dimensions of the housing quality framework in the light of this expanded approach.
Resumo:
Our objective in this study was to develop and implement an effective intervention strategy to manipulate the amount and composition of dietary fat and carbohydrate (CHO) in free-living individuals in the RISCK study. The study was a randomized, controlled dietary intervention study that was conducted in 720 participants identified as higher risk for or with metabolic syndrome. All followed a 4-wk run-in reference diet [high saturated fatty acids (SF)/high glycemic index (GI)]. Volunteers were randomized to continue this diet for a further 24 wk or to I of 4 isoenergetic prescriptions [high monounsaturated fatty acids (MUFA)/high GI; high MUFA/low GI; low fat (LF)/high GI; and LF/low GI]. We developed a food exchange model to implement each diet. Dietary records and plasma phospholipid fatty acids were used to assess the effectiveness of the intervention strategy. Reported fat intake from the LF diets was significantly reduced to 28% of energy (%E) compared with 38% E from the HM and LF diets. SF intake was successfully decreased in the HM and LF diets was similar to 10% E compared with 17% E in the reference diet (P = 0.001). Dietary MUFA in the HIM diets was similar to 17% E, significantly higher than in the reference (12% E) and LF diets (10% E) (P = 0.001). Changes in plasma phospholipid fatty acids provided further evidence for the successful manipulation of fat intake. The GI of the HGI and LGI arms differed by similar to 9 points (P = 0.001). The food exchange model provided an effective dietary strategy for the design and implementation across multiple sites of 5 experimental diets with specific targets for the proportion of fat and CHO. J. Nutr. 139: 1534-1540, 2009.
Resumo:
The assessment of cellular effects by the aqueous phase of human feces (fecal water, FW) is a useful biomarker approach to study cancer risks and protective activities of food. In order to refine and develop the biomarker, different protocols of preparing FW were compared. Fecal waters were prepared by 3 methods: (A) direct centrifugation; (B) extraction of feces in PBS before centrifugation; and (C) centrifugation of lyophilized and reconstituted feces. Genotoxicity was determined in colon cells using the Comet assay. Selected samples were investigated for additional parameters related to carcinogenesis. Two of 7 FWs obtained by methods A and B were similarly genotoxic. Method B, however, yielded higher volumes of FW, allowing sterile filtration for long-term culture experiments. Four of 7 samples were non-genotoxic when prepared according to all 3 methods. FW from lyophilized feces and from fresh samples were equally genotoxic. FWs modulated cytotoxicity, paracellular permeability, and invasion, independent of their genotoxicity. All 3 methods of FW preparation can be used to assess genotoxicity. The higher volumes of FWobtained by preparation method B greatly enhance the perspectives of measuring different types of biological parameters and using these to disclose activities related to cancer development.
Resumo:
Background: Medication errors are an important cause of morbidity and mortality in primary care. The aims of this study are to determine the effectiveness, cost effectiveness and acceptability of a pharmacist-led information-technology-based complex intervention compared with simple feedback in reducing proportions of patients at risk from potentially hazardous prescribing and medicines management in general (family) practice. Methods: Research subject group: "At-risk" patients registered with computerised general practices in two geographical regions in England. Design: Parallel group pragmatic cluster randomised trial. Interventions: Practices will be randomised to either: (i) Computer-generated feedback; or (ii) Pharmacist-led intervention comprising of computer-generated feedback, educational outreach and dedicated support. Primary outcome measures: The proportion of patients in each practice at six and 12 months post intervention: - with a computer-recorded history of peptic ulcer being prescribed non-selective non-steroidal anti-inflammatory drugs - with a computer-recorded diagnosis of asthma being prescribed beta-blockers - aged 75 years and older receiving long-term prescriptions for angiotensin converting enzyme inhibitors or loop diuretics without a recorded assessment of renal function and electrolytes in the preceding 15 months. Secondary outcome measures; These relate to a number of other examples of potentially hazardous prescribing and medicines management. Economic analysis: An economic evaluation will be done of the cost per error avoided, from the perspective of the UK National Health Service (NHS), comparing the pharmacist-led intervention with simple feedback. Qualitative analysis: A qualitative study will be conducted to explore the views and experiences of health care professionals and NHS managers concerning the interventions, and investigate possible reasons why the interventions prove effective, or conversely prove ineffective. Sample size: 34 practices in each of the two treatment arms would provide at least 80% power (two-tailed alpha of 0.05) to demonstrate a 50% reduction in error rates for each of the three primary outcome measures in the pharmacist-led intervention arm compared with a 11% reduction in the simple feedback arm. Discussion: At the time of submission of this article, 72 general practices have been recruited (36 in each arm of the trial) and the interventions have been delivered. Analysis has not yet been undertaken.
Resumo:
This paper describes a prototype grid infrastructure, called the eMinerals minigrid, for molecular simulation scientists. which is based on an integration of shared compute and data resources. We describe the key components, namely the use of Condor pools, Linux/Unix clusters with PBS and IBM's LoadLeveller job handling tools, the use of Globus for security handling, the use of Condor-G tools for wrapping globus job submit commands, Condor's DAGman tool for handling workflow, the Storage Resource Broker for handling data, and the CCLRC dataportal and associated tools for both archiving data with metadata and making data available to other workers.
Resumo:
The self-assembly of a fragment of the amyloid beta peptide that has been shown to be critical in amyloid fibrillization has been studied in aqueous solution. There are conflicting reports in the literature on the fibrillization of A beta (16-20), i.e., KLVFF, and our results shed light on this. In dilute solution, self-assembly of NH2-KLVFF-COOH is strongly influenced by aromatic interactions between phenylalanine units, as revealed by UV spectroscopy and circular dichroism. Fourier transform infrared (FTIR) spectroscopy reveals beta-sheet features in spectra taken for more concentrated solutions and also dried films. X-ray diffraction and cryo-transmission electron microscopy (cryo-TEM) provide further support for beta-sheet amyloid fibril formation. A comparison of cryo-TEM images with those from conventional dried and negatively stained TEM specimens highlights the pronounced effects of sample preparation on the morphology. A comparison of FTIR data for samples in solution and dried samples also highlights the strong effect of drying on the self-assembled structure. In more concentrated phosphate-buffered saline (PBS) solution, gelation of NH2-KLVFF-COOH is observed. This is believed to be caused by screening of the electrostatic charge on the peptide, which enables beta sheets to aggregate into a fibrillar gel network. The rheology of the hydrogel is probed, and the structure is investigated by light scattering and small-angle X-ray scattering.
Resumo:
Howard Barker is a writer who has made several notable excursions into what he calls ‘the charnel house…of European drama.’ David Ian Rabey has observed that a compelling property of these classical works lies in what he calls ‘the incompleteness of [their] prescriptions’, and Barker’s Women Beware Women (1986), Seven Lears (1990) and Gertrude: The Cry (2002), are in turn based around the gaps and interstices found in Thomas Middleton’s Women Beware Women (c1627), Shakespeare’s King Lear (c1604) and Hamlet (c1601) respectively. This extends from representing the missing queen from King Lear, who Barker observes, ‘is barely quoted even in the depths of rage or pity’, to his new ending for Middleton’s Jacobean tragedy and the erotic revivification of Hamlet’s mother. This paper will argue that each modern reappropriation accentuates a hidden but powerful feature in these Elizabethan and Jacobean plays – namely their clash between obsessive desire, sexual transgression and death against the imposed restitution of a prescribed morality. This contradiction acts as the basis for Barker’s own explorations of eroticism, death and tragedy. The paper will also discuss Barker’s project for these ‘antique texts’, one that goes beyond what he derisively calls ‘relevance’, but attempts instead to recover ‘smothered genius’, whereby the transgressive is ‘concealed within structures that lend an artificial elegance.’ Together with Barker’s own rediscovery of tragedy, the paper will assert that these rewritings of Elizabethan and Jacobean drama expose their hidden, yet unsettling and provocative ideologies concerning the relationship between political corruption / justice through the power of sexuality (notably through the allure and danger of the mature woman), and an erotics of death that produces tragedy for the contemporary age.
Resumo:
Aim: To determine the prevalence and nature of prescribing errors in general practice; to explore the causes, and to identify defences against error. Methods: 1) Systematic reviews; 2) Retrospective review of unique medication items prescribed over a 12 month period to a 2% sample of patients from 15 general practices in England; 3) Interviews with 34 prescribers regarding 70 potential errors; 15 root cause analyses, and six focus groups involving 46 primary health care team members Results: The study involved examination of 6,048 unique prescription items for 1,777 patients. Prescribing or monitoring errors were detected for one in eight patients, involving around one in 20 of all prescription items. The vast majority of the errors were of mild to moderate severity, with one in 550 items being associated with a severe error. The following factors were associated with increased risk of prescribing or monitoring errors: male gender, age less than 15 years or greater than 64 years, number of unique medication items prescribed, and being prescribed preparations in the following therapeutic areas: cardiovascular, infections, malignant disease and immunosuppression, musculoskeletal, eye, ENT and skin. Prescribing or monitoring errors were not associated with the grade of GP or whether prescriptions were issued as acute or repeat items. A wide range of underlying causes of error were identified relating to the prescriber, patient, the team, the working environment, the task, the computer system and the primary/secondary care interface. Many defences against error were also identified, including strategies employed by individual prescribers and primary care teams, and making best use of health information technology. Conclusion: Prescribing errors in general practices are common, although severe errors are unusual. Many factors increase the risk of error. Strategies for reducing the prevalence of error should focus on GP training, continuing professional development for GPs, clinical governance, effective use of clinical computer systems, and improving safety systems within general practices and at the interface with secondary care.
Resumo:
The presence of resident Langerhans cells (LCs) in the epidermis makes the skin an attractive target for DNA vaccination. However, reliable animal models for cutaneous vaccination studies are limited. We demonstrate an ex vivo human skin model for cutaneous DNA vaccination which can potentially bridge the gap between pre-clinical in vivo animal models and clinical studies. Cutaneous transgene expression was utilised to demonstrate epidermal tissue viability in culture. LC response to the culture environment was monitored by immunohistochemistry. Full-thickness and split-thickness skin remained genetically viable in culture for at least 72 h in both phosphate-buffered saline (PBS) and full organ culture medium (OCM). The epidermis of explants cultured in OCM remained morphologically intact throughout the culture duration. LCs in full-thickness skin exhibited a delayed response (reduction in cell number and increase in cell size) to the culture conditions compared with split-thickness skin, whose response was immediate. In conclusion, excised human skin can be cultured for a minimum of 72 h for analysis of gene expression and immune cell activation. However, the use of split-thickness skin for vaccine formulation studies may not be appropriate because of the nature of the activation. Full-thickness skin explants are a more suitable model to assess cutaneous vaccination ex vivo.