877 resultados para pacs: equipment and software evaluation methods


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Abstract OBJECTIVE: Accelerated atherosclerosis and premature coronary heart disease (CHD) are recognized complications of systemic lupus erythematosus (SLE), but the exact etiology remains unclear and is likely to be multifactorial. We hypothesized that SLE patients with CHD have increased exposure to traditional risk factors as well as differing disease phenotype and therapy-related factors compared to SLE patients free of CHD. Our aim was to examine risk factors for development of clinical CHD in SLE in the clinical setting. METHODS: In a UK-wide multicenter retrospective case-control study we recruited 53 SLE patients with verified clinical CHD (myocardial infarction or angina pectoris) and 96 SLE patients without clinical CHD. Controls were recruited from the same center as the case and matched by disease duration. Charts were reviewed up to time of event for cases, or the same "dummy-date" in controls. RESULTS: SLE patients with clinical CHD were older at the time of event [mean (SD) 53 (10) vs 42 (10) yrs; p

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Randomising set index functions can reduce the number of conflict misses in data caches by spreading the cache blocks uniformly over all sets. Typically, the randomisation functions compute the exclusive ors of several address bits. Not all randomising set index functions perform equally well, which calls for the evaluation of many set index functions. This paper discusses and improves a technique that tackles this problem by predicting the miss rate incurred by a randomisation function, based on profiling information. A new way of looking at randomisation functions is used, namely the null space of the randomisation function. The members of the null space describe pairs of cache blocks that are mapped to the same set. This paper presents an analytical model of the error made by the technique and uses this to propose several optimisations to the technique. The technique is then applied to generate a conflict-free randomisation function for the SPEC benchmarks. (C) 2003 Elsevier Science B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objectives: To assess whether open angle glaucoma (OAG) screening meets the UK National Screening Committee criteria, to compare screening strategies with case finding, to estimate test parameters, to model estimates of cost and cost-effectiveness, and to identify areas for future research. Data sources: Major electronic databases were searched up to December 2005. Review methods: Screening strategies were developed by wide consultation. Markov submodels were developed to represent screening strategies. Parameter estimates were determined by systematic reviews of epidemiology, economic evaluations of screening, and effectiveness (test accuracy, screening and treatment). Tailored highly sensitive electronic searches were undertaken. Results: Most potential screening tests reviewed had an estimated specificity of 85% or higher. No test was clearly most accurate, with only a few, heterogeneous studies for each test. No randomised controlled trials (RCTs) of screening were identified. Based on two treatment RCTs, early treatment reduces the risk of progression. Extrapolating from this, and assuming accelerated progression with advancing disease severity, without treatment the mean time to blindness in at least one eye was approximately 23 years, compared to 35 years with treatment. Prevalence would have to be about 3-4% in 40 year olds with a screening interval of 10 years to approach cost-effectiveness. It is predicted that screening might be cost-effective in a 50-year-old cohort at a prevalence of 4% with a 10-year screening interval. General population screening at any age, thus, appears not to be cost-effective. Selective screening of groups with higher prevalence (family history, black ethnicity) might be worthwhile, although this would only cover 6% of the population. Extension to include other at-risk cohorts (e.g. myopia and diabetes) would include 37% of the general population, but the prevalence is then too low for screening to be considered cost-effective. Screening using a test with initial automated classification followed by assessment by a specialised optometrist, for test positives, was more cost-effective than initial specialised optometric assessment. The cost-effectiveness of the screening programme was highly sensitive to the perspective on costs (NHS or societal). In the base-case model, the NHS costs of visual impairment were estimated as £669. If annual societal costs were £8800, then screening might be considered cost-effective for a 40-year-old cohort with 1% OAG prevalence assuming a willingness to pay of £30,000 per quality-adjusted life-year. Of lesser importance were changes to estimates of attendance for sight tests, incidence of OAG, rate of progression and utility values for each stage of OAG severity. Cost-effectiveness was not particularly sensitive to the accuracy of screening tests within the ranges observed. However, a highly specific test is required to reduce large numbers of false-positive referrals. The findings that population screening is unlikely to be cost-effective are based on an economic model whose parameter estimates have considerable uncertainty, in particular, if rate of progression and/or costs of visual impairment are higher than estimated then screening could be cost-effective. Conclusions: While population screening is not cost-effective, the targeted screening of high-risk groups may be. Procedures for identifying those at risk, for quality assuring the programme, as well as adequate service provision for those screened positive would all be needed. Glaucoma detection can be improved by increasing attendance for eye examination, and improving the performance of current testing by either refining practice or adding in a technology-based first assessment, the latter being the more cost-effective option. This has implications for any future organisational changes in community eye-care services. Further research should aim to develop and provide quality data to populate the economic model, by conducting a feasibility study of interventions to improve detection, by obtaining further data on costs of blindness, risk of progression and health outcomes, and by conducting an RCT of interventions to improve the uptake of glaucoma testing. © Queen's Printer and Controller of HMSO 2007. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: The postpartum period is a vulnerable time for excess weight retention, particularly for the increasing number of women who are overweight at the start of their pregnancy and subsequently find it difficult to lose additional weight gained during pregnancy. Although postpartum weight management interventions play an important role in breaking this potentially vicious cycle of weight gain, the effectiveness of such interventions in breastfeeding women remains unclear. Our aim was to systematically review the literature about the effectiveness of weight management interventions in breastfeeding women.

Methods: Seven electronic databases were searched for eligible papers. Intervention studies included were carried out exclusively in breastfeeding mothers, ≤2 years postpartum and with a body mass index greater than 18.5 kg/m2, with an outcome measure of change in weight and/or body composition.

Results: Six studies met the selection criteria, and were stratified according to the type of intervention and outcome measures. Despite considerable heterogeneity among studies, the dietary-based intervention studies appeared to be the most efficacious in promoting weight loss; however, few studies were tailored toward the needs of breastfeeding women.

Conclusions: Weight management interventions which include an energy-restricted diet may play a key role in successful postpartum weight loss for breastfeeding mothers.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background

Organ dysfunction consequent to infection (‘severe sepsis’) is the leading cause of admission to an intensive care unit (ICU). In both animal models and early clinical studies the calcium channel sensitizer levosimendan has been demonstrated to have potentially beneficial effects on organ function. The aims of the Levosimendan for the Prevention of Acute oRgan Dysfunction in Sepsis (LeoPARDS) trial are to identify whether a 24-hour infusion of levosimendan will improve organ dysfunction in adults who have septic shock and to establish the safety profile of levosimendan in this group of patients.

Methods/Design

This is a multicenter, randomized, double-blind, parallel group, placebo-controlled trial. Adults fulfilling the criteria for systemic inflammatory response syndrome due to infection, and requiring vasopressor therapy, will be eligible for inclusion in the trial. Within 24 hours of meeting these inclusion criteria, patients will be randomized in a 1:1 ratio stratified by the ICU to receive either levosimendan (0.05 to 0.2 μg.kg-1.min-1 or placebo for 24 hours in addition to standard care. The primary outcome measure is the mean Sequential Organ Failure Assessment (SOFA) score while in the ICU. Secondary outcomes include: central venous oxygen saturations and cardiac output; incidence and severity of renal failure using the Acute Kidney Injury Network criteria; duration of renal replacement therapy; serum bilirubin; time to liberation from mechanical ventilation; 28-day, hospital, 3 and 6 month survival; ICU and hospital length-of-stay; and days free from catecholamine therapy. Blood and urine samples will be collected on the day of inclusion, at 24 hours, and on days 4 and 6 post-inclusion for investigation of the mechanisms by which levosimendan might improve organ function. Eighty patients will have additional blood samples taken to measure levels of levosimendan and its active metabolites OR-1896 and OR-1855. A total of 516 patients will be recruited from approximately 25 ICUs in the United Kingdom.

Discussion

This trial will test the efficacy of levosimendan to reduce acute organ dysfunction in adult patients who have septic shock and evaluate its biological mechanisms of action.


Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: Age-related macular degeneration is the most common cause of sight impairment in the UK. In neovascular age-related macular degeneration (nAMD), vision worsens rapidly (over weeks) due to abnormal blood vessels developing that leak fluid and blood at the macula.

OBJECTIVES: To determine the optimal role of optical coherence tomography (OCT) in diagnosing people newly presenting with suspected nAMD and monitoring those previously diagnosed with the disease.

DATA SOURCES: Databases searched: MEDLINE (1946 to March 2013), MEDLINE In-Process & Other Non-Indexed Citations (March 2013), EMBASE (1988 to March 2013), Biosciences Information Service (1995 to March 2013), Science Citation Index (1995 to March 2013), The Cochrane Library (Issue 2 2013), Database of Abstracts of Reviews of Effects (inception to March 2013), Medion (inception to March 2013), Health Technology Assessment database (inception to March 2013).

REVIEW METHODS: Types of studies: direct/indirect studies reporting diagnostic outcomes.

INDEX TEST: time domain optical coherence tomography (TD-OCT) or spectral domain optical coherence tomography (SD-OCT).

COMPARATORS: clinical evaluation, visual acuity, Amsler grid, colour fundus photographs, infrared reflectance, red-free images/blue reflectance, fundus autofluorescence imaging, indocyanine green angiography, preferential hyperacuity perimetry, microperimetry. Reference standard: fundus fluorescein angiography (FFA). Risk of bias was assessed using quality assessment of diagnostic accuracy studies, version 2. Meta-analysis models were fitted using hierarchical summary receiver operating characteristic curves. A Markov model was developed (65-year-old cohort, nAMD prevalence 70%), with nine strategies for diagnosis and/or monitoring, and cost-utility analysis conducted. NHS and Personal Social Services perspective was adopted. Costs (2011/12 prices) and quality-adjusted life-years (QALYs) were discounted (3.5%). Deterministic and probabilistic sensitivity analyses were performed.

RESULTS: In pooled estimates of diagnostic studies (all TD-OCT), sensitivity and specificity [95% confidence interval (CI)] was 88% (46% to 98%) and 78% (64% to 88%) respectively. For monitoring, the pooled sensitivity and specificity (95% CI) was 85% (72% to 93%) and 48% (30% to 67%) respectively. The FFA for diagnosis and nurse-technician-led monitoring strategy had the lowest cost (£39,769; QALYs 10.473) and dominated all others except FFA for diagnosis and ophthalmologist-led monitoring (£44,649; QALYs 10.575; incremental cost-effectiveness ratio £47,768). The least costly strategy had a 46.4% probability of being cost-effective at £30,000 willingness-to-pay threshold.

LIMITATIONS: Very few studies provided sufficient information for inclusion in meta-analyses. Only a few studies reported other tests; for some tests no studies were identified. The modelling was hampered by a lack of data on the diagnostic accuracy of strategies involving several tests.

CONCLUSIONS: Based on a small body of evidence of variable quality, OCT had high sensitivity and moderate specificity for diagnosis, and relatively high sensitivity but low specificity for monitoring. Strategies involving OCT alone for diagnosis and/or monitoring were unlikely to be cost-effective. Further research is required on (i) the performance of SD-OCT compared with FFA, especially for monitoring but also for diagnosis; (ii) the performance of strategies involving combinations/sequences of tests, for diagnosis and monitoring; (iii) the likelihood of active and inactive nAMD becoming inactive or active respectively; and (iv) assessment of treatment-associated utility weights (e.g. decrements), through a preference-based study.

STUDY REGISTRATION: This study is registered as PROSPERO CRD42012001930.

FUNDING: The National Institute for Health Research Health Technology Assessment programme.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The rock/atmosphere interface is inhabited by a complex microbial community including bacteria, algae and fungi. These communities are prominent biodeterioration agents and remarkably influence the status of stone monuments and buildings. Deeper comprehension of natural biodeterioration processes on stone surfaces has brought about a concept of complex microbial communities referred to as "subaerial biofilms". The practical implications of biofilm formation are that control strategies must be devised both for testing the susceptibility of the organisms within the biofilm and treating the established biofilm. Model multi-species biofilms associated with mineral surfaces that are frequently refractory to conventional treatment have been used as test targets. A combination of scanning microscopy with image analysis was applied along with traditional cultivation methods and fluorescent activity stains. Such a polyphasic approach allowed a comprehensive quantitative evaluation of the biofilm status and development. Effective treatment strategies incorporating chemical and physical agents have been demonstrated to prevent biofilm growth in vitro. Model biofilm growth on inorganic support was significantly reduced by a combination of PDT and biocides

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: Diabetic retinopathy is an important cause of visual loss. Laser photocoagulation preserves vision in diabetic retinopathy but is currently used at the stage of proliferative diabetic retinopathy (PDR).

OBJECTIVES: The primary aim was to assess the clinical effectiveness and cost-effectiveness of pan-retinal photocoagulation (PRP) given at the non-proliferative stage of diabetic retinopathy (NPDR) compared with waiting until the high-risk PDR (HR-PDR) stage was reached. There have been recent advances in laser photocoagulation techniques, and in the use of laser treatments combined with anti-vascular endothelial growth factor (VEGF) drugs or injected steroids. Our secondary questions were: (1) If PRP were to be used in NPDR, which form of laser treatment should be used? and (2) Is adjuvant therapy with intravitreal drugs clinically effective and cost-effective in PRP?

ELIGIBILITY CRITERIA: Randomised controlled trials (RCTs) for efficacy but other designs also used.


REVIEW METHODS: Systematic review and economic modelling.

RESULTS: The Early Treatment Diabetic Retinopathy Study (ETDRS), published in 1991, was the only trial designed to determine the best time to initiate PRP. It randomised one eye of 3711 patients with mild-to-severe NPDR or early PDR to early photocoagulation, and the other to deferral of PRP until HR-PDR developed. The risk of severe visual loss after 5 years for eyes assigned to PRP for NPDR or early PDR compared with deferral of PRP was reduced by 23% (relative risk 0.77, 99% confidence interval 0.56 to 1.06). However, the ETDRS did not provide results separately for NPDR and early PDR. In economic modelling, the base case found that early PRP could be more effective and less costly than deferred PRP. Sensitivity analyses gave similar results, with early PRP continuing to dominate or having low incremental cost-effectiveness ratio. However, there are substantial uncertainties. For our secondary aims we found 12 trials of lasers in DR, with 982 patients in total, ranging from 40 to 150. Most were in PDR but five included some patients with severe NPDR. Three compared multi-spot pattern lasers against argon laser. RCTs comparing laser applied in a lighter manner (less-intensive burns) with conventional methods (more intense burns) reported little difference in efficacy but fewer adverse effects. One RCT suggested that selective laser treatment targeting only ischaemic areas was effective. Observational studies showed that the most important adverse effect of PRP was macular oedema (MO), which can cause visual impairment, usually temporary. Ten trials of laser and anti-VEGF or steroid drug combinations were consistent in reporting a reduction in risk of PRP-induced MO.

LIMITATION: The current evidence is insufficient to recommend PRP for severe NPDR.

CONCLUSIONS: There is, as yet, no convincing evidence that modern laser systems are more effective than the argon laser used in ETDRS, but they appear to have fewer adverse effects. We recommend a trial of PRP for severe NPDR and early PDR compared with deferring PRP till the HR-PDR stage. The trial would use modern laser technologies, and investigate the value adjuvant prophylactic anti-VEGF or steroid drugs.

STUDY REGISTRATION: This study is registered as PROSPERO CRD42013005408.

FUNDING: The National Institute for Health Research Health Technology Assessment programme.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: NF2 patients develop multiple nervous system tumors including bilateral vestibular schwannomas (VS). The tumors and their surgical treatment are associated with deafness, neurological disability, and mortality. Medical treatment with bevacizumab has been reported to reduce VS growth and to improve hearing. In addition to evaluating these effects, this study also aimed to determine other important consequences of treatment including patient-reported quality of life and the impact of treatment on surgical VS rates. Methods: Patients treated with bevacizumab underwent serial prospective MRI, audiology, clinical, CTCAE-4.0 adverse events, and NFTI-QOL quality-of-life assessments. Tumor volumetrics were classified according to the REiNs criteria and annual VS surgical rates reviewed. Results: Sixty-one patients (59% male), median age 25 years (range, 10–57), were reviewed. Median follow-up was 23 months (range, 3–53). Partial volumetric tumor response (all tumors) was seen in 39% and 51% had stabilization of previously growing tumors. Age and pretreatment growth rate were predictors of response. Hearing was maintained or improved in 86% of assessable patients. Mean NFTI-QOL scores improved from 12.0 to 10.7 (P < .05). Hypertension was observed in 30% and proteinuria in 16%. Twelve treatment breaks occurred due to adverse events. The rates of VS surgery decreased after the introduction of bevacizumab. Conclusion: Treatment with bevacizumab in this large, UK-wide cohort decreased VS growth rates and improved hearing and quality of life. The potential risk of surgical iatrogenic damage was also reduced due to an associated reduction in VS surgical rates. Ongoing follow-up of this cohort will determine the long-term benefits and risks of bevacizumab treatment.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Dans cette thèse, je me suis interessé à l’identification partielle des effets de traitements dans différents modèles de choix discrets avec traitements endogènes. Les modèles d’effets de traitement ont pour but de mesurer l’impact de certaines interventions sur certaines variables d’intérêt. Le type de traitement et la variable d’intérêt peuvent être défini de manière générale afin de pouvoir être appliqué à plusieurs différents contextes. Il y a plusieurs exemples de traitement en économie du travail, de la santé, de l’éducation, ou en organisation industrielle telle que les programmes de formation à l’emploi, les techniques médicales, l’investissement en recherche et développement, ou l’appartenance à un syndicat. La décision d’être traité ou pas n’est généralement pas aléatoire mais est basée sur des choix et des préférences individuelles. Dans un tel contexte, mesurer l’effet du traitement devient problématique car il faut tenir compte du biais de sélection. Plusieurs versions paramétriques de ces modèles ont été largement étudiées dans la littérature, cependant dans les modèles à variation discrète, la paramétrisation est une source importante d’identification. Dans un tel contexte, il est donc difficile de savoir si les résultats empiriques obtenus sont guidés par les données ou par la paramétrisation imposée au modèle. Etant donné, que les formes paramétriques proposées pour ces types de modèles n’ont généralement pas de fondement économique, je propose dans cette thèse de regarder la version nonparamétrique de ces modèles. Ceci permettra donc de proposer des politiques économiques plus robustes. La principale difficulté dans l’identification nonparamétrique de fonctions structurelles, est le fait que la structure suggérée ne permet pas d’identifier un unique processus générateur des données et ceci peut être du soit à la présence d’équilibres multiples ou soit à des contraintes sur les observables. Dans de telles situations, les méthodes d’identifications traditionnelles deviennent inapplicable d’où le récent développement de la littérature sur l’identification dans les modèles incomplets. Cette littérature porte une attention particuliere à l’identification de l’ensemble des fonctions structurelles d’intérêt qui sont compatibles avec la vraie distribution des données, cet ensemble est appelé : l’ensemble identifié. Par conséquent, dans le premier chapitre de la thèse, je caractérise l’ensemble identifié pour les effets de traitements dans le modèle triangulaire binaire. Dans le second chapitre, je considère le modèle de Roy discret. Je caractérise l’ensemble identifié pour les effets de traitements dans un modèle de choix de secteur lorsque la variable d’intérêt est discrète. Les hypothèses de sélection du secteur comprennent le choix de sélection simple, étendu et généralisé de Roy. Dans le dernier chapitre, je considère un modèle à variable dépendante binaire avec plusieurs dimensions d’hétérogéneité, tels que les jeux d’entrées ou de participation. je caractérise l’ensemble identifié pour les fonctions de profits des firmes dans un jeux avec deux firmes et à information complète. Dans tout les chapitres, l’ensemble identifié des fonctions d’intérêt sont écrites sous formes de bornes et assez simple pour être estimées à partir des méthodes d’inférence existantes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Using 6-benzo[1,3]dioxolefulvene (1a), a series of benzodioxole substituted titanocenes was synthesized. The benzyl-substituted titanocene bis[(benzo[1,3]dioxole)-5-methylcyclopentadienyl] titanium (IV) dichloride (2a) was synthesized from the reaction of Super Hydride with 1a. An X-ray determined crystal structure was obtained for 2a. The ansa-titanocene (1,2-di(cyclopentadienyl)1,2-di-(benzo[1,3]dioxole)-ethanediyl) titanium(IV) dichloride (2b) was synthesized by reductive dimerisation of la with titanium dichloride. The diarylmethyl substituted titanocene bis(di(benzo[1,3]dioxole)-S-methylcyclopentadienyl) titanium(IV) dichloride (20 was synthesized by reacting la with the para-lithiated benzodioxole followed by transmetallation with titanium tetrachloride. When titanocenes 2a-c were tested against pig kidney (LLC-PK) cells inhibitory concentrations (IC50) of 2.8 X 10(-4), 1.6 x 10(-4) and 7.6 x 10(-5) m, respectively, were observed. These values represent improved cytotoxicity against LLC-PK, when compared with unsubstituted titanocene dichloride, but are not as impressive as values obtained for titanocenes previously synthesized using the above methods. Copyright (c) 2006 John Wiley & Sons, Ltd.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

There are a number of challenges associated with managing knowledge and information in construction organizations delivering major capital assets. These include the ever-increasing volumes of information, losing people because of retirement or competitors, the continuously changing nature of information, lack of methods on eliciting useful knowledge, development of new information technologies and changes in management and innovation practices. Existing tools and methodologies for valuing intangible assets in fields such as engineering, project management and financial, accounting, do not address fully the issues associated with the valuation of information and knowledge. Information is rarely recorded in a way that a document can be valued, when either produced or subsequently retrieved and re-used. In addition there is a wealth of tacit personal knowledge which, if codified into documentary information, may prove to be very valuable to operators of the finished asset or future designers. This paper addresses the problem of information overload and identifies the differences between data, information and knowledge. An exploratory study was conducted with a leading construction consultant examining three perspectives (business, project management and document management) by structured interviews and specifically how to value information in practical terms. Major challenges in information management are identified. An through-life Information Evaluation methodology (IEM) is presented to reduce information overload and to make the information more valuable in the future.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A precipitation downscaling method is presented using precipitation from a general circulation model (GCM) as predictor. The method extends a previous method from monthly to daily temporal resolution. The simplest form of the method corrects for biases in wet-day frequency and intensity. A more sophisticated variant also takes account of flow-dependent biases in the GCM. The method is flexible and simple to implement. It is proposed here as a correction of GCM output for applications where sophisticated methods are not available, or as a benchmark for the evaluation of other downscaling methods. Applied to output from reanalyses (ECMWF, NCEP) in the region of the European Alps, the method is capable of reducing large biases in the precipitation frequency distribution, even for high quantiles. The two variants exhibit similar performances, but the ideal choice of method can depend on the GCM/reanalysis and it is recommended to test the methods in each case. Limitations of the method are found in small areas with unresolved topographic detail that influence higher-order statistics (e.g. high quantiles). When used as benchmark for three regional climate models (RCMs), the corrected reanalysis and the RCMs perform similarly in many regions, but the added value of the latter is evident for high quantiles in some small regions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objective To undertake a process evaluation of pharmacists' recommendations arising in the context of a complex IT-enabled pharmacist-delivered randomised controlled trial (PINCER trial) to reduce the risk of hazardous medicines management in general practices. Methods PINCER pharmacists manually recorded patients’ demographics, details of interventions recommended, actions undertaken by practice staff and time taken to manage individual cases of hazardous medicines management. Data were coded and double entered into SPSS v15, and then summarised using percentages for categorical data (with 95% CI) and, as appropriate, means (SD) or medians (IQR) for continuous data. Key findings Pharmacists spent a median of 20 minutes (IQR 10, 30) reviewing medical records, recommending interventions and completing actions in each case of hazardous medicines management. Pharmacists judged 72% (95%CI 70, 74) (1463/2026) of cases of hazardous medicines management to be clinically relevant. Pharmacists recommended 2105 interventions in 74% (95%CI 73, 76) (1516/2038) of cases and 1685 actions were taken in 61% (95%CI 59, 63) (1246/2038) of cases; 66% (95%CI 64, 68) (1383/2105) of interventions recommended by pharmacists were completed and 5% (95%CI 4, 6) (104/2105) of recommendations were accepted by general practitioners (GPs), but not completed at the end of the pharmacists’ placement; the remaining recommendations were rejected or considered not relevant by GPs. Conclusions The outcome measures were used to target pharmacist activity in general practice towards patients at risk from hazardous medicines management. Recommendations from trained PINCER pharmacists were found to be broadly acceptable to GPs and led to ameliorative action in the majority of cases. It seems likely that the approach used by the PINCER pharmacists could be employed by other practice pharmacists following appropriate training.