127 resultados para Health impact assessment
Resumo:
Meat and meat products can be contaminated with different species of bacteria resistant to various antimicrobials. The human health risk of a type of meat or meat product carry by emerging antimicrobial resistance depends on (i) the prevalence of contamination with resistant bacteria, (ii) the human health consequences of an infection with a specific bacterium resistant to a specific antimicrobial and (iii) the consumption volume of a specific product. The objective of this study was to compare the risk for consumers arising from their exposure to antibiotic resistant bacteria from meat of four different types (chicken, pork, beef and veal), distributed in four different product categories (fresh meat, frozen meat, dried raw meat products and heat-treated meat products). A semi-quantitative risk assessment model, evaluating each food chain step, was built in order to get an estimated score for the prevalence of Campylobacter spp., Enterococcus spp. and Escherichia coli in each product category. To assess human health impact, nine combinations of bacterial species and antimicrobial agents were considered based on a published risk profile. The combination of the prevalence at retail, the human health impact and the amount of meat or product consumed, provided the relative proportion of total risk attributed to each category of product, resulting in a high, medium or low human health risk. According to the results of the model, chicken (mostly fresh and frozen meat) contributed 6.7% of the overall risk in the highest category and pork (mostly fresh meat and dried raw meat products) contributed 4.0%. The contribution of beef and veal was of 0.4% and 0.1% respectively. The results were tested and discussed for single parameter changes of the model. This risk assessment was a useful tool for targeting antimicrobial resistance monitoring to those meat product categories where the expected risk for public health was greater.
Resumo:
BACKGROUND HIV treatment recommendations are updated as clinical trials are published. Whether recommendations drive clinicians to change antiretroviral therapy in well-controlled patients is unexplored. METHODS We selected patients with undetectable viral loads (VLs) on nonrecommended regimens containing double-boosted protease inhibitors (DBPIs), triple-nucleoside reverse transcriptase inhibitors (NRTIs), or didanosine (ddI) plus stavudine (d4T) at publication of the 2006 International AIDS Society recommendations. We compared demographic and clinical characteristics with those of control patients with undetectable VL not on these regimens and examined clinical outcome and reasons for treatment modification. RESULTS At inclusion, 104 patients were in the DBPI group, 436 in the triple-NRTI group, and 19 in the ddI/d4T group. By 2010, 28 (29%), 204 (52%), and 1 (5%) patient were still on DBPIs, triple-NRTIs, and ddI plus d4T, respectively. 'Physician decision,' excluding toxicity/virological failure, drove 30% of treatment changes. Predictors of recommendation nonobservance included female sex [adjusted odds ratio (aOR) 2.69, 95% confidence interval (CI) 1 to 7.26; P = 0.01] for DPBIs, and undetectable VL (aOR 3.53, 95% CI 1.6 to 7.8; P = 0.002) and lack of cardiovascular events (aOR 2.93, 95% CI 1.23 to 6.97; P = 0.02) for triple-NRTIs. All patients on DBPIs with documented diabetes or a cardiovascular event changed treatment. Recommendation observance resulted in lower cholesterol values in the DBPI group (P = 0.06), and more patients having undetectable VL (P = 0.02) in the triple-NRTI group. CONCLUSION The physician's decision is the main factor driving change from nonrecommended to recommended regimens, whereas virological suppression is associated with not switching. Positive clinical outcomes observed postswitch underline the importance of observing recommendations, even in well-controlled patients.
Resumo:
BACKGROUND Partner notification is essential to the comprehensive case management of sexually transmitted infections. Systematic reviews and mathematical modelling can be used to synthesise information about the effects of new interventions to enhance the outcomes of partner notification. OBJECTIVE To study the effectiveness and cost-effectiveness of traditional and new partner notification technologies for curable sexually transmitted infections (STIs). DESIGN Secondary data analysis of clinical audit data; systematic reviews of randomised controlled trials (MEDLINE, EMBASE and Cochrane Central Register of Controlled Trials) published from 1 January 1966 to 31 August 2012 and of studies of health-related quality of life (HRQL) [MEDLINE, EMBASE, ISI Web of Knowledge, NHS Economic Evaluation Database (NHS EED), Database of Abstracts of Reviews of Effects (DARE) and Health Technology Assessment (HTA)] published from 1 January 1980 to 31 December 2011; static models of clinical effectiveness and cost-effectiveness; and dynamic modelling studies to improve parameter estimation and examine effectiveness. SETTING General population and genitourinary medicine clinic attenders. PARTICIPANTS Heterosexual women and men. INTERVENTIONS Traditional partner notification by patient or provider referral, and new partner notification by expedited partner therapy (EPT) or its UK equivalent, accelerated partner therapy (APT). MAIN OUTCOME MEASURES Population prevalence; index case reinfection; and partners treated per index case. RESULTS Enhanced partner therapy reduced reinfection in index cases with curable STIs more than simple patient referral [risk ratio (RR) 0.71; 95% confidence interval (CI) 0.56 to 0.89]. There are no randomised trials of APT. The median number of partners treated for chlamydia per index case in UK clinics was 0.60. The number of partners needed to treat to interrupt transmission of chlamydia was lower for casual than for regular partners. In dynamic model simulations, > 10% of partners are chlamydia positive with look-back periods of up to 18 months. In the presence of a chlamydia screening programme that reduces population prevalence, treatment of current partners achieves most of the additional reduction in prevalence attributable to partner notification. Dynamic model simulations show that cotesting and treatment for chlamydia and gonorrhoea reduce the prevalence of both STIs. APT has a limited additional effect on prevalence but reduces the rate of index case reinfection. Published quality-adjusted life-year (QALY) weights were of insufficient quality to be used in a cost-effectiveness study of partner notification in this project. Using an intermediate outcome of cost per infection diagnosed, doubling the efficacy of partner notification from 0.4 to 0.8 partners treated per index case was more cost-effective than increasing chlamydia screening coverage. CONCLUSIONS There is evidence to support the improved clinical effectiveness of EPT in reducing index case reinfection. In a general heterosexual population, partner notification identifies new infected cases but the impact on chlamydia prevalence is limited. Partner notification to notify casual partners might have a greater impact than for regular partners in genitourinary clinic populations. Recommendations for future research are (1) to conduct randomised controlled trials using biological outcomes of the effectiveness of APT and of methods to increase testing for human immunodeficiency virus (HIV) and STIs after APT; (2) collection of HRQL data should be a priority to determine QALYs associated with the sequelae of curable STIs; and (3) standardised parameter sets for curable STIs should be developed for mathematical models of STI transmission that are used for policy-making. FUNDING The National Institute for Health Research Health Technology Assessment programme.
Resumo:
BACKGROUND Overlapping first generation sirolimus- and paclitaxel-eluting stents are associated with persistent inflammation, fibrin deposition and delayed endothelialisation in preclinical models, and adverse angiographic and clinical outcomes--including death and myocardial infarction (MI)--in clinical studies. OBJECTIVES To establish as to whether there are any safety concerns with newer generation drug-eluting stents (DES). DESIGN Propensity score adjustment of baseline anatomical and clinical characteristics were used to compare clinical outcomes (Kaplan-Meier estimates) between patients implanted with overlapping DES (Resolute zotarolimus-eluting stent (R-ZES) or R-ZES/other DES) against no overlapping DES. Additionally, angiographic outcomes for overlapping R-ZES and everolimus-eluting stents were evaluated in the randomised RESOLUTE All-Comers Trial. SETTING Patient level data from five controlled studies of the RESOLUTE Global Clinical Program evaluating the R-ZES were pooled. Enrollment criteria were generally unrestrictive. PATIENTS 5130 patients. MAIN OUTCOME MEASURES 2-year clinical outcomes and 13-month angiographic outcomes. RESULTS 644 of 5130 patients (12.6%) in the RESOLUTE Global Clinical Program underwent overlapping DES implantation. Implantation of overlapping DES was associated with an increased frequency of MI and more complex/calcified lesion types at baseline. Adjusted in-hospital, 30-day and 2-year clinical outcomes indicated comparable cardiac death (2-year overlap vs non-overlap: 3.0% vs 2.1%, p=0.36), major adverse cardiac events (13.3% vs 10.7%, p=0.19), target-vessel MI (3.9% vs 3.4%, p=0.40), clinically driven target vessel revascularisation (7.7% vs 6.5%, p=0.32), and definite/probable stent thrombosis (1.4% vs 0.9%, p=0.28). 13-month adjusted angiographic outcomes were comparable between overlapping and non-overlapping DES. CONCLUSIONS Overlapping newer generation DES are safe and effective, with comparable angiographic and clinical outcomes--including repeat revascularisation--to non-overlapping DES.
Resumo:
OBJECTIVE: The importance of the costimulatory molecules CD28 and CTLA-4 in the pathologic mechanism of rheumatoid arthritis (RA) has been demonstrated by genetic associations and the successful clinical application of CTLA-4Ig for the treatment of RA. This study was undertaken to investigate the role of the CTLA-4/CD28 axis in the local application of CTLA-4Ig in the synovial fluid (SF) of RA patients. METHODS: Quantitative polymerase chain reaction was used to analyze the expression of proinflammatory and antiinflammatory cytokines in ex vivo fluorescence-activated cell sorted CTLA-4+ and CTLA-4- T helper cells from the peripheral blood and SF of RA patients. T helper cells were also analyzed for cytokine expression in vitro after the blockade of CTLA-4 by anti-CTLA-4 Fab fragments or of B7 (CD80/CD86) molecules by CTLA-4Ig. RESULTS: CTLA-4+ T helper cells were unambiguously present in the SF of all RA patients examined, and they expressed increased amounts of interferon-γ (IFNγ), interleukin-17 (IL-17), and IL-10 as compared to CTLA-4- T helper cells. The selective blockade of CTLA-4 in T helper cells from the SF in vitro led to increased levels of IFNγ, IL-2, and IL-17. The concomitant blockade of CD28 and CTLA-4 in T helper cells from RA SF by CTLA-4Ig in vitro resulted in reduced levels of the proinflammatory cytokines IFNγ and IL-2 and increased levels of the antiinflammatory cytokines IL-10 and transforming growth factor β. CONCLUSION: Our ex vivo and in vitro results demonstrate that the CTLA-4/CD28 axis constitutes a drug target for not only the systemic, but potentially also the local, application of the costimulation blocking agent CTLA-4Ig for the treatment of RA.
Resumo:
BACKGROUND: Harvesting techniques can affect cellular parameters of autogenous bone grafts in vitro. Whether these differences translate to in vivo bone formation, however, remains unknown. OBJECTIVE: The purpose of this study was to assess the impact of different harvesting techniques on bone formation and graft resorption in vivo. MATERIAL AND METHODS: Four harvesting techniques were used: (i) corticocancellous blocks particulated by a bone mill; (ii) bone scraper; (iii) piezosurgery; and (iv) bone slurry collected from a filter device upon drilling. The grafts were placed into bone defects in the mandibles of 12 minipigs. The animals were sacrificed after 1, 2, 4 and 8 weeks of healing. Histology and histomorphometrical analyses were performed to assess bone formation and graft resorption. An explorative statistical analysis was performed. RESULTS: The amount of new bone increased, while the amount of residual bone decreased over time with all harvesting techniques. At all given time points, no significant advantage of any harvesting technique on bone formation was observed. The harvesting technique, however, affected bone formation and the amount of residual graft within the overall healing period. Friedman test revealed an impact of the harvesting technique on residual bone graft after 2 and 4 weeks. At the later time point, post hoc testing showed more newly formed bone in association with bone graft processed by bone mill than harvested by bone scraper and piezosurgery. CONCLUSIONS: Transplantation of autogenous bone particles harvested with four techniques in the present model resulted in moderate differences in terms of bone formation and graft resorption.
Resumo:
BACKGROUND To standardize multiple-breath washout (MBW) measurements, 1L tidal volume (VT) protocols were suggested. The effect on MBW derived ventilation inhomogeneity (VI) indices is unclear. METHODS We compared VI indices from free breathing MBW at baseline to 1L VT MBW performed in triplicates in 35 children (20 with CF). Mean (range) age was 12.8 (7.0-16.7) years, weight 42 (20-64) kg and height 151 (117-170) cm. RESULTS Baseline lung clearance index (LCI) increased from mean (SD) 11.0 (2.2) to 13.0 (2.6), p=0.011, in CF and from 6.8 (0.5) to 7.7 (1.4), p=0.004, in controls. Moment ratio and Scond similarly increased. While change in VI indices was heterogeneous in individuals, decrease in functional residual capacity was most strongly associated with LCI increase. CONCLUSION MBW protocols strongly influence measures of VI. The 1L VT MBW protocol leads to overestimation of VI and is not recommended in children.
Resumo:
We assessed the impact of antiviral prophylaxis and preemptive therapy on the incidence and outcomes of cytomegalovirus (CMV) disease in a nationwide prospective cohort of solid organ transplant recipients. Risk factors associated with CMV disease and graft failure-free survival were analyzed using Cox regression models. One thousand two hundred thirty-nine patients transplanted from May 2008 until March 2011 were included; 466 (38%) patients received CMV prophylaxis and 522 (42%) patients were managed preemptively. Overall incidence of CMV disease was 6.05% and was linked to CMV serostatus (D+/R− vs. R+, hazard ratio [HR] 5.36 [95% CI 3.14–9.14], p < 0.001). No difference in the incidence of CMV disease was observed in patients receiving antiviral prophylaxis as compared to the preemptive approach (HR 1.16 [95% CI 0.63–2.17], p = 0.63). CMV disease was not associated with a lower graft failure-free survival (HR 1.27 [95% CI 0.64–2.53], p = 0.50). Nevertheless, patients followed by the preemptive approach had an inferior graft failure-free survival after a median of 1.05 years of follow-up (HR 1.63 [95% CI 1.01–2.64], p = 0.044). The incidence of CMV disease in this cohort was low and not influenced by the preventive strategy used. However, patients on CMV prophylaxis were more likely to be free from graft failure.
Resumo:
Context: In virologically suppressed, antiretroviral-treated patients, the effect of switching to tenofovir (TDF) on bone biomarkers compared to patients remaining on stable antiretroviral therapy is unknown. Methods: We examined bone biomarkers (osteocalcin [OC], procollagen type 1 amino-terminal propeptide, and C-terminal cross-linking telopeptide of type 1 collagen) and bone mineral density (BMD) over 48 weeks in virologically suppressed patients (HIV RNA < 50 copies/ml) randomized to switch to TDF/emtricitabine (FTC) or remain on first-line zidovudine (AZT)/lamivudine (3TC). PTH was also measured. Between-group differences in bone biomarkers and associations between change in bone biomarkers and BMD measures were assessed by Student's t tests, Pearson correlation, and multivariable linear regression, respectively. All data are expressed as mean (SD), unless otherwise specified. Results: Of 53 subjects (aged 46.0 y; 84.9% male; 75.5% Caucasian), 29 switched to TDF/FTC. There were reductions in total hip and lumbar spine BMD in those switching to TDF/FTC (total hip, TDF/FTC, −1.73 (2.76)% vs AZT/3TC, −0.39 (2.41)%; between-group P = .07; lumbar spine, TDF/FTC, −1.50 (3.49)% vs AZT/3TC, +0.25 (2.82)%; between-group P = .06), but they did not reach statistical significance. Greater declines in lumbar spine BMD correlated with greater increases in OC (r = −0.28; P = .05). The effect of TDF/FTC on bone biomarkers remained significant when adjusted for baseline biomarker levels, gender, and ethnicity. There was no difference in change in PTH levels over 48 weeks between treatment groups (between-group P = .23). All biomarkers increased significantly from weeks 0 to 48 in the switch group, with no significant change in those remaining on AZT/3TC (between-group, all biomarkers, P < .0001). Conclusion: A switch to TDF/FTC compared to remaining on a stable regimen is associated with increases in bone turnover that correlate with reductions in BMD, suggesting that TDF exposure directly affects bone metabolism in vivo.
Resumo:
During recent years, mindfulness-based approaches have been gaining relevance for treatment in clinical populations. Correspondingly, the empirical study of mindfulness has steadily grown; thus, the availability of valid measures of the construct is critically important. This paper gives an overview of the current status in the field of self-report assessment of mindfulness. All eight currently available and validated mindfulness scales (for adults) are evaluated, with a particular focus on their virtues and limitations and on differences among them. It will be argued that none of these scales may be a fully adequate measure of mindfulness, as each of them offers unique advantages but also disadvantages. In particular, none of them seems to provide a comprehensive assessment of all aspects of mindfulness in samples from the general population. Moreover, some scales may be particularly indicated in investigations focusing on specific populations such as clinical samples (Cognitive and Affective Mindfulness Scale, Southampton Mindfulness Questionnaire) or meditators (Freiburg Mindfulness Inventory). Three main open issues are discussed: (1) the coverage of aspects of mindfulness in questionnaires; (2) the nature of the relationships between these aspects; and (3) the validity of self-report measures of mindfulness. These issues should be considered in future developments in the self-report assessment of mindfulness.
Resumo:
BACKGROUND: The burden of enterococcal infections has increased over the last decades with vancomycin-resistant enterococci (VRE) being a major health problem. Solid organ transplantation is considered as a risk factor. However, little is known about the relevance of enterococci in solid organ transplantation recipients in areas with a low VRE prevalence. METHODS: We examined the epidemiology of enterococcal events in patients followed in the Swiss Transplant Cohort Study between May 2008 and September 2011 and analyzed risk factors for infection, aminopenicillin resistance, treatment, and outcome. RESULTS: Of the 1234 patients, 255 (20.7%) suffered from 392 enterococcal events (185 [47.2%] infections, 205 [52.3%] colonizations, and 2 events with missing clinical information). Only 2 isolates were VRE. The highest infection rates were found early after liver transplantation (0.24/person-year) consisting in 58.6% of Enterococcus faecium. The highest colonization rates were documented in lung transplant recipients (0.33/person-year), with 46.5% E. faecium. Age, prophylaxis with a betalactam antibiotic, and liver transplantation were significantly associated with infection. Previous antibiotic treatment, intensive care unit stay, and lung transplantation were associated with aminopenicillin resistance. Only 4/205 (2%) colonization events led to an infection. Adequate treatment did not affect microbiological clearance rates. Overall mortality was 8%; no deaths were attributable to enterococcal events. CONCLUSIONS: Enterococcal colonizations and infections are frequent in transplant recipients. Progression from colonization to infection is rare. Therefore, antibiotic treatment should be used restrictively in colonization. No increased mortality because of enterococcal infection was noted
Resumo:
BACKGROUND Multidetector computed tomography (MDCT) may be useful to identify patients with patent foramen ovale (PFO). The aim of this study was to analyze whether a MDCT performed before pulmonary vein isolation reliably detects a PFO that may be used for access to the left atrium. METHODS AND RESULTS In 79 consecutive patients, who were referred for catheter ablation of symptomatic paroxysmal or persistent atrial fibrillation (AF), the presence of a PFO was explored by MDCT and transesophageal echocardiography (TEE). TEE was considered as the gold standard, and quality of TEE was good in all patients. In 16 patients (20.3%), MDCT could not be used for analysis because of artifacts, mainly because of AF. On TEE, a PFO was found in 15 (23.8%) of the 63 patients with usable MDCT. MDCT detected six PFO of which four were present on TEE. This corresponded to a sensitivity of 26.7%, a specificity of 95.8%, a negative predictive value of 80.7%, and a positive predictive value of 66.7%. The receiver operating characteristics curve of MDCT for the detection of PFO was 0.613 (95% confidence interval 0.493-0.732). CONCLUSIONS MDCT may detect a PFO before pulmonary isolation. However, presence of AF may lead to artifacts on MDCT impeding a meaningful analysis. Furthermore, in this study sensitivity and positive predictive value of MDCT were low and therefore MDCT was not a reliable screening tool for detection of PFO.