960 resultados para 360 Social problems
Resumo:
OBJECTIVE To determine if adequacy of randomisation and allocation concealment is associated with changes in effect sizes (ES) when comparing physical therapy (PT) trials with and without these methodological characteristics. DESIGN Meta-epidemiological study. PARTICIPANTS A random sample of randomised controlled trials (RCTs) included in meta-analyses in the PT discipline were identified. INTERVENTION Data extraction including assessments of random sequence generation and allocation concealment was conducted independently by two reviewers. To determine the association between sequence generation, and allocation concealment and ES, a two-level analysis was conducted using a meta-meta-analytic approach. PRIMARY AND SECONDARY OUTCOME MEASURES association between random sequence generation and allocation concealment and ES in PT trials. RESULTS 393 trials included in 43 meta-analyses, analysing 44 622 patients contributed to this study. Adequate random sequence generation and appropriate allocation concealment were accomplished in only 39.7% and 11.5% of PT trials, respectively. Although trials with inappropriate allocation concealment tended to have an overestimate treatment effect when compared with trials with adequate concealment of allocation, the difference was non-statistically significant (ES=0.12; 95% CI -0.06 to 0.30). When pooling our results with those of Nuesch et al, we obtained a pooled statistically significant value (ES=0.14; 95% CI 0.02 to 0.26). There was no difference in ES in trials with appropriate or inappropriate random sequence generation (ES=0.02; 95% CI -0.12 to 0.15). CONCLUSIONS Our results suggest that when evaluating risk of bias of primary RCTs in PT area, systematic reviewers and clinicians implementing research into practice should pay attention to these biases since they could exaggerate treatment effects. Systematic reviewers should perform sensitivity analysis including trials with low risk of bias in these domains as primary analysis and/or in combination with less restrictive analyses. Authors and editors should make sure that allocation concealment and random sequence generation are properly reported in trial reports.
Resumo:
BACKGROUND The Valve Academic Research Consortium (VARC) has proposed a standardized definition of bleeding in patients undergoing transcatheter aortic valve interventions (TAVI). The VARC bleeding definition has not been validated or compared to other established bleeding definitions so far. Thus, we aimed to investigate the impact of bleeding and compare the predictivity of VARC bleeding events with established bleeding definitions. METHODS AND RESULTS Between August 2007 and April 2012, 489 consecutive patients with severe aortic stenosis were included into the Bern-TAVI-Registry. Every bleeding complication was adjudicated according to the definitions of VARC, BARC, TIMI, and GUSTO. Periprocedural blood loss was added to the definition of VARC, providing a modified VARC definition. A total of 152 bleeding events were observed during the index hospitalization. Bleeding severity according to VARC was associated with a gradual increase in mortality, which was comparable to the BARC, TIMI, GUSTO, and the modified VARC classifications. The predictive precision of a multivariable model for mortality at 30 days was significantly improved by adding the most serious bleeding of VARC (area under the curve [AUC], 0.773; 95% confidence interval [CI], 0.706 to 0.839), BARC (AUC, 0.776; 95% CI, 0.694 to 0.857), TIMI (AUC, 0.768; 95% CI, 0.692 to 0.844), and GUSTO (AUC, 0.791; 95% CI, 0.714 to 0.869), with the modified VARC definition resulting in the best predictivity (AUC, 0.814; 95% CI, 0.759 to 0.870). CONCLUSIONS The VARC bleeding definition offers a severity stratification that is associated with a gradual increase in mortality and prognostic information comparable to established bleeding definitions. Adding the information of periprocedural blood loss to VARC may increase the sensitivity and the predictive power of this classification.
Resumo:
OBJECTIVE To assess the current state of reporting of pain outcomes in Cochrane reviews on chronic musculoskeletal painful conditions and to elicit opinions of patients, healthcare practitioners, and methodologists on presenting pain outcomes to patients, clinicians, and policymakers. METHODS We identified all reviews in the Cochrane Library of chronic musculoskeletal pain conditions from Cochrane review groups (Back, Musculoskeletal, and Pain, Palliative, and Supportive Care) that contained a summary of findings (SoF) table. We extracted data on reported pain domains and instruments and conducted a survey and interviews on considerations for SoF tables (e.g., pain domains, presentation of results). RESULTS Fifty-seven SoF tables in 133 Cochrane reviews were eligible. SoF tables reported pain in 56/57, with all presenting results for pain intensity (20 different outcome instruments), pain interference in 8 SoF tables (5 different outcome instruments), and pain frequency in 1 multiple domain instrument. Other domains like pain quality or pain affect were not reported. From the survey and interviews [response rate 80% (36/45)], we derived 4 themes for a future research agenda: pain domains, considerations for assessing truth, discrimination, and feasibility; clinically important thresholds for responder analyses and presenting results; and establishing hierarchies of outcome instruments. CONCLUSION There is a lack of standardization in the domains of pain selected and the manner that pain outcomes are reported in SoF tables, hampering efforts to synthesize evidence. Future research should focus on the themes identified, building partnerships to achieve consensus and develop guidance on best practices for reporting pain outcomes.
Resumo:
BACKGROUND Inability to predict the therapeutic effect of a drug in individual pain patients prolongs the process of drug and dose finding until satisfactory pharmacotherapy can be achieved. Many chronic pain conditions are associated with hypersensitivity of the nervous system or impaired endogenous pain modulation. Pharmacotherapy often aims at influencing these disturbed nociceptive processes. Its effect might therefore depend on the extent to which they are altered. Quantitative sensory testing (QST) can evaluate various aspects of pain processing and might therefore be able to predict the analgesic efficacy of a given drug. In the present study three drugs commonly used in the pharmacological management of chronic low back pain are investigated. The primary objective is to examine the ability of QST to predict pain reduction. As a secondary objective, the analgesic effects of these drugs and their effect on QST are evaluated. METHODS/DESIGN In this randomized, double blinded, placebo controlled cross-over study, patients with chronic low back pain are randomly assigned to imipramine, oxycodone or clobazam versus active placebo. QST is assessed at baseline, 1 and 2 h after drug administration. Pain intensity, side effects and patients' global impression of change are assessed in intervals of 30 min up to two hours after drug intake. Baseline QST is used as explanatory variable to predict drug effect. The change in QST over time is analyzed to describe the pharmacodynamic effects of each drug on experimental pain modalities. Genetic polymorphisms are analyzed as co-variables. DISCUSSION Pharmacotherapy is a mainstay in chronic pain treatment. Antidepressants, anticonvulsants and opioids are frequently prescribed in a "trial and error" fashion, without knowledge however, which drug suits best which patient. The present study addresses the important need to translate recent advances in pain research to clinical practice. Assessing the predictive value of central hypersensitivity and endogenous pain modulation could allow for the implementation of a mechanism-based treatment strategy in individual patients. TRIAL REGISTRATION Clinicaltrials.gov, NCT01179828.
Resumo:
HIV-infected women are at increased risk of cervical intra-epithelial neoplasia (CIN) and invasive cervical cancer (ICC), but it has been difficult to disentangle the influences of heavy exposure to HPV infection, inadequate screening, and immunodeficiency. A case-control study including 364 CIN2/3 and 20 ICC cases matched to 1,147 controls was nested in the Swiss HIV Cohort Study (1985-2013). CIN2/3 risk was significantly associated with low CD4+ cell counts, whether measured as nadir (odds ratio (OR) per 100-cell/μL decrease=1.15, 95% CI: 1.08, 1.22), or at CIN2/3 diagnosis (1.10, 95% CI: 1.04, 1.16). An association was evident even for nadir CD4+ 200-349 versus ≥350 cells/μL (OR=1.57, 95% CI: 1.09, 2.25). After adjustment for nadir CD4+, a protective effect of >2-year cART use was seen against CIN2/3 (OR versus never cART use=0.64, 95% CI: 0.42, 0.98). Despite low study power, similar associations were seen for ICC, notably with nadir CD4+ (OR for 50 versus >350 cells/μL= 11.10, 95% CI: 1.24, 100). HPV16-L1 antibodies were significantly associated with CIN2/3, but HPV16-E6 antibodies were nearly exclusively detected in ICC. In conclusion, worsening immunodeficiency, even at only moderately decreased CD4+ cell counts (200-349 CD4+ cells/μL), is a significant risk factor for CIN2/3 and cervical cancer. This article is protected by copyright. All rights reserved.
Uterine torsion in Brown Swiss cattle: retrospective analysis from an alpine practice in Switzerland
Resumo:
The incidence of uterine torsion in cattle is 0.5–1 per cent of all calvings and up to 30 per cent of all dystocia cases (Berchtold and Rüsch 1993). The unstable suspension of the bovine uterus is a predisposition cited by different authors (Pearson 1971, Schulz and others 1975, Berchtold and Rüsch 1993). Age of the cow, season and weight and sex of the calf have been inconsistently reported to be associated with uterine torsion (Distl 1991, Frazer and others 1996, Tamm 1997). Small amount of fetal fluids and a large abdomen may contribute to uterine torsion (Berchtold and Rüsch 1993). Furthermore, some authors describe a predisposition in the Brown Swiss breed (Distl 1991, Schmid 1993, Frazer and others 1996) and in cows kept in alpine regions (Schmid 1993). Uterine torsion is predominantly seen under parturition, and the degree of torsion is most often between 180° and 360°. The direction is counter-clockwise in 60–90 per cent of the cases (Pearson 1971, Berchtold and Rüsch 1993, Erteld and others 2012). Vaginal delivery is possible after manual detorsion or after rolling of the cow, whereas caesarean section has to be performed after unsuccessful detorsion or if the cervix is not dilating adequately following successful correction of the torsion (Berchtold and Rüsch 1993, Frazer and others 1996). Out of all veterinary-assisted dystocia cases, 20 per cent (Aubry and others 2008) to 30 per cent (Berchtold and Rüsch 1993) are due to uterine torsion. Many publications describe fertility variables after dystocia, but only Schönfelder and coworkers described that 40 per cent of the cows got pregnant after uterine torsion followed by caesarean section (Schönfelder and Sobiraj 2005).
Resumo:
INTRODUCTION Late presentation to HIV care leads to increased morbidity and mortality. We explored risk factors and reasons for late HIV testing and presentation to care in the nationally representative Swiss HIV Cohort Study (SHCS). METHODS Adult patients enrolled in the SHCS between July 2009 and June 2012 were included. An initial CD4 count <350 cells/µl or an AIDS-defining illness defined late presentation. Demographic and behavioural characteristics of late presenters (LPs) were compared with those of non-late presenters (NLPs). Information on self-reported, individual barriers to HIV testing and care were obtained during face-to-face interviews. RESULTS Of 1366 patients included, 680 (49.8%) were LPs. Seventy-two percent of eligible patients took part in the survey. LPs were more likely to be female (p<0.001) or from sub-Saharan Africa (p<0.001) and less likely to be highly educated (p=0.002) or men who have sex with men (p<0.001). LPs were more likely to have their first HIV test following a doctor's suggestion (p=0.01), and NLPs in the context of a regular check-up (p=0.02) or after a specific risk situation (p<0.001). The main reasons for late HIV testing were "did not feel at risk" (72%), "did not feel ill" (65%) and "did not know the symptoms of HIV" (51%). Seventy-one percent of the participants were symptomatic during the year preceding HIV diagnosis and the majority consulted a physician for these symptoms. CONCLUSIONS In Switzerland, late presentation to care is driven by late HIV testing due to low risk perception and lack of awareness about HIV. Tailored HIV testing strategies and enhanced provider-initiated testing are urgently needed.
Resumo:
OBJECTIVE To assess recommended and actual use of statins in primary prevention of cardiovascular disease (CVD) based on clinical prediction scores in adults who develop their first acute coronary syndrome (ACS). METHOD Cross-sectional study of 3172 adults without previous CVD hospitalized with ACS at 4 university centers in Switzerland. The number of participants eligible for statins before hospitalization was estimated based on the European Society of Cardiology (ESC) guidelines and compared to the observed number of participants on statins at hospital entry. RESULTS Overall, 1171 (37%) participants were classified as high-risk (10-year risk of cardiovascular mortality ≥5% or diabetes); 1025 (32%) as intermediate risk (10-year risk <5% but ≥1%); and 976 (31%) as low risk (10-year risk <1%). Before hospitalization, 516 (16%) were on statins; among high-risk participants, only 236 of 1171 (20%) were on statins. If ESC primary prevention guidelines had been fully implemented, an additional 845 high-risk adults (27% of the whole sample) would have been eligible for statins before hospitalization. CONCLUSION Although statins are recommended for primary prevention in high-risk adults, only one-fifth of them are on statins when hospitalized for a first ACS.
Resumo:
IMPORTANCE Associations between subclinical thyroid dysfunction and fractures are unclear and clinical trials are lacking. OBJECTIVE To assess the association of subclinical thyroid dysfunction with hip, nonspine, spine, or any fractures. DATA SOURCES AND STUDY SELECTION The databases of MEDLINE and EMBASE (inception to March 26, 2015) were searched without language restrictions for prospective cohort studies with thyroid function data and subsequent fractures. DATA EXTRACTION Individual participant data were obtained from 13 prospective cohorts in the United States, Europe, Australia, and Japan. Levels of thyroid function were defined as euthyroidism (thyroid-stimulating hormone [TSH], 0.45-4.49 mIU/L), subclinical hyperthyroidism (TSH <0.45 mIU/L), and subclinical hypothyroidism (TSH ≥4.50-19.99 mIU/L) with normal thyroxine concentrations. MAIN OUTCOME AND MEASURES The primary outcome was hip fracture. Any fractures, nonspine fractures, and clinical spine fractures were secondary outcomes. RESULTS Among 70,298 participants, 4092 (5.8%) had subclinical hypothyroidism and 2219 (3.2%) had subclinical hyperthyroidism. During 762,401 person-years of follow-up, hip fracture occurred in 2975 participants (4.6%; 12 studies), any fracture in 2528 participants (9.0%; 8 studies), nonspine fracture in 2018 participants (8.4%; 8 studies), and spine fracture in 296 participants (1.3%; 6 studies). In age- and sex-adjusted analyses, the hazard ratio (HR) for subclinical hyperthyroidism vs euthyroidism was 1.36 for hip fracture (95% CI, 1.13-1.64; 146 events in 2082 participants vs 2534 in 56,471); for any fracture, HR was 1.28 (95% CI, 1.06-1.53; 121 events in 888 participants vs 2203 in 25,901); for nonspine fracture, HR was 1.16 (95% CI, 0.95-1.41; 107 events in 946 participants vs 1745 in 21,722); and for spine fracture, HR was 1.51 (95% CI, 0.93-2.45; 17 events in 732 participants vs 255 in 20,328). Lower TSH was associated with higher fracture rates: for TSH of less than 0.10 mIU/L, HR was 1.61 for hip fracture (95% CI, 1.21-2.15; 47 events in 510 participants); for any fracture, HR was 1.98 (95% CI, 1.41-2.78; 44 events in 212 participants); for nonspine fracture, HR was 1.61 (95% CI, 0.96-2.71; 32 events in 185 participants); and for spine fracture, HR was 3.57 (95% CI, 1.88-6.78; 8 events in 162 participants). Risks were similar after adjustment for other fracture risk factors. Endogenous subclinical hyperthyroidism (excluding thyroid medication users) was associated with HRs of 1.52 (95% CI, 1.19-1.93) for hip fracture, 1.42 (95% CI, 1.16-1.74) for any fracture, and 1.74 (95% CI, 1.01-2.99) for spine fracture. No association was found between subclinical hypothyroidism and fracture risk. CONCLUSIONS AND RELEVANCE Subclinical hyperthyroidism was associated with an increased risk of hip and other fractures, particularly among those with TSH levels of less than 0.10 mIU/L and those with endogenous subclinical hyperthyroidism. Further study is needed to determine whether treating subclinical hyperthyroidism can prevent fractures.
Resumo:
AIMS We aimed to assess the prevalence and management of clinical familial hypercholesterolaemia (FH) among patients with acute coronary syndrome (ACS). METHODS AND RESULTS We studied 4778 patients with ACS from a multi-centre cohort study in Switzerland. Based on personal and familial history of premature cardiovascular disease and LDL-cholesterol levels, two validated algorithms for diagnosis of clinical FH were used: the Dutch Lipid Clinic Network algorithm to assess possible (score 3-5 points) or probable/definite FH (>5 points), and the Simon Broome Register algorithm to assess possible FH. At the time of hospitalization for ACS, 1.6% had probable/definite FH [95% confidence interval (CI) 1.3-2.0%, n = 78] and 17.8% possible FH (95% CI 16.8-18.9%, n = 852), respectively, according to the Dutch Lipid Clinic algorithm. The Simon Broome algorithm identified 5.4% (95% CI 4.8-6.1%, n = 259) patients with possible FH. Among 1451 young patients with premature ACS, the Dutch Lipid Clinic algorithm identified 70 (4.8%, 95% CI 3.8-6.1%) patients with probable/definite FH, and 684 (47.1%, 95% CI 44.6-49.7%) patients had possible FH. Excluding patients with secondary causes of dyslipidaemia such as alcohol consumption, acute renal failure, or hyperglycaemia did not change prevalence. One year after ACS, among 69 survivors with probable/definite FH and available follow-up information, 64.7% were using high-dose statins, 69.0% had decreased LDL-cholesterol from at least 50, and 4.6% had LDL-cholesterol ≤1.8 mmol/L. CONCLUSION A phenotypic diagnosis of possible FH is common in patients hospitalized with ACS, particularly among those with premature ACS. Optimizing long-term lipid treatment of patients with FH after ACS is required.
Resumo:
OBJECTIVES The aim of this study was to assess the safety of the concurrent administration of a clopidogrel and prasugrel loading dose in patients undergoing primary percutaneous coronary intervention. BACKGROUND Prasugrel is one of the preferred P2Y12 platelet receptor antagonists for ST-segment elevation myocardial infarction patients. The use of prasugrel was evaluated clinically in clopidogrel-naive patients. METHODS Between September 2009 and October 2012, a total of 2,023 STEMI patients were enrolled in the COMFORTABLE (Comparison of Biomatrix Versus Gazelle in ST-Elevation Myocardial Infarction [STEMI]) and the SPUM-ACS (Inflammation and Acute Coronary Syndromes) studies. Patients receiving a prasugrel loading dose were divided into 2 groups: 1) clopidogrel and a subsequent prasugrel loading dose; and 2) a prasugrel loading dose. The primary safety endpoint was Bleeding Academic Research Consortium types 3 to 5 bleeding in hospital at 30 days. RESULTS Of 2,023 patients undergoing primary percutaneous coronary intervention, 427 (21.1%) received clopidogrel and a subsequent prasugrel loading dose, 447 (22.1%) received a prasugrel loading dose alone, and the remaining received clopidogrel only. At 30 days, the primary safety endpoint was observed in 1.9% of those receiving clopidogrel and a subsequent prasugrel loading dose and 3.4% of those receiving a prasugrel loading dose alone (adjusted hazard ratio [HR]: 0.57; 95% confidence interval [CI]: 0.25 to 1.30, p = 0.18). The HAS-BLED (hypertension, abnormal renal/liver function, stroke, bleeding history or predisposition, labile international normalized ratio, elderly, drugs/alcohol concomitantly) bleeding score tended to be higher in prasugrel-treated patients (p = 0.076). The primary safety endpoint results, however, remained unchanged after adjustment for these differences (clopidogrel and a subsequent prasugrel loading dose vs. prasugrel only; HR: 0.54 [95% CI: 0.23 to 1.27], p = 0.16). No differences in the composite of cardiac death, myocardial infarction, or stroke were observed at 30 days (adjusted HR: 0.66, 95% CI: 0.27 to 1.62, p = 0.36). CONCLUSIONS This observational, nonrandomized study of ST-segment elevation myocardial infarction patients suggests that the administration of a loading dose of prasugrel in patients pre-treated with a loading dose of clopidogrel is not associated with an excess of major bleeding events. (Comparison of Biomatrix Versus Gazelle in ST-Elevation Myocardial Infarction [STEMI] [COMFORTABLE]; NCT00962416; and Inflammation and Acute Coronary Syndromes [SPUM-ACS]; NCT01000701).
Resumo:
BACKGROUND The objective of the study was to evaluate the implications of different classifications of rheumatic heart disease on estimated prevalence, and to systematically assess the importance of incidental findings from echocardiographic screening among schoolchildren in Peru. METHODS We performed a cluster randomized observational survey using portable echocardiography among schoolchildren aged 5 to 16 years from randomly selected public and private schools in Arequipa, Peru. Rheumatic heart disease was defined according to the modified World Health Organization (WHO) criteria and the World Heart Federation (WHF) criteria. FINDINGS Among 1395 eligible students from 40 classes and 20 schools, 1023 (73%) participated in the present survey. The median age of the children was 11 years (interquartile range [IQR] 8-13 years) and 50% were girls. Prevalence of possible, probable and definite rheumatic heart disease according to the modified WHO criteria amounted to 19.7/1000 children and ranged from 10.2/1000 among children 5 to 8 years of age to 39.8/1000 among children 13 to 16 years of age; the prevalence of borderline/definite rheumatic heart disease according to the WHF criteria was 3.9/1000 children. 21 children (2.1%) were found to have congenital heart disease, 8 of which were referred for percutaneous or surgical intervention. CONCLUSIONS Prevalence of RHD in Peru was considerably lower compared to endemic regions in sub-Saharan Africa, southeast Asia, and Oceania; and paralleled by a comparable number of undetected congenital heart disease. Strategies to address collateral findings from echocardiographic screening are necessary in the setup of active surveillance programs for RHD. TRIAL REGISTRATION ClinicalTrials.gov identifier: NCT02353663.