334 resultados para spanish cohort
Resumo:
BACKGROUND: Guidelines for the management of anaemia in patients with chronic kidney disease (CKD) recommend a minimal haemoglobin (Hb) target of 11 g/dL. Recent surveys indicate that this requirement is not met in many patients in Europe. In most studies, Hb is only assessed over a short-term period. The aim of this study was to examine the control of anaemia over a continuous long-term period in Switzerland. METHODS: A prospective multi-centre observational study was conducted in dialysed patients treated with recombinant human epoetin (EPO) beta, over a one-year follow-up period, with monthly assessments of anaemia parameters. RESULTS: Three hundred and fifty patients from 27 centres, representing 14% of the dialysis population in Switzerland, were included. Mean Hb was 11.9 +/- 1.0 g/dL, and remained stable over time. Eighty-five % of the patients achieved mean Hb >or= 11 g/dL. Mean EPO dose was 155 +/- 118 IU/kg/week, being delivered mostly by subcutaneous route (64-71%). Mean serum ferritin and transferrin saturation were 435 +/- 253 microg/L and 30 +/- 11%, respectively. At month 12, adequate iron stores were found in 72.5% of patients, whereas absolute and functional iron deficiencies were observed in only 5.1% and 17.8%, respectively. Multivariate analysis showed that diabetes unexpectedly influenced Hb towards higher levels (12.1 +/- 0.9 g/dL; p = 0.02). One year survival was significantly higher in patients with Hb >or= 11 g/dL than in those with Hb <11 g/dL (19.7% vs 7.3%, p = 0.006). CONCLUSION: In comparison to European studies of reference, this survey shows a remarkable and continuous control of anaemia in Swiss dialysis centres. These results were reached through moderately high EPO doses, mostly given subcutaneously, and careful iron therapy management.
Resumo:
OBJECTIVES: Etravirine (ETV) is a novel nonnucleoside reverse transcriptase inhibitor (NNRTI) with reduced cross-resistance to first-generation NNRTIs, which has been primarily studied in randomized clinical trials and not in routine clinical settings. METHODS: ETV resistance-associated mutations (RAMs) were investigated by analysing 6072 genotypic tests. The antiviral activity of ETV was predicted using different interpretation systems: International AIDS Society-USA (IAS-USA), Stanford, Rega and Agence Nationale de Recherches sur le Sida et les hépatites virales (ANRS). RESULTS: The prevalence of ETV RAMs was higher in NNRTI-exposed patients [44.9%, 95% confidence interval (CI) 41.0-48.9%] than in treatment-naïve patients (9.6%, 95% CI 8.5-10.7%). ETV RAMs in treatment-naïve patients mainly represent polymorphism, as prevalence estimates in genotypic tests for treatment-naïve patients with documented recent (<1 year) infection, who had acquired HIV before the introduction of NNRTIs, were almost identical (9.8%, 95% CI 3.3-21.4). Discontinuation of NNRTI treatment led to a marked drop in the detection of ETV RAMs, from 51.7% (95% CI 40.8-62.6%) to 34.5% (95% CI 24.6-45.4%, P=0.032). Differences in prevalence among subtypes were found for V90I and V179T (P<0.001). Estimates of restricted virological response to ETV varied among algorithms in patients with exposure to efavirenz (EFV)/nevirapine (NVP), ranging from 3.8% (95% CI 2.5-5.6%) for ANRS to 56.2% (95% CI 52.2-60.1%) for Stanford. The predicted activity of ETV decreased as the sensitivity of potential optimized background regimens decreased. The presence of major IAS-USA mutations (L100I, K101E/H/P and Y181C/I/V) reduced the treatment response at week 24. CONCLUSIONS: Most ETV RAMs in drug-naïve patients are polymorphisms rather than transmitted RAMs. Uncertainty regarding predictions of antiviral activity for ETV in NNRTI-treated patients remains high. The lowest activity was predicted for patients harbouring extensive multidrug-resistant viruses, thus limiting ETV use in those who are most in need.
Resumo:
BACKGROUND: The optimal strategy for percutaneous coronary intervention (PCI) of ST-segment elevation myocardial infarction (STEMI) in multi-vessel disease (MVD), i.e., multi-vessel PCI (MV-PCI) vs. PCI of the infarct-related artery only (IRA-PCI), still remains unknown. METHODS: Patients of the AMIS Plus registry admitted with an acute coronary syndrome were contacted after a median of 378 days (interquartile range 371-409). The primary end-point was all-cause death. The secondary end-point included all major adverse cardiovascular and cerebrovascular events (MACCE) including death, re-infarction, re-hospitalization for cardiac causes, any cardiac re-intervention, and stroke. RESULTS: Between 2005 and 2012, 8330 STEMI patients were identified, of whom 1909 (24%) had MVD. Of these, 442 (23%) received MV-PCI and 1467 (77%) IRA-PCI. While all-cause mortality was similar in both groups (2.7% both, p>0.99), MACCE was significantly lower after MV-PCI vs. IRA-PCI (15.6% vs. 20.0%, p=0.038), mainly driven by lower rates of cardiac re-hospitalization and cardiac re-intervention. Patients undergoing MV-PCI with drug-eluting stents had lower rates of all-cause mortality (2.1% vs. 7.4%, p=0.026) and MACCE (14.1% vs. 25.9%, p=0.042) compared with those receiving bare metal stents (BMS). In multivariate analysis, MV-PCI (odds ratio, OR 0.69, 95% CI 0.51-0.93, p=0.017) and comorbidities (Charlson index ≥ 2; OR 1.42, 95% CI 1.05-1.92, p=0.025) were independent predictors for 1-year MACCE. CONCLUSION: In an unselected nationwide real-world cohort, an approach using immediate complete revascularization may be beneficial in STEMI patients with MVD regarding MACCE, specifically when drug-eluting stents are used, but not regarding mortality. This has to be tested in a randomized controlled trial.
Resumo:
BACKGROUND: Mental disorders, common in primary care, are often associated with physical complaints. While exposure to psychosocial stressors and development or presence of principal mental disorders (i.e. depression, anxiety and somatoform disorders defined as multisomatoforme disorders) is commonly correlated, temporal association remains unproven. The study explores the onset of such disorders after exposure to psychosocial stressors in a cohort of primary care patients with at least one physical symptom. METHOD: The cohort study SODA (SOmatization, Depression and Anxiety) was conducted by 21 private-practice GPs and three fellow physicians in a Swiss academic primary care centre. GPs included patients via randomized daily identifiers. Depression, anxiety or somatoform disorders were identified by the full Patient Health Questionnaire (PHQ), a validated procedure to identify mental disorders based on DSM-IV criteria. The PHQ was also used to investigate exposure to psychosocial stressors (before the index consultation and during follow up) and the onset of principal mental disorders after one year of follow up. RESULTS: From November 2004 to July 2005, 1020 patients were screened for inclusion. 627 were eligible and 482 completed the PHQ one year later and were included in the analysis (77%). At one year, prevalence of principal mental disorders was 30/153 (19.6% CI95% 13.6; 26.8) for those initially exposed to a major psychosocial stressor and 26/329 (7.9% CI95% 5.2; 11.4) for those not. Stronger association exists between psychosocial stressors and depression (RR = 2.4) or anxiety (RR = 3.5) than multisomatoforme disorders (RR = 1.8). Patients who are "bothered a lot" (subjective distress) by a stressor are therefore 2.5 times (CI95% 1.5; 4.0) more likely to experience a mental disorder at one year. A history of psychiatric comorbidities or psychological treatment was not a confounding factor for developing a principal mental disorder after exposure to psychosocial stressors. CONCLUSION: This primary care study shows that patients with physical complaints exposed to psychosocial stressors had a higher risk for developing mental disorders one year later. This temporal association opens the field for further research in preventive care for mental diseases in primary care patients.
Safety of artemether-lumefantrine exposure in first trimester of pregnancy: an observational cohort.
Resumo:
BACKGROUND: There is limited data available regarding safety profile of artemisinins in early pregnancy. They are, therefore, not recommended by WHO as a first-line treatment for malaria in first trimester due to associated embryo-foetal toxicity in animal studies. The study assessed birth outcome among pregnant women inadvertently exposed to artemether-lumefantrine (AL) during first trimester in comparison to those of women exposed to other anti-malarial drugs or no drug at all during the same period of pregnancy. METHODS: Pregnant women with gestational age <20 weeks were recruited from Maternal Health clinics or from monthly house visits (demographic surveillance), and followed prospectively until delivery. RESULTS: 2167 pregnant women were recruited and 1783 (82.3%) completed the study until delivery. 319 (17.9%) used anti-malarials in first trimester, of whom 172 (53.9%) used (AL), 78 (24.4%) quinine, 66 (20.7%) sulphadoxine-pyrimethamine (SP) and 11 (3.4%) amodiaquine. Quinine exposure in first trimester was associated with an increased risk of miscarriage/stillbirth (OR 2.5; 1.3-5.1) and premature birth (OR 2.6; 1.3-5.3) as opposed to AL with (OR 1.4; 0.8-2.5) for miscarriage/stillbirth and (OR 0.9; 0.5-1.8) for preterm birth. Congenital anomalies were identified in 4 exposure groups namely AL only (1/164[0.6%]), quinine only (1/70[1.4%]), SP (2/66[3.0%]), and non-anti-malarial exposure group (19/1464[1.3%]). CONCLUSION: Exposure to AL in first trimester was more common than to any other anti-malarial drugs. Quinine exposure was associated with adverse pregnancy outcomes which was not the case following other anti-malarial intake. Since AL and quinine were used according to their availability rather than to disease severity, it is likely that the effect observed was related to the drug and not to the disease itself. Even with this caveat, a change of policy from quinine to AL for the treatment of uncomplicated malaria during the whole pregnancy period could be already envisaged.
Resumo:
Purpose: To assess the global cardiovascular (CV) risk of an individual, several scores have been developed. However, their accuracy and comparability need to be evaluated in populations others from which they were derived. The aim of this study was to compare the predictive accuracy of 4 CV risk scores using data of a large population-based cohort. Methods: Prospective cohort study including 4980 participants (2698 women, mean age± SD: 52.7±10.8 years) in Lausanne, Switzerland followed for an average of 5.5 years (range 0.2 - 8.5). Two end points were assessed: 1) coronary heart disease (CHD), and 2) CV diseases (CVD). Four risk scores were compared: original and recalibrated Framingham coronary heart disease scores (1998 and 2001); original PROCAM score (2002) and its recalibrated version for Switzerland (IAS-AGLA); Reynolds risk score. Discrimination was assessed using Harrell's C statistics, model fitness using Akaike's information criterion (AIC) and calibration using pseudo Hosmer-Lemeshow test. The sensitivity, specificity and corresponding 95% confidence intervals were assessed for each risk score using the highest risk category ([20+ % at 10 years) as the "positive" test. Results: Recalibrated and original 1998 and original 2001 Framingham scores show better discrimination (>0.720) and model fitness (low AIC) for CHD and CVD. All 4 scores are correctly calibrated (Chi2<20). The recalibrated Framingham 1998 score has the best sensitivities, 37.8% and 40.4%, for CHD and CVD, respectively. All scores present specificities >90%. Framingham 1998, PROCAM and IAS-AGLA scores include the greatest proportion of subjects (>200) in the high risk category whereas recalibrated Framingham 2001 and Reynolds include <=44 subjects. Conclusion: In this cohort, we see variations of accuracy between risk scores, the original Framingham 2001 score demonstrating the best compromise between its accuracy and its limited selection of subjects in the highest risk category. We advocate that national guidelines, based on independently validated data, take into account calibrated CV risk scores for their respective countries.
Resumo:
OBJECTIVES: Toll-like receptors (TLRs) are innate immune sensors that are integral to resisting chronic and opportunistic infections. Mounting evidence implicates TLR polymorphisms in susceptibilities to various infectious diseases, including HIV-1. We investigated the impact of TLR single nucleotide polymorphisms (SNPs) on clinical outcome in a seroincident cohort of HIV-1-infected volunteers. DESIGN: We analyzed TLR SNPs in 201 antiretroviral treatment-naive HIV-1-infected volunteers from a longitudinal seroincident cohort with regular follow-up intervals (median follow-up 4.2 years, interquartile range 4.4). Participants were stratified into two groups according to either disease progression, defined as peripheral blood CD4(+) T-cell decline over time, or peak and setpoint viral load. METHODS: Haplotype tagging SNPs from TLR2, TLR3, TLR4, and TLR9 were detected by mass array genotyping, and CD4(+) T-cell counts and viral load measurements were determined prior to antiretroviral therapy initiation. The association of TLR haplotypes with viral load and rapid progression was assessed by multivariate regression models using age and sex as covariates. RESULTS: Two TLR4 SNPs in strong linkage disequilibrium [1063 A/G (D299G) and 1363 C/T (T399I)] were more frequent among individuals with high peak viral load compared with low/moderate peak viral load (odds ratio 6.65, 95% confidence interval 2.19-20.46, P < 0.001; adjusted P = 0.002 for 1063 A/G). In addition, a TLR9 SNP previously associated with slow progression was found less frequently among individuals with high viral setpoint compared with low/moderate setpoint (odds ratio 0.29, 95% confidence interval 0.13-0.65, P = 0.003, adjusted P = 0.04). CONCLUSION: This study suggests a potentially new role for TLR4 polymorphisms in HIV-1 peak viral load and confirms a role for TLR9 polymorphisms in disease progression.
Resumo:
BACKGROUND AND OBJECTIVES: Combination antiretroviral therapy (cART) is changing, and this may affect the type and occurrence of side effects. We examined the frequency of lipodystrophy (LD) and weight changes in relation to the use of specific drugs in the Swiss HIV Cohort Study (SHCS). METHODS: In the SHCS, patients are followed twice a year and scored by the treating physician as having 'fat accumulation', 'fat loss', or neither. Treatments, and reasons for change thereof, are recorded. Our study sample included all patients treated with cART between 2003 and 2006 and, in addition, all patients who started cART between 2000 and 2003. RESULTS: From 2003 to 2006, the percentage of patients taking stavudine, didanosine and nelfinavir decreased, the percentage taking lopinavir, nevirapine and efavirenz remained stable, and the percentage taking atazanavir and tenofovir increased by 18.7 and 22.2%, respectively. In life-table Kaplan-Meier analysis, patients starting cART in 2003-2006 were less likely to develop LD than those starting cART from 2000 to 2002 (P<0.02). LD was quoted as the reason for treatment change or discontinuation for 4% of patients on cART in 2003, and for 1% of patients treated in 2006 (P for trend <0.001). In univariate and multivariate regression analysis, patients with a weight gain of >or=5 kg were more likely to take lopinavir or atazanavir than patients without such a weight gain [odds ratio (OR) 2, 95% confidence interval (CI) 1.3-2.9, and OR 1.7, 95% CI 1.3-2.1, respectively]. CONCLUSIONS: LD has become less frequent in the SHCS from 2000 to 2006. A weight gain of more than 5 kg was associated with the use of atazanavir and lopinavir.
Resumo:
OBJECTIVES: Darunavir was designed for activity against HIV resistant to other protease inhibitors (PIs). We assessed the efficacy, tolerability and risk factors for virological failure of darunavir for treatment-experienced patients seen in clinical practice. METHODS: We included all patients in the Swiss HIV Cohort Study starting darunavir after recording a viral load above 1000 HIV-1 RNA copies/mL given prior exposure to both PIs and nonnucleoside reverse transcriptase inhibitors. We followed these patients for up to 72 weeks, assessed virological failure using different loss of virological response algorithms and evaluated risk factors for virological failure using a Bayesian method to fit discrete Cox proportional hazard models. RESULTS: Among 130 treatment-experienced patients starting darunavir, the median age was 47 years, the median duration of HIV infection was 16 years, and 82% received mono or dual antiretroviral therapy before starting highly active antiretroviral therapy. During a median patient follow-up period of 45 weeks, 17% of patients stopped taking darunavir after a median exposure of 20 weeks. In patients followed beyond 48 weeks, the rate of virological failure at 48 weeks was at most 20%. Virological failure was more likely where patients had previously failed on both amprenavir and saquinavir and as the number of previously failed PI regimens increased. CONCLUSIONS: As a component of therapy for treatment-experienced patients, darunavir can achieve a similar efficacy and tolerability in clinical practice to that seen in clinical trials. Clinicians should consider whether a patient has failed on both amprenavir and saquinavir and the number of failed PI regimens before prescribing darunavir.
Resumo:
Molecular and genetic investigations in endometrial carcinogenesis may have prognostic and therapeutic implications. We studied the expression of EGFR, c-Met, PTEN and the mTOR signalling pathway (phospho-AKT/phospho-mTOR/phospho-RPS6) in 69 consecutive tumours and 16 tissue microarrays. We also analysed PIK3CA, K-Ras mutations and microsatellite instability (MSI). We distinguished two groups: group 1 (grade 1 and 2 endometrioid cancers) and group 2 (grade 3 endometrioid and type II clear and serous cell cancers). We hypothesised that these histological groups might have different features. We found that a) survival was higher in group 1 with less aggressive tumours (P⟨0.03); b) EGFR (P=0.01), PTEN and the AKT/mTOR/RPS6 signalling pathway were increased in group 1 versus group 2 (P=0.05 for phospho-mTOR); c) conversely, c-Met was higher (P⟨0.03) in group 2 than in group 1; d) In group 1, EGFR was correlated with c-Met, phospho-mTOR, phospho-RPS6 and the global activity of the phospho-AKT/phospho-mTOR/phospho-RPS6 pathway. In group 2, EGFR was correlated only with the phospho-AKT/phospho-mTOR/phospho-RPS6 pathway, whereas c-Met was correlated with PTEN; e) survival was higher for tumours with more than 50% PTEN-positive cells; f) K-RAS and PIK3CA mutations occurred in 10-12% of the available tumours and MSI in 40.4%, with a loss of MLH1 and PMS2 expression. Our results for endometrial cancers provide the first evidence for a difference in status between groups 1 and 2. The patients may benefit from different targeted treatments, anti-EGFR agents and rapamycin derivatives (anti-mTOR) for group 1 and an anti c-MET/ligand complex for group 2.
Resumo:
Background a nd A ims: T he 2 007 ECCO g uidelines o nanemia in inflammatory bowel disease (IBD) favour intravenous(iv) over oral (po) i ron supplementation due to bettereffectiveness and tolerance. Application of guidelines in clinicalpractice m ay r equire time. We a imed to determine thepercentage of IBD patients under iron supplementation therapyand its application mode over time in a large IBD cohort.Methods: Helsana, a leading Swiss health insurance companyprovides c overage f or approximately 18% of t he Swisspopulation, corresponding to about 1.2 million enrollees.Patients with Crohn's disease (CD) and ulcerative colitis (UC)were identified b y keyword search from t he a nonymisedHelsana database.Results: I n total, 6 29 CD ( 61% female) a nd 4 03 UC ( 56%female) patients w ere identified, mean retrospectiveobservation time w as 2 0.4 m onths f or CD and 13 m onths f orUC patients. Of t he entire study population, 29.3% wereprescribed iron. O ccurrence of iron prescription was 21.3% inmales a nd 31.2% in f emales ( odds r atio [OR] 1 .69, 95%-confidence interval [CI] 1.26-2.28). The prescription of iv i ronincreased from 2006/2007 ( 48.8% w ith iv i ron) to 2 008/2009(65.2% with iv iron) by a factor of 1.89.Conclusions: One third of the IBD population was treated withiron supplementation. A gradual s hift from oral t o iv iron wasobserved over time in a large Swiss IBD cohort. This switch inprescription habits g oes a long with the implementation of theECCO consensus guidelines on anemia in IBD.
Resumo:
BACKGROUND AND PURPOSE: Intravenous thrombolysis for acute ischemic stroke is beneficial within 4.5 hours of symptom onset, but the effect rapidly decreases over time, necessitating quick diagnostic in-hospital work-up. Initial time strain occasionally results in treatment of patients with an alternate diagnosis (stroke mimics). We investigated whether intravenous thrombolysis is safe in these patients. METHODS: In this multicenter observational cohort study containing 5581 consecutive patients treated with intravenous thrombolysis, we determined the frequency and the clinical characteristics of stroke mimics. For safety, we compared the symptomatic intracranial hemorrhage (European Cooperative Acute Stroke Study II [ECASS-II] definition) rate of stroke mimics with ischemic strokes. RESULTS: One hundred stroke mimics were identified, resulting in a frequency of 1.8% (95% confidence interval, 1.5-2.2). Patients with a stroke mimic were younger, more often female, and had fewer risk factors except smoking and previous stroke or transient ischemic attack. The symptomatic intracranial hemorrhage rate in stroke mimics was 1.0% (95% confidence interval, 0.0-5.0) compared with 7.9% (95% confidence interval, 7.2-8.7) in ischemic strokes. CONCLUSIONS: In experienced stroke centers, among patients treated with intravenous thrombolysis, only a few had a final diagnosis other than stroke. The complication rate in these stroke mimics was low.
Resumo:
Background Long-term treatment of primary HIV-1 infection (PHI) may allow the immune reconstitution of responses lost during the acute viremic phase and decrease of peripheral reservoirs. This in turn may represent the best setting for the use of therapeutic vaccines in order to lower the viral set-point or control of viral rebound upon ART discontinuation. Methods We investigated a cohort of 16 patients who started ART at PHI, with treatment duration of ≥4 years and persistent aviremia (<50 HIV-1 copies/ml). The cohort was characterized in terms of viral subtype, cell-associated RNA, proviral DNA and HLA genotype. Secretion of IFN-γ, IL-2 and TNF-α by CD8 T-cells was analysed by polychromatic flowcytometry using a panel of 192 HIV-1-derived epitopes. Results This cohort is highly homogenous in terms of viral subtype: 81% clade B. We identified 44 epitope-specific responses: all patients had detectable responses to >1 epitope and the mean number of responding epitopes per patient was 3. The mean frequency of cytokines-secreting CD8 T-cells was 0.32%. CD8 T-cells secreting simultaneously IFN-γ, IL-2 and TNF-α made up for about 40% of the response and cells secreting at least 2 cytokines for about 80%, consistent with a highly polyfunctional CD8 T-cell profile. There was no difference in term of polyfunctionality when HLA restriction, or recognized viral regions and epitopes were considered. Proviral DNA was detectable in all patients but at low levels (mean = 108 copies/1 million PBMCs) while cell-associated mRNA was not detectable in 19% of patients (mean = 11 copies/1 million PBMCs when detectable). Conclusion Patients with sustained virological suppression after initiation of ART at PHI show polyfunctional CD8 T-cell and low levels of proviral DNA with an absence of residual replication in a substantial percentage of patients. The use of therapeutic vaccines in this population may promote low level of rebound viremia or control of viral replication upon ART cessation.
Resumo:
Despite recent medical progresses in patient support, the mortality of sepsis remains high. Recently, new supporting strategies were proposed to improve outcome. Whereas such strategies are currently considered as standard of care, their real impact on mortality, morbidity, length of stay, and hence, health care resources utilization has been only weakly evaluated so far. Obviously, there is a critical need for epidemiologic surveys of sepsis to better address these major issues. The Lausanne Cohort of septic patients aims at building a large clinical, biological and microbiological database that will be used as a multidisciplinary research platform to study the various pathogenic mechanisms of sepsis in collaboration with the various specialists. This could be an opportunity to strengthen the collaboration within the Swiss Latin network of Intensive Care Medicine.
Resumo:
The aim of this study was to assess the prevalence of malignant lymphomas in patients with long-standing primary Sjögren's syndrome (pSS). We retrospectively studied a cohort of 55 patients with pSS over a mean follow-up period of 12 years. Five patients (9%) developed malignant lymphoma. The interval between the diagnoses of SS and lymphoma ranged from four to 12 years (mean = 6.5 years). The lymphoma arose in the lymph nodes in two cases, the parotid gland in one case, the lacrimal gland in one case, and the lung in one case. All five cases were B-cell low-grade lymphomas. Among our SS patients, those with extraglandular manifestations and/or a mixed cryoglobulin were at increased risk for lymphoma development. Secondary lymphoma carried a poor prognosis in our study. Three of the six SS patients who died during the follow-up period had lymphoma.