283 resultados para Cohort study
Resumo:
In a cohort study among 2751 members (71.5% females) of the German and Swiss RLS patient organizations changes in restless legs syndrome (RLS) severity over time was assessed and the impact on quality of life, sleep quality and depressive symptoms was analysed. A standard set of scales (RLS severity scale IRLS, SF-36, Pittsburgh Sleep Quality Index and the Centre for Epidemiologic Studies Depression Scale) in mailed questionnaires was repeatedly used to assess RLS severity and health status over time and a 7-day diary once to assess short-term variations. A clinically relevant change of the RLS severity was defined by a change of at least 5 points on the IRLS scale. During 36 months follow-up minimal improvement of RLS severity between assessments was observed. Men consistently reported higher severity scores. RLS severity increased with age reaching a plateau in the age group 45-54 years. During 3 years 60.2% of the participants had no relevant (±5 points) change in RLS severity. RLS worsening was significantly related to an increase in depressive symptoms and a decrease in sleep quality and quality of life. The short-term variation showed distinctive circadian patterns with rhythm magnitudes strongly related to RLS severity. The majority of participants had a stable course of severe RLS over three years. An increase in RLS severity was accompanied by a small to moderate negative, a decrease by a small positive influence on quality of life, depressive symptoms and sleep quality.
Resumo:
BACKGROUND High-risk prostate cancer (PCa) is an extremely heterogeneous disease. A clear definition of prognostic subgroups is mandatory. OBJECTIVE To develop a pretreatment prognostic model for PCa-specific survival (PCSS) in high-risk PCa based on combinations of unfavorable risk factors. DESIGN, SETTING, AND PARTICIPANTS We conducted a retrospective multicenter cohort study including 1360 consecutive patients with high-risk PCa treated at eight European high-volume centers. INTERVENTION Retropubic radical prostatectomy with pelvic lymphadenectomy. OUTCOME MEASUREMENTS AND STATISTICAL ANALYSIS Two Cox multivariable regression models were constructed to predict PCSS as a function of dichotomization of clinical stage (< cT3 vs cT3-4), Gleason score (GS) (2-7 vs 8-10), and prostate-specific antigen (PSA; ≤ 20 ng/ml vs > 20 ng/ml). The first "extended" model includes all seven possible combinations; the second "simplified" model includes three subgroups: a good prognosis subgroup (one single high-risk factor); an intermediate prognosis subgroup (PSA >20 ng/ml and stage cT3-4); and a poor prognosis subgroup (GS 8-10 in combination with at least one other high-risk factor). The predictive accuracy of the models was summarized and compared. Survival estimates and clinical and pathologic outcomes were compared between the three subgroups. RESULTS AND LIMITATIONS The simplified model yielded an R(2) of 33% with a 5-yr area under the curve (AUC) of 0.70 with no significant loss of predictive accuracy compared with the extended model (R(2): 34%; AUC: 0.71). The 5- and 10-yr PCSS rates were 98.7% and 95.4%, 96.5% and 88.3%, 88.8% and 79.7%, for the good, intermediate, and poor prognosis subgroups, respectively (p = 0.0003). Overall survival, clinical progression-free survival, and histopathologic outcomes significantly worsened in a stepwise fashion from the good to the poor prognosis subgroups. Limitations of the study are the retrospective design and the long study period. CONCLUSIONS This study presents an intuitive and easy-to-use stratification of high-risk PCa into three prognostic subgroups. The model is useful for counseling and decision making in the pretreatment setting.
Resumo:
We assessed the feasibility and the procedural and long-term safety of intracoronary (i.c) imaging for documentary purposes with optical coherence tomography (OCT) and intravascular ultrasound (IVUS) in patients with acute ST-elevation myocardial infarction (STEMI) undergoing primary PCI in the setting of IBIS-4 study. IBIS4 (NCT00962416) is a prospective cohort study conducted at five European centers including 103 STEMI patients who underwent serial three-vessel coronary imaging during primary PCI and at 13 months. The feasibility parameter was successful imaging, defined as the number of pullbacks suitable for analysis. Safety parameters included the frequency of peri-procedural complications, and major adverse cardiac events (MACE), a composite of cardiac death, myocardial infarction (MI) and any clinically-indicated revascularization at 2 years. Clinical outcomes were compared with the results from a cohort of 485 STEMI patients undergoing primary PCI without additional imaging. Imaging of the infarct-related artery at baseline (and follow-up) was successful in 92.2 % (96.6 %) of patients using OCT and in 93.2 % (95.5 %) using IVUS. Imaging of the non-infarct-related vessels was successful in 88.7 % (95.6 %) using OCT and in 90.5 % (93.3 %) using IVUS. Periprocedural complications occurred <2.0 % of OCT and none during IVUS. There were no differences throughout 2 years between the imaging and control group in terms of MACE (16.7 vs. 13.3 %, adjusted HR1.40, 95 % CI 0.77-2.52, p = 0.27). Multi-modality three-vessel i.c. imaging in STEMI patients undergoing primary PCI is consistent a high degree of success and can be performed safely without impact on cardiovascular events at long-term follow-up.
Resumo:
BACKGROUND AND AIMS Limited data from large cohorts are available on tumor necrosis factor (TNF) antagonists (infliximab, adalimumab, certolizumab pegol) switch over time. We aimed to evaluate the prevalence of switching from one TNF antagonist to another and to identify associated risk factors. METHODS Data from the Swiss Inflammatory Bowel Diseases Cohort Study (SIBDCS) were analyzed. RESULTS Of 1731 patients included into the SIBDCS (956 with Crohn's disease [CD] and 775 with ulcerative colitis [UC]), 347 CD patients (36.3%) and 129 UC patients (16.6%) were treated with at least one TNF antagonist. A total of 53/347 (15.3%) CD patients (median disease duration 9 years) and 20/129 (15.5%) of UC patients (median disease duration 7 years) needed to switch to a second and/or a third TNF antagonist, respectively. Median treatment duration was longest for the first TNF antagonist used (CD 25 months; UC 14 months), followed by the second (CD 13 months; UC 4 months) and third TNF antagonist (CD 11 months; UC 15 months). Primary nonresponse, loss of response and side effects were the major reasons to stop and/or switch TNF antagonist therapy. A low body mass index, a short diagnostic delay and extraintestinal manifestations at inclusion were identified as risk factors for a switch of the first used TNF antagonist within 24 months of its use in CD patients. CONCLUSION Switching of the TNF antagonist over time is a common issue. The median treatment duration with a specific TNF antagonist is diminishing with an increasing number of TNF antagonists being used.
Resumo:
BACKGROUND: Infliximab (IFX) has been used for over a decade worldwide. Less is known about the natural history of IFX use beyond a few years and which patients are more likely to sustain benefits. METHODS: Patients with Crohn's disease (CD) exposed to IFX from Massachusetts General Hospital, Boston, Saint-Antoine Hospital, Paris, and the Swiss IBD Cohort Study were identified through retrospective and prospective data collection, complemented by chart abstraction of electronic medical records. We compared long-term users of IFX (>5 yr of treatment, long-term users of infliximab [LTUI]), with non-LTUI patients to identify prognostic factors. RESULTS: We pooled data on 1014 patients with CD from 3 different databases, of whom 250 were defined as LTUI. The comparison group comprised 290 patients with CD who discontinued IFX: 48 primary nonresponses, 95 loss of responses, and 147 adverse events. Factors associated with LTUI were colonic involvements and an earlier age at the start of IFX. The prevalence of active smokers and obese patients differed markedly, but inversely, between American and European centers but did not impact outcome. The discontinuation rate was stable around 3% to 6%, each year from years 3 to 10. CONCLUSIONS: Young age at start of IFX and colonic CD are factors associated with a beneficial long-term use of IFX. After 5 years of IFX, there is still a 3% to 5% discontinuation rate annually. Several factors associated with a good initial response such as nonsmoker and shorter disease duration at IFX initiation do not seem associated with a longer term response.
Resumo:
OBJECTIVES Systemic lupus erythematosus (SLE) is associated with considerable cardiovascular morbidity that has not yet been directly compared with other diseases with known cardiovascular risk. METHODS Two hundred and forty-one patients of the multicentre Swiss SLE cohort study (SSCS) were cross-sectionally assessed for coronary heart disease (CHD), cerebrovascular disease (CVD) and peripheral artery disease (PAD). SLE patients were compared with a cohort of 193 patients with type-1 diabetes mellitus being followed at the University Hospital Basel. A subgroup analysis of 50 age- and sex-matched patients from the University Hospital Basel was performed. RESULTS Of patients within the SSCS 13.3% had one or more vascular events: 8.3% CHD, 5% CVD and 1.2% PAD. In type-1 diabetes mellitus patients, 15% had vascular events: 9.3% CHD, 3.1% CVD and 5.6% PAD. In the matched subgroup, 26% of SLE patients had vascular events (14% CHD) compared with 12% in type-1 DM patients (2% CHD). Cardiovascular risk factors were similar in both groups. Vascular events in SLE patients were associated with age, longer disease duration, dyslipidaemia, and hypertension. CONCLUSION Cardiovascular morbidity in SLE is at least as frequent as in age- and sex-matched type-1 diabetes mellitus patients. Therefore, aggressive screening and management of cardiovascular risk factors should be performed.
Resumo:
OBJECTIVES To report on trends of tuberculosis ascertainment among HIV patients in a rural HIV cohort in Tanzania, and assessing the impact of a bundle of services implemented in December 2012, consisting of three components:(i)integration of HIV and tuberculosis services; (ii)GeneXpert for tuberculosis diagnosis; and (iii)electronic data collection. DESIGN Retrospective cohort study of patients enrolled in the Kilombero Ulanga Antiretroviral Cohort (KIULARCO), Tanzania.). METHODS HIV patients without prior history of tuberculosis enrolled in the KIULARCO cohort between 2005 and 2013 were included.Cox proportional hazard models were used to estimate rates and predictors of tuberculosis ascertainment. RESULTS Of 7114 HIV positive patients enrolled, 5123(72%) had no history of tuberculosis. Of these, 66% were female, median age was 38 years, median baseline CD4+ cell count was 243 cells/µl, and 43% had WHO clinical stage 3 or 4. During follow-up, 421 incident tuberculosis cases were notified with an estimated incidence of 3.6 per 100 person-years(p-y)[95% confidence interval(CI)3.26-3.97]. The incidence rate varied over time and increased significantly from 2.96 to 43.98 cases per 100 p-y after the introduction of the bundle of services in December 2012. Four independent predictors of tuberculosis ascertainment were identified:poor clinical condition at baseline (Hazard Ratio (HR) 3.89, 95% CI 2.87-5.28), WHO clinical stage 3 or 4 (HR 2.48, 95% CI 1.88-3.26), being antiretroviralnaïve (HR 2.97, 95% CI 2.25-3.94), and registration in 2013(HR 6.07, 95% CI 4.39-8.38). CONCLUSION The integration of tuberculosis and HIV services together with comprehensive electronic data collection and use of GeneXpert increased dramatically the ascertainment of tuberculosis in this rural African HIV cohort.
Resumo:
BACKGROUND Most guidelines recommend at least 2-cm excision margin for melanomas thicker than 2 mm. OBJECTIVE We evaluated whether 1- or 2-cm excision margins for melanoma (>2 mm) result in different outcomes. METHODS This is a retrospective cohort study on patients with melanomas (>2 mm) who underwent tumor excision with 1-cm (228 patients) or 2-cm (97 patients) margins to investigate presence of local recurrences, locoregional and distant metastases, and disease-free and overall survival. RESULTS In all, 325 patients with mean age of 61.84 years and Breslow thickness of 4.36 mm were considered for the study with a median follow-up of 1852 days (1995-2012). There was no significant difference in the frequency of locoregional and distant metastasis between the 2 groups (P = .311 and .571). The survival analysis showed no differences for disease-free (P = .800; hazard ratio 0.948; 95% confidence interval 0.627-1.433) and overall (P = .951; hazard ratio 1.018; 95% confidence interval 0.575-1.803) survival. LIMITATIONS The study was not prospectively randomized. CONCLUSIONS Our study did not show any significant differences in important outcome parameters such as local or distant metastases and overall survival. A prospective study testing 1- versus 2-cm excision margin is warranted.
Resumo:
BACKGROUND Data evaluating the chronological order of appearance of extraintestinal manifestations (EIMs) relative to the time of inflammatory bowel disease (IBD) diagnosis is currently lacking. We aimed to assess the type, frequency, and chronological order of appearance of EIMs in patients with IBD. METHODS Data from the Swiss Inflammatory Bowel Disease Cohort Study were analyzed. RESULTS The data on 1249 patients were analyzed (49.8% female, median age: 40 [interquartile range, 30-51 yr], 735 [58.8%] with Crohn's disease, 483 [38.7%] with ulcerative colitis, and 31 [2.5%] with indeterminate colitis). A total of 366 patients presented with EIMs (29.3%). Of those, 63.4% presented with 1, 26.5% with 2, 4.9% with 3, 2.5% with 4, and 2.7% with 5 EIMs during their lifetime. Patients presented with the following diseases as first EIMs: peripheral arthritis 70.0%, aphthous stomatitis 21.6%, axial arthropathy/ankylosing spondylitis 16.4%, uveitis 13.7%, erythema nodosum 12.6%, primary sclerosing cholangitis 6.6%, pyoderma gangrenosum 4.9%, and psoriasis 2.7%. In 25.8% of cases, patients presented with their first EIM before IBD was diagnosed (median time 5 mo before IBD diagnosis: range, 0-25 mo), and in 74.2% of cases, the first EIM manifested itself after IBD diagnosis (median: 92 mo; range, 29-183 mo). CONCLUSIONS In one quarter of patients with IBD, EIMs appeared before the time of IBD diagnosis. Occurrence of EIMs should prompt physicians to look for potential underlying IBD.
Resumo:
Data concerning the link between severity of abdominal aortic calcification (AAC) and fracture risk in postmenopausal women are discordant. This association may vary by skeletal site and duration of follow-up. Our aim was to assess the association between the AAC severity and fracture risk in older women over the short- and long term. This is a case-cohort study nested in a large multicenter prospective cohort study. The association between AAC and fracture was assessed using Odds Ratios (OR) and 95% confidence intervals (95%CI) for vertebral fractures and using Hazard Risks (HR) and 95%CI for non-vertebral and hip fractures. AAC severity was evaluated from lateral spine radiographs using Kauppila's semiquantitative score. Severe AAC (AAC score 5+) was associated with higher risk of vertebral fracture during 4 years of follow-up, after adjustment for confounders (age, BMI, walking, smoking, hip bone mineral density, prevalent vertebral fracture, systolic blood pressure, hormone replacement therapy) (OR=2.31, 95%CI: 1.24-4.30, p<0.01). In a similar model, severe AAC was associated with an increase in the hip fracture risk (HR=2.88, 95%CI: 1.00-8.36, p=0.05). AAC was not associated with the risk of any non-vertebral fracture. AAC was not associated with the fracture risk after 15 years of follow-up. In elderly women, severe AAC is associated with higher short-term risk of vertebral and hip fractures, but not with the long-term risk of these fractures. There is no association between AAC and risk of non-vertebral-non-hip fracture in older women. Our findings lend further support to the hypothesis that AAC and skeletal fragility are related.
Resumo:
CONTEXT The polyuria-polydipsia syndrome comprises primary polydipsia (PP) and central and nephrogenic diabetes insipidus (DI). Correctly discriminating these entities is mandatory, given that inadequate treatment causes serious complications. The diagnostic "gold standard" is the water deprivation test with assessment of arginine vasopressin (AVP) activity. However, test interpretation and AVP measurement are challenging. OBJECTIVE The objective was to evaluate the accuracy of copeptin, a stable peptide stoichiometrically cosecreted with AVP, in the differential diagnosis of polyuria-polydipsia syndrome. DESIGN, SETTING, AND PATIENTS This was a prospective multicenter observational cohort study from four Swiss or German tertiary referral centers of adults >18 years old with the history of polyuria and polydipsia. MEASUREMENTS A standardized combined water deprivation/3% saline infusion test was performed and terminated when serum sodium exceeded 147 mmol/L. Circulating copeptin and AVP levels were measured regularly throughout the test. Final diagnosis was based on the water deprivation/saline infusion test results, clinical information, and the treatment response. RESULTS Fifty-five patients were enrolled (11 with complete central DI, 16 with partial central DI, 18 with PP, and 10 with nephrogenic DI). Without prior thirsting, a single baseline copeptin level >21.4 pmol/L differentiated nephrogenic DI from other etiologies with a 100% sensitivity and specificity, rendering a water deprivation testing unnecessary in such cases. A stimulated copeptin >4.9 pmol/L (at sodium levels >147 mmol/L) differentiated between patients with PP and patients with partial central DI with a 94.0% specificity and a 94.4% sensitivity. A stimulated AVP >1.8 pg/mL differentiated between the same categories with a 93.0% specificity and a 83.0% sensitivity. LIMITATION This study was limited by incorporation bias from including AVP levels as a diagnostic criterion. CONCLUSION Copeptin is a promising new tool in the differential diagnosis of the polyuria-polydipsia syndrome, and a valid surrogate marker for AVP. Primary Funding Sources: Swiss National Science Foundation, University of Basel.
Resumo:
Whether anticoagulation management practices are associated with improved outcomes in elderly patients with acute venous thromboembolism (VTE) is uncertain. Thus, we aimed to examine whether practices recommended by the American College of Chest Physicians guidelines are associated with outcomes in elderly patients with VTE. We studied 991 patients aged ≥65 years with acute VTE in a Swiss prospective multicenter cohort study and assessed the adherence to four management practices: parenteral anticoagulation ≥5 days, INR ≥2.0 for ≥24 hours before stopping parenteral anticoagulation, early start with vitamin K antagonists (VKA) ≤24 hours of VTE diagnosis, and the use of low-molecular-weight heparin (LMWH) or fondaparinux. The outcomes were all-cause mortality, VTE recurrence, and major bleeding at 6 months, and the length of hospital stay (LOS). We used Cox regression and lognormal survival models, adjusting for patient characteristics. Overall, 9% of patients died, 3% had VTE recurrence, and 7% major bleeding. Early start with VKA was associated with a lower risk of major bleeding (adjusted hazard ratio 0.37, 95% CI 0.20-0.71). Early start with VKA (adjusted time ratio [TR] 0.77, 95% CI 0.69-0.86) and use of LMWH/fondaparinux (adjusted TR 0.87, 95% CI 0.78-0.97) were associated with a shorter LOS. An INR ≥2.0 for ≥24 hours before stopping parenteral anticoagulants was associated with a longer LOS (adjusted TR 1.2, 95% CI 1.08-1.33). In elderly patients with VTE, the adherence to recommended anticoagulation management practices showed mixed results. In conclusion, only early start with VKA and use of parenteral LMWH/fondaparinux were associated with better outcomes.
Resumo:
BACKGROUND Drug resistance is a major barrier to successful antiretroviral treatment (ART). Therefore, it is important to monitor time trends at a population level. METHODS We included 11,084 ART-experienced patients from the Swiss HIV Cohort Study (SHCS) between 1999 and 2013. The SHCS is highly representative and includes 72% of patients receiving ART in Switzerland. Drug resistance was defined as the presence of at least one major mutation in a genotypic resistance test. To estimate the prevalence of drug resistance, data for patients with no resistance test was imputed based on patient's risk of harboring drug resistant viruses. RESULTS The emergence of new drug resistance mutations declined dramatically from 401 to 23 patients between 1999 and 2013. The upper estimated prevalence limit of drug resistance among ART-experienced patients decreased from 57.0% in 1999 to 37.1% in 2013. The prevalence of three-class resistance decreased from 9.0% to 4.4% and was always <0.4% for patients who initiated ART after 2006. Most patients actively participating in the SHCS in 2013 with drug resistant viruses initiated ART before 1999 (59.8%). Nevertheless, in 2013, 94.5% of patients who initiated ART before 1999 had good remaining treatment options based on Stanford algorithm. CONCLUSION HIV-1 drug resistance among ART-experienced patients in Switzerland is a well-controlled relic from the pre-combination ART era. Emergence of drug resistance can be virtually stopped with new potent therapies and close monitoring.