936 resultados para Long-term follow-up study
Resumo:
BACKGROUND:Accurate quantification of the prevalence of human immunodeficiency virus type 1 (HIV-1) drug resistance in patients who are receiving antiretroviral therapy (ART) is difficult, and results from previous studies vary. We attempted to assess the prevalence and dynamics of resistance in a highly representative patient cohort from Switzerland. METHODS:On the basis of genotypic resistance test results and clinical data, we grouped patients according to their risk of harboring resistant viruses. Estimates of resistance prevalence were calculated on the basis of either the proportion of individuals with a virologic failure or confirmed drug resistance (lower estimate) or the frequency-weighted average of risk group-specific probabilities for the presence of drug resistance mutations (upper estimate). RESULTS:Lower and upper estimates of drug resistance prevalence in 8064 ART-exposed patients were 50% and 57% in 1999 and 37% and 45% in 2007, respectively. This decrease was driven by 2 mechanisms: loss to follow-up or death of high-risk patients exposed to mono- or dual-nucleoside reverse-transcriptase inhibitor therapy (lower estimates range from 72% to 75%) and continued enrollment of low-risk patients who were taking combination ART containing boosted protease inhibitors or nonnucleoside reverse-transcriptase inhibitors as first-line therapy (lower estimates range from 7% to 12%). A subset of 4184 participants (52%) had >or= 1 study visit per year during 2002-2007. In this subset, lower and upper estimates increased from 45% to 49% and from 52% to 55%, respectively. Yearly increases in prevalence were becoming smaller in later years. CONCLUSIONS:Contrary to earlier predictions, in situations of free access to drugs, close monitoring, and rapid introduction of new potent therapies, the emergence of drug-resistant viruses can be minimized at the population level. Moreover, this study demonstrates the necessity of interpreting time trends in the context of evolving cohort populations.
Resumo:
BACKGROUND: This study aimed to investigate the influence of deep sternal wound infection on long-term survival following cardiac surgery. MATERIAL AND METHODS: In our institutional database we retrospectively evaluated medical records of 4732 adult patients who received open-heart surgery from January 1995 through December 2005. The predictive factors for DSWI were determined using logistic regression analysis. Then, each patient with deep sternal wound infection (DSWI) was matched with 2 controls without DSWI, according to the risk factors identified previously. After checking balance resulting from matching, short-term mortality was compared between groups using a paired test, and long-term survival was compared using Kaplan-Meier analysis and a Cox proportional hazard model. RESULTS: Overall, 4732 records were analyzed. The mean age of the investigated population was 69.3±12.8 years. DSWI occurred in 74 (1.56%) patients. Significant independent predictive factors for deep sternal infections were active smoking (OR 2.19, CI95 1.35-3.53, p=0.001), obesity (OR 1.96, CI95 1.20-3.21, p=0.007), and insulin-dependent diabetes mellitus (OR 2.09, CI95 1.05-10.06, p=0.016). Mean follow-up in the matched set was 125 months, IQR 99-162. After matching, in-hospital mortality was higher in the DSWI group (8.1% vs. 2.7% p=0.03), but DSWI was not an independent predictor of long-term survival (adjusted HR 1.5, CI95 0.7-3.2, p=0.33). CONCLUSIONS: The results presented in this report clearly show that post-sternotomy deep wound infection does not influence long-term survival in an adult general cardio-surgical patient population.
Resumo:
In this prospective case series study, 20 patients with an implant-borne single crown following early implant placement with simultaneous contour augmentation were followed for 6 years. Clinical, radiologic, and esthetic parameters were assessed. In addition, cone beam computed tomography (CBCT) was used at 6 years to examine the facial bone wall. During the study period, all 20 implants were successfully integrated, and the clinical parameters remained stable over time. Pleasing esthetic outcomes were noted, as assessed by the pink esthetic scores. None of the implants developed mucosal recession of 1 mm or more. The periapical radiographs yielded stable peri-implant bone levels, with a mean DIB of 0.44 mm at 6 years. The CBCT scans showed that all 20 implants had a detectable facial bone wall at 6 years, with a mean thickness of around 1.9 mm. In summary, this prospective case series study demonstrated stable peri-implant hard and soft tissues for all 20 implants, and pleasing esthetic outcomes overall. The follow-up of 6 years confirmed that the risk for mucosal recession is low with early implant placement. In addition, contour augmentation with guided bone regeneration (GBR) was able to establish and maintain a facial bone wall in all 20 patients.
Resumo:
Mortality of HIV/tuberculosis (TB) patients in Eastern Europe is high. Little is known about their causes of death. This study aimed to assess and compare mortality rates and cause of death in HIV/TB patients across Eastern Europe and Western Europe and Argentina (WEA) in an international cohort study. Mortality rates and causes of death were analysed by time from TB diagnosis (<3 months, 3-12 months or >12 months) in 1078 consecutive HIV/TB patients. Factors associated with TB-related death were examined in multivariate Poisson regression analysis. 347 patients died during 2625 person-years of follow-up. Mortality in Eastern Europe was three- to ninefold higher than in WEA. TB was the main cause of death in Eastern Europe in 80%, 66% and 61% of patients who died <3 months, 3-12 months or >12 months after TB diagnosis, compared to 50%, 0% and 15% in the same time periods in WEA (p<0.0001). In multivariate analysis, follow-up in WEA (incidence rate ratio (IRR) 0.12, 95% CI 0.04-0.35), standard TB-treatment (IRR 0.45, 95% CI 0.20-0.99) and antiretroviral therapy (IRR 0.32, 95% CI 0.14-0.77) were associated with reduced risk of TB-related death. Persistently higher mortality rates were observed in HIV/TB patients in Eastern Europe, and TB was the dominant cause of death at any time during follow-up. This has important implications for HIV/TB programmes aiming to optimise the management of HIV/TB patients and limit TB-associated mortality in this region.
Resumo:
BACKGROUND Coronary atherosclerosis begins early in life, but acute coronary syndromes in adults aged <30 years are exceptional. We aimed to investigate the rate of occurrence, clinical and angiographic characteristics, and long-term clinical outcome of acute coronary syndrome (ACS) in young patients who were referred to two Swiss hospitals. METHODS From 1994 to 2010, data on all patients with ACS aged <30 years were retrospectively retrieved from our database and the patients were contacted by phone or physician's visit. Baseline, lesion and procedural characteristics, and clinical outcome were compared between patients in whom an underlying atypical aetiology was found (non-ATS group; ATS: atherosclerosis) and patients in whom no such aetiology was detected (ATS group). The clinical endpoint was freedom from any major adverse cardiac event (MACE) during follow-up. RESULTS A total of 27 young patients with ACS aged <30 years were admitted during the study period. They accounted for 0.05% of all coronary angiograms performed. Mean patient age was 26.8 ± 3.5 years and 22 patients (81%) were men. Current smoking (81%) and dyslipidaemia (59%) were the most frequent risk factors. Typical chest pain (n = 23; 85%) and ST-segment elevation myocardial infarction (STEMI; n = 18 [67%]) were most often found. The ATS group consisted of 17 patients (63%) and the non-ATS group of 10 patients (37%). Hereditary thrombophilia was the most frequently encountered atypical aetiology (n = 4; 15%). At 5 years, mortality and MACE rate were 7% and 19%, respectively. CONCLUSION ACS in young patients is an uncommon condition with a variety of possible aetiologies and distinct risk factors. In-hospital and 5-year clinical outcome is satisfactory.
Resumo:
The role of the electrophysiologic (EP) study for risk stratification in patients with arrhythmogenic right ventricular cardiomyopathy is controversial. We investigated the role of inducible sustained monomorphic ventricular tachycardia (SMVT) for the prediction of an adverse outcome (AO), defined as the occurrence of cardiac death, heart transplantation, sudden cardiac death, ventricular fibrillation, ventricular tachycardia with hemodynamic compromise or syncope. Of 62 patients who fulfilled the 2010 Arrhythmogenic Right Ventricular Cardiomyopathy Task Force criteria and underwent an EP study, 30 (48%) experienced an adverse outcome during a median follow-up of 9.8 years. SMVT was inducible in 34 patients (55%), 22 (65%) of whom had an adverse outcome. In contrast, in 28 patients without inducible SMVT, 8 (29%) had an adverse outcome. Kaplan-Meier analysis showed an event-free survival benefit for patients without inducible SMVT (log-rank p = 0.008) with a cumulative survival free of an adverse outcome of 72% (95% confidence interval [CI] 56% to 92%) in the group without inducible SMVT compared to 26% (95% CI 14% to 50%) in the other group after 10 years. The inducibility of SMVT during the EP study (hazard ratio [HR] 2.99, 95% CI 1.23 to 7.27), nonadherence (HR 2.74, 95% CI 1.3 to 5.77), and heart failure New York Heart Association functional class II and III (HR 2.25, 95% CI 1.04 to 4.87) were associated with an adverse outcome on univariate Cox regression analysis. The inducibility of SMVT (HR 2.52, 95% CI 1.03 to 6.16, p = 0.043) and nonadherence (HR 2.34, 95% CI 1.1 to 4.99, p = 0.028) remained as significant predictors on multivariate analysis. This long-term observational data suggest that SMVT inducibility during EP study might predict an adverse outcome in patients with arrhythmogenic right ventricular cardiomyopathy, advocating a role for EP study in risk stratification.
Resumo:
BACKGROUND The objective of this study was to compare transtelephonic ECG every 2 days and serial 7-day Holter as two methods of follow-up after atrial fibrillation (AF) catheter ablation for the judgment of ablation success. Patients with highly symptomatic AF are increasingly treated with catheter ablation. Several methods of follow-up have been described, and judgment on ablation success often relies on patients' symptoms. However, the optimal follow-up strategy objectively detecting most of the AF recurrences is yet unclear. METHODS Thirty patients with highly symptomatic AF were selected for circumferential pulmonary vein ablation. During follow-up, a transtelephonic ECG was transmitted once every 2 days for half a year. Additionally, a 7-day Holter was recorded preablation, after ablation, after 3 and 6 months, respectively. With both, procedures symptoms and actual rhythm were correlated thoroughly. RESULTS A total of 2,600 transtelephonic ECGs were collected with 216 of them showing AF. 25% of those episodes were asymptomatic. On a Kaplan-Meier analysis 45% of the patients with paroxysmal AF were still in continuous SR after 6 months. Simulating a follow-up based on symptomatic recurrences only, that number would have increased to 70%. Using serial 7-day ECG, 113 Holter with over 18,900 hours of ECG recording were acquired. After 6 months the percentage of patients classified as free from AF was 50%. Of the patients with recurrences, 30-40% were completely asymptomatic. The percentage of asymptomatic AF episodes stepwise increased from 11% prior ablation to 53% 6 months after. CONCLUSIONS The success rate in terms of freedom from AF was 70% on a symptom-only-based follow-up; using serial 7-day Holter it decreased to 50% and on transtelephonic monitoring to 45%, respectively. Transtelephonic ECG and serial 7-day Holter were equally effective to objectively determine long-term success and to detect asymptomatic patients.
Resumo:
BACKGROUND Trials assessing the benefit of immediate androgen-deprivation therapy (ADT) for treating prostate cancer (PCa) have often done so based on differences in detectable prostate-specific antigen (PSA) relapse or metastatic disease rates at a specific time after randomization. OBJECTIVE Based on the long-term results of European Organization for Research and Treatment of Cancer (EORTC) trial 30891, we questioned if differences in time to progression predict for survival differences. DESIGN, SETTING, AND PARTICIPANTS EORTC trial 30891 compared immediate ADT (n=492) with orchiectomy or luteinizing hormone-releasing hormone analog with deferred ADT (n=493) initiated upon symptomatic disease progression or life-threatening complications in randomly assigned T0-4 N0-2 M0 PCa patients. OUTCOME MEASUREMENTS AND STATISTICAL ANALYSIS Time to first objective progression (documented metastases, ureteric obstruction, not PSA rise) and time to objective castration-resistant progressive disease were compared as well as PCa mortality and overall survival. RESULTS AND LIMITATIONS After a median of 12.8 yr, 769 of the 985 patients had died (78%), 269 of PCa (27%). For patients receiving deferred ADT, the overall treatment time was 31% of that for patients on immediate ADT. Deferred ADT was significantly worse than immediate ADT for time to first objective disease progression (p<0.0001; 10-yr progression rates 42% vs 30%). However, time to objective castration-resistant disease after deferred ADT did not differ significantly (p=0.42) from that after immediate ADT. In addition, PCa mortality did not differ significantly, except in patients with aggressive PCa resulting in death within 3-5 yr after diagnosis. Deferred ADT was inferior to immediate ADT in terms of overall survival (hazard ratio: 1.21; 95% confidence interval, 1.05-1.39; p [noninferiority]=0.72, p [difference] = 0.0085). CONCLUSIONS This study shows that if hormonal manipulation is used at different times during the disease course, differences in time to first disease progression cannot predict differences in disease-specific survival. A deferred ADT policy may substantially reduce the time on treatment, but it is not suitable for patients with rapidly progressing disease.
Resumo:
OBJECTIVE: In recent years research investigating various health benefits of Taiji practice has markedly increased. Despite this growing scientific interest, essential questions such as to what extent a Taiji course may exert noticeable effects in participants’ everyday life, what these effects are, and how and where potential transfer effects occur, have hardly been considered. The aim of our study was to explore transfer effects from a Taiji course into participants’ daily lives. METHODS: We conducted a longitudinal observational study in 45 healthy participants at the end of their three-month Taiji beginner course (tp1) and at two months (tp2) as well as one year after course completion (tp3). Participants were asked to report their Taiji practice behavior at all time points, as well as to rate and describe perceived transfer effects of Taiji course contents on their daily life at tp1 and tp3. RESULTS: Transfer effects were reported by 91.1% of all respondents after course completion (tp1) and persisted in 73.3% at the one-year follow-up assessment (tp3), counting “increase of self-efficacy”, “improvement of stress management”, and “increase of body awareness” as the most frequently mentioned effects. Transfer effects predominantly occurred in participants’ work and social environments, as well as during everyday activities in public areas. While selfreliant Taiji practice frequency significantly decreased from 82.2% at tp1 to 55.6% at tp3 (P < 0.001), the magnitude of self-reported transfer effects did not (P = 0.35). As explorative analyses revealed, regular Taiji course attendance was highly correlated with stronger transfer effects at tp1 (r = 0.51; P < 0.001) and tp3 (r = 0.35; P = 0.020). Participants reporting high self-reliant Taiji practice frequency at tp2 were likely to maintain a regular practice routine at tp3 (r = 0.42; P < 0.004), whereas self-reliant practice frequency and transfer effects at tp1 were positively correlated with self-reliant practice frequency at tp3 on a trend level (r < 0.27; P > 0.08). CONCLUSION: Our data underline the importance of regular course participation for pronounced and long lasting transfer effects into participants’ everyday life. We discuss that several context and process-related aspects of a Taiji intervention are potentially relevant factors for enhancement of transfer effect.
Resumo:
The importance of the cerebellum for non‐motor functions is becoming more and more evident. The influence on cognitive functions from acquired cerebellar lesions during childhood, however, is not well known. We present follow‐up data from 24 patients, who were operated upon during childhood for benign cerebellar tumours. The benign histology of these tumours required neither radiotherapy nor chemotherapy. Post‐operatively, these children were of normal intelligence with a mean IQ of 99.1, performance intelligence quotient (PIQ) of 101.3 and verbal intelligence quotient (VIQ) of 96.8. However, 57% of patients showed abnormalities in subtesting. In addition, more extensive neuropsychological testing revealed significant problems for attention, memory, processing speed and interference. Visuo‐constructive problems were marked for copying the Rey figure, but less pronounced for recall of the figure. Verbal fluency was more affected than design fluency. Behavioural deficits could be detected in 33% of patients. Attention deficit problems were marked in 12.5%, whereas others demonstrated psychiatric symptoms such as mutism, addiction problems, anorexia, uncontrolled temper tantrums and phobia. Age at tumour operation and size of tumour had no influence on outcome. Vermis involvement was related to an increase in neuropsychological and psychiatric problems. The observation that patients with left‐sided cerebellar tumours were more affected than patients with right‐sided tumours is probably also influenced by a more pronounced vermian involvement in the former group. In summary, this study confirms the importance of the cerebellum for cognitive development and points to the necessity of careful follow‐up for these children to provide them with the necessary help to achieve full integration into professional life.
Resumo:
AIMS: Second-generation everolimus-eluting stents (EES) are safer and more efficient than first-generation paclitaxel-eluting stents (PES). Third-generation biolimus-eluting stents (BES) have been found to be non-inferior to PES. To date, there is no available comparative study between EES and BES. We aimed to investigate the safety and efficacy of BES with biodegradable polymer compared to EES with durable polymer at a follow-up of two years in an unselected population of consecutively enrolled patients. METHODS AND RESULTS: A group of 814 consecutive patients undergoing percutaneous coronary intervention (PCI) was enrolled between 2007 and 2010, of which 527 were treated with EES and 287 with BES implantation. Clinical outcome was compared in 200 pairs using propensity score matching. The primary endpoint was a composite of death, myocardial infarction (MI) and target vessel revascularisation (TVR) at two-year follow-up. Median follow-up was 22 months. The primary outcome occurred in 11.5% of EES and 10.5% of BES patients (HR 1.11, 95% CI: 0.61-2.00, p=0.74). At two years, there was no significant difference with regard to death (HR 0.49, 95% CI: 0.18-1.34, p=0.17), cardiac death (HR 0.14, 95% CI: 0.02-1.14, p=0.66) or MI (HR 6.10, 95% CI: 0.73-50.9, p=0.10). Stent thrombosis (ST) incidence was evenly distributed between EES (n=2) and BES (n=2) (p-value=1.0). CONCLUSIONS: This first clinical study failed to demonstrate any significant difference regarding safety or efficacy between these two types and generations of drug-eluting stents (DES).
Resumo:
BACKGROUND: Little is known on the "very" long-term incidence of major adverse cardiac events (MACE), target-lesion revascularization (TLR), target-vessel revascularization and stent thrombosis after sirolimus-eluting stent (SES) implantation. We present the first study to provide a 10-year clinical follow-up in an unselected patient population who underwent SES implantation. METHODS AND RESULTS: We ran a systematic 10-year clinical follow-up in a series of 200 consecutive patients treated with unrestricted SES implantation between April 2002 and April 2003 in two Swiss hospitals. Outcomes and follow-up were obtained in all 200 patients. The cumulative 10-year MACE rate was 47% with all-cause death of 20%, cardiac death of 9%, myocardial infarction of 7%, TLR and target-vessel revascularization of 8% and 11% respectively. Academic Research Consortium-defined "definite and probable" stent thrombosis-rate was 2.5%. TLR risk was maximal between 3 to 6 years. New lesion revascularization increased throughout the study period. CONCLUSION: Incidence of TLR was maximal 3 to 6 years after SES implantation and decreased thereafter. MACE and non-TLR revascularization rates steadily increased during the complete follow-up underlining the progression of coronary artery disease.
Resumo:
Background Nowadays there is extensive evidence available showing the efficacy of cognitive remediation therapies. Integrative approaches seem superior regarding the maintenance of proximal outcome at follow-up as well as generalization to other areas of functioning. To date, only limited evidence about the efficacy of CRT is available concerning elder schizophrenia patients. The Integrated Neurocognitive Therapy (INT) represents a new developed cognitive remediation approach. It is a manualized group therapy approach targeting all 11 NIMH-MATRICS dimensions within one therapy concept. In this study we compared the effects of INT on an early course group (duration of disease<5 years) to a long-term group of schizophrenia outpatients (duration of disease>15 years). Methods An international multicenter study carried out in Germany, Switzerland and Austria with a total of 90 outpatients diagnosed with Schizophrenia (DSM-IV-TR) were randomly assigned either to an INT-Therapy or to Treatment-As-Usual (TAU). 50 of the 90 Patients were an Early-Course (EC) group, suffering from schizophrenia for less than 5 years (Mean age=29 years, Mean duration of illness=3.3 years). The other 40 were a Long-term Course (LC) group, suffering from schizophrenia longer than 15 years (Mean age= 45 years, Mean duration of illness=22 years). Treatment comprised of 15 biweekly sessions. An extensive assessment battery was conducted before and after treatment and at follow up (1 year). Multivariate General Linear Models (GLM) (duration of illness x treatment x time) examined our hypothesis, if an EC group of schizophrenia outpatients differ in proximal and distal outcome from a LC group. Results Irrespective of the duration of illness, both groups (EC & LC) were able to benefit from the INT. INT was superior compared to TAU in most of the assessed domains. Dropout rate of EC group was much higher (21.4%) than LC group (8%) during therapy phase. However, interaction effects show that the LC group revealed significantly higher effects in the neurocognitive domains of speed of processing (F>3.6) and vigilance (F>2.4). In social cognition the EC group showed significantly higher effects in social schema (F>2.5) and social attribution (blame; F>6.0) compared to the LC group. Regarding more distal outcome, patients treated with INT obtained reduced general symptoms unaffected by the duration of illness during therapy phase and at follow-up (F>4.3). Discussion Results suggest that INT is a valid goal-oriented treatment to improve cognitive functions in schizophrenia outpatients. Irrespective of the duration of illness significant treatment, effects were evident. Against common expectations, long-term, more chronic patients showed higher effects in basal cognitive functions compared to younger patients and patients without any active therapy (TAU). Consequently, more integrated therapy offers are also recommended for long-term course schizophrenia patients.
Resumo:
BACKGROUND Spinal myxopapillary ependymomas (MPEs) are slowly growing ependymal gliomas with preferential manifestation in young adults. The aim of this study was to assess the outcome of patients with MPE treated with surgery, radiotherapy (RT), and/or chemotherapy. METHODS The medical records of 183 MPE patients (male: 59%) treated at the MD Anderson Cancer Center and 11 institutions from the Rare Cancer Network were retrospectively reviewed. Mean patient' age at diagnosis was 35.5 ± 15.8 years. Ninety-seven (53.0%) patients underwent surgery without RT, and 86 (47.0%) were treated with surgery and/or RT. Median RT dose was 50.4 Gy. Median follow-up was 83.9 months. RESULTS Fifteen (8.2%) patients died, 7 of unrelated cause. The estimated 10-year overall survival was 92.4% (95% CI: 87.7-97.1). Treatment failure was observed in 58 (31.7%) patients. Local failure, distant spinal relapse, and brain failure were observed in 49 (26.8%), 17 (9.3%), and 11 (6.0%) patients, respectively. The estimated 10-year progression-free survival was 61.2% (95% CI: 52.8-69.6). Age (<36 vs ≥36 y), treatment modality (surgery alone vs surgery and RT), and extent of surgery were prognostic factors for local control and progression-free survival on univariate and multivariate analysis. CONCLUSIONS In this series, treatment failure of MPE occurred in approximately one third of patients. The observed recurrence pattern of primary spinal MPE was mainly local, but a substantial number of patients failed nonlocally. Younger patients and those not treated initially with adjuvant RT or not undergoing gross total resection were significantly more likely to present with tumor recurrence/progression.
Resumo:
BACKGROUND The optimal management of high-risk prostate cancer remains uncertain. In this study we assessed the safety and efficacy of a novel multimodal treatment paradigm for high-risk prostate cancer. METHODS This was a prospective phase II trial including 35 patients with newly diagnosed high-risk localized or locally advanced prostate cancer treated with high-dose intensity-modulated radiation therapy preceded or not by radical prostatectomy, concurrent intensified-dose docetaxel-based chemotherapy and long-term androgen deprivation therapy. Primary endpoint was acute and late toxicity evaluated with the Common Terminology Criteria for Adverse Events version 3.0. Secondary endpoint was biochemical and clinical recurrence-free survival explored with the Kaplan-Meier method. RESULTS Acute gastro-intestinal and genito-urinary toxicity was grade 2 in 23% and 20% of patients, and grade 3 in 9% and 3% of patients, respectively. Acute blood/bone marrow toxicity was grade 2 in 20% of patients. No acute grade ≥ 4 toxicity was observed. Late gastro-intestinal and genito-urinary toxicity was grade 2 in 9% of patients each. No late grade ≥ 3 toxicity was observed. Median follow-up was 63 months (interquartile range 31-79). Actuarial 5-year biochemical and clinical recurrence-free survival rate was 55% (95% confidence interval, 35-75%) and 70% (95% confidence interval, 52-88%), respectively. CONCLUSIONS In our phase II trial testing a novel multimodal treatment paradigm for high-risk prostate cancer, toxicity was acceptably low and mid-term oncological outcome was good. This treatment paradigm, thus, may warrant further evaluation in phase III randomized trials.