936 resultados para Long-term follow-up study
Resumo:
Ileal pouch-anal anastomosis was an important advancement in the treatment of ulcerative colitis. The aim of this study was to determine whether early complications of ileal pouch-anal anastomosis in patients with ulcerative colitis are associated with poor late functional results. PATIENTS AND METHODS: Eighty patients were operated on from 1986 to 2000, 62 patients with ileostomy and 18 without. The early and late complications were recorded. Specific emphasis has been placed on the incidence of pouchitis with prolonged follow-up. RESULTS: The ileostomy was closed an average of 9.2 months after the first operation. Fourteen patients were excluded from the long-term evaluation; 6 patients were lost to regular follow-up, 4 died, and 4 patients still have the ileostomy. Of the 4 patients that died, 1 died from surgical complications. Early complications after operation (41) occurred in 34 patients (42.5%). Late complications (29) occurred in 25 patients as follows: 16 had pouchitis, 3 associated with stenosis and 1 with sexual dysfunction; 5 had stenosis; and there was 1 case each of incisional hernia, ileoanal fistula, hepatic cancer, and endometriosis. Pouchitis occurred in 6 patients (9.8%) 1 year after ileal pouch-anal anastomosis, 9 (14.8%) after 3 years, 13 (21.3%) after 5 years, and 16 (26.2%) after more than 6 years. The mean daily stool frequency was 12 before and 5.8 after operation. One pouch was removed because of fistulas that appeared 2 years later. CONCLUSIONS: Ileal pouch-anal anastomosis is associated with a considerable number of early complications. There was no correlation between pouchitis and severe disease, operation with or without ileostomy, or early postoperative complications. The incidence of pouchitis was directly proportional to duration of time of follow-up.
Resumo:
PURPOSE: Infection is the leading complication of long-term central venous catheters, and its incidence may vary according to catheter type. The objective of this study was to compare the frequency and probability of infection between two types of long-term intravenous devices. METHODS: Retrospective study in 96 onco-hematology patients with partially implanted catheters (n = 55) or completely implanted ones (n = 42). Demographic data and catheter care were similar in both groups. Infection incidence and infection-free survival were used for the comparison of the two devices. RESULTS: In a median follow-up time of 210 days, the catheter-related infection incidence was 0.2102/100 catheter-days for the partially implanted devices and 0.0045/100 catheter-days for the completely implanted devices; the infection incidence rate was 46.7 (CI 95% = 6.2 to 348.8). The 1-year first infection-free survival ratio was 45% versus 97%, and the 1-year removal due to infection-free survival ratio was 42% versus 97% for partially and totally implanted catheters, respectively (P <.001 for both comparisons). CONCLUSION: In the present study, the infection risk was lower in completely implanted devices than in partially implanted ones.
Resumo:
Background: Cardiac magnetic resonance imaging provides detailed anatomical information on infarction. However, few studies have investigated the association of these data with mortality after acute myocardial infarction. Objective: To study the association between data regarding infarct size and anatomy, as obtained from cardiac magnetic resonance imaging after acute myocardial infarction, and long-term mortality. Methods: A total of 1959 reports of “infarct size” were identified in 7119 cardiac magnetic resonance imaging studies, of which 420 had clinical and laboratory confirmation of previous myocardial infarction. The variables studied were the classic risk factors – left ventricular ejection fraction, categorized ventricular function, and location of acute myocardial infarction. Infarct size and acute myocardial infarction extent and transmurality were analyzed alone and together, using the variable named “MET-AMI”. The statistical analysis was carried out using the elastic net regularization, with the Cox model and survival trees. Results: The mean age was 62.3 ± 12 years, and 77.3% were males. During the mean follow-up of 6.4 ± 2.9 years, there were 76 deaths (18.1%). Serum creatinine, diabetes mellitus and previous myocardial infarction were independently associated with mortality. Age was the main explanatory factor. The cardiac magnetic resonance imaging variables independently associated with mortality were transmurality of acute myocardial infarction (p = 0.047), ventricular dysfunction (p = 0.0005) and infarcted size (p = 0.0005); the latter was the main explanatory variable for ischemic heart disease death. The MET-AMI variable was the most strongly associated with risk of ischemic heart disease death (HR: 16.04; 95%CI: 2.64-97.5; p = 0.003). Conclusion: The anatomical data of infarction, obtained from cardiac magnetic resonance imaging after acute myocardial infarction, were independently associated with long-term mortality, especially for ischemic heart disease death.
Resumo:
Abstract Background: BNP has been extensively evaluated to determine short- and intermediate-term prognosis in patients with acute coronary syndrome, but its role in long-term mortality is not known. Objective: To determine the very long-term prognostic role of B-type natriuretic peptide (BNP) for all-cause mortality in patients with non-ST segment elevation acute coronary syndrome (NSTEACS). Methods: A cohort of 224 consecutive patients with NSTEACS, prospectively seen in the Emergency Department, had BNP measured on arrival to establish prognosis, and underwent a median 9.34-year follow-up for all-cause mortality. Results: Unstable angina was diagnosed in 52.2%, and non-ST segment elevation myocardial infarction, in 47.8%. Median admission BNP was 81.9 pg/mL (IQ range = 22.2; 225) and mortality rate was correlated with increasing BNP quartiles: 14.3; 16.1; 48.2; and 73.2% (p < 0.0001). ROC curve disclosed 100 pg/mL as the best BNP cut-off value for mortality prediction (area under the curve = 0.789, 95% CI= 0.723-0.854), being a strong predictor of late mortality: BNP < 100 = 17.3% vs. BNP ≥ 100 = 65.0%, RR = 3.76 (95% CI = 2.49-5.63, p < 0.001). On logistic regression analysis, age >72 years (OR = 3.79, 95% CI = 1.62-8.86, p = 0.002), BNP ≥ 100 pg/mL (OR = 6.24, 95% CI = 2.95-13.23, p < 0.001) and estimated glomerular filtration rate (OR = 0.98, 95% CI = 0.97-0.99, p = 0.049) were independent late-mortality predictors. Conclusions: BNP measured at hospital admission in patients with NSTEACS is a strong, independent predictor of very long-term all-cause mortality. This study allows raising the hypothesis that BNP should be measured in all patients with NSTEACS at the index event for long-term risk stratification.
Resumo:
Background. Accurate quantification of the prevalence of human immunodeficiency virus type 1 (HIV-1) drug resistance in patients who are receiving antiretroviral therapy (ART) is difficult, and results from previous studies vary. We attempted to assess the prevalence and dynamics of resistance in a highly representative patient cohort from Switzerland. Methods. On the basis of genotypic resistance test results and clinical data, we grouped patients according to their risk of harboring resistant viruses. Estimates of resistance prevalence were calculated on the basis of either the proportion of individuals with a virologic failure or confirmed drug resistance (lower estimate) or the frequency-weighted average of risk group-specific probabilities for the presence of drug resistance mutations (upper estimate). Results. Lower and upper estimates of drug resistance prevalence in 8064 ART-exposed patients were 50% and 57% in 1999 and 37% and 45% in 2007, respectively. This decrease was driven by 2 mechanisms: loss to follow-up or death of high-risk patients exposed to mono- or dual-nucleoside reverse-transcriptase inhibitor therapy (lower estimates range from 72% to 75%) and continued enrollment of low-risk patients who were taking combination ART containing boosted protease inhibitors or nonnucleoside reverse-transcriptase inhibitors as first-line therapy (lower estimates range from 7% to 12%). A subset of 4184 participants (52%) had 1 study visit per year during 2002-2007. In this subset, lower and upper estimates increased from 45% to 49% and from 52% to 55%, respectively. Yearly increases in prevalence were becoming smaller in later years. Conclusions. Contrary to earlier predictions, in situations of free access to drugs, close monitoring, and rapid introduction of new potent therapies, the emergence of drug-resistant viruses can be minimized at the population level. Moreover, this study demonstrates the necessity of interpreting time trends in the context of evolving cohort populations.
Resumo:
STATEMENT OF PROBLEM: The difficulty of identifying the ownership of lost dentures when found is a common and expensive problem in long term care facilities (LTCFs) and hospitals. PURPOSE: The purpose of this study was to evaluate the reliability of using radiofrequency identification (RFID) in the identification of dentures for LTCF residents after 3 and 6 months. MATERIAL AND METHODS: Thirty-eight residents of 2 LTCFs in Switzerland agreed to participate after providing informed consent. The tag was programmed with the family and first names of the participants and then inserted in the dentures. After placement of the tag, the information was read. A second and third assessment to review the functioning of the tag occurred at 3 and 6 months, and defective tags (if present) were reported and replaced. The data were analyzed with descriptive statistics. RESULTS: At the 3-month assessment of 34 residents (63 tags) 1 tag was unreadable and 62 tags (98.2%) were operational. At 6 months, the tags of 27 of the enrolled residents (50 tags) were available for review. No examined tag was defective at this time period. CONCLUSIONS: Within the limits of this study (number of patients, 6-month time span) RFID appears to be a reliable method of tracking and identifying dentures, with only 1 of 65 devices being unreadable at 3 months and 100% of 50 initially placed tags being readable at the end of the trial.
Perineal stapled prolapse resection for external rectal prolapse: is it worthwhile in the long-term?
Resumo:
BACKGROUND: Perineal stapled prolapse (PSP) resection is a novel operation for treating external rectal prolapse. However, no long-term results have been reported in the literature. This study analyses the long-term recurrence rate, functional outcome, and morbidity associated with PSP resection. METHODS: Nine consecutive patients undergoing PSP resection between 2007 and 2011 were prospectively followed. Surgery was performed by the same surgeons in a standardised technique. Recurrence rate, functional outcome, and complication grade were prospectively assessed. RESULTS: All 9 patients undergoing PSP resection were investigated. The median age was 72 years (range 25-88 years). No intraoperative complications occurred. Faecal incontinence, preoperatively present in 2 patients, worsened postoperatively in one patient (Vaizey 18-22). One patient developed new-onset faecal incontinence (Vaizey 18). The median obstructive defecation syndrome score decreased postoperatively significantly from 11 (median; range 8-13) to 5 (median; range 4-8) (p < 0.005). At a median follow-up of 40 months (range 14-58 months), the prolapse recurrence rate was 44 % (4/9 patients). CONCLUSIONS: The PSP resection is a fast and safe procedure associated with low morbidity. However, the poor long-term functional outcome and the recurrence rate of 44 % warrant a cautious patient selection.
Resumo:
Background: Gastric banding is currently one of the most performed procedures for morbid obesity. Results are related in part to the surgical technique and the quality of follow-up. Several bands are currently on the market, and it may bethat the type of band also plays a role in long-term results. The aim of this prospective randomized study was to compare the long-term results of the Lapband and the SAGB.Patients and methods: In three institutions with a common bariatric surgeon, consecutive patients undergoing laparoscopic gastric banding for morbid obesity were randomized to receive either a Lapband or a SAGB. The Lapband was placed using the perigastric and the SAGB with the pars flaccida technique. All data were collected prospectively. The median duration of follow-up was 131 months (103-147). Patients who lost their band were excluded from analysis as of band removal.Results: 180 patients were included between December 1998 and June 2002, 90 in each group. Except for age, which was lower in SAGB patients, the pre-operative characteristics were similar in the two patient groups. Early band-related morbidity was higher in the SAGB group (6,6 vs 0 %, p=0,03). Patients with a Lapband lost weight quicker than those with SAGB (EBMIL 50,8 vs 39,8 after 12 months, p<0,001), but the two weight loss curves joined after 24 months, and no difference could be observed later on up to 12 years after surgery (EBMIL 53,7 vs 58,1 % after 10 years, P=0,68). Long-term complications developed in 91 patients (50,5 %). Severe complications leading to band removal±conversion to another procedure developed in 30 and 40 patients in the Lapband and SAGB group respectively (33,3 vs 44,4 %, p=0,16).Conclusions: This prospective randomized study shows no significant difference in the long-term results of gastric banding between the Lapband and the SAGB. Both bands were associated with significant long-term complication and band removal/conversion rates. Patients who retain their band have acceptable long-term weight loss. It is likely that the concept of gastric banding rather than the device itself plays the most important role in longterm results.
Resumo:
Object The purpose of this study was to establish the safety and efficacy of repeat Gamma Knife surgery (GKS) for recurrent trigeminal neuralgia (TN). Methods Using the prospective database of TN patients treated with GKS in Timone University Hospital (Marseille, France), data were analyzed for 737 patients undergoing GKS for TN Type 1 from July 1992 to November 2010. Among the 497 patients with initial pain cessation, 34.4% (157/456 with ≥ 1-year follow-up) experienced at least 1 recurrence. Thirteen patients (1.8%) were considered for a second GKS, proposed only if the patients had good and prolonged initial pain cessation after the first GKS, with no other treatment alternative at the moment of recurrence. As for the first GKS, a single 4-mm isocenter was positioned in the cisternal portion of the trigeminal nerve at a median distance of 7.6 mm (range 4-14 mm) anterior to the emergence of the nerve (retrogasserian target). A median maximum dose of 90 Gy (range 70-90 Gy) was delivered. Data for 9 patients with at least 1-year followup were analyzed. A systematic review of literature was also performed, and results are compared with those of the Marseille study. Results The median time to retreatment in the Marseille study was 72 months (range 12-125 months) and in the literature it was 17 months (range 3-146 months). In the Marseille study, the median follow-up period was 33.9 months (range 12-96 months), and 8 of 9 patients (88.9%) had initial pain cessation with a median of 6.5 days (range 1-180 days). The actuarial rate for new hypesthesia was 33.3% at 6 months and 50% at 1 year, which remained stable for 7 years. The actuarial probabilities of maintaining pain relief without medication at 6 months and 1 year were 100% and 75%, respectively, and remained stable for 7 years. The systematic review analyzed 20 peer-reviewed studies reporting outcomes for repeat GKS for recurrent TN, with a total of 626 patients. Both the selection of the cases for retreatment and the way of reporting outcomes vary widely among studies, with a median rate for initial pain cessation of 88% (range 60%-100%) and for new hypesthesia of 33% (range 11%-80%). Conclusions Results from the Marseille study raise the question of surgical alternatives after failed GKS for TN. The rates of initial pain cessation and recurrence seem comparable to, or even better than, those of the first GKS, according to different studies, but toxicity is much higher, both in the Marseille study and in the published data. Neither the Marseille study data nor literature data answer the 3 cardinal questions regarding repeat radiosurgery in recurrent TN: which patients to retreat, which target is optimal, and which dose to use.
Partial cricotracheal resection for pediatric subglottic stenosis: long-term outcome in 57 patients.
Resumo:
OBJECTIVE: We sought to assess the long-term outcome of 57 pediatric patients who underwent partial cricotracheal resection for subglottic stenosis. METHODS: Eighty-one pediatric partial cricotracheal resections were performed in our tertiary care institution between 1978 and 2004. Fifty-seven patients had a minimal follow-up time of 1 year and were included in this study. Evaluation was based on the last laryngotracheal endoscopy, the responses to a questionnaire, and a retrospective review of the patient's data. The following parameters were analyzed: decannulation rates, breathing, voice quality, and deglutition. RESULTS: A single-stage partial cricotracheal resection was performed in 38 patients, and a double-stage procedure was performed in 19 patients. Sixteen patients underwent an extended partial cricotracheal resection (ie, partial cricotracheal resection combined with another open procedure). At a median follow-up time of 5.1 years, the decannulation rates after a single- or double-stage procedure were 97.4% and 95%, respectively. Two patients remained tracheotomy dependent. One patient had moderate exertional dyspnea, and all other patients had no exertional dyspnea. Voice quality was found to improve after surgical intervention for 1 +/- 1.34 grade dysphonia (P < .0001) according to the adapted GRBAS grading system (Grade, Roughness, Breathiness, Asthenia, and Strain). CONCLUSIONS: Partial cricotracheal resection provides good results for grades III and IV subglottic stenosis as primary or salvage operations. The procedure has no deleterious effects on laryngeal growth and function. The quality of voice significantly improves after surgical intervention but largely depends on the preoperative condition.
Resumo:
BACKGROUND: Anti-TNFα agents are commonly used for ulcerative colitis (UC) therapy in the event of non-response to conventional strategies or as colon-salvaging therapy. The objectives were to assess the appropriateness of biological therapies for UC patients and to study treatment discontinuation over time, according to appropriateness of treatment, as a measure of outcome. METHODS: We selected adult ulcerative colitis patients from the Swiss IBD cohort who had been treated with anti-TNFα agents. Appropriateness of the first-line anti-TNFα treatment was assessed using detailed criteria developed during the European Panel on the Appropriateness of Therapy for UC. Treatment discontinuation as an outcome was assessed for categories of appropriateness. RESULTS: Appropriateness of the first-line biological treatment was determined in 186 UC patients. For 64% of them, this treatment was considered appropriate. During follow-up, 37% of all patients discontinued biological treatment, 17% specifically because of failure. Time-to-failure of treatment was significantly different among patients on an appropriate biological treatment compared to those for whom the treatment was considered not appropriate (p=0.0007). Discontinuation rate after 2years was 26% compared to 54% between those two groups. Patients on inappropriate biological treatment were more likely to have severe disease, concomitant steroids and/or immunomodulators. They were also consistently more likely to suffer a failure of efficacy and to stop therapy during follow-up. CONCLUSION: Appropriateness of first-line anti-TNFα therapy results in a greater likelihood of continuing with the therapy. In situations where biological treatment is uncertain or inappropriate, physicians should consider other options instead of prescribing anti-TNFα agents.
Resumo:
BACKGROUND. Either higher levels of initial DNA damage or lower levels of radiation-induced apoptosis in peripheral blood lymphocytes have been associated to increased risk for develop late radiation-induced toxicity. It has been recently published that these two predictive tests are inversely related. The aim of the present study was to investigate the combined role of both tests in relation to clinical radiation-induced toxicity in a set of breast cancer patients treated with high dose hyperfractionated radical radiotherapy. METHODS. Peripheral blood lymphocytes were taken from 26 consecutive patients with locally advanced breast carcinoma treated with high-dose hyperfractioned radical radiotherapy. Acute and late cutaneous and subcutaneous toxicity was evaluated using the Radiation Therapy Oncology Group morbidity scoring schema. The mean follow-up of survivors (n = 13) was 197.23 months. Radiosensitivity of lymphocytes was quantified as the initial number of DNA double-strand breaks induced per Gy and per DNA unit (200 Mbp). Radiation-induced apoptosis (RIA) at 1, 2 and 8 Gy was measured by flow cytometry using annexin V/propidium iodide. RESULTS. Mean DSB/Gy/DNA unit obtained was 1.70 ± 0.83 (range 0.63-4.08; median, 1.46). Radiation-induced apoptosis increased with radiation dose (median 12.36, 17.79 and 24.83 for 1, 2, and 8 Gy respectively). We observed that those "expected resistant patients" (DSB values lower than 1.78 DSB/Gy per 200 Mbp and RIA values over 9.58, 14.40 or 24.83 for 1, 2 and 8 Gy respectively) were at low risk of suffer severe subcutaneous late toxicity (HR 0.223, 95%CI 0.073-0.678, P = 0.008; HR 0.206, 95%CI 0.063-0.677, P = 0.009; HR 0.239, 95%CI 0.062-0.929, P = 0.039, for RIA at 1, 2 and 8 Gy respectively) in multivariate analysis. CONCLUSIONS. A radiation-resistant profile is proposed, where those patients who presented lower levels of initial DNA damage and higher levels of radiation induced apoptosis were at low risk of suffer severe subcutaneous late toxicity after clinical treatment at high radiation doses in our series. However, due to the small sample size, other prospective studies with higher number of patients are needed to validate these results.
Resumo:
BACKGROUND. Transsexual persons afford a very suitable model to study the effect of sex steroids on uric acid metabolism. DESIGN. This was a prospective study to evaluate the uric acid levels and fractional excretion of uric acid (FEUA) in a cohort of 69 healthy transsexual persons, 22 male-to-female transsexuals (MFTs) and 47 female-to-male transsexuals (FMTs).The subjects were studied at baseline and 1 and 2 yr after starting cross-sex hormone treatment. RESULTS. The baseline levels of uric acid were higher in the MFT group.Compared with baseline, uric acid levels had fallen significantly after 1 yr of hormone therapy in the MFT group and had risen significantly in the FMT group. The baseline FEUA was greater in the FMT group. After 2 yr of cross-sex hormone therapy, the FEUA had increased in MFTs (P = 0.001) and fallen in FMTs (P = 0.004).In MFTs, the levels of uric acid at 2 yr were lower in those who had received higher doses of estrogens (P = 0.03),and the FEUA was higher (P = 0.04).The FEUA at 2 yr was associated with both the estrogen dose (P = 0.02) and the serum levels of estradiol-17beta (P =0.03).In MFTs, a correlation was found after 2 yr of therapy between the homeostasis model assessment of insulin resistance and the serum uric acid (r = 0.59; P = 0.01). CONCLUSIONS. Serum levels of uric acid and the FEUA are altered in transsexuals as a result of cross-sex hormone therapy.The results concerning the MFT group support the hypothesis that the lower levels of uric acid in women are due to estrogen-induced increases in FEUA.
Resumo:
BACKGROUND: The objective of the present study was to compare current results of prosthetic valve replacement following acute infective native valve endocarditis (NVE) with that of prosthetic valve endocarditis (PVE). Prosthetic valve replacement is often necessary for acute infective endocarditis. Although valve repair and homografts have been associated with excellent outcome, homograft availability and the importance of valvular destruction often dictate prosthetic valve replacement in patients with acute bacterial endocarditis. METHODS: A retrospective analysis of the experience with prosthetic valve replacement following acute NVE and PVE between 1988 and 1998 was performed at the Montreal Heart Institute. RESULTS: Seventy-seven patients (57 men and 20 women, mean age 48 +/- 16 years) with acute infective endocarditis underwent valve replacement. Fifty patients had NVE and 27 had PVE. Four patients (8%) with NVE died within 30 days of operation and there were no hospital deaths in patients with PVE. Survival at 1, 5, and 7 years averaged 80% +/- 6%, 76% +/- 6%, and 76% +/- 6% for NVE and 70% +/- 9%, 59% +/- 10%, and 55% +/- 10% for PVE, respectively (p = 0.15). Reoperation-free survival at 1, 5, and 7 years averaged 80% +/- 6%, 76% +/- 6%, and 76% +/- 6% for NVE and 45% +/- 10%, 40% +/- 10%, and 36% +/- 9% for PVE (p = 0.003). Five-year survival for NVE averaged 75% +/- 9% following aortic valve replacement and 79% +/- 9% following mitral valve replacement. Five-year survival for PVE averaged 66% +/- 12% following aortic valve replacement and 43% +/- 19% following mitral valve replacement (p = 0.75). Nine patients underwent reoperation during follow-up: indications were prosthesis infection in 4 patients (3 mitral, 1 aortic), dehiscence of mitral prosthesis in 3, and dehiscence of aortic prosthesis in 2. CONCLUSIONS: Prosthetic valve replacement for NVE resulted in good long-term patient survival with a minimal risk of reoperation compared with patients who underwent valve replacement for PVE. In patients with PVE, those who needed reoperation had recurrent endocarditis or noninfectious periprosthetic dehiscence.
Resumo:
BACKGROUND: This study aimed to investigate the influence of deep sternal wound infection on long-term survival following cardiac surgery. MATERIAL AND METHODS: In our institutional database we retrospectively evaluated medical records of 4732 adult patients who received open-heart surgery from January 1995 through December 2005. The predictive factors for DSWI were determined using logistic regression analysis. Then, each patient with deep sternal wound infection (DSWI) was matched with 2 controls without DSWI, according to the risk factors identified previously. After checking balance resulting from matching, short-term mortality was compared between groups using a paired test, and long-term survival was compared using Kaplan-Meier analysis and a Cox proportional hazard model. RESULTS: Overall, 4732 records were analyzed. The mean age of the investigated population was 69.3±12.8 years. DSWI occurred in 74 (1.56%) patients. Significant independent predictive factors for deep sternal infections were active smoking (OR 2.19, CI95 1.35-3.53, p=0.001), obesity (OR 1.96, CI95 1.20-3.21, p=0.007), and insulin-dependent diabetes mellitus (OR 2.09, CI95 1.05-10.06, p=0.016). Mean follow-up in the matched set was 125 months, IQR 99-162. After matching, in-hospital mortality was higher in the DSWI group (8.1% vs. 2.7% p=0.03), but DSWI was not an independent predictor of long-term survival (adjusted HR 1.5, CI95 0.7-3.2, p=0.33). CONCLUSIONS: The results presented in this report clearly show that post-sternotomy deep wound infection does not influence long-term survival in an adult general cardio-surgical patient population.