223 resultados para LONG-TERM HEALTH EFFECTS
Resumo:
PURPOSE As survival rates of adolescent and young adult (AYA) cancer patients increase, a growing number of AYA cancer survivors need follow-up care. However, there is little research on their preferences for follow-up care. We aimed to (1) describe AYA cancer survivors' preferences for the organization and content of follow-up care, (2) describe their preferences for different models of follow-up, and (3) investigate clinical and sociodemographic characteristics associated with preferences for the different models. METHODS AYA cancer survivors (diagnosed with cancer at age 16-25 years; ≥5 years after diagnosis) were identified through the Cancer Registry Zurich and Zug. Survivors completed a questionnaire on follow-up attendance, preferences for organizational aspects of follow-up care (what is important during follow-up, what should be included during appointments, what specialists should be involved, location), models of follow-up (telephone/questionnaire, general practitioner (GP), pediatric oncologist, medical oncologist, multidisciplinary team), and sociodemographic characteristics. Information on tumor and treatment was available through the Cancer Registry Zurich and Zug. RESULTS Of 389 contacted survivors, 160 (41.1 %) participated and 92 (57.5 %) reported still attending follow-up. Medical aspects of follow-up care were more important than general aspects (p < 0.001). Among different organizational models, follow-up by a medical oncologist was rated higher than all other models (p = 0.002). Non-attenders of follow-up rated GP-led follow-up significantly higher than attenders (p = 0.001). CONCLUSION Swiss AYA cancer survivors valued medical content of follow-up and showed a preference for medical oncologist-led follow-up. Implementation of different models of follow-up care might improve accessibility and attendance among AYA cancer survivors.
Resumo:
Background Malabsorptive bariatric surgery requires lifelongmicronutrientsupplementation.Basedontherecommendations, we assessed the number of adjustments of micronutrientsupplementationandtheprevalenceofvitaminandmineral deficiencies at a minimum follow-up of 5 years after biliopancreatic diversion with duodenal switch (BPD-DS). Methods Between October 2010 and December 2013, a total of 51 patients at a minimum follow-up of 5 years after BPDDS were invited for a clinical check-up with a nutritional blood screening test for vitamins and minerals. Results Forty-three of fifty-one patients (84.3 %) completed the blood sampling with a median follow-up of 71.2 (range 60–102) monthsafter BPD-DS. At that time,all patientswere supplemented with at least one multivitamin. However, 35 patients (81.4 %) showed either a vitamin or a mineral deficiencyoracombinationofit.Nineteenpatients(44.1%)were anemic,and17patients(39.5%)hadanirondeficiency.High deficiency rates for fat-soluble vitamins were also present in 23.2 % for vitamin A, in 76.7 % for vitamin D, in 7.0 % for vitamin E, and in 11.6 % for vitamin K. Conclusions Theresultsofourstudyshowthattheprevalence ofvitaminandmineraldeficienciesafterBPD-DSis81.4%at a minimum follow-up of 5 years. The initial prescription of micronutrientsupplementationandfurtheradjustmentsduring thefirstfollow-upwereinsufficient toavoidlong-term micronutrient deficiencies. Life-long monitoring of micronutrients at a specialized bariatric center and possibly a better micronutrient supplementation, is crucial to avoid a deficient micronutrient status at every stage after malabsorptive bariatric surgery
Resumo:
STUDY DESIGN Retrospective analysis of prospectively collected clinical data. OBJECTIVE To assess the long-term outcome of patients with monosegmental L4/5 degenerative spondylolisthesis treated with the dynamic Dynesys device. SUMMARY OF BACKGROUND DATA The Dynesys system has been used as a semirigid, lumbar dorsal pedicular stabilization device since 1994. Good short-term results have been reported, but little is known about the long-term outcome after treatment for degenerative spondylolisthesis at the L4/5 level. METHODS A total of 39 consecutive patients with symptomatic degenerative lumbar spondylolisthesis at the L4/5 level were treated with bilateral decompression and Dynesys instrumentation. At a mean follow-up of 7.2 years (range, 5.0-11.2 y), they underwent clinical and radiographic evaluation and quality of life assessment. RESULTS At final follow-up, back pain improved in 89% and leg pain improved in 86% of patients compared with preoperative status. Eighty-three percent of patients reported global subjective improvement. Ninety-two percent would undergo the surgery again. Eight patients (21%) required further surgery because of symptomatic adjacent segment disease (6 cases), late-onset infection (1 case), and screw breakage (1 case). In 9 cases, radiologic progression of spondylolisthesis at the operated segment was found. Seventy-four percent of operated segments showed limited flexion-extension range of <4 degrees. Adjacent segment pathology, although without clinical correlation, was diagnosed at the L5/S1 (17.9%) and L3/4 (28.2%) segments. In 4 cases, asymptomatic screw loosening was observed. CONCLUSIONS Monosegmental Dynesys instrumentation of degenerative spondylolisthesis at L4/5 shows good long-term results. The rate of secondary surgeries is comparable to other dorsal instrumentation devices. Residual range of motion in the stabilized segment is reduced, and the rate of radiologic and symptomatic adjacent segment degeneration is low. Patient satisfaction is high. Dynesys stabilization of symptomatic L4/5 degenerative spondylolisthesis is a possible alternative to other stabilization devices.
Resumo:
OBJECTIVES We sought to assess the safety and efficacy of percutaneous closure of atrial septal defects (ASDs) under fluoroscopic guidance only, without periprocedural echocardiographic guidance. BACKGROUND Percutaneous closure of ASDs is usually performed using simultaneous fluoroscopic and transthoracic, transesophageal (TEE), or intracardiac echocardiographic (ICE) guidance. However, TEE requires deep sedation or general anesthesia, which considerably lengthens the procedure. TEE and ICE increase costs. METHODS Between 1997 and 2008, a total of 217 consecutive patients (age, 38 ± 22 years; 155 females and 62 males), of whom 44 were children ≤16 years, underwent percutaneous ASD closure with an Amplatzer ASD occluder (AASDO). TEE guidance and general anesthesia were restricted to the children, while devices were implanted under fluoroscopic guidance only in the adults. For comparison of technical safety and feasibility of the procedure without echocardiographic guidance, the children served as a control group. RESULTS The implantation procedure was successful in all but 3 patients (1 child and 2 adults; 1.4%). Mean device size was 23 ± 8 mm (range, 4-40 mm). There was 1 postprocedural complication (0.5%; transient perimyocarditis in an adult patient). At last echocardiographic follow-up, 13 ± 23 months after the procedure, 90% of patients had no residual shunt, whereas a minimal, moderate, or large shunt persisted in 7%, 1%, and 2%, respectively. Four adult patients (2%) underwent implantation of a second device for a residual shunt. During a mean follow-up period of 3 ± 2 years, 2 deaths and 1 ischemic stroke occurred. CONCLUSION According to these results, percutaneous ASD closure using the AASDO without periprocedural echocardiographic guidance seems safe and feasible.
Resumo:
The Ross operation remains a controversially discussed procedure when performed in the full root technique because concern exists regarding late dilatation of the pulmonary autograft and regurgitation of the neo-aortic valve. In 2008, we published our short-term experience when using external reinforcement of the autograft, which was inserted into a prosthetic Dacron graft. This detail was thought to prevent neoaortic root dilatation. Since 2006, 22 adult patients have undergone a Ross procedure using this technique. Indications were aortic regurgitation (n = 2), aortic stenosis (n = 15), and combined aortic stenosis and insufficiency (n = 5). A bicuspid aortic valve was present in 10 patients. Prior balloon valvuloplasty had been performed in seven patients. No early or late deaths occurred in this small series. One patient required aortic valve replacement early postoperatively, but freedom from late reoperation is 100% in the 21 remaining patients. Echocardiography confirmed the absence of more than trivial aortic insufficiency in 15 patients after a mean of 70 months (range, 14 to 108 months). No autograft dilatation was observed during follow-up and all patients are in New York Heart Association Class I. Autograft reinforcement is a simple and reproducible technical adjunct that may be especially useful in situations known for late autograft dilatation, namely, bicuspid aortic valve, predominant aortic insufficiency, and ascending aortic enlargement. The mid- to long-term results are encouraging because no late aortic root enlargement has been observed and the autograft valve is well functioning in all cases.
Resumo:
OBJECTIVE Candida esophagitis belongs to the most common AIDS-defining diseases, however, a comprehensive immune pathogenic concept is lacking. DESIGN We investigated the immune status of 37 HIV-1-infected patients from the Swiss HIV cohort study at diagnosis of Candida esophagitis, 1 year before, 1 year later and after 2 years of suppressed HIV RNA. We compared these patients to 3 groups: 37 HIV-1-infected patients without Candida esophagitis but similar CD4 counts as the patients at diagnosis (advanced HIV group), 15 HIV-1-infected patients with CD4 counts >500 cells/μl, CD4 nadir >350 cells/μl and suppressed HIV RNA under combination antiretroviral therapy (cART) (early cART group), and 20 healthy individuals. METHODS We investigated phenotype, cytokine production and proliferative capacity of different immune cells by flow cytometry and ELISpot. RESULTS We found that patients with Candida esophagitis had nearly abolished CD4 proliferation in response to C. albicans, significantly increased percentages of dysfunctional CD4 cells, significantly decreased cytotoxic NK-cell counts and peripheral innate lymphoid cells and significantly reduced IFN-γ and IL-17 production compared to the early cART group and healthy individuals. Most of these defects remained for more than 2 years despite viral suppression. The advanced HIV group without opportunistic infection showed partly improved immune recovery. CONCLUSIONS Our data indicate that Candida esophagitis in HIV-1-infected patients is caused by an accumulation of multiple, partly Candida-specific immunological defects. Long-term immune recovery is impaired, illustrating that specific immunological gaps persist despite cART. These data also support the rationale for early cART initiation to prevent irreversible immune defects.
Resumo:
PURPOSE Paroxysmal atrial fibrillation (PAF) often remains undiagnosed. Long-term surface ECG is used for screening, but has limitations. Esophageal ECG (eECG) allows recording high quality atrial signals, which were used to identify markers for PAF. METHODS In 50 patients (25 patients with PAF; 25 controls) an eECG and surface ECG was recorded simultaneously. Partially A-V blocked atrial runs (PBARs) were quantified, atrial signal duration in eECG was measured. RESULTS eECG revealed 1.8‰ of atrial premature beats in patients with known PAF to be PBARs with a median duration of 853ms (interquartile range (IQR) 813-1836ms) and a median atrial cycle length of 366ms (IQR 282-432ms). Even during a short recording duration of 2.1h (IQR 1.2-17.2h), PBARs occurred in 20% of PAF patients but not in controls (p=0.05). Left atrial signal duration was predictive for PAF (72% sensitivity, 80% specificity). CONCLUSIONS eECG reveals partially blocked atrial runs and prolonged left atrial signal duration - two novel surrogate markers for PAF.
Resumo:
The steep environmental gradients of mountain ecosystems over short distances reflect large gradients of several climatic parameters and hence provide excellent possibilities for ecological research on the effects of environmental change. To gain a better understanding of the dynamics of abiotic and biotic parameters of mountain ecosystems, long-term records are required since permanent plots in mountain regions cover in the best case about 50 - 70 years. In order to extend investigations of ecological dynamics beyond these temporal limitations of permanent plots, paleoecological approaches can be used if the sampling resolution can be adapted to ecological research questions, e.g. a sample every 10 years. Paleoecological studies in mountain ecosystems can provide new ecological insights through the combination of different spatial and temporal scales. [f we thus improve our understanding of processes across both steep environmental gradients and different time scales, we may be able to better estimate ecosystem responses to current and future environmental change (Ammann et al. 1993; Lotter et al. 1997). The complexity of ecological interactions in mountain regions forces us to concentrate on a number of sub-systems - without losing sight of the wider context. Here, we summarize a few case studies on the effects of Holocene climate change and disturbance on the vegetation of the Western Alps. To categorize the main response modes of vegetation to climatic change and disturbance in the Alps we use three classes of ecological behaviour: "resilience", "adjustment", and "vulnerability", We assume a resilient (or elastic) behaviour if vegetation is able to recover to its former state, regaining important ecosystem characteristics, such as floristic composition, biodiversity, species abundances, and biomass (e.g. Küttel 1990; Aber and Melillo 199 1). Conversely, vegetation displacements may occur in response to climatic change and/or disturbance. In some cases, this may culminate in irreversible large-scale processes such as species and/or community extinctions. Such drastic developments indicate high ecosystem vulnerability (or inelasticity or instability, for detailed definitions see Küttel 1990; Aber and Melillo 199 1) to climatic change and/or disturbance. In this sense, the "vulnerability" (or instability) of an ecosystem is expressed by the degree of failure to recover to the original state before disturbance and/or climatic change. Between these two extremes (resilience vs. vulnerability), ecosystem adjustments to climatic change and/or disturbance may occur, including the appearance of new and/or the disappearance of old species. The term "adjustment" is hence used to indicate the response of vegetational communities, which adapted to new environmental conditions without losing their main character. For forest ecosystems, we assume vegetational adjustments (rather than vulnerability) if the dominant (or co-dominant) tree species are not outnumbered or replaced by formerly unimportant plant species or new invaders. Adaptation as a genetic process is not discussed here and will require additional pbylogeographical studies (that incorporate the analysis of ancient DNA) in order to fully understand the distributions of ecotypes.
Resumo:
Circulating miRNAs in body fluids, particularly serum, are promising candidates for future routine biomarker profiling in various pathologic conditions in human and veterinary medicine. However, reliable standardized methods for miRNA extraction from equine serum and fresh or archived whole blood are sorely lacking. We systematically compared various miRNA extraction methods from serum and whole blood after short and long-term storage without addition of RNA stabilizing additives prior to freezing. Time of storage at room temperature prior to freezing did not affect miRNA quality in serum. Furthermore, we showed that miRNA of NGS-sufficient quality can be recovered from blood samples after >10 years of storage at -80 °C. This allows retrospective analyses of miRNAs from archived samples.
Resumo:
BACKGROUND Long-term outcomes following ventricular tachycardia (VT) ablation are sparsely described. OBJECTIVES To describe long term prognosis following VT ablation in patients with no structural heart disease (no SHD), ischemic (ICM) and non-ischemic cardiomyopathy (NICM). METHODS Consecutive patients (n=695; no SHD 98, ICM 358, NICM 239 patients) ablated for sustained VT were followed for a median of 6 years. Acute procedural parameters (complete success [non-inducibility of any VT]) and outcomes after multiple procedures were reported. RESULTS Compared with patients with no SHD or NICM, ICM patients were the oldest, had more males, lowest left ventricular ejection fraction (LVEF), highest drug failures, VT storms and number of inducible VTs. Complete procedure success was highest in no SHD, compared ICM and NICM patients (79%, 56%, 60% respectively, P<0.001). At 6 years, ventricular arrhythmia (VA)-free survival was highest in no SHD (77%) than ICM (54%) and NICM (38%, P<0.001) and overall survival was lowest in ICM (48%), followed by NICM (74%) and no SHD patients (100%, P<0.001). Age, LVEF, presence of SHD, acute procedural success (non-inducibility of any VT), major complications, need for non-radiofrequency ablation modalities, and VA recurrence were independently associated with all cause mortality. CONCLUSIONS Long term follow up following VT ablation shows excellent prognosis in the absence of SHD, highest VA recurrence and transplantation in NICM and highest mortality in patients with ICM. The extremely low mortality for those without SHD suggests that VT in this population is very rarely an initial presentation of a myopathic process.
Resumo:
PURPOSE To identify the prevalence and progression of macular atrophy (MA) in neovascular age-related macular degeneration (AMD) patients under long-term anti-vascular endothelial growth factor (VEGF) therapy and to determine risk factors. METHOD This retrospective study included patients with neovascular AMD and ≥30 anti-VEGF injections. Macular atrophy (MA) was measured using near infrared and spectral-domain optical coherence tomography (SD-OCT). Yearly growth rate was estimated using square-root transformation to adjust for baseline area and allow for linearization of growth rate. Multiple regression with Akaike information criterion (AIC) as model selection criterion was used to estimate the influence of various parameters on MA area. RESULTS Forty-nine eyes (47 patients, mean age 77 ± 14) were included with a mean of 48 ± 13 intravitreal anti-VEGF injections (ranibizumab:37 ± 11, aflibercept:11 ± 6, mean number of injections/year 8 ± 2.1) over a mean treatment period of 6.2 ± 1.3 years (range 4-8.5). Mean best-corrected visual acuity improved from 57 ± 17 letters at baseline (= treatment start) to 60 ± 16 letters at last follow-up. The MA prevalence within and outside the choroidal neovascularization (CNV) border at initial measurement was 45% and increased to 74%. Mean MA area increased from 1.8 ± 2.7 mm(2) within and 0.5 ± 0.98 mm(2) outside the CNV boundary to 2.7 ± 3.4 mm(2) and 1.7 ± 1.8 mm(2) , respectively. Multivariate regression determined posterior vitreous detachment (PVD) and presence/development of intraretinal cysts (IRCs) as significant factors for total MA size (R(2) = 0.16, p = 0.02). Macular atrophy (MA) area outside the CNV border was best explained by the presence of reticular pseudodrusen (RPD) and IRC (R(2) = 0.24, p = 0.02). CONCLUSION A majority of patients show MA after long-term anti-VEGF treatment. Reticular pseudodrusen (RPD), IRC and PVD but not number of injections or treatment duration seem to be associated with the MA size.