223 resultados para LONG-TERM HEALTH EFFECTS


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Long-term electrocardiography (ECG) featuring adequate atrial and ventricular signal quality is highly desirable. Routinely used surface leads are limited in atrial signal sensitivity and recording capability impeding complete ECG delineation, i.e. in the presence of supraventricular arrhythmias. Long-term esophageal ECG might overcome these limitations but requires a dedicated lead system and recorder design. To this end, we analysed multiple-lead esophageal ECGs with respect to signal quality by describing the ECG waves as a function of the insertion level, interelectrode distance, electrode shape and amplifier's input range. The results derived from clinical data show that two bipolar esophageal leads, an atrial lead with short (15 mm) interelectrode distance and a ventricular lead with long (80 mm) interelectrode distance provide non-inferior ventricular signal strength and superior atrial signal strength compared to standard surface lead II. High atrial signal slope in particular is observed with the atrial esophageal lead. The proposed esophageal lead system in combination with an increased recorder input range of ±20 mV minimizes signal loss due to excessive electrode motion typically observed in esophageal ECGs. The design proposal might help to standardize long-term esophageal ECG registrations and facilitate novel ECG classification systems based on the independent detection of ventricular and atrial electrical activity.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Aims: Arterial plaque rupture and thrombus characterise ST-elevation myocardial infarction (STEMI) and may aggravate delayed arterial healing following durable polymer drug-eluting stent (DP-DES) implantation. Biodegradable polymer (BP) may improve biocompatibility. We compared long-term outcomes in STEMI patients receiving BP-DES vs. durable polymer sirolimus-eluting stents (DP-SES). Methods and results: We pooled individual patient-level data from three randomised clinical trials (ISAR-TEST-3, ISAR-TEST-4 and LEADERS) comparing outcomes from BP-DES with DP-SES at four years. The primary endpoint (MACE) comprised cardiac death, MI, or target lesion revascularisation (TLR). Secondary endpoints were TLR, cardiac death or MI, and definite or probable stent thrombosis. Of 497 patients with STEMI, 291 received BP-DES and 206 DP-SES. At four years, MACE was significantly reduced following treatment with BP-DES (hazard ratio [HR] 0.59, 95% CI: 0.39-0.90; p=0.01) driven by reduced TLR (HR 0.54, 95% CI: 0.30-0.98; p=0.04). Trends towards reduction were seen for cardiac death or MI (HR 0.63, 95% CI: 0.37-1.05; p=0.07) and definite or probable stent thrombosis (3.6% vs. 7.1%; HR 0.49, 95% CI: 0.22-1.11; p=0.09). Conclusions: In STEMI, BP-DES demonstrated superior clinical outcomes to DP-SES at four years. Trends towards reduced cardiac death or myocardial infarction and reduced stent thrombosis require corroboration in specifically powered trials.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND We describe the long-term outcome after clinical introduction and dose escalation of somatostatin receptor targeted therapy with [90Y-DOTA]-TOC in patients with metastasized neuroendocrine tumors. METHODS In a clinical phase I dose escalation study we treated patients with increasing [90Y-DOTA]-TOC activities. Multivariable Cox regression and competing risk regression were used to compare efficacy and toxicities of the different dosage protocols. RESULTS Overall, 359 patients were recruited; 60 patients were enrolled for low dose (median: 2.4 GBq/cycle, range 0.9-7.8 GBq/cycle), 77 patients were enrolled for intermediate dose (median: 3.3 GBq/cycle, range: 2.0-7.4 GBq/cycle) and 222 patients were enrolled for high dose (median: 6.7 GBq/cycle, range: 3.7-8.1 GBq/cycle) [90Y-DOTA]-TOC treatment. The incidences of hematotoxicities grade 1-4 were 65.0%, 64.9% and 74.8%; the incidences of grade 4/5 kidney toxicities were 8.4%, 6.5% and 14.0%, and the median survival was 39 (range: 1-158) months, 34 (range: 1-118) months and 29 (range: 1-113) months. The high dose protocol was associated with an increased risk of kidney toxicity (Hazard Ratio: 3.12 (1.13-8.59) vs. intermediate dose, p = 0.03) and a shorter overall survival (Hazard Ratio: 2.50 (1.08-5.79) vs. low dose, p = 0.03). CONCLUSIONS Increasing [90Y-DOTA]-TOC activities may be associated with increasing hematological toxicities. The dose related hematotoxicity profile of [90Y-DOTA]-TOC could facilitate tailoring [90Y-DOTA]-TOC in patients with preexisting hematotoxicities. The results of the long-term outcome suggest that fractionated [90Y-DOTA]-TOC treatment might allow to reduce renal toxicity and to improve overall survival. (ClinicalTrials.gov number NCT00978211).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background Atrial fibrillation (AF) is common and may have severe consequences. Continuous long-term electrocardiogram (ECG) is widely used for AF screening. Recently, commercial ECG analysis software was launched, which automatically detects AF in long-term ECGs. It has been claimed that such tools offer reliable AF screening and save time for ECG analysis. However, this has not been investigated in a real-life patient cohort. Objective To investigate the performance of automatic software-based screening for AF in long-term ECGs. Methods Two independent physicians manually screened 22,601 hours of continuous long-term ECGs from 150 patients for AF. Presence, number, and duration of AF episodes were registered. Subsequently, the recordings were screened for AF by an established ECG analysis software (Pathfinder SL), and its performance was validated against the thorough manual analysis (gold standard). Results Sensitivity and specificity for AF detection was 98.5% (95% confidence interval 91.72%–99.96%) and 80.21% (95% confidence interval 70.83%–87.64%), respectively. Software-based AF detection was inferior to manual analysis by physicians (P < .0001). Median AF duration was underestimated (19.4 hours vs 22.1 hours; P < .001) and median number of AF episodes was overestimated (32 episodes vs 2 episodes; P < .001) by the software. In comparison to extensive quantitative manual ECG analysis, software-based analysis saved time (2 minutes vs 19 minutes; P < .001). Conclusion Owing to its high sensitivity and ability to save time, software-based ECG analysis may be used as a screening tool for AF. An additional manual confirmatory analysis may be required to reduce the number of false-positive findings.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Patient self-management (PSM) of oral anticoagulation is under discussion, because evidence from real-life settings is missing. Using data from a nationwide, prospective cohort study in Switzerland, we assessed overall long-term efficacy and safety of PSM and examined subgroups. Data of 1140 patients (5818.9 patient-years) were analysed and no patient were lost to follow-up. Median follow-up was 4.3 years (range 0.2-12.8 years). Median age at the time of training was 54.2 years (range 18.2-85.2) and 34.6% were women. All-cause mortality was 1.4 per 100 patient-years (95% CI 1.1-1.7) with a higher rate in patients with atrial fibrillation (2.5; 1.6-3.7; p<0.001), patients>50 years of age (2.0; 1.6-2.6; p<0.001), and men (1.6; 1.2-2.1; p = 0.036). The rate of thromboembolic events was 0.4 (0.2-0.6) and independent from indications, sex and age. Major bleeding were observed in 1.1 (0.9-1.5) per 100 patient-years. Efficacy was comparable to standard care and new oral anticoagulants in a network meta-analysis. PSM of properly trained patients is effective and safe in a long-term real-life setting and robust across clinical subgroups. Adoption in various clinical settings, including those with limited access to medical care or rural areas is warranted.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose: To assess liver remnant volume regeneration and maintenance, and complications in the long-time follow-up of donors after living donor liver transplantation using CT and MRI. Materials and Methods: 47 donors with a mean age of 33.5 years who donated liver tissue for transplantation and who were available for follow-up imaging were included in this retrospective study. Contrast-enhanced CT and MR studies were acquired for routine follow-up. Two observers evaluated pre- and postoperative images regarding anatomy and pathological findings. Volumes were manually measured on contrast-enhanced images in the portal venous phase, and potential postoperative complications were documented. Pre- and postoperative liver volumes were compared for evaluating liver remnant regeneration. Results: 47 preoperative and 89 follow-up studies covered a period of 22.4 months (range: 1 - 84). After right liver lobe (RLL) donation, the mean liver remnant volume was 522.0 ml (± 144.0; 36.1 %; n = 18), after left lateral section (LLS) donation 1,121.7 ml (± 212.8; 79.9 %; n = 24), and after left liver lobe (LLL) donation 1,181.5 ml (± 279.5; 72.0 %; n = 5). Twelve months after donation, the liver remnant volume were 87.3 % (RLL; ± 11.8; n = 11), 95.0 % (LS; ± 11.6; n = 18), and 80.1 % (LLL; ± 2.0; n = 2 LLL) of the preoperative total liver volume. Rapid initial regeneration and maintenance at 80 % of the preoperative liver volume were observed over the total follow-up period. Minor postoperative complications were found early in 4 patients. No severe or late complications or mortality occurred. Conclusion: Rapid regeneration of liver remnant volumes in all donors and volume maintenance over the long-term follow-up period of up to 84 months without severe or late complications are important observations for assessing the safety of LDLT donors. Key Points: Liver remnant volumes of LDLT donors rapidly regenerated after donation and volumes were maintained over the long-term follow-up period of up to 84 months without severe or late complications.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Microbial exposure following birth profoundly impacts mammalian immune system development. Microbiota alterations are associated with increased incidence of allergic and autoimmune disorders with elevated serum IgE as a hallmark. The previously reported abnormally high serum IgE levels in germ-free mice suggests that immunoregulatory signals from microbiota are required to control basal IgE levels. We report that germ-free mice and those with low-diversity microbiota develop elevated serum IgE levels in early life. B cells in neonatal germ-free mice undergo isotype switching to IgE at mucosal sites in a CD4 T-cell- and IL-4-dependent manner. A critical level of microbial diversity following birth is required in order to inhibit IgE induction. Elevated IgE levels in germ-free mice lead to increased mast-cell-surface-bound IgE and exaggerated oral-induced systemic anaphylaxis. Thus, appropriate intestinal microbial stimuli during early life are critical for inducing an immunoregulatory network that protects from induction of IgE at mucosal sites.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The importance of the cerebellum for non‐motor functions is becoming more and more evident. The influence on cognitive functions from acquired cerebellar lesions during childhood, however, is not well known. We present follow‐up data from 24 patients, who were operated upon during childhood for benign cerebellar tumours. The benign histology of these tumours required neither radiotherapy nor chemotherapy. Post‐operatively, these children were of normal intelligence with a mean IQ of 99.1, performance intelligence quotient (PIQ) of 101.3 and verbal intelligence quotient (VIQ) of 96.8. However, 57% of patients showed abnormalities in subtesting. In addition, more extensive neuropsychological testing revealed significant problems for attention, memory, processing speed and interference. Visuo‐constructive problems were marked for copying the Rey figure, but less pronounced for recall of the figure. Verbal fluency was more affected than design fluency. Behavioural deficits could be detected in 33% of patients. Attention deficit problems were marked in 12.5%, whereas others demonstrated psychiatric symptoms such as mutism, addiction problems, anorexia, uncontrolled temper tantrums and phobia. Age at tumour operation and size of tumour had no influence on outcome. Vermis involvement was related to an increase in neuropsychological and psychiatric problems. The observation that patients with left‐sided cerebellar tumours were more affected than patients with right‐sided tumours is probably also influenced by a more pronounced vermian involvement in the former group. In summary, this study confirms the importance of the cerebellum for cognitive development and points to the necessity of careful follow‐up for these children to provide them with the necessary help to achieve full integration into professional life.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objective To determine the comparative effectiveness and safety of current maintenance strategies in preventing exacerbations of asthma. Design Systematic review and network meta-analysis using Bayesian statistics. Data sources Cochrane systematic reviews on chronic asthma, complemented by an updated search when appropriate. Eligibility criteria Trials of adults with asthma randomised to maintenance treatments of at least 24 weeks duration and that reported on asthma exacerbations in full text. Low dose inhaled corticosteroid treatment was the comparator strategy. The primary effectiveness outcome was the rate of severe exacerbations. The secondary outcome was the composite of moderate or severe exacerbations. The rate of withdrawal was analysed as a safety outcome. Results 64 trials with 59 622 patient years of follow-up comparing 15 strategies and placebo were included. For prevention of severe exacerbations, combined inhaled corticosteroids and long acting β agonists as maintenance and reliever treatment and combined inhaled corticosteroids and long acting β agonists in a fixed daily dose performed equally well and were ranked first for effectiveness. The rate ratios compared with low dose inhaled corticosteroids were 0.44 (95% credible interval 0.29 to 0.66) and 0.51 (0.35 to 0.77), respectively. Other combined strategies were not superior to inhaled corticosteroids and all single drug treatments were inferior to single low dose inhaled corticosteroids. Safety was best for conventional best (guideline based) practice and combined maintenance and reliever therapy. Conclusions Strategies with combined inhaled corticosteroids and long acting β agonists are most effective and safe in preventing severe exacerbations of asthma, although some heterogeneity was observed in this network meta-analysis of full text reports.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Understanding the causes and consequences of wildfires in forests of the western United States requires integrated information about fire, climate changes, and human activity on multiple temporal scales. We use sedimentary charcoal accumulation rates to construct long-term variations in fire during the past 3,000 y in the American West and compare this record to independent fire-history data from historical records and fire scars. There has been a slight decline in burning over the past 3,000 y, with the lowest levels attained during the 20th century and during the Little Ice Age (LIA, ca. 1400-1700 CE Common Era]). Prominent peaks in forest fires occurred during the Medieval Climate Anomaly (ca. 950-1250 CE) and during the 1800s. Analysis of climate reconstructions beginning from 500 CE and population data show that temperature and drought predict changes in biomass burning up to the late 1800s CE. Since the late 1800s, human activities and the ecological effects of recent high fire activity caused a large, abrupt decline in burning similar to the LIA fire decline. Consequently, there is now a forest ``fire deficit'' in the western United States attributable to the combined effects of human activities, ecological, and climate changes. Large fires in the late 20th and 21st century fires have begun to address the fire deficit, but it is continuing to grow.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

AIMS Due to a high burden of systemic cardiovascular events, current guidelines recommend the use of statins in all patients with peripheral artery disease (PAD). We sought to study the impact of statin use on limb prognosis in patients with symptomatic PAD enrolled in the international REACH registry. METHODS Statin use was assessed at study enrolment, as well as a time-varying covariate. Rates of the primary adverse limb outcome (worsening claudication/new episode of critical limb ischaemia, new percutaneous/surgical revascularization, or amputation) at 4 years and the composite of cardiovascular death/myocardial infarction/stroke were compared among statin users vs. non-users. RESULTS A total of 5861 patients with symptomatic PAD were included. Statin use at baseline was 62.2%. Patients who were on statins had a significantly lower risk of the primary adverse limb outcome at 4 years when compared with those who were not taking statins [22.0 vs. 26.2%; hazard ratio (HR), 0.82; 95% confidence interval (CI), 0.72-0.92; P = 0.0013]. Results were similar when statin use was considered as a time-dependent variable (P = 0.018) and on propensity analysis (P < 0.0001). The composite of cardiovascular death/myocardial infarction/stroke was similarly reduced (HR, 0.83; 95% CI, 0.73-0.96; P = 0.01). CONCLUSION Among patients with PAD in the REACH registry, statin use was associated with an ∼18% lower rate of adverse limb outcomes, including worsening symptoms, peripheral revascularization, and ischaemic amputations. These findings suggest that statin therapy not only reduces the risk of adverse cardiovascular events, but also favourably affects limb prognosis in patients with PAD.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

AIM To investigate risk factors for the loss of multi-rooted teeth (MRT) in subjects treated for periodontitis and enrolled in supportive periodontal therapy (SPT). MATERIAL AND METHODS A total of 172 subjects were examined before (T0) and after active periodontal therapy (APT)(T1) and following a mean of 11.5 ± 5.2 (SD) years of SPT (T2). The association of risk factors with loss of MRT was analysed with multilevel logistic regression. The tooth was the unit of analysis. RESULTS Furcation involvement (FI) = 1 before APT was not a risk factor for tooth loss compared with FI = 0 (p = 0.37). Between T0 and T2, MRT with FI = 2 (OR: 2.92, 95% CI: 1.68, 5.06, p = 0.0001) and FI = 3 (OR: 6.85, 95% CI: 3.40, 13.83, p < 0.0001) were at a significantly higher risk to be lost compared with those with FI = 0. During SPT, smokers lost significantly more MRT compared with non-smokers (OR: 2.37, 95% CI: 1.05, 5.35, p = 0.04). Non-smoking and compliant subjects with FI = 0/1 at T1 lost significantly less MRT during SPT compared with non-compliant smokers with FI = 2 (OR: 10.11, 95% CI: 2.91, 35.11, p < 0.0001) and FI = 3 (OR: 17.18, 95% CI: 4.98, 59.28, p < 0.0001) respectively. CONCLUSIONS FI = 1 was not a risk factor for tooth loss compared with FI = 0. FI = 2/3, smoking and lack of compliance with regular SPT represented risk factors for the loss of MRT in subjects treated for periodontitis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Rainfall controls fire in tropical savanna ecosystems through impacting both the amount and flammability of plant biomass, and consequently, predicted changes in tropical precipitation over the next century are likely to have contrasting effects on the fire regimes of wet and dry savannas. We reconstructed the long-term dynamics of biomass burning in equatorial East Africa, using fossil charcoal particles from two well-dated lake-sediment records in western Uganda and central Kenya. We compared these high-resolution (5 years/sample) time series of biomass burning, spanning the last 3800 and 1200 years, with independent data on past hydroclimatic variability and vegetation dynamics. In western Uganda, a rapid (<100 years) and permanent increase in burning occurred around 2170 years ago, when climatic drying replaced semideciduous forest by wooded grassland. At the century time scale, biomass burning was inversely related to moisture balance for much of the next two millennia until ca. 1750 ad, when burning increased strongly despite regional climate becoming wetter. A sustained decrease in burning since the mid20th century reflects the intensified modern-day landscape conversion into cropland and plantations. In contrast, in semiarid central Kenya, biomass burning peaked at intermediate moisture-balance levels, whereas it was lower both during the wettest and driest multidecadal periods of the last 1200 years. Here, burning steadily increased since the mid20th century, presumably due to more frequent deliberate ignitions for bush clearing and cattle ranching. Both the observed historical trends and regional contrasts in biomass burning are consistent with spatial variability in fire regimes across the African savanna biome today. They demonstrate the strong dependence of East African fire regimes on both climatic moisture balance and vegetation, and the extent to which this dependence is now being overridden by anthropogenic activity.