962 resultados para Long-term data
Resumo:
Patient self-management (PSM) of oral anticoagulation is under discussion, because evidence from real-life settings is missing. Using data from a nationwide, prospective cohort study in Switzerland, we assessed overall long-term efficacy and safety of PSM and examined subgroups. Data of 1140 patients (5818.9 patient-years) were analysed and no patient were lost to follow-up. Median follow-up was 4.3 years (range 0.2-12.8 years). Median age at the time of training was 54.2 years (range 18.2-85.2) and 34.6% were women. All-cause mortality was 1.4 per 100 patient-years (95% CI 1.1-1.7) with a higher rate in patients with atrial fibrillation (2.5; 1.6-3.7; p<0.001), patients>50 years of age (2.0; 1.6-2.6; p<0.001), and men (1.6; 1.2-2.1; p = 0.036). The rate of thromboembolic events was 0.4 (0.2-0.6) and independent from indications, sex and age. Major bleeding were observed in 1.1 (0.9-1.5) per 100 patient-years. Efficacy was comparable to standard care and new oral anticoagulants in a network meta-analysis. PSM of properly trained patients is effective and safe in a long-term real-life setting and robust across clinical subgroups. Adoption in various clinical settings, including those with limited access to medical care or rural areas is warranted.
Resumo:
The importance of the cerebellum for non‐motor functions is becoming more and more evident. The influence on cognitive functions from acquired cerebellar lesions during childhood, however, is not well known. We present follow‐up data from 24 patients, who were operated upon during childhood for benign cerebellar tumours. The benign histology of these tumours required neither radiotherapy nor chemotherapy. Post‐operatively, these children were of normal intelligence with a mean IQ of 99.1, performance intelligence quotient (PIQ) of 101.3 and verbal intelligence quotient (VIQ) of 96.8. However, 57% of patients showed abnormalities in subtesting. In addition, more extensive neuropsychological testing revealed significant problems for attention, memory, processing speed and interference. Visuo‐constructive problems were marked for copying the Rey figure, but less pronounced for recall of the figure. Verbal fluency was more affected than design fluency. Behavioural deficits could be detected in 33% of patients. Attention deficit problems were marked in 12.5%, whereas others demonstrated psychiatric symptoms such as mutism, addiction problems, anorexia, uncontrolled temper tantrums and phobia. Age at tumour operation and size of tumour had no influence on outcome. Vermis involvement was related to an increase in neuropsychological and psychiatric problems. The observation that patients with left‐sided cerebellar tumours were more affected than patients with right‐sided tumours is probably also influenced by a more pronounced vermian involvement in the former group. In summary, this study confirms the importance of the cerebellum for cognitive development and points to the necessity of careful follow‐up for these children to provide them with the necessary help to achieve full integration into professional life.
Resumo:
Objective To determine the comparative effectiveness and safety of current maintenance strategies in preventing exacerbations of asthma. Design Systematic review and network meta-analysis using Bayesian statistics. Data sources Cochrane systematic reviews on chronic asthma, complemented by an updated search when appropriate. Eligibility criteria Trials of adults with asthma randomised to maintenance treatments of at least 24 weeks duration and that reported on asthma exacerbations in full text. Low dose inhaled corticosteroid treatment was the comparator strategy. The primary effectiveness outcome was the rate of severe exacerbations. The secondary outcome was the composite of moderate or severe exacerbations. The rate of withdrawal was analysed as a safety outcome. Results 64 trials with 59 622 patient years of follow-up comparing 15 strategies and placebo were included. For prevention of severe exacerbations, combined inhaled corticosteroids and long acting β agonists as maintenance and reliever treatment and combined inhaled corticosteroids and long acting β agonists in a fixed daily dose performed equally well and were ranked first for effectiveness. The rate ratios compared with low dose inhaled corticosteroids were 0.44 (95% credible interval 0.29 to 0.66) and 0.51 (0.35 to 0.77), respectively. Other combined strategies were not superior to inhaled corticosteroids and all single drug treatments were inferior to single low dose inhaled corticosteroids. Safety was best for conventional best (guideline based) practice and combined maintenance and reliever therapy. Conclusions Strategies with combined inhaled corticosteroids and long acting β agonists are most effective and safe in preventing severe exacerbations of asthma, although some heterogeneity was observed in this network meta-analysis of full text reports.
Resumo:
Understanding the causes and consequences of wildfires in forests of the western United States requires integrated information about fire, climate changes, and human activity on multiple temporal scales. We use sedimentary charcoal accumulation rates to construct long-term variations in fire during the past 3,000 y in the American West and compare this record to independent fire-history data from historical records and fire scars. There has been a slight decline in burning over the past 3,000 y, with the lowest levels attained during the 20th century and during the Little Ice Age (LIA, ca. 1400-1700 CE Common Era]). Prominent peaks in forest fires occurred during the Medieval Climate Anomaly (ca. 950-1250 CE) and during the 1800s. Analysis of climate reconstructions beginning from 500 CE and population data show that temperature and drought predict changes in biomass burning up to the late 1800s CE. Since the late 1800s, human activities and the ecological effects of recent high fire activity caused a large, abrupt decline in burning similar to the LIA fire decline. Consequently, there is now a forest ``fire deficit'' in the western United States attributable to the combined effects of human activities, ecological, and climate changes. Large fires in the late 20th and 21st century fires have begun to address the fire deficit, but it is continuing to grow.
Resumo:
BACKGROUND Copper and its main transport protein ceruloplasmin have been suggested to promote the development of atherosclerosis. Most of the data come from experimental and animal model studies. Copper and mortality have not been simultaneously evaluated in patients undergoing coronary angiography. METHODS AND RESULTS We examined whether serum copper and ceruloplasmin concentrations are associated with angiographic coronary artery disease (CAD) and mortality from all causes and cardiovascular causes in 3253 participants of the Ludwigshafen Risk and Cardiovascular Health Study. Age and sex-adjusted hazard ratios (HR) for death from any cause were 2.23 (95% CI, 1.85-2.68) for copper and 2.63 (95% CI, 2.17-3.20) for ceruloplasmin when we compared the highest with the lowest quartiles. Corresponding hazard ratios (HR) for death from cardiovascular causes were 2.58 (95% CI, 2.05-3.25) and 3.02 (95% CI, 2.36-3.86), respectively. Further adjustments for various risk factors and clinical variables considerably attenuated these associations, which, however, were still statistically significant and the results remained consistent across subgroups. CONCLUSIONS The elevated concentrations of both copper and ceruloplasmin are independently associated with increased risk of mortality from all causes and from cardiovascular causes.
Resumo:
Rainfall controls fire in tropical savanna ecosystems through impacting both the amount and flammability of plant biomass, and consequently, predicted changes in tropical precipitation over the next century are likely to have contrasting effects on the fire regimes of wet and dry savannas. We reconstructed the long-term dynamics of biomass burning in equatorial East Africa, using fossil charcoal particles from two well-dated lake-sediment records in western Uganda and central Kenya. We compared these high-resolution (5 years/sample) time series of biomass burning, spanning the last 3800 and 1200 years, with independent data on past hydroclimatic variability and vegetation dynamics. In western Uganda, a rapid (<100 years) and permanent increase in burning occurred around 2170 years ago, when climatic drying replaced semideciduous forest by wooded grassland. At the century time scale, biomass burning was inversely related to moisture balance for much of the next two millennia until ca. 1750 ad, when burning increased strongly despite regional climate becoming wetter. A sustained decrease in burning since the mid20th century reflects the intensified modern-day landscape conversion into cropland and plantations. In contrast, in semiarid central Kenya, biomass burning peaked at intermediate moisture-balance levels, whereas it was lower both during the wettest and driest multidecadal periods of the last 1200 years. Here, burning steadily increased since the mid20th century, presumably due to more frequent deliberate ignitions for bush clearing and cattle ranching. Both the observed historical trends and regional contrasts in biomass burning are consistent with spatial variability in fire regimes across the African savanna biome today. They demonstrate the strong dependence of East African fire regimes on both climatic moisture balance and vegetation, and the extent to which this dependence is now being overridden by anthropogenic activity.
Resumo:
BACKGROUND
Renal impairment (RI) is associated with impaired prognosis in patients with coronary artery disease. Clinical and angiographic outcomes of patients undergoing percutaneous coronary intervention (PCI) with the use of drug-eluting stents (DES) in this patient population are not well established.
METHODS
We pooled individual data for 5,011 patients from 3 trials with the exclusive and unrestricted use of DES (SIRTAX - N = 1,012, LEADERS - N = 1,707, RESOLUTE AC - N = 2,292). Angiographic follow-up was available for 1,544 lesions. Outcomes through 2 years were stratified according to glomerular filtration rate (normal renal function: GFR≥90 ml/min; mild RI: 90
Resumo:
The purpose of this long-term follow-up study was twofold-firstly, to assess prevalence of relapse after treatment of deep bite malocclusion and secondly, to identify risk factors that predispose patients with deep bite malocclusion to relapse. Sixty-one former patients with overbite more than 50% incisor overlap before treatment were successfully recalled. Clinical data, morphometrical measurements on plaster casts before treatment, after treatment and at long-term follow-up, as well as cephalometric measurements before and after treatment were collected. The median follow-up period was 11.9 years. Patients were treated by various treatment modalities, and the majority of patients received at least a lower fixed retainer and an upper removable bite plate during retention. Relapse was defined as increase in incisor overlap from below 50% after treatment to equal or more than 50% incisor overlap at long-term follow-up. Ten per cent of the patients showed relapse to equal or larger than 50% incisor overlap, and their amount of overbite increase was low. Among all cases with deep bite at follow-up, gingival contact and palatal impingement were more prevalent in partially corrected noncompliant cases than in relapse cases. In this sample, prevalence and amount of relapse were too low to identify risk factors of relapse.
Resumo:
INTRODUCTION Anemia and renal impairment are important co-morbidities among patients with coronary artery disease undergoing Percutaneous Coronary Intervention (PCI). Disease progression to eventual death can be understood as the combined effect of baseline characteristics and intermediate outcomes. METHODS Using data from a prospective cohort study, we investigated clinical pathways reflecting the transitions from PCI through intermediate ischemic or hemorrhagic events to all-cause mortality in a multi-state analysis as a function of anemia (hemoglobin concentration <120 g/l and <130 g/l, for women and men, respectively) and renal impairment (creatinine clearance <60 ml/min) at baseline. RESULTS Among 6029 patients undergoing PCI, anemia and renal impairment were observed isolated or in combination in 990 (16.4%), 384 (6.4%), and 309 (5.1%) patients, respectively. The most frequent transition was from PCI to death (6.7%, 95% CI 6.1-7.3), followed by ischemic events (4.8%, 95 CI 4.3-5.4) and bleeding (3.4%, 95% CI 3.0-3.9). Among patients with both anemia and renal impairment, the risk of death was increased 4-fold as compared to the reference group (HR 3.9, 95% CI 2.9-5.4) and roughly doubled as compared to patients with either anemia (HR 1.7, 95% CI 1.3-2.2) or renal impairment (HR 2.1, 95% CI 1.5-2.9) alone. Hazard ratios indicated an increased risk of bleeding in all three groups compared to patients with neither anemia nor renal impairment. CONCLUSIONS Applying a multi-state model we found evidence for a gradient of risk for the composite of bleeding, ischemic events, or death as a function of hemoglobin value and estimated glomerular filtration rate at baseline.
Resumo:
BACKGROUND Contour augmentation around early-placed implants (Type 2 placement) using autogenous bone chips combined with deproteinized bovine bone mineral (DBBM) and a collagen barrier membrane has been documented to predictably provide esthetically satisfactory clinical outcomes. In addition, recent data from cone beam computed tomography studies have shown the augmented volume to be stable long-term. However, no human histologic data are available to document the tissue reactions to this bone augmentation procedure. METHODS Over an 8-year period, 12 biopsies were harvested 14 to 80 months after implant placement with simultaneous contour augmentation in 10 patients. The biopsies were subjected to histologic and histomorphometric analysis. RESULTS The biopsies consisted of 32.0% ± 9.6% DBBM particles and 40.6% ± 14.6% mature bone. 70.3% ± 14.5% of the DBBM particle surfaces were covered with bone. On the remaining surface, multinucleated giant cells with varying intensity of tartrate-resistant acid phosphatase staining were regularly present. No signs of inflammation were visible, and no tendency toward a decreasing volume fraction of DBBM over time was observed. CONCLUSIONS The present study confirms previous findings that osseointegrated DBBM particles do not tend to undergo substitution over time. This low substitution rate may be the reason behind the clinically and radiographically documented long-term stability of contour augmentation using a combination of autogenous bone chips, DBBM particles, and a collagen membrane.
Resumo:
The role of Soil Organic Carbon (SOC) in mitigating climate change, indicating soil quality and ecosystem function has created research interested to know the nature of SOC at landscape level. The objective of this study was to examine variation and distribution of SOC in a long-term land management at a watershed and plot level. This study was based on meta-analysis of three case studies and 128 surface soil samples from Ethiopia. Three sites (Gununo, Anjeni and Maybar) were compared after considering two Land Management Categories (LMC) and three types of land uses (LUT) in quasi-experimental design. Shapiro-Wilk tests showed non-normal distribution (p = 0.002, a = 0.05) of the data. SOC median value showed the effect of long-term land management with values of 2.29 and 2.38 g kg-1 for less and better-managed watersheds, respectively. SOC values were 1.7, 2.8 and 2.6 g kg-1 for Crop (CLU), Grass (GLU) and Forest Land Use (FLU), respectively. The rank order for SOC variability was FLU>GLU>CLU. Mann-Whitney U and Kruskal-Wallis test showed a significant difference in the medians and distribution of SOC among the LUT, between soil profiles (p<0.05, confidence interval 95%, a = 0.05) while it is not significant (p>0.05) for LMC. The mean and sum rank of Mann Whitney U and Kruskal Wallis test also showed the difference at watershed and plot level. Using SOC as a predictor, cross-validated correct classification with discriminant analysis showed 46 and 49% for LUT and LMC, respectively. The study showed how to categorize landscapes using SOC with respect to land management for decision-makers.
Resumo:
OBJECTIVES Despite new treatment modalities, cyclophosphamide (CYC) remains a cornerstone in the treatment of organ or life-threatening vasculitides and connective tissue disorders. We aimed at analysing the short- and long-term side-effects of CYC treatment in patients with systemic autoimmune diseases. METHODS Chart review and phone interviews regarding side effects of CYC in patients with systemic autoimmune diseases treated between 1984 and 2011 in a single university centre. Adverse events were stratified according to the "Common Terminology Criteria for Adverse Events" version 4. RESULTS A total of 168 patients were included. Cumulative CYC dose was 7.45 g (range 0.5-205 g). Gastro-intestinal side effects were seen in 68 events, hair loss occurred in 38 events. A total of 58 infections were diagnosed in 44/168 patients (26.2%) with 9/44 suffering multiple infections. Severity grading of infections was low in 37/58 cases (63.8%). One CYC-related infection-induced death (0.6%) was registered. Amenorrhoea occurred in 7/92 females (7.6%) with 5/7 remaining irreversible. In females with reversible amenorrhoea, prophylaxis with nafarelin had been administered. Malignancy was registered in 19 patients after 4.7 years (median, range 0.25-22.25) presenting as 4 premalignancies and 18 malignancies, 3 patients suffered 2 premalignancies/malignancies each. Patients with malignancies were older with a higher cumulative CYC dose. Death was registered in 28 patients (16.6%) with 2/28 probably related to CYC. CONCLUSIONS Considering the organ or life-threatening conditions which indicate the use of CYC, severe drug-induced health problems were rare. Our data confirm the necessity to follow-up patients long-term for timely diagnosis of malignancies. CYC side-effects do not per se justify prescription of newer drugs or biologic agents in the treatment of autoimmune diseases.
Resumo:
(1) H-MRS is regularly applied to determine lipid content in ectopic tissue - mostly skeletal muscle and liver - to investigate physiological and/or pathologic conditions, e.g. insulin resistance. Technical developments also allow non-invasive in vivo assessment of cardiac lipids; however, basic data about methodological reliability (repeatability) and physiological variations are scarce. The aim of the presented work was to determine potential diurnal changes of cardiac lipid stores in humans, and to put the results in relation to methodological repeatability and normal physiological day-to-day variations. Optimized cardiac- and respiratory-gated (1) H-MRS was used for non-invasive quantification of intracardiomyocellular lipids (ICCL), creatine, trimethyl-ammonium compounds (TMA), and taurine in nine healthy young men at three time points per day on two days separated by one week. This design allowed determination of (a) diurnal changes, (b) physiological variation over one week and (c) methodological repeatability of the ICCL levels. Comparison of fasted morning to post-absorptive evening measurements revealed a significant 37 ± 19% decrease of ICCL during the day (p = 0.0001). There was a significant linear correlation between ICCL levels in the morning and their decrease during the day (p = 0.015). Methodological repeatability for the ICCL/creatine ratio was excellent, with a coefficient of variance of ~5%, whereas physiological variation was found to be considerably higher (22%) in spite of a standardized physiological preparation protocol. In contrast, TMA levels remained stable over this time period. The proposed (1) H-MRS technique provides a robust way to investigate relevant physiological changes in cardiac metabolites, in particular ICCL. The present results suggest that ICCL reveal a diurnal course, with higher levels in the morning as compared to evening. In addition, a considerable long-term variation of ICCL levels, in both the morning and evening, was documented. Given the high methodological repeatability, these effects should be taken into account in studies investigating the metabolic role of ICCL.
Resumo:
Objectives: To examine the predictive value of early improvement for short- and long-term outcome in the treatment of depressive female inpatients and to explore the influence of comorbid disorders (CD). Methods: Archival data of a naturalistic sample of 277 female inpatients diagnosed with a depressive disorder was analyzed assessing the BDI at baseline, after 20 days and 30 days, posttreatment, and after 3 to 6 months at follow-up. Early improvement, defined as a decrease in the BDI score of at least 30% after 20 and after 30 days, and CD were analyzed using binary logistic regression. Results: Both early improvement definitions were predictive of remission at posttreatment. Early improvement after 30 days showed a sustained treatment effect in the follow-up phase, whereas early improvement after 20 days failed to show a persistent effect regarding remission at follow-up. CD were not significantly related neither at posttreatment nor at follow-up. At no time point CD moderated the prediction by early improvement. Conclusions: We show that early improvement is a valid predictor for short-term remission and at follow-up in an inpatient setting. CD did not predict outcome. Further studies are needed to identify patient subgroups amenable to more tailored treatments.
Resumo:
BACKGROUND & AIMS Wilson disease is an autosomal recessive disorder that affects copper metabolism, leading to copper accumulation in liver, central nervous system, and kidneys. There are few data on long-term outcomes and survival from large cohorts; we studied these features in a well-characterized Austrian cohort of patients with Wilson disease. METHODS We analyzed data from 229 patients diagnosed with Wilson disease from 1961 through 2013; 175 regularly attended a Wilson disease outpatient clinic and/or their physicians were contacted for information on disease and treatment status and outcomes. For 53 patients lost during the follow-up period, those that died and reasons for their death were identified from the Austrian death registry. RESULTS The mean observation period was 14.8 ± 11.4 years (range, 0.5-52.0 years), resulting in 3116 patient-years. Of the patients, 61% presented with hepatic disease, 27% with neurologic symptoms, and 10% were diagnosed by family screening at presymptomatic stages. Patients with a hepatic presentation were diagnosed younger (21.2 ± 12.0 years) than patients with neurologic disease (28.8 ± 12.0; P < .001). In 2% of patients, neither symptoms nor onset of symptoms could be determined with certainty. Most patients stabilized (35%) or improved on chelation therapy (26% fully recovered, 24% improved), but 15% deteriorated; 8% required a liver transplant, and 7.4% died within the observation period (71% of deaths were related to Wilson disease). A lower proportion of patients with Wilson disease survived for 20 years (92%) than healthy Austrians (97%), adjusted for age and sex (P = .03). Cirrhosis at diagnosis was the best predictor of death (odds ratio, 6.8; 95% confidence interval, 1.5-31.03; P = .013) and need for a liver transplant (odds ratio, 07; 95% confidence interval, 0.016-0.307; P < .001). Only 84% of patients with cirrhosis survived 20 years after diagnosis (compared with healthy Austrians, P =.008). CONCLUSION Overall, patients who receive adequate care for Wilson disease have a good long-term prognosis. However, cirrhosis increases the risk of death and liver disease. Early diagnosis, at a precirrhotic stage, might increase survival times and reduce the need for a liver transplant.