979 resultados para long-term change
Resumo:
Several long-term studies of breast cancer survival have shown continued excess mortality from breast cancer up to 20-40 years following treatment. The purpose of this report was to investigate temporal trends in long-term survival from breast cancer in all New South Wales (NSW) women. Breast cancer cases incident in 1972-1996 (54,228) were derived from the NSW Central Cancer Registry a population-based registry which began in 1972. All cases of breast cancer not known to be dead were matched against death records. The expected survival for NSW women was derived from published annual life tables. Relative survival analysis compared the survival of cancer cases with the age, sex and period matched mortality of the total population. Cases were considered alive at the end of 1996, except when known to be dead. Proportional hazards regression was employed to model survival on age, period and degree of spread at diagnosis. Survival at 5, 10, 15, 20 and 25 years of follow-up was 76 per cent, 65 per cent, 60 per cent, 57 per cent and 56 per cent. The annual hazard rate for excess mortality was 4.3 per cent in year 1, maximal at 6.5 per cent in year 3, declining to 4.7 per cent in year 5, 2.7 per cent in year 10, 1.4 per cent in year 15, 1.0 per cent for years 16-20, and 0.4 per cent for years 20-25 of follow-up. Relative survival was highest in 40-49 year-olds. Cases diagnosed most recently (1992-1996) had the highest survival, compared with cases diagnosed in previous periods. Five-year survival improved over time, especially from the late 1980s for women in the screening age group (50-69 years). Survival was highest for those with localised cancer at diagnosis: 88.4 per cent, 79.1 per cent, 74.6 per cent, 72.7 per cent and 72.8 per cent at 5, 10, 15, 20 and 25 years follow-up (excluding those aged greater than or equal to 70 years). There was no significant difference between the survival of the breast cancer cases and the general population at 20-25 years follow-up. Degree of spread was less predictive of survival 5-20 years after diagnosis, compared with 0-5 years after diagnosis, and was not significant at 20-25 years of follow-up. Relative survival from breast cancer in NSW women continues to decrease to 25 years after diagnosis, but there is little excess mortality after 15 years follow-up, especially for those with localised cancer at diagnosis, and the minimal excess mortality at 20-25 years of follow-up is not statistically significant. (C) 2002 Elsevier Science Ltd. All rights reserved.
Resumo:
1. Ice-volume forced glacial-interglacial cyclicity is the major cause of global climate variation within the late Quaternary period. Within the Australian region, this variation is expressed predominantly as oscillations in moisture availability. Glacial periods were substantially drier than today with restricted distribution of mesic plant communities, shallow or ephemeral water bodies and extensive aeolian dune activity. 2. Superimposed on this cyclicity in Australia is a trend towards drier and/or more variable climates within the last 350 000 years. This trend may have been initiated by changes in atmospheric and ocean circulation resulting from Australia's continued movement into the Southeast Asian region and involving the onset or intensification of the El Nino-Southern Oscillation system and a reduction in summer monsoon activity. 3. Increased biomass burning, stemming originally from increased climatic variability and later enhanced by activities of indigenous people, resulted in a more open and sclerophyllous vegetation, increased salinity and a further reduction in water availability. 4. Past records combined with recent observations suggest that the degree of environmental variability will increase and the drying trend will be enhanced in the foreseeable future, regardless of the extent or nature of human intervention.
Resumo:
We model and calibrate the arguments in favor and against short-term and long-term debt. These arguments broadly include: maturity premium, sustainability, and service smoothing. We use a dynamic-equilibrium model with tax distortions and government outlays uncertainty, and model maturity as the fraction of debt that needs to be rolled over every period. In the model, the benefits of defaulting are tempered by higher future interest rates. We then calibrate our artificial economy and solve for the optimal debt maturity for Brazil as an example of a developing country and the US as an example of a mature economy. We obtain that the calibrated costs from defaulting on long-term debt more than offset costs associated with short-term debt. Therefore, short-term debt implies higher welfare levels.
Resumo:
Background There are few population-based data on long-term management of patients after coronary artery bypass graft (CABG), despite the high risk for future major vascular events among this group. We assessed the prevalence and correlates of pharmacotherapy for prevention of new cardiac events in a large population-based series. Methods A postal survey was conducted of 2500 randomly selected survivors from a state population of patients 6 to 20 years after first CABG. Results Response was 82% (n = 2061). Use of antiplatelet agents (80%) and statins (64%) declined as age increased. Other independent predictors of antiplatelet use included statin use (odds ratio [OR] 1.6, 95% CI 1.26-2.05) and recurrent angina (OR 1.6, CI 1.17-2.06). Current smokers were less likely to use aspirin (OR 0.59, CI 0.4-0.89). Statin use was associated with reported high cholesterol (OR 24.4, CI 8.4-32.4), management by a cardiologist (OR 2.3, CI 1.8-3.0), and the use of calcium channel-blockers. Patients reporting hypertension or heart failure, in addition to high cholesterol, were less likely to use statins. Angiotensin-converting enzyme inhibitors were the most commonly prescribed agents for management of hypertension (59%) and were more frequently used among patients with diabetes and those with symptoms of heart failure. Overall 42% of patients were on angiotensin-converting enzyme inhibitors and 36% on beta-blockers. Conclusions Gaps exist in the use of-recommended medications after CABG. Lower anti-platelet and statin use was associated with older age, freedom from angina, comorbid heart failure or hypertension, and not regularly visiting a cardiologist. Patients who continue to smoke might be less likely to adhere to prescribed medications.
Resumo:
To better understand the biochemical mechanisms underlying anisosmotic extracellular regulation in the freshwater Brachyura, we kinetically characterized the V-ATPase from the posterior gills of Dilocarcinus pagei, acclimated for 10 days to salinities up to 21%.. Specific activity was highest in fresh water (26.5 +/- 2.1 U mg(-1)), decreasing in 5 parts per thousand to 21 parts per thousand, attaining 3-fold less at 15 parts per thousand. Apparent affinities for ATP and Mg(2+) respectively increased 3.2- and 2-fold at 10 parts per thousand, suggesting expression of different isoenzymes. In a 240-h time-course study of exposure to 21%., maximum specific activity decreased 2.5- to 4-fold within 1 to 24 h while apparent affinities for ATP and Mg(2+) respectively increased by 12-fold within 24 h and 2.4-fold after 1 h, unchanged thereafter. K(I) for bafilomycin A(1) decreased 150-fold after 1 h, remaining constant up to 120 h. This is the first kinetic analysis of V-ATPase specific activity in crustacean gills during salinity acclimation. Our findings indicate active gill Cl(-) uptake by D. pagei in fresh water, and short- and long-term down-regulation of V-ATPase-driven ion uptake processes during salinity exposure, aiding in comprehension of the biochemical adaptations underpinning the establishment of the Brachyura in fresh water. (C) 2011 Elsevier Inc. All rights reserved.
Resumo:
Two longitudinal experiments involving Merino sheep challenged with either bovine or ovine strains of Mycobacterium avium subsp. paratuberculosis (Map) have been conducted over a period of 54 and 35 months, respectively. Blood samples for the interferon-gamma test, the absorbed ELISA and faecal samples for bacteriological culture were taken pre-challenge and monthly post-challenge. Infections were induced with either a bovine or ovine strain of Map in separate experiments with infections being more easily established, in terms of faecal bacterial shedding and clinical disease when the challenge inoculum was prepared from gut mucosal tissue than cultured bacteria. The patterns of response for shedding and clinical disease were similar. Cell-mediated immune responses were proportionally elevated by at least an order of magnitude in all sheep dosed with either a bovine or ovine strain of Map. Conversely, antibody responses were only elevated in a relatively small proportion of infected sheep. Neither of the clinically affected tissue challenged sheep developed an antibody response despite the presence of persistent shedding and the development and decline in cell-mediated immunity. The results indicated that for sheep the interferon-gamma test may be useful for determining if a flock has been exposed to ovine Johne's disease. (C) 2004 Elsevier B.V. All rights reserved.
Resumo:
Two longitudinal experiments involving Angora goats challenged with either bovine or ovine strains of Mycobacterium avium subspecies paratuberculosis (Map) have been conducted over a period of 54 and 35 months, respectively. Blood samples for the interferon-gamma (IFN-gamma) test and the absorbed ELISA and faecal samples for bacteriological culture were taken pre-challenge and monthly post-challenge. Persistent shedding, IFN-gamma production, seroconversion and clinical disease occurred earlier with the bovine Map gut mucosal tissue challenge inoculum than with cultured bacteria. The IFN-gamma responses of the gut mucosal tissue and bacterial challenge groups were substantially and consistently higher than those of the control group. The in vivo and cultured cattle strains were much more pathogenic for goats than the sheep strains with persistent faecal shedding, seroconversion and clinical disease occurring in the majority of bovine Map challenged goats. With the ovine Map, 3 goats developed persistent antibody responses but only one of these goats developed persistent faecal shedding and clinical disease. However, there was no significant difference between the IFN-gamma responses of the tissue challenged, bacterial challenged and control groups. Compared with sheep, the ELISA appeared to have higher sensitivity and the IFN-gamma test lower specificity. (C) 2005 Elsevier B.V. All rights reserved.
Resumo:
Background-Randomized trials that studied clinical outcomes after percutaneous coronary intervention (PCI) with bare metal stenting versus coronary artery bypass grafting (CABG) are underpowered to properly assess safety end points like death, stroke, and myocardial infarction. Pooling data from randomized controlled trials increases the statistical power and allows better assessment of the treatment effect in high-risk subgroups. Methods and Results-We performed a pooled analysis of 3051 patients in 4 randomized trials evaluating the relative safety and efficacy of PCI with stenting and CABG at 5 years for the treatment of multivessel coronary artery disease. The primary end point was the composite end point of death, stroke, or myocardial infarction. The secondary end point was the occurrence of major adverse cardiac and cerebrovascular accidents, death, stroke, myocardial infarction, and repeat revascularization. We tested for heterogeneities in treatment effect in patient subgroups. At 5 years, the cumulative incidence of death, myocardial infarction, and stroke was similar in patients randomized to PCI with stenting versus CABG (16.7% versus 16.9%, respectively; hazard ratio, 1.04, 95% confidence interval, 0.86 to 1.27; P = 0.69). Repeat revascularization, however, occurred significantly more frequently after PCI than CABG (29.0% versus 7.9%, respectively; hazard ratio, 0.23; 95% confidence interval, 0.18 to 0.29; P<0.001). Major adverse cardiac and cerebrovascular events were significantly higher in the PCI than the CABG group (39.2% versus 23.0%, respectively; hazard ratio, 0.53; 95% confidence interval, 0.45 to 0.61; P<0.001). No heterogeneity of treatment effect was found in the subgroups, including diabetic patients and those presenting with 3-vessel disease. Conclusions-In this pooled analysis of 4 randomized trials, PCI with stenting was associated with a long-term safety profile similar to that of CABG. However, as a result of persistently lower repeat revascularization rates in the CABG patients, overall major adverse cardiac and cerebrovascular event rates were significantly lower in the CABG group at 5 years.
Resumo:
Background-The effectiveness of heart failure disease management proarams in patients under cardiologists` care over long-term follow-up is not established. Methods and Results-We investigated the effects of a disease management program with repetitive education and telephone monitoring on primary (combined death or unplanned first hospitalization and quality-of-life changes) and secondary end points (hospitalization, death, and adherence). The REMADHE [Repetitive Education and Monitoring for ADherence for Heart Failure] trial is a long-term randomized, prospective, parallel trial designed to compare intervention with control. One hundred seventeen patients were randomized to usual care, and 233 to additional intervention. The mean follow-up was 2.47 +/- 1.75 years, with 54% adherence to the program. In the intervention group, the primary end point composite of death or unplanned hospitalization was reduced (hazard ratio, 0.64; confidence interval, 0.43 to 0.88; P=0.008), driven by reduction in hospitalization. The quality-of-life questionnaire score improved only in the intervention group (P<0.003). Mortality was similar in both groups. Number of hospitalizations (1.3 +/- 1.7 versus 0.8 +/- 1.3, P<0.0001), total hospital days during the follow-up (19.9 +/- 51 versus 11.1 +/- 24 days, P<0.0001), and the need for emergency visits (4.5 +/- 10.6 versus 1.6 +/- 2.4, P<0.0001) were lower in the intervention group. Beneficial effects were homogeneous for sex, race, diabetes and no diabetes, age, functional class, and etiology. Conclusions-For a longer follow-up period than in previous studies, this heart failure disease management program model of patients under the supervision of a cardiologist is associated with a reduction in unplanned hospitalization, a reduction of total hospital days, and a reduced need for emergency care, as well as improved quality of life, despite modest program adherence over time. (Circ Heart Fail. 2008;1:115-124.)
Resumo:
There are few studies on the relationship between the morphology of acute tubular necrosis (ATN) in native kidneys and late functional recovery. Eighteen patients with acute renal failure (ARF) who had undergone renal biopsy were studied. All had the histological diagnosis of ATN and were followed for at least six months. Clinical characteristics of ARF were analyzed, and histological features were semi-quantitatively evaluated (tubular atrophy, interstitial inflammatory infiltrate, interstitial fibrosis, and ATN). According to the maximal GFR achieved during the follow-up, patients were divided into two groups: complete recovery (GFR >= 90 mL/min/1.73 m(2)) and partial recovery (GFR < 90 mL/min/1.73 m(2)). Only 39% of the patients achieved complete recovery. Patients with partial recovery achieved their maximal GFR (63 +/- 9 mL/min/1.73 m(2)) 37 +/- 14 months after ARF, a period of time similar to those patients with complete recovery (i.e., 54 +/- 22 months). Patients with partial recovery had more severe ARF: oliguria was more frequent (90 versus 17%, p < 0.01), and they had higher peak creatinine (13.85 +/- 1.12 versus 8.95 +/- 1.30 mg/dL, p = 0.01), and longer hospitalization (45 +/- 7 versus 20 +/- 4 days, p = 0.03). No single histological parameter was associated with partial recovery, but the sum of all was when expressed as an injury index [4.00 (2.73-5.45) versus 2.00 (1.25-3.31), p < 0.05]. In conclusion, among patients with atypical ATN course, those with more severe ARF and tubule-interstitial lesions are more prone to partial recovery.
Resumo:
Background: Despite all benefits offered by mandible distraction, complications and long-term consequences need to be evaluated to define its safety and morbidity. Forty mandible distractions were studied. Panoramic mandible radiographs obtained preoperatively, during distraction, and during the postoperative period were reviewed, with the intention of evaluating development and complications of molar buds and teeth in the distraction area. Methods: The mean patient age was 8.1 years. Twenty-five patients had craniofacial microsomia (one associated with a no. 10 facial cleft), five had temporomandibular joint ankylosis, two had familiar cases of auriculocondylar syndrome, one had a Tessier no. 30 facial cleft, and one had Treacher Collins syndrome. The severity of mandible hypoplasia was Pruzansky grade I in four cases, grade IIA in eight cases, grade 1113 in 16 cases, and grade III in one case. Mean radiographic follow-up was 44.8 months. Results: Molar buds located in the distraction area erupted without any deformity or displacement in 18 sides (45 percent). Fourteen cases presented distalization of a dental bud to a superior position in the mandibular ramus (four migrated back to the original position). Six molar buds presented perforations, four had shape deformities (two caused by dental fracture), and two had dental root injuries followed by root absorption lately. One case developed a dentigerous cyst. Conclusions: Almost half of the patients did not have any molar bud or tooth alterations after mandible distraction, and more than 20 percent presented only bud distalization. Therefore, preventive bud enucleation or tooth extraction should be avoided before mandible distraction.
Resumo:
Short-term (one week) and chronic (six week) cardiovascular effects of orally administered perindopril were examined in the rabbit to demonstrate if short-term results can predict chronic outcomes. In short-term treatment, five doses of perindopril were examined in random order separated by a one week recovery period in each of six rabbits. Two doses of perindopril which resulted in a moderate hypotensive effect (-14 mmHg) and no hypotensive effect, respectively, were then selected for long-term treatment. Each rabbit in the short-term study received perindopril in doses of 0.01, 0.06, 0.32, 1.8 and 10 mg kg(-1) day(-1) for a week at a time. Rabbits on long-term treatment received either 0.3 or 0.01 mg kg(-1) day(-1) perindopril for six weeks. All rabbits had their mean arterial blood pressure (MAP) and heart rate recorded throughout treatment. Plasma angiotensin I (AngI), perindoprilat, angiotensin converting enzyme (ACE) inhibition were also assayed. Perindopril treatment for one week produced a dose-dependent hypotensive effect with the threshold dose, 0.06 mg kg(-1) day(-1), producing a 6.5+/-1.8 mmHg fall in MAP. The highest dose (10.0 mg kg(-1) day(-1)) produced a large fall in blood pressure of -29.6+/-4.2 mmHg. The 0.01 and 0.06 mg kg(-1) day(-1) doses of perindopril produced an average 2.65 fold increase in plasma AngI levels compared to the initial control. The three higher doses (0.32-10.0 mg kg(-1) day(-1)) of perindopril produced an equivalent 5.7 fold increase in plasma AngI levels compared to the initial controls. However, over six weeks 0.01 mg kg(-1) day(-1) perindopril induced a similar decrease in MAP as the 30 fold higher dose (-9.3 mmHg compared to -11.7 mmHg,). This was in spite of a 3 fold difference in plasma perindoprilat concentrations between the high and low dose perindopril groups. Plasma ACE inhibition was >80% with both doses of perindopril. The results indicate that while perindopril decreases MAP in a dose-dependent manner in short-term (one week) periods, over longer treatment times (six weeks) low concentrations of perindopril, non-hypotensive with shortterm treatment, may be as anti-hypertensive as considerably higher doses. (C) 1996 The Italian Pharmacological Society.
Resumo:
Purpose: The impact of pelvic floor muscle training on the recovery of urinary continence after radical prostatectomy is still controversial. We tested the effectiveness of biofeedback-pelvic floor muscle training in improving urinary incontinence in the 12 months following radical prostatectomy. Materials and Methods: A total of 73 patients who underwent radical prostatectomy were randomized to a treatment group (36) receiving biofeedback-pelvic floor muscle training once a week for 3 months as well as home exercises or a control group (37). Patients were evaluated 1, 3, 6 and 12 months postoperatively. Continence was defined as the use of 1 pad or less daily and incontinence severity was measured by the 24-hour pad test. Incontinence symptoms and quality of life were assessed with the International Continence Society male Short Form questionnaire and the Incontinence Impact Questionnaire. Pelvic floor muscle strength was evaluated with the Oxford score. Results: A total of 54 patients (26 pelvic floor muscle training and 28 controls) completed the trial. Duration of incontinence was shorter in the treatment group. At postoperative month 12, 25 (96.15%) patients in the treatment group and 21 (75.0%) in the control group were continent (p = 0.028). The absolute risk reduction was 21.2% (95% CI 3.45-38.81) and the relative risk of recovering continence was 1.28 (95% CI 1.02-1.69). The number needed to treat was 5 (95% CI 2.6-28.6). Overall there were significant changes in both groups in terms of incontinence symptoms, lower urinary tract symptoms, quality of life and pelvic floor muscle strength (p <0.0001). Conclusions: Early biofeedback-pelvic floor muscle training not only hastens the recovery of urinary continence after radical prostatectomy but allows for significant improvements in the severity of incontinence, voiding symptoms and pelvic floor muscle strength 12 months postoperatively.
Resumo:
The role of natural killer (NK) T cells in the development of lupus-like disease in mice is still controversial. We treated NZB/W mice with anti-NK1.1 monoclonal antibodies (mAbs) and our results revealed that administration of either an irrelevant immunoglobulin G2a (IgG2a) mAb or an IgG2a anti-NK1.1 mAb increased the production of anti-dsDNA antibodies in young NZB/W mice. However, the continuous administration of an anti-NK1.1 mAb protected aged NZB/W mice from glomerular injury, leading to prolonged survival and stabilization of the proteinuria. Conversely, the administration of the control IgG2a mAb led to an aggravation of the lupus-like disease. Augmented titres of anti-dsDNA in NZB/W mice, upon IgG2a administration, correlated with the production of BAFF/BLyS by dendritic, B and T cells. Treatment with an anti-NK1.1 mAb reduced the levels of interleukin-16, produced by T cells, in spleen cell culture supernatants from aged NZB/W. Adoptive transfer of NK T cells from aged to young NZB/W accelerated the production of anti-dsDNA in recipient NZB/W mice, suggesting that NK T cells from aged NZB/W are endowed with a B-cell helper activity. In vitro studies, using purified NK T cells from aged NZB/W, showed that these cells provided helper B-cell activity for the production of anti-dsDNA. We concluded that NK T cells are involved in the progression of lupus-like disease in mature NZB/W mice and that immunoglobulin of the IgG2a isotype has an enhancing effect on antibody synthesis due to the induction of BAFF/BLyS, and therefore have a deleterious effect in the NZB/W mouse physiology.
Resumo:
The association of cyclophosphamide (CYC) and prednisone (PRED) for the treatment of lung fibrosis in systemic sclerosis (SSc) was only evaluated in uncontrolled studies, although in idiopathic interstitial lung disease (ILD) this association seems to be beneficial in patients with non-specific interstitial pneumonia (NSIP). Objectives: To treat SSc-ILD in a prospective open-label controlled study based on lung pattern during 12 months of treatment. Methods: A 3-year analysis was also performed. Twenty-four consecutive patients with SSc and ILD were submitted to an open lung biopsy. Eighteen patients (NSIP) were randomized in two groups: CYC versus CYC + PRED during 12 months. Lung function tests (diffusion lung capacity of monoxide carbone corrected for hemoglobin concentration (DLCO-Hb), forced vital capacity (FVC), total lung capacity) and Modified Rodnan Skin Score (MRSS) were performed before, after one of treatment and after 3 years from the end of the treatment. Results: Pulmonary function tests were similar in both groups on baseline. After 1 year of treatment, FVC% was comparable between CYC groups (p = 0.72) and in CYC + PRED (p = 0.40). Three years after the end of treatment, FVC% values (p = 0.39 in group CYC and p = 0.61 in CYC + PRED and p = 0.22 in CYC + PRED) and DLCO-Hb (p = 0.54 in CYC and p = 0.28 in CYC + PRED) were similar compared to 1 year of treatment. We observed a reduction of the MRSS in the CYC + PRED group after 1 year of treatment (p = 0.02); although after 3 years, MRSS values remained stable in both groups. Conclusions: CYC was effective to stabilize lung function parameters in NSIP lung pattern of SSc disease for 3 years after the end of a 1-year therapy.