901 resultados para INTERMITTENT HEMODIALYSIS
Resumo:
The acute poisoning of chronic renal patients during hemodialysis sessions in 1996 in Caruaru City (Pernambuco State, Brazil) stimulated an intensive search for the cause of this severe complication. This search culminated in the identification of microcystins (MC), hepatotoxic cyclic heptapeptides produced by cyanobacteria, as the causative agents. More than ten years later, additional research data provides us with a better understanding of the factors related to cyanobacterial bloom occurrence and production of MC in Brazil and other South American countries. The contamination of water bodies and formation of toxic blooms remains a very serious concern, especially in countries in which surface water is used as the main source for human consumption. The purpose of this review is to highlight the discoveries of the past 15 years that have brought South American researchers to their current level of understanding of toxic cyanobacteria species and that have contributed to their knowledge of factors related to MC production, mechanisms of action and consequences for human health and the environment. (C) 2010 Elsevier Ltd. All rights reserved.
Resumo:
Physiological and kinematic data were collected from elite under-19 rugby union players to provide a greater understanding of the physical demands of rugby union. Heart rate, blood lactate and time-motion analysis data were collected from 24 players (mean +/- s((x) over bar): body mass 88.7 +/- 9.9 kg, height 185 +/- 7 cm, age 18.4 +/- 0.5 years) during six competitive premiership fixtures. Six players were chosen at random from each of four groups: props and locks, back row forwards, inside backs, outside backs. Heart rate records were classified based on percent time spent in four zones (>95%, 85-95%, 75-84%, <75% HRmax). Blood lactate concentration was measured periodically throughout each match, with movements being classified as standing, walking, jogging, cruising, sprinting, utility, rucking/mauling and scrummaging. The heart rate data indicated that props and locks (58.4%) and back row forwards (56.2%) spent significantly more time in high exertion (85-95% HRmax) than inside backs (40.5%) and outside backs (33.9%) (P < 0.001). Inside backs (36.5%) and outside backs (38.5%) spent significantly more time in moderate exertion (75-84% HRmax) than props and locks (22.6%) and back row forwards (19.8%) (P < 0.05). Outside backs (20.1%) spent significantly more time in low exertion (< 75% HRmax) than props and locks (5.8%) and back row forwards (5.6%) (P < 0.05). Mean blood lactate concentration did not differ significantly between groups (range: 4.67 mmol.l(-1) for outside backs to 7.22 mmol.l(-1) for back row forwards; P < 0.05). The motion analysis data indicated that outside backs (5750 m) covered a significantly greater total distance than either props and locks or back row forwards (4400 and 4080 m, respectively; P < 0.05). Inside backs and outside backs covered significantly greater distances walking (1740 and 1780 m, respectively; P < 0.001), in utility movements (417 and 475 m, respectively; P < 0.001) and sprinting (208 and 340 m, respectively; P < 0.001) than either props and locks or back row forwards (walking: 1000 and 991 m; utility movements: 106 and 154 m; sprinting: 72 and 94 m, respectively). Outside backs covered a significantly greater distance sprinting than inside backs (208 and 340 m, respectively; P < 0.001). Forwards maintained a higher level of exertion than backs, due to more constant motion and a large involvement in static high-intensity activities. A mean blood lactate concentration of 4.8-7.2 mmol.l(-1) indicated a need for 'lactate tolerance' training to improve hydrogen ion buffering and facilitate removal following high-intensity efforts. Furthermore, the large distances (4.2-5.6 km) covered during, and intermittent nature of, match-play indicated a need for sound aerobic conditioning in all groups (particularly backs) to minimize fatigue and facilitate recovery between high-intensity efforts.
Resumo:
Objective: To determine the effectiveness of twice-weekly directly observed therapy (DOT) for tuberculosis (TB) in HIV-infected and uninfected patients, irrespective of their previous treatment history. Also to determine the predictive value of 2-3 month smears on treatment outcome. Methods: Four hundred and sixteen new and 113 previously treated adults with culture positive pulmonary TB (58% HIV infected, 9% combined drug resistance) in Hlabisa, South Africa. Daily isoniazid (H), rifampicin (R), pyrazinamide (Z) and ethambutol (E) given in hospital (median 17 days), followed by HRZE twice a week to 2 months and HR twice a week to 6 months in the community. Results: Outcomes at 6 months among the 416 new patients were: transferred out 2%; interrupted treatment 17%; completed treatment 3%; failure 2%; and cured 71%. Outcomes were similar among HIV-infected and uninfected patients except for death (6 versus 2%; P = 0.03). Cure was frequent among adherent HIV-infected (97%; 95% CI 94-99%) and uninfected (96%; 95% CI 92-99%) new patients. Outcomes were similar among previously treated and new patients, except for death (11 versus 4%; P = 0.01), and cure among adherent previously treated patients 97% (95% CI 92-99%) was high. Smear results at 2 months did not predict the final outcome. Conclusion: A twice-weekly rifampicin-containing drug regimen given under DOT cures most adherent patients irrespective of HIV status and previous treatment history. The 2 month smear may be safely omitted. Relapse rates need to be determined, and an improved system of keeping treatment interrupters on therapy is needed. Simplified TB treatment may aid implementation of the DOTS strategy in settings with high TB caseloads secondary to the HIV epidemic. (C) 1999 Lippincott Williams & Wilkins.
Resumo:
The physical nonequilibrium of solute concentration resulting from preferential now of soil water has often led to models where the soil is partitioned into two regions: preferential flow paths, where solute transport occurs mainly by advection, and the remaining region, where significant solute transport occurs through diffusive exchange with the flow paths. These two-region models commonly ignore concentration gradients within the regions. Our objective was to develop a simple model to assess the influence of concentration gradients on solute transport and to compare model results with experiments conducted on structured materials. The model calculates the distribution of solutes in a single spherical aggregate surrounded by preferential now paths and subjected to alternating boundary conditions representing either an exchange of solutes between the two regions (a wet period) or no exchange but redistribution of solutes within the aggregate (a dry period). The key parameter in the model is the aggregate radius, which defines the diffusive time scales. We conducted intermittent leaching experiments on a column of packed porous spheres and on a large (300 mm long by 216 mm diameter) undisturbed field soil core to test the validity of the model and its application to field soils. Alternating wet and dry periods enhanced leaching by up to 20% for this soil, which was consistent with the model's prediction, given a fitted equivalent aggregate radius of 1.8 cm, If similar results are obtained for other soils, use of alternating wet and dry periods could improve management of solutes, for example in salinity control and in soil remediation.
Resumo:
A method by which to overcome the clinical symptoms of atherosclerosis is the insertion of a graft to bypass an artery blocked or impeded by plaque. However, there may be insufficient autologous mammary artery for multiple or repeat bypass, saphenous vein may have varicose degenerative alterations that can lead to aneurysm in high-pressure sites, and small-caliber synthetic grafts are prone to thrombus induction and occlusion. Therefore, the aim of the present study was to develop an artificial blood conduit of any required length and diameter from the cells of the host for autologous transplantation. Silastic tubing, of variable length and diameter, was inserted into the peritoneal cavity of rats or rabbits. By 2 weeks, it had become covered by several layers of myofibroblasts, collagen matrix, and a single layer of mesothelium. The Silastic tubing was removed from the harvested implants, and the tube of living tissue was everted such that it now resembled a blood vessel with an inner lining of nonthrombotic mesothelial cells (the intima), with a media of smooth muscle-like cells (myofibroblasts), collagen, and elastin, and with an outer collagenous adventitia. The tube of tissue (10 to 20 mm long) was successfully grafted by end-to-end anastomoses into the severed carotid artery or abdominal aorta of the same animal in which they were grown. The transplant remained patent for at least 4 months and developed structures resembling elastic lamellae. The myofibroblasts gained a higher volume fraction of myofilaments and became responsive to contractile agonists, similar to the vessel into which they had been grafted. It is suggested that these nonthrombogenic tubes of living tissue, grown in the peritoneal cavity of the host, may be developed as autologous coronary artery bypass grafts or as arteriovenous access fistulae for hemodialysis patients.
Resumo:
Most soils contain preferential flow paths that can impact on solute mobility. Solutes can move rapidly down the preferential flow paths with high pore-water velocities, but can be held in the less permeable region of the soil matrix with low pore-water velocities, thereby reducing the efficiency of leaching. In this study, we conducted leaching experiments with interruption of the flow and drainage of the main flow paths to assess the efficiency of this type of leaching. We compared our experimental results to a simple analytical model, which predicts the influence of the variations in concentration gradients within a single spherical aggregate (SSA) surrounded by preferential flow paths on leaching. We used large (length: 300 mm, diameter: 216 mm) undisturbed field soil cores from two contrasting soil types. To carry out intermittent leaching experiments, the field soil cores were first saturated with tracer solution (CaBr2), and background solution (CaCl2) was applied to mimic a leaching event. The cores were then drained at 25- to 30-cm suction to empty the main flow paths to mimic a dry period during which solutes could redistribute within the undrained region. We also conducted continuous leaching experiments to assess the impact of the dry periods on the efficiency of leaching. The flow interruptions with drainage enhanced leaching by 10-20% for our soils, which was consistent with the model's prediction, given an optimised equivalent aggregate radius for each soil. This parameter quantifies the time scales that characterise diffusion within the undrained region of the soil, and allows us to calculate the duration of the leaching events and interruption periods that would lead to more efficient leaching. Application of these methodologies will aid development of strategies for improving management of chemicals in soils, needed in managing salts in soils, in improving fertiliser efficiency, and in reclaiming contaminated soils. (C) 2000 Elsevier Science B.V. All rights reserved.
Resumo:
Background and Purpose-Few community-based studies have examined the long-term survival and prognostic factors for death within 5 years after an acute first-ever stroke. This study aimed to determine the absolute and relative survival and the independent baseline prognostic Factors for death over the next 5 years among all individuals and among 30-day survivors after a first-ever stroke in a population of Perth, Western Australia. Methods-Between February 1989 and August 1990, all individuals with a suspected acute stroke or transient ischemic attack of the brain who were resident in a geographically defined region of Perth, Western Australia, with a population of 138 708 people, were registered prospectively and assessed according to standardized diagnostic criteria. Patients were followed up prospectively at 4 months, 12 months, and 5 years after the index event. Results-Three hundred seventy patients with first-ever stroke were registered, and 362 (98%) were followed up at 5 years, by which time 210 (58%) had died. In the first year after stroke the risk of death was 36.5% (95% CI, 31.5% to 41.4%), which was 10-fold (95% CI, 8.3% to 11.7%) higher than that expected among the general population of the same age and sex. The most common cause of death was the index stroke (64%). Between 1 and 5 years after stroke, the annual risk of death was approximately 10% per year, which was approximately 2-fold greater than expected, and the most common cause of death was cardiovascular disease (41%). The independent baseline factors among 30-day survivors that predicted death over 5 years were intermittent clandication (hazard ratio [WR], 1.9; 95% CI, 1.2 to 2.9), urinary incontinence (HR, 2.0; 95% CI, 1.3 to 3.0), previous transient ischemic attack (HR, 2.4; 95% CT, 1.3 to 4.1), and prestroke Barthel Index <20/20 (HR, 2.0, 95% CI, 1.3 to 3.2). Conclusions-One-year survivors of first-ever stroke continue to die over the next 4 years at a rate of approximately 10% per year, which is twice the rate expected among the general population of the same age and sex. The most common cause of death is cardiovascular disease. Long-term survival after stroke may be improved by early, active, and sustained implementation of effective strategies for preventing subsequent cardiovascular events.
Resumo:
The objective of the present study was to evaluate the performance of a new bioelectrical impedance instrument, the Soft Tissue Analyzer (STA), which predicts a subject's body composition. A cross-sectional population study in which the impedance of 205 healthy adult subjects was measured using the STA. Extracellular water (ECW) volume (as a percentage of total body water, TBW) and fat-free mass (FFM) were predicted by both the STA and a compartmental model, and compared according to correlation and limits of agreement analysis, with the equivalent data obtained by independent reference methods of measurement (TBW measured by D2O dilution, and FFM measured by dual-energy X-ray absorptiometry). There was a small (2.0 kg) but significant (P < 0.02) difference in mean FFM predicted by the STA, compared with the reference technique in the males, but not in the females (-0.4 kg) or in the combined group (0.8 kg). Both methods were highly correlated. Similarly, small but significant differences for predicted mean ECW volume were observed. The limits of agreement for FFM and ECW were -7.5-9.9 and -4.1-3.0 kg, respectively. Both FFM and ECW (as a percentage of TBW) are well predicted by the STA on a population basis, but the magnitude of the limits of agreement with reference methods may preclude its usefulness for predicting body composition in an individual. In addition, the theoretical basis of an impedance method that does not include a measure of conductor length requires further validation. (C) Elsevier Science Inc. 2000.
Improving maximum walking distance in early peripheral arterial disease: Randomised controlled trial
Resumo:
The purpose of this study was to determine the impact of increased physical activity and cessation of smoking on the natural history of early peripheral arterial disease, We conducted a randomised controlled trial in Perth, Western Australia, involving 882 men with early peripheral arterial disease identified via population-based screening using the Edinburgh Claudication Questionnaire and the ankle:brachial index. Members of the control group (n = 441) received usual care from their general practitioner while members of the intervention group (n = 441) were allocated to a stop smoking and keep walking regime - a combined community-based intervention of cessation of smoking (where applicable) and increased physical activity. Postal follow-up occurred at two and 12 months post-entry into the trial. The main outcome of interest was maximum walking distance. There were no statistically significant differences in the characteristics of the intervention and usual care groups at recruitment. Follow-up information at two and 12 months was available for 85% and 84% of participants, respectively. At 12 months, more men allocated to the intervention group had improved their maximum walking distance (23% vs 15%; chi(2) = 9.74, df = 2, p = 0.008). In addition, more men in the intervention group reported walking more than three times per week for recreation (34% vs 25%, p = 0.01). Although not statistically significant, more men in the intervention group who were smokers when enrolled in the trial had stopped smoking (12% vs 8%, p = 0.43). It is concluded that referral of older patients with intermittent claudication to established physiotherapy programs in the community can increase levels of physical activity and reduce disability related to peripheral arterial disease. A combination of simple and safe interventions that are readily available in the community through physiotherapists and general practitioners has the potential to improve early peripheral arterial disease.
Resumo:
Objective: To determine the age-standardised prevalence of peripheral arterial disease (PAD) and associated risk factors, particularly smoking. Method: Design: Cross-sectional survey of a randomly selected population. Setting: Metropolitan area of Perth, Western Australia. Participants: Men aged between 65-83 years. Results: The adjusted response fraction was 77.2%. Of 4,470 men assessed, 744 were identified as having PAD by the Edinburgh Claudication Questionnaire and/or the ankle-brachial index of systolic blood pressure, yielding an age-standardised prevalence of PAD of 15.6% (95% confidence intervals (CI): 14.5%, 16.6%). The main risk factors identified in univariate analyses were increasing age, smoking current (OR=3.9, 95% CI 2.9-5.1) or former (OR=2.0, 95% CI 1.6-2.4), physical inactivity (OR=1.4, 95% CI 1.2-1.7), a history of angina (OR=2.2, 95% CI 1.8-2.7) and diabetes mellitus (OR=2.1, 95% CI 1.7-2.6). The multivariate analysis showed that the highest relative risk associated with PAD was current smoking of 25 or more cigarettes daily (OR=7.3, 95% CI 4.2-12.8). In this population, 32% of PAD was attributable to current smoking and a further 40% was attributable to past smoking by men who did not smoke currently. Conclusions: This large observational study shows that PAD is relatively common in older, urban Australian men. In contrast with its relationship to coronary disease and stroke, previous smoking appears to have a long legacy of increased risk of PAD. Implications: This research emphasises the importance of smoking as a preventable cause of PAD.
Resumo:
WO(3)/chitosan and WO(3)/chitosan/poly(ethylene oxide) (PEO) films were prepared by the layer-by-layer method. The presence of chitosan enabled PEO to be carried into the self-assembled structure, contributing to an increase in the Li(+) diffusion rate. On the basis of the galvanostatic intermittent titration technique (GITT) and the quadratic logistic equation (QLE), a spectroelectrochemical method was used for determination of the ""optical"" diffusion coefficient (D(op)), enabling analysis of the Li(+) diffusion rate and, consequently, the coloration front rate in these host matrices. The D(op) values within the WO(3)/chitosan/PEO film were significantly higher than those within the WO(3)/chitosan film, mainly for higher values of injected charge. The presence of PEO also ensured larger accessibility to the electroactive sites, in accordance with the method employed here. Hence, this spectroelectrochemical method allowed us to separate the contribution of the diffusion process from the number of accessible electroactive sites in the materials, thereby aiding a better understanding of the useful electrochemical and electrochromic properties of these films for use in electrochromic devices. (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
Layer-by-layer (LbL) nanocomposite films from TiO(2) nanoparticles and tungsten-based oxides (WO(x)H(y)), as well as dip-coating films of TiO(2) nano particles, were prepared and investigated by electrochemical techniques under visible light beams, aiming to evaluate the lithium ion storage and chromogenic properties. Atomic force microscopy (AFM) images were obtained for morphological characterization of the Surface of the materials, which have similar roughness. Cyclic voltammetry and chronoamperometry measurements indicated high storage capacity of lithium ions in the LbL nanocomposite compared with the dip-coating film, which was attributed to the faster lithium ion diffusion rate within the self-assembled matrix. On the basis of the data obtained from galvanostatic intermittent titration technique (GITT), the values of lithium ion diffusion coefficient (D(Li)) for TiO(2)/WO(x)H(y) were larger compared with those for TiO(2). The rate of the coloration front in the matrices was investigated using a spectroelectrochemical method based oil GITT, allowing the determination of the ""optical"" diffusion coefficient (D(op)) as a function of the amount of lithium ions previously inserted into the matrices. The Values of D(Li) and D(op) suggested the existence of phases with distinct contribution to lithium ion diffusion rates and electrochromic efficiency. Moreover, these results aided a better understanding of the temporal change of current density and absorbance during the ionic electro-insertion, which is important for the possible application of these materials in lithium ion batteries and electrohromic devices.
Resumo:
Background: We tested the hypothesis that the universal application of myocardial scanning with single-photon emission computed tomography (SPECT) would result in better risk stratification in renal transplant candidates (RTC) compared with SPECT being restricted to patients who, in addition to renal disease, had other clinical risk factors. Methods: RTCs (n=363) underwent SPECT and clinical risk stratification according to the American Society of Transplantation (AST) algorithm and were followed up until a major adverse cardiovascular event (MACE) or death. Results: Of the 363 patients, 79 patients (22%) had an abnormal SPECT scan and 270 (74%) were classified as high risk. Both methods correctly identified patients with increased probability of MACE. However, clinical stratification performed better (sensitivity and negative predictive value 99% and 99% vs. 25% and 87%, respectively). High-risk patients with an abnormal SPECT scan had a modest increased risk of events (log-rank = 0.03; hazard ratio [HR] = 1.37; 95% confidence interval [95% CI], 1.02-1.82). Eighty-six patients underwent coronary angiography, and coronary artery disease (CAD) was found in 60%. High-risk patients with CAD had an increased incidence of events (log-rank = 0.008; HR=3.85; 95% CI, 1.46-13.22), but in those with an abnormal SPECT scan, the incidence of events was not influenced by CAD (log-rank = 0.23). Forty-six patients died. Clinical stratification, but not SPECT, correlated with the probability of death (log-rank = 0.02; HR=3.25; 95% CI, 1.31-10.82). Conclusion: SPECT should be restricted to high-risk patients. Moreover, in contrast to SPECT, the AST algorithm was also useful for predicting death by any cause in RTCs and for selecting patients for invasive coronary testing.
Resumo:
Vascular calcification is a strong prognostic marker of mortality in hemodialysis patients and has been associated with bone metabolism disorders in this population. In earlier stages of chronic kidney disease (CKD), vascular calcification also has been documented. This study evaluated the association between coronary artery calcification (CAC) and bone histomorphometric parameters in CKD predialysis patients assessed by multislice coronary tomography and by undecalcified bone biopsy. CAC was detected in 33 (66%) patients, and their median calcium score was 89.7 (0.4-2299.3 AU). The most frequent bone histologic alterations observed included low trabecular bone volume, increased eroded and osteoclast surfaces, and low bone-formation rate (BFR/BS). Multiple logistic regression analysis, adjusted for age, sex, and diabetes, showed that BFR/BS was independently associated with the presence of coronary calcification [p=.009; odd ratio (OR) = 0.15; 95% confidence interval (Cl) 0.036-0.619] This study showed a high prevalence of CAC in asymptomatic predialysis CKD patients. Also, there was an independent association of low bone formation and CAC in this population. In conclusion, our results provide evidence that low bone-formation rate constitutes another nontraditional risk factor for cardiovascular disease in CKD patients. 2010 American Society for Bone and Mineral Research.