896 resultados para Refinery treated effluent


Relevância:

20.00% 20.00%

Publicador:

Resumo:

INTRODUCTION: Electroencephalography (EEG) has a central role in the outcome prognostication in subjects with anoxic/hypoxic encephalopathy following a cardiac arrest (CA). Continuous EEG monitoring (cEEG) has been consistently developed and studied; however, its yield as compared to repeated standard EEG (sEEG) is unknown. METHODS: We studied a prospective cohort of comatose adults treated with therapeutic hypothermia (TH) after a CA. cEEG data regarding background activity and epileptiform components were compared to two 20 minute sEEG extracted from the cEEG recording (one during TH, and one in early normothermia). RESULTS: In this cohort, 34 recordings were studied. During TH, the agreement between cEEG and sEEG was 97.1% (95% CI: 84.6 - 99.9%) for background discontinuity and reactivity evaluation, while it was 94.1% (95% CI 80.3 - 99.2%) regarding epileptiform activity. In early normothermia, we did not find any discrepancies. Thus, concordance was very good during TH (kappa 0.83), and optimal during normothermia (kappa=1). The median delay between CA and the first EEG reactivity testing was 18 hours (range: 4.75 - 25) for patients with perfect agreement and 10 hours (range: 5.75 - 10.5) for the three patients in whom there were discordant findings (P=0.02, Wilcoxon). CONCLUSION: Standard intermittent EEG has comparable performance than continuous EEG both for variables important for outcome prognostication (EEG reactivity) and identification of epileptiform transients in this relatively small sample of comatose survivors of CA. This finding has an important practical implication, especially for centers where EEG resources are limited.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Application of wild-type or genetically-modified bacteria to the soil environment entails the risk of dissemination of these organisms to the groundwater. To measure vertical transport of bacteria under natural climatic conditions, Pseudomonas fluorescens strain CHA0 was released together with bromide as a mobile tracer at the surface of large outdoor lysimeters. Two experiments, one starting in autumn 1993 and the other in spring 1994 were performed. Shortly after a heavy rainfall in late spring 1994, the released bacteria were detected for the first time in effluent water from the 2.5-m-deep lysimeters in both experiments, i.e. 210 d and 21 d, respectively, after inoculation. Only a 10−9 to 10−8 fraction of the inoculum was recovered as culturable cells in the effluent water, but a larger fraction of the CHA0 cells was in a non-culturable state as detected with immunofluorescence microscopy. As much as 50% of the mobile tracer percolated through the lysimeters, indicating that, compared with bromide, bacterial cells were retained in soil. In the second part of this study, persistence of CHA0 in groundwater microcosms consisting of lysimeter effluent water was studied for 380 d. Survival of the inoculant as culturable cells was better under anaerobic than under aerobic conditions. However, a large fraction of the cells became non-culturable in both cases. When the experiment was performed with filter-sterilized effluent water, the total count of introduced bacteria did not decline with time. In conclusion, the biocontrol strain was transported in low numbers to a potential groundwater level under natural climatic conditions, but could persist for an extended period in groundwater microcosms.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Organic residue application into soil alter the emission of gases to atmosphere and CO2, CH4, N2O may contribute to increase the greenhouse effect. This experiment was carried out in a restoration area on a dystrophic Ultisol (PVAd) to quantify greenhouse gas (GHG) emissions from soil under castor bean cultivation, treated with sewage sludge (SS) or mineral fertilizer. The following treatments were tested: control without N; FertMin = mineral fertilizer; SS5 = 5 t ha-1 SS (37.5 kg ha-1 N); SS10 = 10 t ha-1 SS (75 kg ha-1 N); and SS20 = 20 t ha-1 SS (150 kg ha-1 N). The amount of sludge was based on the recommended rate of N for castor bean (75 kg ha-1), the N level of SS and the mineralization fraction of N from SS. Soil gas emission was measured for 21 days. Sewage sludge and mineral fertilizers altered the CO2, CH4 and N2O fluxes. Soil moisture had no effect on GHG emissions and the gas fluxes was statistically equivalent after the application of FertMin and of 5 t ha-1 SS. The application of the entire crop N requirement in the form of SS practically doubled the Global Warming Potential (GWP) and the C equivalent emissions in comparison with FertMin treatments.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJECTIVE: To evaluate the results of closed and open grade I and II tibial shaft fractures treated by reamed nail and unreamed nailing. SUBJECTS AND METHODS: Between 1997 and 2000, 119 patients with tibial shaft fractures were treated with reamed tibial nails. Postoperatively 96 patients (70 closed and 26 grade I and II open fractures) were followed clinically and radiologically for up to 18 months. The nail was inserted either by patellar tendon splitting or by nonsplitting technique. The nail was inserted after overreaming by 1.5 mm. Postoperatively, patients with isolated tibial fracture were mobilized by permitting partial weight bearing on the injured leg for 6 weeks. Patients with associated ankle fractures were allowed to walk with a Sarmiento cast. RESULTS: Postoperatively, 6 (6.3%) patients developed a compartment syndrome after surgery. In 48 (50%) cases, dynamization of the nail was carried out after a mean period of 12 weeks for delayed union. Overall, a 90.6% union was obtained at a mean of 24 weeks without difference between closed or open fractures. Two (2.1%) patients with an open grade II fracture developed a deep infection requiring treatment. A 9.4% rate of malunion was observed. Eight (8.3%) patients developed screw failure without clinical consequences. At the last follow-up, 52% of patients with patellar tendon splitting had anterior knee pain, compared to those (14%) who did not have tendon splitting. CONCLUSION: Reamed intramedullary nail is a suitable implant in treating closed as well as grade I and II open tibial shaft fractures.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A by-product of Wastewater Treatment Stations is sewage sludge. By treatment and processing, the sludge is made suitable for rational and environmentally safe use in agriculture. The aim of this study was to assess the influence of different doses of limed sewage sludge (50 %) on clay dispersion in soil samples with different textures (clayey and medium). The study was conducted with soil samples collected from native forest, on a Red Latosol (Brazilian classification: Latossolo Vermelho distroférrico) loamy soil in Londrina (PR) and a Red-Yellow Latosol (BC: Latossolo Vermelho-Amarelo distrófico) medium texture soil in Jaguapitã (PR). Pots were filled with 3 kg of air-dried fine earth and kept in greenhouse. The experiment was arranged in a randomized block design with six treatments: T1 control, and treatments with limed sewage sludge (50 %) as follows: T2 (3 t ha-1), T3 (6 t ha-1), T4 (12 t ha-1), T5 (24 t ha-1) and T6 (48 t ha-1) and five replications. The incubation time was 180 days. At the end of this period, the pots were opened and two sub-samples per treatment collected to determine pH-H2O, pH KCl (1 mol L-1), organic matter content, water-dispersible clay, ΔpH (pH KCl - pH-H2O) and estimated PZC (point of zero charge): PZC = 2 pH KCl - pH-H2O, as well as the mineralogy of the clay fraction, determined by X ray diffraction. The results showed no significant difference in the average values for water-dispersible clay between the control and the other treatments for the two soil samples studied and ΔpH was the variable that correlated best with water-dispersible clay in both soils.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

PURPOSE: To quantify the relationship between bone marrow (BM) response to radiation and radiation dose by using (18)F-labeled fluorodeoxyglucose positron emission tomography [(18)F]FDG-PET standard uptake values (SUV) and to correlate these findings with hematological toxicity (HT) in cervical cancer (CC) patients treated with chemoradiation therapy (CRT). METHODS AND MATERIALS: Seventeen women with a diagnosis of CC were treated with standard doses of CRT. All patients underwent pre- and post-therapy [(18)F]FDG-PET/computed tomography (CT). Hemograms were obtained before and during treatment and 3 months after treatment and at last follow-up. Pelvic bone was autosegmented as total bone marrow (BMTOT). Active bone marrow (BMACT) was contoured based on SUV greater than the mean SUV of BMTOT. The volumes (V) of each region receiving 10, 20, 30, and 40 Gy (V10, V20, V30, and V40, respectively) were calculated. Metabolic volume histograms and voxel SUV map response graphs were created. Relative changes in SUV before and after therapy were calculated by separating SUV voxels into radiation therapy dose ranges of 5 Gy. The relationships among SUV decrease, radiation dose, and HT were investigated using multiple regression models. RESULTS: Mean relative pre-post-therapy SUV reductions in BMTOT and BMACT were 27% and 38%, respectively. BMACT volume was significantly reduced after treatment (from 651.5 to 231.6 cm(3), respectively; P<.0001). BMACT V30 was significantly correlated with a reduction in BMACT SUV (R(2), 0.14; P<.001). The reduction in BMACT SUV significantly correlated with reduction in white blood cells (WBCs) at 3 months post-treatment (R(2), 0.27; P=.04) and at last follow-up (R(2), 0.25; P=.04). Different dosimetric parameters of BMTOT and BMACT correlated with long-term hematological outcome. CONCLUSIONS: The volumes of BMTOT and BMACT that are exposed to even relatively low doses of radiation are associated with a decrease in WBC counts following CRT. The loss in proliferative BM SUV uptake translates into low WBC nadirs after treatment. These results suggest the potential of intensity modulated radiation therapy to spare BMTOT to reduce long-term hematological toxicity.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJECTIVES: To evaluate the prevalence of 25-hydroxyvitamin D [25(OH)D] deficiency in HIV-positive patients, a population at risk for osteoporosis. DESIGN: Retrospective assessment of vitamin D levels by season and initiation of combined antiretroviral therapy (cART). METHODS: 25(OH)D was measured in 211 HIV-positive patients: samples were taken before initiation of cART from February to April or from August to October as well as 12 (same season) and 18 months (alternate season) after starting cART. 1,25-Dihydroxyvitamin D [1,25(OH)2D] was measured in a subset of 74 patients. Multivariable analyses included season, sex, age, ethnicity, BMI, intravenous drug use (IDU), renal function, time since HIV diagnosis, previous AIDS, CD4 cell count and cART, in particular nonnucleoside reverse transcriptase inhibitor (NNRTI) and tenofovir (TDF) use. RESULTS: At baseline, median 25(OH)D levels were 37 (interquartile range 20-49) nmol/l in spring and 57 (39-74) nmol/l in the fall; 25(OH)D deficiency less than 30 nmol/l was more prevalent in spring (42%) than in fall (14%), but remained unchanged regardless of cART exposure. In multivariable analysis, 25(OH)D levels were higher in white patients and those with a longer time since HIV diagnosis and lower in springtime measurements and in those with active IDU and NNRTI use. 1-Hydroxylation rates were significantly higher in patients with low 25(OH)D. Hepatitis C seropositivity, previous AIDS and higher CD4 cell counts correlated with lower 1,25(OH)2D levels, whereas BMI and TDF use were associated with higher levels. In TDF-treated patients, higher 1,25(OH)2D correlated with increases in serum alkaline phosphatase. CONCLUSION: Based on the high rate of vitamin D deficiency in HIV-positive patients, systematic screening with consideration of seasonality is warranted. The impact of NNRTIs on 25(OH)D and TDF on 1,25(OH)2D needs further attention.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In patients with venous thromboembolism (VTE), assessment of the risk of fatal recurrent VTE and fatal bleeding during anticoagulation may help to guide intensity and duration of therapy. We aimed to provide estimates of the case-fatality rate (CFR) of recurrent VTE and major bleeding during anticoagulation in a 'real life' population, and to assess these outcomes according to the initial presentation of VTE and its etiology. The study included 41,826 patients with confirmed VTE from the RIETE registry who received different durations of anticoagulation (mean 7.8 ± 0.6 months). During 27,110 patient-years, the CFR was 12.1% (95% CI, 10.2-14.2) for recurrent VTE, and 19.7% (95% CI, 17.4-22.1) for major bleeding. During the first three months of anticoagulant therapy, the CFR of recurrent VTE was 16.1% (95% CI, 13.6-18.9), compared to 2.0% (95% CI, 0-4.2) beyond this period. The CFR of bleeding was 20.2% (95% CI, 17.5-23.1) during the first three months, compared to 18.2% (95% CI, 14.0-23.2) beyond this period. The CFR of recurrent VTE was higher in patients initially presenting with PE (18.5%; 95% CI, 15.3-22.1) than in those with DVT (6.3%; 95% CI, 4.5-8.6), and in patients with provoked VTE (16.3%; 95% CI, 13.6-19.4) than in those with unprovoked VTE (5.5%; 95% CI, 3.5-8.0). In conclusion, the CFR of recurrent VTE decreased over time during anticoagulation, while the CFR of major bleeding remained stable. The CFR of recurrent VTE was higher in patients initially presenting with PE and in those with provoked VTE.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

From data collected during routine TDM, plasma concentrations of citalopram (CIT) and its metabolites demethylcitalopram (DCIT) and didemethylcitalopram (DDCIT) were measured in 345 plasma samples collected in steady-state conditions. They were from 258 patients treated with usual doses (20-60 mg/d) and from patients medicated with 80-360 mg/d CIT. Most patients had one or several comedications, including other antidepressants, antipsychotics, lithium, anticonvulsants, psychostimulants and somatic medications. Dose-corrected CIT plasma concentrations (C/D ratio) were 2.51 +/- 2.25 ng mL-1 mg-1 (n = 258; mean +/- SD). Patients >65 years had significantly higher dose-corrected CIT plasma concentrations (n = 56; 3.08 +/- 1.35 ng mL-1 mg-1) than younger patients (n = 195; 2.35 +/- 2.46 ng mL-1 mg-1) (P = 0.03). CIT plasma concentrations in the generally recommended dose range were [mean +/- SD, (median)]: 57 +/- 64 (45) ng/mL (10-20 mg/d; n = 64), 117 +/- 95 (91) ng/mL (21-60 mg/d; n = 96). At higher than usual doses, the following concentrations of CIT were measured: 61-120 mg/d CIT, 211 +/- 103 (190) ng/mL (n = 93); 121-200 mg/d: 339 +/- 143 (322) ng/mL (n = 70); 201-280 mg/d: 700 +/- 408 (565) ng/mL (n = 18); 281-360 mg/d: 888 +/- 620 (616) ng/mL (n = 4). When only one sample per patient (at the highest daily dose if repeated dosages) is considered, there is a linear and significant correlation (n = 48, r = 0.730; P < 0.001) between daily dose (10-200 mg/d) and CIT plasma concentrations. In experiments with dogs, DDCIT was reported to affect the QT interval when present at concentrations >300 ng/mL. In this study, DDCIT concentration reached 100 ng/mL in a patient treated with 280 mg/d CIT. Twelve other patients treated with 140-320 mg/d CIT had plasma concentrations of DDCIT within the range 52-73 ng/mL. In a subgroup comprised of patients treated with > or =160 mg/d CIT and with CIT plasma concentrations < or =300 ng/mL, and patients treated with < or =200 mg/d CIT and CIT plasma concentrations > or = 600 ng/mL, the enantiomers of CIT and DCIT were also analyzed. The highest S-CIT concentration measured in this subgroup was 327 ng/mL in a patient treated with 140 mg/d CIT, but the highest S-CIT concentration (632 ng/mL) was measured in patient treated with 360 mg/d CIT. In conclusion, there is a highly linear correlation between CIT plasma concentrations and CIT doses, well above the usual dose range.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Synthetic root exudates were formulated based on the organic acid composition of root exudates derived from the rhizosphere of aseptically grown corn plants, pH of the rhizosphere, and the background chemical matrices of the soil solutions. The synthetic root exudates, which mimic the chemical conditions of the rhizosphere environment where soil-borne metals are dissolved and absorbed by plants, were used to extract metals from sewage-sludge treated soils 16 successive times. The concentrations of Zn, Cd, Ni, Cr, and Cu of the sludge-treated soil were 71.74, 0.21, 15.90, 58.12, and 37.44 mg kg-1, respectively. The composition of synthetic root exudates consisted of acetic, butyric, glutaric, lactic, maleic, propionic, pyruvic, succinic, tartaric, and valeric acids. The organic acid mixtures had concentrations of 0.05 and 0.1 mol L-1 -COOH. The trace elements removed by successive extractions may be considered representative for the availability of these metals to plants in these soils. The chemical speciation of the metals in the liquid phase was calculated; results showed that metals in sludge-treated soils were dissolved and formed soluble complexes with the different organic acid-based root exudates. The most reactive organic acid ligands were lactate, maleate, tartarate, and acetate. The inorganic ligands of chloride and sulfate played insignificant roles in metal dissolution. Except for Cd, free ions did not represent an important chemical species of the metals in the soil rhizosphere. As different metals formed soluble complexes with different ligands in the rhizosphere, no extractor, based on a single reagent would be able to recover all of the potentially plant-available metals from soils; the root exudate-derived organic acid mixtures tested in this study may be better suited to recover potentially plant-available metals from soils than the conventional extractors.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Studies on sewage sludge (SS) have confirmed the possibilities of using this waste as fertilizer and/or soil conditioner in crop production areas. Despite restrictions with regard to the levels of potentially toxic elements (PTE) and pathogens, it is believed that properly treated SS with low PTE levels, applied to soil at adequate rates, may improve the soil chemical and microbiological properties. This study consisted of a long-term field experiment conducted on a Typic Haplorthox (eutroferric Red Latosol) treated with SS for seven successive years for maize production, to evaluate changes in the soil chemical and microbiological properties. The treatments consisted of two SS rates (single and double dose of the crop N requirement) and a mineral fertilizer treatment. Soil was sampled in the 0-0.20 m layer and analyzed for chemical properties (organic C, pH, P, K, Ca, Mg, CEC, B, Cu, Fe, Mn, Zn, Cd, Ni, and Pb) and microbiological properties (basal respiration, microbial biomass activity, microbial biomass C, metabolic quotient, microbial quotient, and protease and dehydrogenase enzyme activities). Successive SS applications to soil increased the macro- and micronutrient availability, but the highest SS dose reduced the soil pH significantly, indicating a need for periodic corrections. The SS treatments also affected soil microbial activity and biomass negatively. There were no significant differences among treatments for maize grain yield. After seven annual applications of the recommended sludge rate, the heavy metal levels in the soil had not reached toxic levels.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

AIM: To assess the influence of hemoglobin (Hb) levels in locally advanced head and neck cancer (LAHNC) patients treated with surgery and postoperative radiotherapy (PORT). MATERIAL AND METHODS: Pre- and postoperative Hb levels were collected in 79 patients treated with surgery followed by accelerated PORT for LAHNC. Median follow-up was 52 months (range 12-95 months). RESULTS AND DISCUSSION: Four-year overall survival (OS) rate was 51%. Neither pre- nor postoperative Hb level (&lt;120 or 130 g/l in women or men, respectively) influenced the outcome. However, when Hb decrease between pre- and postoperative Hb values was taken into account, 4-year OS was significantly higher in patients with Hb difference less than 38 g/l (quartile value) compared with those with Hb decrease 38 g/l or more (61% versus 16%, P = 0.008). CONCLUSION: Decrease in Hb level by more than 38 g/l after surgery secondary to blood loss influences the outcome when postoperative RT is indicated.