220 resultados para TREATED WASTEWATER REUSE
Resumo:
BACKGROUND: We retrospectively reviewed the long-term outcome and late side effects of endometrial cancer (EC) patients treated with different techniques of postoperative radiotherapy (PORT). METHODS: Between 1999 and 2012, 237 patients with EC were treated with PORT. Two-dimensional external beam radiotherapy (2D-EBRT) was used in 69 patients (30 %), three-dimensional EBRT (3D-EBRT) in 51 (21 %), and intensity-modulated RT (IMRT) with helical Tomotherapy in 47 (20 %). All patients received a vaginal brachytherapy (VB) boost. Seventy patients (29 %) received VB alone. RESULTS: After a median of 68 months (range, 6-154) of follow-up, overall survival was 75 % [95 % confidence interval (CI), 69-81], disease-free survival was 72 % (95% CI, 66-78), cancer-specific survival was 85 % (95 % CI, 80-89), and locoregional control was 86 % (95 % CI, 81-91). The 5-year estimates of grade 3 or more toxicity and second cancer rates were 0 and 7 % (95 % CI, 1-13) for VB alone, 6 % (95 % CI, 1-11) and 0 % for IMRT + VB, 9 % (95 % CI, 1-17) and 5 % (95 % CI, 1-9) for 3D-EBRT + VB, and 22 % (95 % CI, 12-32) and 12 % (95 % CI, 4-20) for 2D-EBRT + VB (P = 0.002 and P = 0.01), respectively. CONCLUSIONS: Pelvic EBRT should be tailored to patients with high-risk EC because the severe late toxicity observed might outweigh the benefits. When EBRT is prescribed for EC, IMRT should be considered, because it was associated with a significant reduction of severe late side effects.
Resumo:
INTRODUCTION: Electroencephalography (EEG) has a central role in the outcome prognostication in subjects with anoxic/hypoxic encephalopathy following a cardiac arrest (CA). Continuous EEG monitoring (cEEG) has been consistently developed and studied; however, its yield as compared to repeated standard EEG (sEEG) is unknown. METHODS: We studied a prospective cohort of comatose adults treated with therapeutic hypothermia (TH) after a CA. cEEG data regarding background activity and epileptiform components were compared to two 20 minute sEEG extracted from the cEEG recording (one during TH, and one in early normothermia). RESULTS: In this cohort, 34 recordings were studied. During TH, the agreement between cEEG and sEEG was 97.1% (95% CI: 84.6 - 99.9%) for background discontinuity and reactivity evaluation, while it was 94.1% (95% CI 80.3 - 99.2%) regarding epileptiform activity. In early normothermia, we did not find any discrepancies. Thus, concordance was very good during TH (kappa 0.83), and optimal during normothermia (kappa=1). The median delay between CA and the first EEG reactivity testing was 18 hours (range: 4.75 - 25) for patients with perfect agreement and 10 hours (range: 5.75 - 10.5) for the three patients in whom there were discordant findings (P=0.02, Wilcoxon). CONCLUSION: Standard intermittent EEG has comparable performance than continuous EEG both for variables important for outcome prognostication (EEG reactivity) and identification of epileptiform transients in this relatively small sample of comatose survivors of CA. This finding has an important practical implication, especially for centers where EEG resources are limited.
Resumo:
OBJECTIVE: To evaluate the results of closed and open grade I and II tibial shaft fractures treated by reamed nail and unreamed nailing. SUBJECTS AND METHODS: Between 1997 and 2000, 119 patients with tibial shaft fractures were treated with reamed tibial nails. Postoperatively 96 patients (70 closed and 26 grade I and II open fractures) were followed clinically and radiologically for up to 18 months. The nail was inserted either by patellar tendon splitting or by nonsplitting technique. The nail was inserted after overreaming by 1.5 mm. Postoperatively, patients with isolated tibial fracture were mobilized by permitting partial weight bearing on the injured leg for 6 weeks. Patients with associated ankle fractures were allowed to walk with a Sarmiento cast. RESULTS: Postoperatively, 6 (6.3%) patients developed a compartment syndrome after surgery. In 48 (50%) cases, dynamization of the nail was carried out after a mean period of 12 weeks for delayed union. Overall, a 90.6% union was obtained at a mean of 24 weeks without difference between closed or open fractures. Two (2.1%) patients with an open grade II fracture developed a deep infection requiring treatment. A 9.4% rate of malunion was observed. Eight (8.3%) patients developed screw failure without clinical consequences. At the last follow-up, 52% of patients with patellar tendon splitting had anterior knee pain, compared to those (14%) who did not have tendon splitting. CONCLUSION: Reamed intramedullary nail is a suitable implant in treating closed as well as grade I and II open tibial shaft fractures.
Resumo:
PURPOSE: To quantify the relationship between bone marrow (BM) response to radiation and radiation dose by using (18)F-labeled fluorodeoxyglucose positron emission tomography [(18)F]FDG-PET standard uptake values (SUV) and to correlate these findings with hematological toxicity (HT) in cervical cancer (CC) patients treated with chemoradiation therapy (CRT). METHODS AND MATERIALS: Seventeen women with a diagnosis of CC were treated with standard doses of CRT. All patients underwent pre- and post-therapy [(18)F]FDG-PET/computed tomography (CT). Hemograms were obtained before and during treatment and 3 months after treatment and at last follow-up. Pelvic bone was autosegmented as total bone marrow (BMTOT). Active bone marrow (BMACT) was contoured based on SUV greater than the mean SUV of BMTOT. The volumes (V) of each region receiving 10, 20, 30, and 40 Gy (V10, V20, V30, and V40, respectively) were calculated. Metabolic volume histograms and voxel SUV map response graphs were created. Relative changes in SUV before and after therapy were calculated by separating SUV voxels into radiation therapy dose ranges of 5 Gy. The relationships among SUV decrease, radiation dose, and HT were investigated using multiple regression models. RESULTS: Mean relative pre-post-therapy SUV reductions in BMTOT and BMACT were 27% and 38%, respectively. BMACT volume was significantly reduced after treatment (from 651.5 to 231.6 cm(3), respectively; P<.0001). BMACT V30 was significantly correlated with a reduction in BMACT SUV (R(2), 0.14; P<.001). The reduction in BMACT SUV significantly correlated with reduction in white blood cells (WBCs) at 3 months post-treatment (R(2), 0.27; P=.04) and at last follow-up (R(2), 0.25; P=.04). Different dosimetric parameters of BMTOT and BMACT correlated with long-term hematological outcome. CONCLUSIONS: The volumes of BMTOT and BMACT that are exposed to even relatively low doses of radiation are associated with a decrease in WBC counts following CRT. The loss in proliferative BM SUV uptake translates into low WBC nadirs after treatment. These results suggest the potential of intensity modulated radiation therapy to spare BMTOT to reduce long-term hematological toxicity.
Resumo:
OBJECTIVES: To evaluate the prevalence of 25-hydroxyvitamin D [25(OH)D] deficiency in HIV-positive patients, a population at risk for osteoporosis. DESIGN: Retrospective assessment of vitamin D levels by season and initiation of combined antiretroviral therapy (cART). METHODS: 25(OH)D was measured in 211 HIV-positive patients: samples were taken before initiation of cART from February to April or from August to October as well as 12 (same season) and 18 months (alternate season) after starting cART. 1,25-Dihydroxyvitamin D [1,25(OH)2D] was measured in a subset of 74 patients. Multivariable analyses included season, sex, age, ethnicity, BMI, intravenous drug use (IDU), renal function, time since HIV diagnosis, previous AIDS, CD4 cell count and cART, in particular nonnucleoside reverse transcriptase inhibitor (NNRTI) and tenofovir (TDF) use. RESULTS: At baseline, median 25(OH)D levels were 37 (interquartile range 20-49) nmol/l in spring and 57 (39-74) nmol/l in the fall; 25(OH)D deficiency less than 30 nmol/l was more prevalent in spring (42%) than in fall (14%), but remained unchanged regardless of cART exposure. In multivariable analysis, 25(OH)D levels were higher in white patients and those with a longer time since HIV diagnosis and lower in springtime measurements and in those with active IDU and NNRTI use. 1-Hydroxylation rates were significantly higher in patients with low 25(OH)D. Hepatitis C seropositivity, previous AIDS and higher CD4 cell counts correlated with lower 1,25(OH)2D levels, whereas BMI and TDF use were associated with higher levels. In TDF-treated patients, higher 1,25(OH)2D correlated with increases in serum alkaline phosphatase. CONCLUSION: Based on the high rate of vitamin D deficiency in HIV-positive patients, systematic screening with consideration of seasonality is warranted. The impact of NNRTIs on 25(OH)D and TDF on 1,25(OH)2D needs further attention.
Resumo:
In patients with venous thromboembolism (VTE), assessment of the risk of fatal recurrent VTE and fatal bleeding during anticoagulation may help to guide intensity and duration of therapy. We aimed to provide estimates of the case-fatality rate (CFR) of recurrent VTE and major bleeding during anticoagulation in a 'real life' population, and to assess these outcomes according to the initial presentation of VTE and its etiology. The study included 41,826 patients with confirmed VTE from the RIETE registry who received different durations of anticoagulation (mean 7.8 ± 0.6 months). During 27,110 patient-years, the CFR was 12.1% (95% CI, 10.2-14.2) for recurrent VTE, and 19.7% (95% CI, 17.4-22.1) for major bleeding. During the first three months of anticoagulant therapy, the CFR of recurrent VTE was 16.1% (95% CI, 13.6-18.9), compared to 2.0% (95% CI, 0-4.2) beyond this period. The CFR of bleeding was 20.2% (95% CI, 17.5-23.1) during the first three months, compared to 18.2% (95% CI, 14.0-23.2) beyond this period. The CFR of recurrent VTE was higher in patients initially presenting with PE (18.5%; 95% CI, 15.3-22.1) than in those with DVT (6.3%; 95% CI, 4.5-8.6), and in patients with provoked VTE (16.3%; 95% CI, 13.6-19.4) than in those with unprovoked VTE (5.5%; 95% CI, 3.5-8.0). In conclusion, the CFR of recurrent VTE decreased over time during anticoagulation, while the CFR of major bleeding remained stable. The CFR of recurrent VTE was higher in patients initially presenting with PE and in those with provoked VTE.
Resumo:
From data collected during routine TDM, plasma concentrations of citalopram (CIT) and its metabolites demethylcitalopram (DCIT) and didemethylcitalopram (DDCIT) were measured in 345 plasma samples collected in steady-state conditions. They were from 258 patients treated with usual doses (20-60 mg/d) and from patients medicated with 80-360 mg/d CIT. Most patients had one or several comedications, including other antidepressants, antipsychotics, lithium, anticonvulsants, psychostimulants and somatic medications. Dose-corrected CIT plasma concentrations (C/D ratio) were 2.51 +/- 2.25 ng mL-1 mg-1 (n = 258; mean +/- SD). Patients >65 years had significantly higher dose-corrected CIT plasma concentrations (n = 56; 3.08 +/- 1.35 ng mL-1 mg-1) than younger patients (n = 195; 2.35 +/- 2.46 ng mL-1 mg-1) (P = 0.03). CIT plasma concentrations in the generally recommended dose range were [mean +/- SD, (median)]: 57 +/- 64 (45) ng/mL (10-20 mg/d; n = 64), 117 +/- 95 (91) ng/mL (21-60 mg/d; n = 96). At higher than usual doses, the following concentrations of CIT were measured: 61-120 mg/d CIT, 211 +/- 103 (190) ng/mL (n = 93); 121-200 mg/d: 339 +/- 143 (322) ng/mL (n = 70); 201-280 mg/d: 700 +/- 408 (565) ng/mL (n = 18); 281-360 mg/d: 888 +/- 620 (616) ng/mL (n = 4). When only one sample per patient (at the highest daily dose if repeated dosages) is considered, there is a linear and significant correlation (n = 48, r = 0.730; P < 0.001) between daily dose (10-200 mg/d) and CIT plasma concentrations. In experiments with dogs, DDCIT was reported to affect the QT interval when present at concentrations >300 ng/mL. In this study, DDCIT concentration reached 100 ng/mL in a patient treated with 280 mg/d CIT. Twelve other patients treated with 140-320 mg/d CIT had plasma concentrations of DDCIT within the range 52-73 ng/mL. In a subgroup comprised of patients treated with > or =160 mg/d CIT and with CIT plasma concentrations < or =300 ng/mL, and patients treated with < or =200 mg/d CIT and CIT plasma concentrations > or = 600 ng/mL, the enantiomers of CIT and DCIT were also analyzed. The highest S-CIT concentration measured in this subgroup was 327 ng/mL in a patient treated with 140 mg/d CIT, but the highest S-CIT concentration (632 ng/mL) was measured in patient treated with 360 mg/d CIT. In conclusion, there is a highly linear correlation between CIT plasma concentrations and CIT doses, well above the usual dose range.
Resumo:
Endotoxin causes an inflammation at the bronchial and alveolar level. The inflammation-induced increase in permeability of the bronchoalveolar epithelial barrier is supposed to cause a leakage of pneumoproteins. Therefore, their concentrations are expected to increase in the bloodstream.This study aimed at examining the association between occupational exposure to endotoxin and a serum pneumoprotein, surfactant protein A, to look for nonoccupational factors capable of confounding this association, and examine the relation between surfactant protein A and spirometry. There were 369 control subjects, 325 wastewater workers, and 84 garbage collectors in the study. Exposure to endotoxin was assessed through personal sampling and the Limulus amebocytes lysate assay. Surfactant protein A was determined by an in house sandwich enzyme-linked immunosorbent assay (ELISA) in 697 subjects. Clinical and smoking history were ascertained and spirometry carried out according to American Thoracic Society criteria. Multiple linear regression was used for statistical analysis. Exposure was fairly high during some tasks in wastewater workers but did not influence surfactant protein A. Surfactant protein A was lower in asthmatics. Interindividual variability was large. No correlation with spirometry was found. Endotoxin has no effect on surfactant protein A at these endotoxin levels and serum surfactant protein A does not correlate with spirometry. The decreased surfactant protein A secretion in asthmatics requires further study.
Resumo:
INTRODUCTION: Occupational exposure to bioaerosols in wastewater treatment plants (WWTP) and its consequence on workers׳ health are well documented. Most studies were devoted to enumerating and identifying cultivable bacteria and fungi, as well as measuring concentrations of airborne endotoxins, as these are the main health-related factors found in WWTP. Surprisingly, very few studies have investigated the presence and concentrations of airborne virus in WWTP. However, many enteric viruses are present in wastewater and, due to their small size, they should become aerosolized. Two in particular, the norovirus and the adenovirus, are extremely widespread and are the major causes of infectious gastrointestinal diseases reported around the world. The third one, hepatitis E virus, has an emerging status. GOAL AND METHODS: This study׳s objectives were to detect and quantify the presence and concentrations of 3 different viruses (adenovirus, norovirus and the hepatitis E virus) in air samples from 31 WWTPs by using quantitative polymerase chain reaction (qPCR) during two different seasons and two consecutive years. RESULTS: Adenovirus was present in 100% of summer WWTP samples and 97% of winter samples. The highest airborne concentration measured was 2.27×10(6) genome equivalent/m(3) and, on average, these were higher in summer than in winter. Norovirus was detected in only 3 of the 123 air samples, and the hepatitis E virus was not detected. CONCLUSIONS: Concentrations of potentially pathogenic viral particles in WWTP air are non-negligible and could partly explain the work-related gastrointestinal symptoms often reported in employees in this sector.
Resumo:
AIM: To assess the influence of hemoglobin (Hb) levels in locally advanced head and neck cancer (LAHNC) patients treated with surgery and postoperative radiotherapy (PORT). MATERIAL AND METHODS: Pre- and postoperative Hb levels were collected in 79 patients treated with surgery followed by accelerated PORT for LAHNC. Median follow-up was 52 months (range 12-95 months). RESULTS AND DISCUSSION: Four-year overall survival (OS) rate was 51%. Neither pre- nor postoperative Hb level (<120 or 130 g/l in women or men, respectively) influenced the outcome. However, when Hb decrease between pre- and postoperative Hb values was taken into account, 4-year OS was significantly higher in patients with Hb difference less than 38 g/l (quartile value) compared with those with Hb decrease 38 g/l or more (61% versus 16%, P = 0.008). CONCLUSION: Decrease in Hb level by more than 38 g/l after surgery secondary to blood loss influences the outcome when postoperative RT is indicated.
Resumo:
Background: The anti-angiogenic drug, bevacizumab (Bv), is currently used in the treatment of different malignancies including breast cancer. Many angiogenesis-associated molecules are found in the circulation of cancer patients. Until now, there are no prognostic or predictive factors identified in breast cancer patients treated with Bv. We present here the first results of the prospective monitoring of 6 angiogenesis-related molecules in the peripheral blood of breast cancer patients treated with a combination of Bv and PLD in the phase II trial, SAKK 24/06. Methods: Patients were treated with PLD (20 mg/m2) and Bv (10 mg/kg) on days 1 and 15 of each 4-week cycle for a maximum of 6 cycles, followed by Bv monotherapy maintenance (10 mg/m2 q2 weeks) until progression or severe toxicity. Plasma and serum samples were collected at baseline, after 2 months of therapy, then every 3 months and at treatment discontinuation. Enzyme-linked immunosorbent assays (Quantikine, R&D Systems and Reliatech) were used to measure the expression levels of human vascular endothelial growth factor (hVEGF), placental growth factor (hPlGF), matrix metalloproteinase 9 (hMMP9) and soluble VEGF receptors hsVEGFR-1, hsVEGFR-2 and hsVEGFR-3. The log-transformed data (to reduce the skewness) for each marker was analyzed using an analysis of variance (ANOVA) model to determine if there was a difference between the mean of the subgroups of interest (where α = 0.05). The untransformed data was also analyzed in the same manner as a "sensitivity" check. Results: 132 blood samples were collected in 41 out of 43 enrolled patients. Baseline levels of the molecules were compared to disease status according to RECIST. There was a statistically significant difference in the mean of the log-transformed levels of hMMP9 between responders [CR+PR] versus the mean in patients with PD (p-value=0.0004, log fold change=0.7536), and between patients with disease control [CR+PR+SD] and those with PD (p-value=<0.0001, log fold change=0.81559), with the log-transformed level of hMMP9 being higher for the responder group. The mean of the log-transformed levels of hsVEGFR-1 was statistically significantly different between patients with disease control [CR+PR+SD] and those with PD (p-value=0.0068, log fold change=-0.6089), where the log-transformed level of hsVEGFR-1 was lower for the responder group. The log-transformed level of hMMP9 at baseline was identified as a significant prognostic factor in terms of progression free survival (PFS): p-value=0.0417, hazard ratio (HR)=0.574 with a corresponding 95% confidence interval (0.336 - 0.979)). No strong correlation was shown either between the log-transformed levels of hsVEGF, hPlGF, hsVEGFR-2 or hsVEGFR-3 and clinical response or the occurrence of severe toxicity, or between the levels of the different molecules. Conclusions: Our results suggest that baseline plasma level of the matrix metalloproteinase, hMMP9, could predict tumor response and PFS in patients treated with a combination of Bv and PLD. These data justify further investigation in breast cancer patients treated with anti-angiogenic therapy.
Resumo:
OBJECTIVES: Capillary rarefaction is a hallmark of untreated hypertension. Recent data indicate that rarefaction may be reversed by antihypertensive treatment in nondiabetic hypertensive patients. Despite the frequent association of diabetes with hypertension, nothing is known on the capillary density of treated diabetic patients with hypertension. METHODS: We enrolled 21 normotensive healthy, 25 hypertensive only, and 21 diabetic (type 2) hypertensive subjects. All hypertensive patients were treated with a blocker of the renin-angiotensin system, and a majority had a home blood pressure ≤135/85 mmHg. Capillary density was assessed with videomicroscopy on dorsal finger skin and with laser Doppler imaging on forearm skin (maximal vasodilation elicited by local heating). RESULTS: There was no difference between any of the study groups in either dorsal finger skin capillary density (controls 101 ± 11 capillaries/mm(2) , nondiabetic hypertensive 99 ± 16, diabetic hypertensive 96 ± 18, p > 0.5) or maximal blood flow in forearm skin (controls 666 ± 114 perfusion units, nondiabetic hypertensive 612 ± 126, diabetic hypertensive 620 ± 103, p > 0.5). CONCLUSIONS: Irrespective of the presence or not of type 2 diabetes, capillary density is normal in hypertensive patients with reasonable control of blood pressure achieved with a blocker of the renin-angiotensin system.