940 resultados para treated
Resumo:
PURPOSE: To use diffusion-tensor (DT) magnetic resonance (MR) imaging in patients with essential tremor who were treated with transcranial MR imaging-guided focused ultrasound lesion inducement to identify the structural connectivity of the ventralis intermedius nucleus of the thalamus and determine how DT imaging changes correlated with tremor changes after lesion inducement. MATERIALS AND METHODS: With institutional review board approval, and with prospective informed consent, 15 patients with medication-refractory essential tremor were enrolled in a HIPAA-compliant pilot study and were treated with transcranial MR imaging-guided focused ultrasound surgery targeting the ventralis intermedius nucleus of the thalamus contralateral to their dominant hand. Fourteen patients were ultimately included. DT MR imaging studies at 3.0 T were performed preoperatively and 24 hours, 1 week, 1 month, and 3 months after the procedure. Fractional anisotropy (FA) maps were calculated from the DT imaging data sets for all time points in all patients. Voxels where FA consistently decreased over time were identified, and FA change in these voxels was correlated with clinical changes in tremor over the same period by using Pearson correlation. RESULTS: Ipsilateral brain structures that showed prespecified negative correlation values of FA over time of -0.5 or less included the pre- and postcentral subcortical white matter in the hand knob area; the region of the corticospinal tract in the centrum semiovale, in the posterior limb of the internal capsule, and in the cerebral peduncle; the thalamus; the region of the red nucleus; the location of the central tegmental tract; and the region of the inferior olive. The contralateral middle cerebellar peduncle and bilateral portions of the superior vermis also showed persistent decrease in FA over time. There was strong correlation between decrease in FA and clinical improvement in hand tremor 3 months after lesion inducement (P < .001). CONCLUSION: DT MR imaging after MR imaging-guided focused ultrasound thalamotomy depicts changes in specific brain structures. The magnitude of the DT imaging changes after thalamic lesion inducement correlates with the degree of clinical improvement in essential tremor.
Resumo:
Metastatic melanoma has a poor prognosis with high resistance to chemotherapy and radiation. Recently, the anti-CTLA-4 antibody ipilimumab has demonstrated clinical efficacy, being the first agent to significantly prolong the overall survival of inoperable stage III/IV melanoma patients. A major aim of patient immune monitoring is the identification of biomarkers that predict clinical outcome. We studied circulating myeloid-derived suppressor cells (MDSC) in ipilimumab-treated patients to detect alterations in the myeloid cell compartment and possible correlations with clinical outcome. Lin(-) CD14(+) HLA-DR(-) monocytic MDSC were enriched in peripheral blood of melanoma patients compared to healthy donors (HD). Tumor resection did not significantly alter MDSC frequencies. During ipilimumab treatment, MDSC frequencies did not change significantly compared to baseline levels. We observed high inter-patient differences. MDSC frequencies in ipilimumab-treated patients were independent of baseline serum lactate dehydrogenase levels but tended to increase in patients with severe metastatic disease (M1c) compared to patients with metastases in skin or lymph nodes only (M1a), who had frequencies comparable to HD. Interestingly, clinical responders to ipilimumab therapy showed significantly less lin(-) CD14(+) HLA-DR(-) cells as compared to non-responders. The data suggest that the frequency of monocytic MDSC may be used as predictive marker of response, as low frequencies identify patients more likely benefitting from ipilimumab treatment. Prospective clinical trials assessing MDSC frequencies as potential biomarkers are warranted to validate these observations.
Resumo:
BACKGROUND: We retrospectively reviewed the long-term outcome and late side effects of endometrial cancer (EC) patients treated with different techniques of postoperative radiotherapy (PORT). METHODS: Between 1999 and 2012, 237 patients with EC were treated with PORT. Two-dimensional external beam radiotherapy (2D-EBRT) was used in 69 patients (30 %), three-dimensional EBRT (3D-EBRT) in 51 (21 %), and intensity-modulated RT (IMRT) with helical Tomotherapy in 47 (20 %). All patients received a vaginal brachytherapy (VB) boost. Seventy patients (29 %) received VB alone. RESULTS: After a median of 68 months (range, 6-154) of follow-up, overall survival was 75 % [95 % confidence interval (CI), 69-81], disease-free survival was 72 % (95% CI, 66-78), cancer-specific survival was 85 % (95 % CI, 80-89), and locoregional control was 86 % (95 % CI, 81-91). The 5-year estimates of grade 3 or more toxicity and second cancer rates were 0 and 7 % (95 % CI, 1-13) for VB alone, 6 % (95 % CI, 1-11) and 0 % for IMRT + VB, 9 % (95 % CI, 1-17) and 5 % (95 % CI, 1-9) for 3D-EBRT + VB, and 22 % (95 % CI, 12-32) and 12 % (95 % CI, 4-20) for 2D-EBRT + VB (P = 0.002 and P = 0.01), respectively. CONCLUSIONS: Pelvic EBRT should be tailored to patients with high-risk EC because the severe late toxicity observed might outweigh the benefits. When EBRT is prescribed for EC, IMRT should be considered, because it was associated with a significant reduction of severe late side effects.
Resumo:
INTRODUCTION: Electroencephalography (EEG) has a central role in the outcome prognostication in subjects with anoxic/hypoxic encephalopathy following a cardiac arrest (CA). Continuous EEG monitoring (cEEG) has been consistently developed and studied; however, its yield as compared to repeated standard EEG (sEEG) is unknown. METHODS: We studied a prospective cohort of comatose adults treated with therapeutic hypothermia (TH) after a CA. cEEG data regarding background activity and epileptiform components were compared to two 20 minute sEEG extracted from the cEEG recording (one during TH, and one in early normothermia). RESULTS: In this cohort, 34 recordings were studied. During TH, the agreement between cEEG and sEEG was 97.1% (95% CI: 84.6 - 99.9%) for background discontinuity and reactivity evaluation, while it was 94.1% (95% CI 80.3 - 99.2%) regarding epileptiform activity. In early normothermia, we did not find any discrepancies. Thus, concordance was very good during TH (kappa 0.83), and optimal during normothermia (kappa=1). The median delay between CA and the first EEG reactivity testing was 18 hours (range: 4.75 - 25) for patients with perfect agreement and 10 hours (range: 5.75 - 10.5) for the three patients in whom there were discordant findings (P=0.02, Wilcoxon). CONCLUSION: Standard intermittent EEG has comparable performance than continuous EEG both for variables important for outcome prognostication (EEG reactivity) and identification of epileptiform transients in this relatively small sample of comatose survivors of CA. This finding has an important practical implication, especially for centers where EEG resources are limited.
Resumo:
Organic residue application into soil alter the emission of gases to atmosphere and CO2, CH4, N2O may contribute to increase the greenhouse effect. This experiment was carried out in a restoration area on a dystrophic Ultisol (PVAd) to quantify greenhouse gas (GHG) emissions from soil under castor bean cultivation, treated with sewage sludge (SS) or mineral fertilizer. The following treatments were tested: control without N; FertMin = mineral fertilizer; SS5 = 5 t ha-1 SS (37.5 kg ha-1 N); SS10 = 10 t ha-1 SS (75 kg ha-1 N); and SS20 = 20 t ha-1 SS (150 kg ha-1 N). The amount of sludge was based on the recommended rate of N for castor bean (75 kg ha-1), the N level of SS and the mineralization fraction of N from SS. Soil gas emission was measured for 21 days. Sewage sludge and mineral fertilizers altered the CO2, CH4 and N2O fluxes. Soil moisture had no effect on GHG emissions and the gas fluxes was statistically equivalent after the application of FertMin and of 5 t ha-1 SS. The application of the entire crop N requirement in the form of SS practically doubled the Global Warming Potential (GWP) and the C equivalent emissions in comparison with FertMin treatments.
Resumo:
OBJECTIVE: To evaluate the results of closed and open grade I and II tibial shaft fractures treated by reamed nail and unreamed nailing. SUBJECTS AND METHODS: Between 1997 and 2000, 119 patients with tibial shaft fractures were treated with reamed tibial nails. Postoperatively 96 patients (70 closed and 26 grade I and II open fractures) were followed clinically and radiologically for up to 18 months. The nail was inserted either by patellar tendon splitting or by nonsplitting technique. The nail was inserted after overreaming by 1.5 mm. Postoperatively, patients with isolated tibial fracture were mobilized by permitting partial weight bearing on the injured leg for 6 weeks. Patients with associated ankle fractures were allowed to walk with a Sarmiento cast. RESULTS: Postoperatively, 6 (6.3%) patients developed a compartment syndrome after surgery. In 48 (50%) cases, dynamization of the nail was carried out after a mean period of 12 weeks for delayed union. Overall, a 90.6% union was obtained at a mean of 24 weeks without difference between closed or open fractures. Two (2.1%) patients with an open grade II fracture developed a deep infection requiring treatment. A 9.4% rate of malunion was observed. Eight (8.3%) patients developed screw failure without clinical consequences. At the last follow-up, 52% of patients with patellar tendon splitting had anterior knee pain, compared to those (14%) who did not have tendon splitting. CONCLUSION: Reamed intramedullary nail is a suitable implant in treating closed as well as grade I and II open tibial shaft fractures.
Resumo:
A by-product of Wastewater Treatment Stations is sewage sludge. By treatment and processing, the sludge is made suitable for rational and environmentally safe use in agriculture. The aim of this study was to assess the influence of different doses of limed sewage sludge (50 %) on clay dispersion in soil samples with different textures (clayey and medium). The study was conducted with soil samples collected from native forest, on a Red Latosol (Brazilian classification: Latossolo Vermelho distroférrico) loamy soil in Londrina (PR) and a Red-Yellow Latosol (BC: Latossolo Vermelho-Amarelo distrófico) medium texture soil in Jaguapitã (PR). Pots were filled with 3 kg of air-dried fine earth and kept in greenhouse. The experiment was arranged in a randomized block design with six treatments: T1 control, and treatments with limed sewage sludge (50 %) as follows: T2 (3 t ha-1), T3 (6 t ha-1), T4 (12 t ha-1), T5 (24 t ha-1) and T6 (48 t ha-1) and five replications. The incubation time was 180 days. At the end of this period, the pots were opened and two sub-samples per treatment collected to determine pH-H2O, pH KCl (1 mol L-1), organic matter content, water-dispersible clay, ΔpH (pH KCl - pH-H2O) and estimated PZC (point of zero charge): PZC = 2 pH KCl - pH-H2O, as well as the mineralogy of the clay fraction, determined by X ray diffraction. The results showed no significant difference in the average values for water-dispersible clay between the control and the other treatments for the two soil samples studied and ΔpH was the variable that correlated best with water-dispersible clay in both soils.
Resumo:
PURPOSE: To quantify the relationship between bone marrow (BM) response to radiation and radiation dose by using (18)F-labeled fluorodeoxyglucose positron emission tomography [(18)F]FDG-PET standard uptake values (SUV) and to correlate these findings with hematological toxicity (HT) in cervical cancer (CC) patients treated with chemoradiation therapy (CRT). METHODS AND MATERIALS: Seventeen women with a diagnosis of CC were treated with standard doses of CRT. All patients underwent pre- and post-therapy [(18)F]FDG-PET/computed tomography (CT). Hemograms were obtained before and during treatment and 3 months after treatment and at last follow-up. Pelvic bone was autosegmented as total bone marrow (BMTOT). Active bone marrow (BMACT) was contoured based on SUV greater than the mean SUV of BMTOT. The volumes (V) of each region receiving 10, 20, 30, and 40 Gy (V10, V20, V30, and V40, respectively) were calculated. Metabolic volume histograms and voxel SUV map response graphs were created. Relative changes in SUV before and after therapy were calculated by separating SUV voxels into radiation therapy dose ranges of 5 Gy. The relationships among SUV decrease, radiation dose, and HT were investigated using multiple regression models. RESULTS: Mean relative pre-post-therapy SUV reductions in BMTOT and BMACT were 27% and 38%, respectively. BMACT volume was significantly reduced after treatment (from 651.5 to 231.6 cm(3), respectively; P<.0001). BMACT V30 was significantly correlated with a reduction in BMACT SUV (R(2), 0.14; P<.001). The reduction in BMACT SUV significantly correlated with reduction in white blood cells (WBCs) at 3 months post-treatment (R(2), 0.27; P=.04) and at last follow-up (R(2), 0.25; P=.04). Different dosimetric parameters of BMTOT and BMACT correlated with long-term hematological outcome. CONCLUSIONS: The volumes of BMTOT and BMACT that are exposed to even relatively low doses of radiation are associated with a decrease in WBC counts following CRT. The loss in proliferative BM SUV uptake translates into low WBC nadirs after treatment. These results suggest the potential of intensity modulated radiation therapy to spare BMTOT to reduce long-term hematological toxicity.
Resumo:
OBJECTIVES: To evaluate the prevalence of 25-hydroxyvitamin D [25(OH)D] deficiency in HIV-positive patients, a population at risk for osteoporosis. DESIGN: Retrospective assessment of vitamin D levels by season and initiation of combined antiretroviral therapy (cART). METHODS: 25(OH)D was measured in 211 HIV-positive patients: samples were taken before initiation of cART from February to April or from August to October as well as 12 (same season) and 18 months (alternate season) after starting cART. 1,25-Dihydroxyvitamin D [1,25(OH)2D] was measured in a subset of 74 patients. Multivariable analyses included season, sex, age, ethnicity, BMI, intravenous drug use (IDU), renal function, time since HIV diagnosis, previous AIDS, CD4 cell count and cART, in particular nonnucleoside reverse transcriptase inhibitor (NNRTI) and tenofovir (TDF) use. RESULTS: At baseline, median 25(OH)D levels were 37 (interquartile range 20-49) nmol/l in spring and 57 (39-74) nmol/l in the fall; 25(OH)D deficiency less than 30 nmol/l was more prevalent in spring (42%) than in fall (14%), but remained unchanged regardless of cART exposure. In multivariable analysis, 25(OH)D levels were higher in white patients and those with a longer time since HIV diagnosis and lower in springtime measurements and in those with active IDU and NNRTI use. 1-Hydroxylation rates were significantly higher in patients with low 25(OH)D. Hepatitis C seropositivity, previous AIDS and higher CD4 cell counts correlated with lower 1,25(OH)2D levels, whereas BMI and TDF use were associated with higher levels. In TDF-treated patients, higher 1,25(OH)2D correlated with increases in serum alkaline phosphatase. CONCLUSION: Based on the high rate of vitamin D deficiency in HIV-positive patients, systematic screening with consideration of seasonality is warranted. The impact of NNRTIs on 25(OH)D and TDF on 1,25(OH)2D needs further attention.
Resumo:
In patients with venous thromboembolism (VTE), assessment of the risk of fatal recurrent VTE and fatal bleeding during anticoagulation may help to guide intensity and duration of therapy. We aimed to provide estimates of the case-fatality rate (CFR) of recurrent VTE and major bleeding during anticoagulation in a 'real life' population, and to assess these outcomes according to the initial presentation of VTE and its etiology. The study included 41,826 patients with confirmed VTE from the RIETE registry who received different durations of anticoagulation (mean 7.8 ± 0.6 months). During 27,110 patient-years, the CFR was 12.1% (95% CI, 10.2-14.2) for recurrent VTE, and 19.7% (95% CI, 17.4-22.1) for major bleeding. During the first three months of anticoagulant therapy, the CFR of recurrent VTE was 16.1% (95% CI, 13.6-18.9), compared to 2.0% (95% CI, 0-4.2) beyond this period. The CFR of bleeding was 20.2% (95% CI, 17.5-23.1) during the first three months, compared to 18.2% (95% CI, 14.0-23.2) beyond this period. The CFR of recurrent VTE was higher in patients initially presenting with PE (18.5%; 95% CI, 15.3-22.1) than in those with DVT (6.3%; 95% CI, 4.5-8.6), and in patients with provoked VTE (16.3%; 95% CI, 13.6-19.4) than in those with unprovoked VTE (5.5%; 95% CI, 3.5-8.0). In conclusion, the CFR of recurrent VTE decreased over time during anticoagulation, while the CFR of major bleeding remained stable. The CFR of recurrent VTE was higher in patients initially presenting with PE and in those with provoked VTE.
Resumo:
From data collected during routine TDM, plasma concentrations of citalopram (CIT) and its metabolites demethylcitalopram (DCIT) and didemethylcitalopram (DDCIT) were measured in 345 plasma samples collected in steady-state conditions. They were from 258 patients treated with usual doses (20-60 mg/d) and from patients medicated with 80-360 mg/d CIT. Most patients had one or several comedications, including other antidepressants, antipsychotics, lithium, anticonvulsants, psychostimulants and somatic medications. Dose-corrected CIT plasma concentrations (C/D ratio) were 2.51 +/- 2.25 ng mL-1 mg-1 (n = 258; mean +/- SD). Patients >65 years had significantly higher dose-corrected CIT plasma concentrations (n = 56; 3.08 +/- 1.35 ng mL-1 mg-1) than younger patients (n = 195; 2.35 +/- 2.46 ng mL-1 mg-1) (P = 0.03). CIT plasma concentrations in the generally recommended dose range were [mean +/- SD, (median)]: 57 +/- 64 (45) ng/mL (10-20 mg/d; n = 64), 117 +/- 95 (91) ng/mL (21-60 mg/d; n = 96). At higher than usual doses, the following concentrations of CIT were measured: 61-120 mg/d CIT, 211 +/- 103 (190) ng/mL (n = 93); 121-200 mg/d: 339 +/- 143 (322) ng/mL (n = 70); 201-280 mg/d: 700 +/- 408 (565) ng/mL (n = 18); 281-360 mg/d: 888 +/- 620 (616) ng/mL (n = 4). When only one sample per patient (at the highest daily dose if repeated dosages) is considered, there is a linear and significant correlation (n = 48, r = 0.730; P < 0.001) between daily dose (10-200 mg/d) and CIT plasma concentrations. In experiments with dogs, DDCIT was reported to affect the QT interval when present at concentrations >300 ng/mL. In this study, DDCIT concentration reached 100 ng/mL in a patient treated with 280 mg/d CIT. Twelve other patients treated with 140-320 mg/d CIT had plasma concentrations of DDCIT within the range 52-73 ng/mL. In a subgroup comprised of patients treated with > or =160 mg/d CIT and with CIT plasma concentrations < or =300 ng/mL, and patients treated with < or =200 mg/d CIT and CIT plasma concentrations > or = 600 ng/mL, the enantiomers of CIT and DCIT were also analyzed. The highest S-CIT concentration measured in this subgroup was 327 ng/mL in a patient treated with 140 mg/d CIT, but the highest S-CIT concentration (632 ng/mL) was measured in patient treated with 360 mg/d CIT. In conclusion, there is a highly linear correlation between CIT plasma concentrations and CIT doses, well above the usual dose range.
Resumo:
Synthetic root exudates were formulated based on the organic acid composition of root exudates derived from the rhizosphere of aseptically grown corn plants, pH of the rhizosphere, and the background chemical matrices of the soil solutions. The synthetic root exudates, which mimic the chemical conditions of the rhizosphere environment where soil-borne metals are dissolved and absorbed by plants, were used to extract metals from sewage-sludge treated soils 16 successive times. The concentrations of Zn, Cd, Ni, Cr, and Cu of the sludge-treated soil were 71.74, 0.21, 15.90, 58.12, and 37.44 mg kg-1, respectively. The composition of synthetic root exudates consisted of acetic, butyric, glutaric, lactic, maleic, propionic, pyruvic, succinic, tartaric, and valeric acids. The organic acid mixtures had concentrations of 0.05 and 0.1 mol L-1 -COOH. The trace elements removed by successive extractions may be considered representative for the availability of these metals to plants in these soils. The chemical speciation of the metals in the liquid phase was calculated; results showed that metals in sludge-treated soils were dissolved and formed soluble complexes with the different organic acid-based root exudates. The most reactive organic acid ligands were lactate, maleate, tartarate, and acetate. The inorganic ligands of chloride and sulfate played insignificant roles in metal dissolution. Except for Cd, free ions did not represent an important chemical species of the metals in the soil rhizosphere. As different metals formed soluble complexes with different ligands in the rhizosphere, no extractor, based on a single reagent would be able to recover all of the potentially plant-available metals from soils; the root exudate-derived organic acid mixtures tested in this study may be better suited to recover potentially plant-available metals from soils than the conventional extractors.
Resumo:
Studies on sewage sludge (SS) have confirmed the possibilities of using this waste as fertilizer and/or soil conditioner in crop production areas. Despite restrictions with regard to the levels of potentially toxic elements (PTE) and pathogens, it is believed that properly treated SS with low PTE levels, applied to soil at adequate rates, may improve the soil chemical and microbiological properties. This study consisted of a long-term field experiment conducted on a Typic Haplorthox (eutroferric Red Latosol) treated with SS for seven successive years for maize production, to evaluate changes in the soil chemical and microbiological properties. The treatments consisted of two SS rates (single and double dose of the crop N requirement) and a mineral fertilizer treatment. Soil was sampled in the 0-0.20 m layer and analyzed for chemical properties (organic C, pH, P, K, Ca, Mg, CEC, B, Cu, Fe, Mn, Zn, Cd, Ni, and Pb) and microbiological properties (basal respiration, microbial biomass activity, microbial biomass C, metabolic quotient, microbial quotient, and protease and dehydrogenase enzyme activities). Successive SS applications to soil increased the macro- and micronutrient availability, but the highest SS dose reduced the soil pH significantly, indicating a need for periodic corrections. The SS treatments also affected soil microbial activity and biomass negatively. There were no significant differences among treatments for maize grain yield. After seven annual applications of the recommended sludge rate, the heavy metal levels in the soil had not reached toxic levels.