966 resultados para cockroach, allergen, Per a 3, hexamerin, Per a 9, arginine kinase
Resumo:
Switchgrass (Panicum virgatum L.) is a perennial grass holding great promise as a biofuel resource. While Michigan’s Upper Peninsula has an appropriate land base and climatic conditions, there is little research exploring the possibilities of switchgrass production. The overall objectives of this research were to investigate switchgrass establishment in the northern edge of its distribution through: investigating the effects of competition on the germination and establishment of switchgrass through the developmental and competitive characteristics of Cave-in-Rock switchgrass and large crabgrass (Digitaria sanguinalis L.) in Michigan’s Upper Peninsula; and, determining the optimum planting depths and timing for switchgrass in Michigan’s Upper Peninsula. For the competition study, a randomized complete block design was installed June 2009 at two locations in Michigan’s Upper Peninsula. Four treatments (0, 1, 4, and 8 plants/m2) of crabgrass were planted with one switchgrass plant. There was a significant difference between switchgrass biomass produced in year one, as a function of crabgrass weed pressure. There was no significant difference between the switchgrass biomass produced in year two versus previous crabgrass weed pressure. There is a significant difference between switchgrass biomass produced in year one and two. For the depth and timing study, a completely randomized design was installed at two locations in Michigan’s Upper Peninsula on seven planting dates (three fall 2009, and four spring 2010); 25 seeds were planted 2 cm apart along 0.5 m rows at depths of: 0.6 cm, 1.3 cm, and 1.9 cm. Emergence and biomass yields were compared by planting date, and depths. A greenhouse seeding experiment was established using the same planting depths and parameters as the field study. The number of seedlings was tallied daily for 30 days. There was a significant difference in survivorship between the fall and spring planting dates, with the spring being more successful. Of the four spring planting dates, there was a significant difference between May and June in emergence and biomass yield. June planting dates had the most percent emergence and total survivorship. There is no significant difference between planting switchgrass at depths of 0.6 cm, 1.3 cm, and 1.9 cm. In conclusion, switchgrass showed no signs of a legacy effect of competition from year one, on biomass production. Overall, an antagonistic effect on switchgrass biomass yield during the establishment period has been observed as a result of increasing competing weed pressure. When planting switchgrass in Michigan’s Upper Peninsula, it should be done in the spring, within the first two weeks of June, at any depth ranging from 0.6 cm to 1.9 cm.
Resumo:
OBJECTIVE: To determine the effects of cognitive-behavioral stress management (CBSM) training on clinical and psychosocial markers in HIV-infected persons. METHODS: A randomized controlled trial in four HIV outpatient clinics of 104 HIV-infected persons taking combination antiretroviral therapy (cART), measuring HIV-1 surrogate markers, adherence to therapy and well-being 12 months after 12 group sessions of 2 h CBSM training. RESULTS: Intent-to-treat analyses showed no effects on HIV-1 surrogate markers in the CBSM group compared with the control group: HIV-1 RNA < 50 copies/ml in 81.1% [95% confidence interval (CI), 68.0-90.6] and 74.5% (95% CI, 60.4-85.7), respectively (P = 0.34), and mean CD4 cell change from baseline of 53.0 cells/microl (95% CI, 4.1-101.8) and 15.5 cells/microl (95% CI, -34.3 to 65.4), respectively (P = 0.29). Self-reported adherence to therapy did not differ between groups at baseline (P = 0.53) or at 12 month's post-intervention (P = 0.47). Significant benefits of CBSM over no intervention were observed in mean change of quality of life scores: physical health 2.9 (95% CI, 0.7-5.1) and -0.2 (95% CI, -2.1 to 1.8), respectively (P = 0.05); mental health 4.8 (95% CI, 1.8-7.3) and -0.5 (95% CI, -3.3 to 2.2) (P = 0.02); anxiety -2.1 (95% CI, -3.6 to -1.0) and 0.3 (95% CI, -0.7 to 1.4), respectively (P = 0.002); and depression -2.1 (95% CI, -3.2 to -0.9) and 0.02 (95% CI, -1.0 to 1.1), respectively (P = 0.001). Alleviation of depression and anxiety symptoms were most pronounced among participants with high psychological distress at baseline. CONCLUSION: CBSM training of HIV-infected persons taking on cART does not improve clinical outcome but has lasting effects on quality of life and psychological well-being.
Resumo:
BACKGROUND: This multicenter phase II study investigated the efficacy and feasibility of preoperative induction chemotherapy followed by chemoradiation and surgery in patients with esophageal carcinoma. PATIENTS AND METHODS: Patients with locally advanced resectable squamous cell carcinoma or adenocarcinoma of the esophagus received induction chemotherapy with cisplatin 75 mg/m(2) and docetaxel (Taxotere) 75 mg/m(2) on days 1 and 22, followed by radiotherapy of 45 Gy (25 x 1.8 Gy) and concurrent chemotherapy comprising cisplatin 25 mg/m(2) and docetaxel 20 mg/m(2) weekly for 5 weeks, followed by surgery. RESULTS: Sixty-six patients were enrolled at eleven centers and 57 underwent surgery. R0 resection was achieved in 52 patients. Fifteen patients showed complete, 16 patients nearly complete and 26 patients poor pathological remission. Median overall survival was 36.5 months and median event-free survival was 22.8 months. Squamous cell carcinoma and good pathologically documented response were associated with longer survival. Eighty-two percent of all included patients completed neoadjuvant therapy and survived for 30 days after surgery. Dysphagia and mucositis grade 3/4 were infrequent (<9%) during chemoradiation. Five patients (9%) died due to surgical complications. CONCLUSIONS: This neoadjuvant, taxane-containing regimen was efficacious and feasible in patients with locally advanced esophageal cancer in a multicenter, community-based setting and represents a suitable backbone for further investigation.
Resumo:
BACKGROUND AND PURPOSE: In order to use a single implant with one treatment plan in fractionated high-dose-rate brachytherapy (HDR-B), applicator position shifts must be corrected prior to each fraction. The authors investigated the use of gold markers for X-ray-based setup and position control between the single fractions. PATIENTS AND METHODS: Caudad-cephalad movement of the applicators prior to each HDR-B fraction was determined on radiographs using two to three gold markers, which had been inserted into the prostate as intraprostatic reference, and one to two radiopaque-labeled reference applicators. 35 prostate cancer patients, treated by HDR-B as a monotherapy between 10/2003 and 06/2006 with four fractions of 9.5 Gy each, were analyzed. Toxicity was scored according to the CTCAE Score, version 3.0. Median follow-up was 3 years. RESULTS: The mean change of applicators positions compared to baseline varied substantially between HDR-B fractions, being 1.4 mm before fraction 1 (range, -4 to 2 mm), -13.1 mm before fraction 2 (range, -36 to 0 mm), -4.1 mm before fraction 3 (range, -21 to 9 mm), and -2.6 mm at fraction 4 (range, -16 to 9 mm). The original position of the applicators could be readjusted easily prior to each fraction in every patient. In 18 patients (51%), the applicators were at least once readjusted > 10 mm, however, acute or late grade > or = 2 genitourinary toxicity was not increased (p = 1.0) in these patients. CONCLUSION: Caudad position shifts up to 36 mm were observed. Gold markers represent a valuable tool to ensure setup accuracy and precise dose delivery in fractionated HDR-B monotherapy of prostate cancer.
Resumo:
A method using gas chromatography-mass spectrometry (GC-MS) and solid-phase extraction (SPE) was developed for the determination of ajulemic acid (AJA), a non-psychoactive synthetic cannabinoid with interesting therapeutic potential, in human plasma. When using two calibration graphs, the assay linearity ranged from 10 to 750 ng/ml, and 750 to 3000 ng/ml AJA. The intra- and inter-day precision (R.S.D., %), assessed across the linear ranges of the assay, was between 1.5 and 7.0, and 3.6 and 7.9, respectively. The limit of quantitation (LOQ) was 10 ng/ml. The amount of AJA glucuronide was determined by calculating the difference in the AJA concentration before ("free AJA") and after enzymatic hydrolysis ("total AJA"). The present method was used within a clinical study on 21 patients suffering from neuropathic pain with hyperalgesia and allodynia. For example, plasma levels of 599.4+/-37.2 ng/ml (mean+/-R.S.D., n=9) AJA were obtained for samples taken 2 h after the administration of an oral dose of 20 mg AJA. The mean AJA glucuronide concentration at 2h was 63.8+/-127.9 ng/ml.
Resumo:
Recent studies have indicated that parathyroid hormone-related protein (PTHrP) may have important actions in lactation, affecting the mammary gland, and also calcium metabolism in the newborn and the mother. However, there are as yet no longitudinal studies to support the notion of an endocrine role of this peptide during nursing. We studied a group of 12 nursing mothers, mean age 32 years, after they had been nursing for an average of 7 weeks (B) and also 4 months after stopping nursing (A). It was assumed that changes occurring between A and B correspond to the effect of lactation. Blood was assayed for prolactin (PRL), PTHrP (two-site immunoradiometric assay with sheep antibody against PTHrP(1-40), and goat antibody against PTHrP(60-72), detection limit 0.3 pmol/l), intact PTH (iPTH), ionized calcium (Ca2+), 25-hydroxyvitamin D3 (25(OH)D3) and 1,25-dihydroxyvitamin D3 (1,25(OH)2D3), alkaline phosphatase (alkP), as well as for creatinine (Cr), protein, phosphorus (P), and total calcium (Ca). Fasting 2-h urine samples were analyzed for Ca excretion (CaE) and renal phosphate threshold (TmP/GFR). PRL was significantly higher during lactation than after weaning (39 +/- 10 vs. 13 +/- 9 micrograms/l; p = 0.018) and so was PTHrP (2.8 +/- 0.35 vs. 0.52 +/- 0.04 pmol/l; p = 0.002), values during lactation being above the normal limit (1.3 pmol/l) in all 12 mothers. There was a significant correlation between PRL and PTHrP during lactation (r = 0.8, p = 0.002). Whole blood Ca2+ did not significantly change from A (1.20 +/- 0.02 mmol/l) to B (1.22 +/- 0.02, mmol/l), whereas total Ca corrected for protein (2.18 +/- 0.02 mmol/l) or uncorrected (2.18 +/- 0.02 mmol/l) significantly rose during lactation (2.31 +/- 0.02 mmol/l, p = 0.003 and 2.37 +/- 0.03 mmol/l, p = 0.002, respectively). Conversely, iPTH decreased during lactation (3.47 +/- 0.38 vs. 2.11 +/- 0.35 pmol/l, A vs. B, p = 0.02). Serum-levels of 25(OH)D3 and 1,25(OH)2D3 did not significantly change from A to B (23 +/- 2.3 vs. 24 +/- 1.9 ng/ml and 29.5 +/- 6.0 vs. 21.9 +/- 1.8 pg/ml, respectively). Both TmP/GFR and P were higher during lactation than after weaning (1.15 +/- 0.03 vs. 0.86 +/- 0.05 mmol/l GF, p = 0.003 and 1.25 +/- 0.03 vs. 0.96 +/- 0.05 mmol/l, p = 0.002, respectively) as was alkP (74.0 +/- 7.1 vs. 52.6 +/- 6.9 U/l, p = 0.003). CaE did not differ between A and B (0.015 +/- 0.003 vs. 0.017 +/- 0.003 mmol/l GF, A vs. B, NS). We conclude that lactation is accompanied by an increase in serum PRL. This is associated with a release of PTHrP into the maternal blood circulation. A rise in total plasma Ca ensues, probably in part by increased bone turnover as suggested by the elevation of alkP. PTH secretion falls, with a subsequent rise of TmP/GFR and plasma P despite high plasma levels of PTHrP.
Resumo:
Two consecutive in situ studies were conducted to determine the effects of maturity and frost killing of forages (alfalfa and berseem clover) on degradation kinetics and escape protein concentrations. Four maturities (3, 5, 7, and 9 weeks after second harvest) of forages collected from three locations were used to determine the effects of maturity. Four weeks after a killing frost (-2o C), berseem clover was harvested from the same locations previously sampled. To evaluate maturity, 336 DacronÒ bags containing all maturities of either alfalfa or berseem clover were placed into the rumen of two fistulated steers fed alfalfa-grass hay. Frost killing effects of berseem clover were compared with maturecut berseem clover by placing DacronÒ bags into the rumen of one fistulated steer fed alfalfa hay. Bags were incubated for periods of 0 to 48 hours. With increasing maturity, the proportion of non-degradable protein (NDP) and the rate of crude protein (CP) degradation increased in both forages. While the rate of neutral detergent fiber (NDF) degradation and potentially degradable protein proportion (PDP) increased with increasing maturity in alfalfa, the rate of NDF degradation and PDP proportion decreased and proportion of water soluble protein (WSP) increased in berseem clover. The proportion of protein escaping rumen degradation (PEP) was greater in berseem clover than alfalfa, but was not affected by maturity. Frost killing of mature berseem clover decreased WSP proportion and increased PDP proportion compared to mature berseem clover harvested live. Even though ADIN concentration was higher for frost-killed berseem clover, PEP and total escape protein concentration (CEP) was also higher for frostkilled berseem clover than mature berseem clover harvested live, due to decreases in the rate of ruminal N degradation with frost-killing.
Resumo:
OBJECTIVES Mortality in patients starting antiretroviral therapy (ART) is higher in Malawi and Zambia than in South Africa. We examined whether different monitoring of ART (viral load [VL] in South Africa and CD4 count in Malawi and Zambia) could explain this mortality difference. DESIGN Mathematical modelling study based on data from ART programmes. METHODS We used a stochastic simulation model to study the effect of VL monitoring on mortality over 5 years. In baseline scenario A all parameters were identical between strategies except for more timely and complete detection of treatment failure with VL monitoring. Additional scenarios introduced delays in switching to second-line ART (scenario B) or higher virologic failure rates (due to worse adherence) when monitoring was based on CD4 counts only (scenario C). Results are presented as relative risks (RR) with 95% prediction intervals and percent of observed mortality difference explained. RESULTS RRs comparing VL with CD4 cell count monitoring were 0.94 (0.74-1.03) in scenario A, 0.94 (0.77-1.02) with delayed switching (scenario B) and 0.80 (0.44-1.07) when assuming a 3-times higher rate of failure (scenario C). The observed mortality at 3 years was 10.9% in Malawi and Zambia and 8.6% in South Africa (absolute difference 2.3%). The percentage of the mortality difference explained by VL monitoring ranged from 4% (scenario A) to 32% (scenarios B and C combined, assuming a 3-times higher failure rate). Eleven percent was explained by non-HIV related mortality. CONCLUSIONS VL monitoring reduces mortality moderately when assuming improved adherence and decreased failure rates.
Resumo:
OBJECTIVES The purpose of this study was to assess the occurrence, predictors, and mechanisms of optical coherence tomography (OCT)-detected coronary evaginations following drug-eluting stent (DES) implantation. BACKGROUND Angiographic ectasias and aneurysms in stented segments have been associated with a risk of late stent thrombosis. Using OCT, some stented segments show coronary evaginations reminiscent of ectasias. METHODS Evaginations were defined as outward bulges in the luminal contour between struts. They were considered major evaginations (MEs) when extending ≥3 mm along the vessel length, with a depth ≥10% of the stent diameter. A total of 228 patients who had sirolimus (SES)-, paclitaxel-, biolimus-, everolimus (EES)-, or zotarolimus (ZES)-eluting stents implanted in 254 lesions, were analysed after 1, 2, or 5 years; and serial assessment using OCT and intravascular ultrasound (IVUS) was performed post-intervention and after 1 year in 42 patients. RESULTS Major evaginations occurred frequently at all time points in SES (∼26%) and were rarely seen in EES (3%) and ZES (2%, P = 0.003). Sirolimus-eluting stent implantation was the strongest independent predictor of ME [adjusted OR (95% CI) 9.1 (1.1-77.4), P = 0.008]. Malapposed and uncovered struts were more common in lesions with vs. without ME (77 vs. 25%, P < 0.001 and 95 vs. 20%, P < 0.001, respectively) as was thrombus [49 vs. 14%, OR 7.3 (95% CI: 1.7-31.2), P = 0.007]. Post-intervention intra-stent dissection and protrusion of the vessel wall into the lumen were associated with an increased risk of evagination at follow-up [OR (95% CI): 2.9 (1.8-4.9), P < 0.001 and 3.3 (1.6-6.9), P = 0.001, respectively]. In paired IVUS analyses, lesions with ME showed a larger increase in the external elastic membrane area (20% area change) compared with lesions without ME (5% area change, P < 0.001). CONCLUSION Optical coherence tomography-detected MEs are a specific morphological footprint of early-generation SES and are nearly absent in newer-generation ZES and EES. Evaginations appear to be related to vessel injury at baseline; are associated with positive vessel remodelling; and correlate with uncoverage, malapposition, and thrombus at follow-up.
Resumo:
Background Open-irrigated radiofrequency catheter ablation (oiRFA) of atrial fibrillation (AF) imposes a volume load and risk of pulmonary edema. We sought to assess the effect of volume administration during ablation on left atrial (LA) pressure and B-type natriuretic peptide (BNP). Methods LA pressure was measured via transseptal sheath at the beginning and end of 44 LA ablation procedures in 42 patients. BNP plasma levels were measured before and after 10 procedures. Results A median of 3,255 (interquartile range [IQR], 2,014)-mL saline was administered during the procedure. During LA ablation, the median fluid balance was +1,438 (IQR, 1,109) mL and LA pressure increased by median 3.7 (IQR, 5.9) mm Hg (P < 0.001). LA pressure did not change in the 19 procedures with furosemide administration (median ΔP = −0.3 [IQR, 7.1] mm Hg, P = 0.334). The correlation of LA pressure and fluid balance was weak (rs = 0.383, P = 0.021). BNP decreased in all four procedures starting in AF or atrial tachycardia and then converting to sinus rhythm (P = 0.068), and increased in all six procedures starting and finishing in sinus rhythm (P = 0.028). After ablation, symptomatic volume overload responding to diuresis occurred in three patients. Conclusions A substantial intravascular volume load during oiRFA can be absorbed with little change in LA pressure, such that LA pressure is not a reliable indicator of the fluid balance. Subsequent redistribution of the volume load imposes a risk after the procedure. Conversion to sinus rhythm may improve ability to acutely accommodate the volume load.
Resumo:
We analyzed the species distribution of Candida blood isolates (CBIs), prospectively collected between 2004 and 2009 within FUNGINOS, and compared their antifungal susceptibility according to clinical breakpoints defined by the European Committee on Antimicrobial Susceptibility Testing (EUCAST) in 2013, and the Clinical and Laboratory Standards Institute (CLSI) in 2008 (old CLSI breakpoints) and 2012 (new CLSI breakpoints). CBIs were tested for susceptiblity to fluconazole, voriconazole and caspofungin by microtitre broth dilution (Sensititre® YeastOne™ test panel). Of 1090 CBIs, 675 (61.9%) were C. albicans, 191 (17.5%) C. glabrata, 64 (5.9%) C. tropicalis, 59 (5.4%) C. parapsilosis, 33 (3%) C. dubliniensis, 22 (2%) C. krusei and 46 (4.2%) rare Candida species. Independently of the breakpoints applied, C. albicans was almost uniformly (>98%) susceptible to all three antifungal agents. In contrast, the proportions of fluconazole- and voriconazole-susceptible C. tropicalis and F-susceptible C. parapsilosis were lower according to EUCAST/new CLSI breakpoints than to the old CLSI breakpoints. For caspofungin, non-susceptibility occurred mainly in C. krusei (63.3%) and C. glabrata (9.4%). Nine isolates (five C. tropicalis, three C. albicans and one C. parapsilosis) were cross-resistant to azoles according to EUCAST breakpoints, compared with three isolates (two C. albicans and one C. tropicalis) according to new and two (2 C. albicans) according to old CLSI breakpoints. Four species (C. albicans, C. glabrata, C. tropicalis and C. parapsilosis) represented >90% of all CBIs. In vitro resistance to fluconazole, voriconazole and caspofungin was rare among C. albicans, but an increase of non-susceptibile isolates was observed among C. tropicalis/C. parapsilosis for the azoles and C. glabrata/C. krusei for caspofungin according to EUCAST and new CLSI breakpoints compared with old CLSI breakpoints.
Resumo:
BACKGROUND Enterococci are an important cause of central venous catheter (CVC)-associated bloodstream infections (CA-BSI). It is unclear whether CVC removal is necessary to successfully manage enterococcal CA-BSI. METHODS A 12-month retrospective cohort study of adults with enterococcal CA-BSI was conducted at a tertiary care hospital; clinical, microbiological and outcome data were collected. RESULTS A total of 111 patients had an enterococcal CA-BSI. The median age was 58.2 years (range 21 to 94 years). There were 45 (40.5%) infections caused by Entercoccus faecalis (among which 10 [22%] were vancomycin resistant), 61 (55%) by Enterococcus faecium (57 [93%] vancomycin resistant) and five (4.5%) by other Enterococcus species. Patients were treated with linezolid (n=51 [46%]), vancomycin (n=37 [33%]), daptomycin (n=11 [10%]), ampicillin (n=2 [2%]) or quinupristin/dalfopristin (n=2 [2%]); seven (n=6%) patients did not receive adequate enterococcal treatment. Additionally, 24 (22%) patients received adjunctive gentamicin treatment. The CVC was retained in 29 (26.1%) patients. Patients with removed CVCs showed lower rates of in-hospital mortality (15 [18.3%] versus 11 [37.9]; P=0.03), but similar rates of recurrent bacteremia (nine [11.0%] versus two (7.0%); P=0.7) and a similar post-BSI length of hospital stay (median days [range]) (11.1 [1.7 to 63.1 days] versus 9.3 [1.9 to 31.8 days]; P=0.3). Catheter retention was an independent predictor of mortality (OR 3.34 [95% CI 1.21 to 9.26]). CONCLUSIONS To the authors' knowledge, the present article describes the largest enterococcal CA-BSI series to date. Mortality was increased among patients who had their catheter retained. Additional prospective studies are necessary to determine the optimal management of enterococcal CA-BSI.
Resumo:
The interplay of language and cognition in children’s development has been subject to research for a long time. The present study followed up on recently reported deleterious effects of articulatory suppression on children’s executive functioning (Fatzer & Roebers, 2012), aiming to provide more empirical evidence on the differential influence of language on executive functioning. In the present study, verbal strategies were induced in three executive functioning tasks. The tasks were linked to the three central executive functioning dimensions of updating (Complex Span task), shifting (Cognitive Flexibility task) and inhibition (Flanker task). It was expected that the effects of the verbal strategy instruction would counter the results of articulatory suppression and thus be strong in the Complex Span task, weak but present in the Cognitive Flexibility task and small or nonexistent in the Flanker task. N = 117 children participated in the study, with n = 39 four-year-olds, n = 38 six-year-olds, and n = 40 nine-year-olds. As expected, results revealed a benefit from induced verbal strategies in the Complex Span and the Cognitive Flexibility task, but not in the Flanker task. The positive effect of strategy instruction declined with increasing age, pointing to more frequent spontaneous and self-initiated use of verbal strategies over the course of development. The effect of strategy instruction in the Cognitive Flexibility task was unexpectedly strong in the light of the only small detrimental effect of articulatory suppression in the preceding study. Implications for language’s involvement in the different executive functioning dimensions and for practice are discussed.
Resumo:
High-resolution quantitative computed tomography (HRQCT)-based analysis of spinal bone density and microstructure, finite element analysis (FEA), and DXA were used to investigate the vertebral bone status of men with glucocorticoid-induced osteoporosis (GIO). DXA of L1–L3 and total hip, QCT of L1–L3, and HRQCT of T12 were available for 73 men (54.6±14.0years) with GIO. Prevalent vertebral fracture status was evaluated on radiographs using a semi-quantitative (SQ) score (normal=0 to severe fracture=3), and the spinal deformity index (SDI) score (sum of SQ scores of T4 to L4 vertebrae). Thirty-one (42.4%) subjects had prevalent vertebral fractures. Cortical BMD (Ct.BMD) and thickness (Ct.Th), trabecular BMD (Tb.BMD), apparent trabecular bone volume fraction (app.BV/TV), and apparent trabecular separation (app.Tb.Sp) were analyzed by HRQCT. Stiffness and strength of T12 were computed by HRQCT-based nonlinear FEA for axial compression, anterior bending and axial torsion. In logistic regressions adjusted for age, glucocorticoid dose and osteoporosis treatment, Tb.BMD was most closely associated with vertebral fracture status (standardized odds ratio [sOR]: Tb.BMD T12: 4.05 [95% CI: 1.8–9.0], Tb.BMD L1–L3: 3.95 [1.8–8.9]). Strength divided by cross-sectional area for axial compression showed the most significant association with spine fracture status among FEA variables (2.56 [1.29–5.07]). SDI was best predicted by a microstructural model using Ct.Th and app.Tb.Sp (r2=0.57, p<0.001). Spinal or hip DXA measurements did not show significant associations with fracture status or severity. In this cross-sectional study of males with GIO, QCT, HRQCT-based measurements and FEA variables were superior to DXA in discriminating between patients of differing prevalent vertebral fracture status. A microstructural model combining aspects of cortical and trabecular bone reflected fracture severity most accurately.
Resumo:
BACKGROUND Neoadjuvant chemotherapy for locally advanced gastric cancer leads to major histopathological response in less than 30 % of patients. Data on interim endoscopic response assessment do not exist. This exploratory prospective study evaluates early endoscopy after 50 % of the chemotherapy as predictor for later response and prognosis. METHODS Forty-seven consecutive patients were included (45 resected; 33 R0 resections). All patients received baseline endoscopy and CT scans, after 50 % of their chemotherapy (EGD-1, CT-1) and after completion of chemotherapy (EGD-2, CT-2). Interim endoscopic response (EGD-1) was assessed after having received 50 % (6 weeks) of the planned 12 weeks of neoadjuvant chemotherapy. Post-chemotherapy response was clinically assessed by a combination of CT scan (CT-2) and endoscopy (EGD-2). Histopathological response was determined by a standardized scoring system (Becker criteria). Endoscopic response was defined as a reduction of >75 % of the tumor mass. RESULTS Twelve patients were responders at EGD-1 and 13 at EGD-2. Nine patients (19.1 %) were clinical responders and 7 patients (15.6 %) were histopathological responders after chemotherapy. Specificity, accuracy, and negative predictive value of the interim EGD-1 for subsequent histopathological response were 31/38 (82 %), 36/47 (76 %), and 31/33 (93 %); and for recurrence or death, 28/30 (93.3 %), 38/47 (80.9 %), and 28/35 (80.0 %). Response at EGD-1 was significantly associated with histopathological response (p = 0.010), survival (p < 0.001), and recurrence-free survival (p = 0.009). CONCLUSIONS Interim endoscopy after 6 weeks predicts response and prognosis. Therefore, tailoring treatment according to interim endoscopic assessment could be feasible, but the findings of this study should be validated in a larger patient cohort.