999 resultados para Surf Smart II


Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND AND STUDY AIMS: To summarize the published literature on assessment of appropriateness of colonoscopy for surveillance after polypectomy and after curative-intent resection of colorectal cancer (CRC), and report appropriateness criteria developed by an expert panel, the 2008 European Panel on the Appropriateness of Gastrointestinal Endoscopy, EPAGE II. METHODS: A systematic search of guidelines, systematic reviews and primary studies regarding the evaluation and management of surveillance colonoscopy after polypectomy and after resection of CRC was performed. The RAND/UCLA Appropriateness Method was applied to develop appropriateness criteria for colonoscopy for these conditions. RESULTS: Most CRCs arise from adenomatous polyps. The characteristics of removed polyps, especially the distinction between low-risk adenomas (1 or 2, small [< 1 cm], tubular, no high-grade dysplasia) vs. high-risk adenomas (large [> or = 1 cm], multiple [> 3], high-grade dysplasia or villous features), have an impact on advanced adenoma recurrence. Most guidelines recommend a 3-year follow-up colonoscopy for high-risk adenomas and a 5-year colonoscopy for low-risk adenomas. Despite the lack of evidence to support or refute any survival benefit for follow-up colonoscopy after curative-intent CRC resection, surveillance colonoscopy is recommended by most guidelines. The timing of the first surveillance colonoscopy differs. The expert panel considered that 56 % of the clinical indications for colonoscopy for surveillance after polypectomy were appropriate. For surveillance after CRC resection, it considered colonoscopy appropriate 1 year after resection. CONCLUSIONS: Colonoscopy is recommended as a first-choice procedure for surveillance after polypectomy by all published guidelines and by the EPAGE II criteria. Despite the limitations of the published studies, colonoscopy is also recommended by most of the guidelines and by EPAGE II criteria for surveillance after curative-intent CRC resection.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: Socioeconomic adversity in early life has been hypothesized to "program" a vulnerable phenotype with exaggerated inflammatory responses, so increasing the risk of developing type 2 diabetes in adulthood. The aim of this study is to test this hypothesis by assessing the extent to which the association between lifecourse socioeconomic status and type 2 diabetes incidence is explained by chronic inflammation. METHODS AND FINDINGS: We use data from the British Whitehall II study, a prospective occupational cohort of adults established in 1985. The inflammatory markers C-reactive protein and interleukin-6 were measured repeatedly and type 2 diabetes incidence (new cases) was monitored over an 18-year follow-up (from 1991-1993 until 2007-2009). Our analytical sample consisted of 6,387 non-diabetic participants (1,818 women), of whom 731 (207 women) developed type 2 diabetes over the follow-up. Cumulative exposure to low socioeconomic status from childhood to middle age was associated with an increased risk of developing type 2 diabetes in adulthood (hazard ratio [HR] = 1.96, 95% confidence interval: 1.48-2.58 for low cumulative lifecourse socioeconomic score and HR = 1.55, 95% confidence interval: 1.26-1.91 for low-low socioeconomic trajectory). 25% of the excess risk associated with cumulative socioeconomic adversity across the lifecourse and 32% of the excess risk associated with low-low socioeconomic trajectory was attributable to chronically elevated inflammation (95% confidence intervals 16%-58%). CONCLUSIONS: In the present study, chronic inflammation explained a substantial part of the association between lifecourse socioeconomic disadvantage and type 2 diabetes. Further studies should be performed to confirm these findings in population-based samples, as the Whitehall II cohort is not representative of the general population, and to examine the extent to which social inequalities attributable to chronic inflammation are reversible. Please see later in the article for the Editors' Summary.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Aggregating fetal liver cell cultures were tested for their ability to metabolize xenobiotics using ethoxycoumarin-O-deethylase (ECOD), as marker of phase I metabolism, and glutathione S-transferase (GST), as marker for phase II reactions. Significant basal activities, stable over 14 days in culture were measured for both ECOD and GST activities. The prototype cytochrome P450 inducers, 3-methylcholanthrene (3-MC) and phenobarbital (PB), increased ECOD and GST activities reaching an optimum 7 days after culturing, followed by a decline in activity. This decline was partially prevented by 1% dimethyl sulfoxide (DMSO) added chronically to the culture medium. DMSO was also found to induce ECOD activity and to a lesser extent GST activity. Furthermore, it potentiated in a dose-dependent manner the induction of ECOD by PB. The food-borne carcinogen 2-amino-3,8-dimethylimidazo[4,5-f]quinoxaline (MeIQx) is metabolically transformed through a number of pathways in vivo. It was therefore used to examine the metabolic capacity in fetal and adult liver cell aggregates. Metabolism of MeIQx was mainly through N2-conjugation, resulting in formation of the N2-glucuronide and sulfamate conjugates for non-induced fetal liver cells. These metabolites were also found in large amounts in non-induced adult liver cells. Low levels of cytochrome P450-mediated ring-hydroxylated metabolites were detected in both non-induced fetal and adult liver cells. After induction with arochlor (PCB) or 3-MC, the major pathway was ring-hydroxylation (cytochrome P450 dependent), followed by conjugation to beta-glucuronic or sulfuric acid. The presence of the glucuronide conjugate of N-hydroxy-MeIQx, a mutagenic metabolite, suggested an induction of P450 CYP1A2. The metabolism of MeIQx by liver cell aggregates is very similar to that observed in vivo and suggests that aggregating liver cell cultures are a useful model for in vitro metabolic studies in toxicology.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJECTIVES: We have reported previously that 80 mg valsartan and 50 mg losartan provide less receptor blockade than 150 mg irbesartan in normotensive subjects. In this study we investigated the importance of drug dosing in mediating these differences by comparing the AT(1)-receptor blockade induced by 3 doses of valsartan with that obtained with 3 other antagonists at given doses. METHODS: Valsartan (80, 160, and 320 mg), 50 mg losartan, 150 mg irbesartan, and 8 mg candesartan were administered to 24 healthy subjects in a randomized, open-label, 3-period crossover study. All doses were given once daily for 8 days. The angiotensin II receptor blockade was assessed with two techniques, the reactive rise in plasma renin activity and an in vitro radioreceptor binding assay that quantified the displacement of angiotensin II by the blocking agents. Measurements were obtained before and 4 and 24 hours after drug intake on days 1 and 8. RESULTS: At 4 and 24 hours, valsartan induced a dose-dependent "blockade" of AT(1) receptors. Compared with other antagonists, 80 mg valsartan and 50 mg losartan had a comparable profile. The 160-mg and 320-mg doses of valsartan blocked AT(1) receptors at 4 hours by 80%, which was similar to the effect of 150 mg irbesartan. At trough, however, the valsartan-induced blockade was slightly less than that obtained with irbesartan. With use of plasma renin activity as a marker of receptor blockade, on day 8, 160 mg valsartan was equivalent to 150 mg irbesartan and 8 mg candesartan. CONCLUSIONS: These results show that the differences in angiotensin II receptor blockade observed with the various AT(1) antagonists are explained mainly by differences in dosing. When 160-mg or 320-mg doses were investigated, the effects of valsartan hardly differed from those obtained with recommended doses of irbesartan and candesartan.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

PURPOSE: This randomized phase II trial evaluated two docetaxel-based regimens to see which would be most promising according to overall response rate (ORR) for comparison in a phase III trial with epirubicin-cisplatin-fluorouracil (ECF) as first-line advanced gastric cancer therapy. PATIENTS AND METHODS: Chemotherapy-naïve patients with measurable unresectable and/or metastatic gastric carcinoma, a performance status <or= 1, and adequate hematologic, hepatic, and renal function randomly received <or= eight 3-weekly cycles of ECF (epirubicin 50 mg/m(2) on day 1, cisplatin 60 mg/m(2) on day 1, and fluorouracil [FU] 200 mg/m(2)/d on days 1 to 21), TC (docetaxel initially 85 mg/m(2) on day 1 [later reduced to 75 mg/m(2) as a result of toxicity] and cisplatin 75 mg/m(2) on day 1), or TCF (TC plus FU 300 mg/m(2)/d on days 1 to 14). Study objectives included response (primary), survival, toxicity, and quality of life (QOL). RESULTS: ORR was 25.0% (95% CI, 13% to 41%) for ECF, 18.5% (95% CI, 9% to 34%) for TC, and 36.6% (95% CI, 23% to 53%) for TCF (n = 119). Median overall survival times were 8.3, 11.0, and 10.4 months for ECF, TC, and TCF, respectively. Toxicity was acceptable, with one toxic death (TC arm). Grade 3 or 4 neutropenia occurred in more treatment cycles with docetaxel (TC, 49%; TCF, 57%; ECF, 34%). Global health status/QOL substantially improved with ECF and remained similar to baseline with both docetaxel regimens. CONCLUSION: Time to response and ORR favor TCF over TC for further evaluation, particularly in the neoadjuvant setting. A trend towards increased myelosuppression and infectious complications with TCF versus TC or ECF was observed.