975 resultados para Faecal Occult Blood test (FOBt)
Resumo:
Background: Posterior reconstruction (PR) of the rhabdosphincter has been previously described during retropubic radical prostatectomy, and shorter times to return of urinary continence were reported using this technical modification. This technique has also been applied during robot-assisted radical prostatectomy (RARP); however, contradictory results have been reported. Objective: We describe here a modified technique for PR of the rhabdosphincter during RARP and report its impact on early recovery of urinary continence and on cystographic leakage rates. Design, setting, and participants: We analyzed 803 consecutive patients who underwent RARP by a single surgeon over a 12-mo period: 330 without performing PR and 473 with PR. Surgical procedure: The reconstruction was performed using two 6-in 3-0 Poliglecaprone sutures tied together. The free edge of the remaining Denonvillier`s fascia was identified after prostatectomy and approximated to the posterior aspect of the rhabdosphincter and the posterior median raphe using one arm of the continuous suture. The second layer of the reconstruction was then performed with the other arm of the suture, approximating the posterior lip of the bladder neck and vesicoprostatic muscle to the posterior urethral edge. Measurements: Continence rates were assessed with a self-administrated, validated questionnaire (Expanded Prostate Cancer Index Composite) at 1, 4, 12, and 24 wk after catheter removal. Continence was defined as the use of ""no absorbent pads."" Cystogram was performed in all patients on postoperative day 4 or 5 before catheter removal. Results and limitations: There was no significant difference between the groups with respect to patient age, body mass index, prostate-specific antigen levels, prostate weight, American Urological Association symptom score, estimated blood loss, operative time, number of nerve-sparing procedures, and days with catheter. In the PR group, the continence rates at 1, 4, 12, and 24 wk postoperatively were 22.7%, 42.7%, 91.8%, and 96.3%, respectively; in the non-PR group, the continence rates were 28.7%, 51.6%, 91.1%, and 97%, respectively. The modified PR technique resulted in significantly higher continence rates at 1 and 4 wk after catheter removal (p = 0.048 and 0.016, respectively), although the continence rates at 12 and 24 wk were not significantly affected (p = 0.908 and p = 0.741, respectively). The median interval to recovery of continence was also statistically significantly shorter in the PR group (median: 4 wk; 95% confidence interval [CI]: 3.39-4.61) when compared to the non-PR group (median: 6 wk; 95% CI: 5.18-6.82; log-rank test, p = 0.037). Finally, the incidence of cystographic leaks was lower in the PR group (0.4% vs 2.1%; p = 0.036). Although the patients` baseline characteristics were similar between the groups, the patients were not preoperatively randomized and unknown confounding factors may have influenced the results. Conclusions: Our modified PR combines the benefits of early recovery of continence reported with the original PR technique with a reinforced watertight closure of the posterior anastomotic wall. Shorter interval to recovery of continence and lower incidence of cystographic leaks were demonstrated with our PR technique when compared to RARP with no reconstruction. (C) 2010 European Association of Urology. Published by Elsevier B.V. All rights reserved.
Resumo:
Background. The loss of a child is considered the hardest moment in a parent`s life. Studies addressing length of survival under pediatric palliative care are rare. The aim of this study was to improve a survival prediction model for children in palliative care, as accurate information positively impacts parent and child preparation for palliative care. Procedure. Sixty-five children referred to a pediatric palliative care team were followed from August 2003 until December 2006. Variables investigated (also included in previous studies) were: diagnosis, home care provider, presence of anemia, and performance status score given by the home care provider. Clinical variables such as symptom number were also used to test the score`s ability to pre-validated using the above variables. The number of symptoms at transition to palliative care does not improve the score`s predictive ability. The sum of the single scores gives an overall score for each patient, dividing the population into three groups by probability of 60-day survival: Group A 80.0%, Group B 38.0%, and Group C 28.5% (P < 0.001). Conclusion. A pediatric palliative care score based on easily accessible variables is statistically significant in multivariate analysis. Factors that increase accuracy of life expectancy prediction enable adequate information to be given to patients and families, contributing to therapeutic decision-making issues. Pediatr Blood Cancer. 2010;55:1167-1171. (C) 2010 Wiley-Liss, Inc.
Resumo:
High salt intake is a known cardiovascular risk factor and is associated with cardiac alterations. To better understand this effect, male Wistar rats were fed a normal (NSD: 1.3% NaCl), high 4 (HSD4: 4%), or high 8 (HSD8: 8%) salt diet from weaning until 18 wk of age. The HSD8 group was subdivided into HSD8, HSD8+HZ (15 mg.kg(-1).d(-1) hydralazine in the drinking water), and HSD8+LOS (20 mg.kg(-1).d(-1) losartan in the drinking water) groups. The cardiomyocyte diameter was greater in the HSD4 and HSD8 groups than in the HSD8+LOS and NSD groups. Interstitial fibrosis was greater in the HSD4 and HSD8 groups than in the HSD8+HZ and NSD groups. Hydralazine prevented high blood pressure (BP) and fibrosis, but not cardiomyocyte hypertrophy. Losartan prevented high BP and cardiomyocyte hypertrophy, but not fibrosis. Angiotensin II type 1 receptor (AT(1)) protein expression in both ventricles was greater in the HSD8 group than in the NSD group. Losartan, but not hydralazine, prevented this effect. Compared with the NSD group, the binding of an AT(1) conformation-specific antibody that recognizes the activated form of the receptor was lower in both ventricles in all other groups. Losartan further lowered the binding of the anti-AT(1) antibody in both ventricles compared with all other experimental groups. Angiotensin II was greater in both ventricles in all groups compared with the NSD group. Myocardial structural alterations in response to HSD are independent of the effect on BP. Salt-induced cardiomyocyte hypertrophy and interstitial fibrosis possibly are due to different mechanisms. Evidence from the present study suggests that salt-induced AT(1) receptor internalization is probably due to angiotensin II binding. J. Nutr. 140: 1742-1751, 2010.
Resumo:
Background: The relation between left ventricular filing velocities determined by Doppler echocardiography and autonomic nervous system function assessed by heart rate variability (HRV) is unclear. The aim of this study was to evaluate the influence of the autonomic nervous system assessed by the time and frequency domain indices of HRV in the Doppler indices of left ventricular diastolic filling velocities in patients without heart disease. Methods: We studied 451 healthy individuals (255 female [56.4%]) with normal blood pressure, electrocardiogram, chest x-ray, and treadmill electrocardiographic exercise stress test results, with a mean age of 43 +/- 12 (range 15-82) years, who underwent transthoracic Doppler echocardiography and 24-hour electrocardiographic ambulatory monitoring. We studied indices of HRV on time (standard deviation [SD] of all normal sinus RR intervals during 24 hours, SD of averaged normal sinus RR intervals for all 5-minute segments, mean of the SD of all normal sinus RR intervals for all 5-minute segments, root-mean-square of the successive normal sinus RR interval difference, and percentage of successive normal sinus RR intervals > 50 ms) and frequency (low frequency, high frequency, very low frequency, low frequency/high frequency ratio) domains relative to peak flow velocity during rapid passive filling phase (E), atrial contraction (A), E/A ratio, E-wave deceleration time, and isovolumic relaxation time. Statistical analysis was performed with Pearson correlation and logistic regression. Results: Peak flow velocity during rapid passive filling phase (E) and atrial contraction (A), E/A ratio, and deceleration time of early mitral inflow did not demonstrate a significant correlation with indices of HRV in time and frequency domain. We found that the E/A ratio was < 1 in 45 individuals (10%). Individuals with an E/A ratio < 1 had lower indices of HRV in frequency domain (except low frequency/high frequency) and lower indices of the mean of the SD of all normal sinus RR intervals for all 5-minute segments, root-mean-square of the successive normal sinus RR interval difference, and percentage of successive normal sinus RR intervals > 50 ms in time domain. Logistic regression demonstrated that an E/A ratio < 1 was associated with lower HF. Conclusion: Individuals with no evidence of heart disease and an E/A ratio < 1 demonstrated a significant decrease in indexes of HRV associated with parasympathetic modulation. (J Am Soc Echocardiogr 2010;23: 762-5.)
Resumo:
Objectives We studied the relationship between changes in body composition and changes in blood pressure levels. Background The mechanisms underlying the frequently observed progression from pre-hypertension to hypertension are poorly understood. Methods We examined 1,145 subjects from a population-based survey at baseline in 1994/1995 and at follow-up in 2004/2005. First, we studied individuals pre-hypertensive at baseline who, during 10 years of follow-up, either had normalized blood pressure (PreNorm, n = 48), persistently had pre-hypertension (PrePre, n = 134), or showed progression to hypertension (PreHyp, n = 183). In parallel, we studied predictors for changes in blood pressure category in individuals hypertensive at baseline (n = 429). Results After 10 years, the PreHyp group was characterized by a marked increase in body weight (+5.71% [95% confidence interval (CI): 4.60% to 6.83%]) that was largely the result of an increase in fat mass (+17.8% [95% CI: 14.5% to 21.0%]). In the PrePre group, both the increases in body weight (+1.95% [95% CI: 0.68% to 3.22%]) and fat mass (+8.09% [95% CI: 4.42% to 11.7%]) were significantly less pronounced than in the PreHyp group (p < 0.001 for both). The PreNorm group showed no significant change in body weight (-1.55% [95% CI: -3.70% to 0.61%]) and fat mass (+0.20% [95% CI: -6.13% to 6.52%], p < 0.05 for both, vs. the PrePre group). Conclusions After 10 years of follow-up, hypertension developed in 50.1% of individuals with pre-hypertension and only 6.76% went from hypertensive to pre-hypertensive blood pressure levels. An increase in body weight and fat mass was a risk factor for the development of sustained hypertension, whereas a decrease was predictive of a decrease in blood pressure. (J Am Coll Cardiol 2010; 56: 65-76) (C) 2010 by the American College of Cardiology Foundation
Resumo:
BACKGROUND: The profile of blood donors changed dramatically in Brazil over the past 20 years, from remunerated to nonremunerated and then from replacement to community donors. Donor demographic data from three major blood centers establish current donation profiles in Brazil, serving as baseline for future analyses and tracking longitudinal changes in donor characteristics. STUDY DESIGN AND METHODS: Data were extracted from the blood center, compiled in a data warehouse, and analyzed. Population data were obtained from the Brazilian census. RESULTS: During 2007 to 2008, there were 615,379 blood donations from 410,423 donors. A total of 426,142 (69.2%) were from repeat (Rpt) donors and 189,237 (30.8%) were from first-time (FT) donors. Twenty percent of FT donors returned to donate in the period. FT donors were more likely to be younger, and Rpt donors were more likely to be community donors. All were predominantly male. Replacement donors still represent 50% of FT and 30% of Rpt donors. The mean percentage of the potentially general population who were donors was approximately 1.2% for the three centers (0.7, 1.5, and 3.1%). Adjusting for the catchment`s area, the first two were 2.1 and 1.6%. CONCLUSIONS: Donors in the three Brazilian centers tended to be younger with a higher proportion of males than in the general population. Donation rates were lower than desirable. There were substantial differences in sex, age, and community/replacement status by center. Studies on the safety, donation frequencies, and motivations of donors are in progress to orient efforts to enhance the availability of blood.
Resumo:
Background. Lung transplantation is the procedure of choice in several end-stage lung diseases. Despite improvements in surgical techniques and immunosuppression, early postoperative complications occur frequently. Objective. To evaluate the pleural inflammatory response after surgery. Patients and Methods. Twenty patients aged 18 to 63 years underwent unilateral or bilateral lung transplantation between August 2006 and March 2008. Proinflammatory cytokines interleukin (IL)-1 beta, IL-6, and IL-8 and vascular endothelial growth factor in pleural fluid and serum were analyzed. For cytokine evaluation, 20-mL samples of pleural fluid and blood (right, left, or both chest cavities) were obtained at 6 hours after surgery and daily until removal of the chest tube or for a maximum of 10 days. Data were analyzed using analysis of variance followed by the Holm-Sidak test. Results. All effusions were exudates according to Light`s criteria. Pleural fluid cytokine concentrations were highest at 6 hours after surgery. Serum concentrations were lower than those in pleural fluid, and IL-1 beta, IL-6, and IL-8 were undetectable at all time points. Conclusions. There is a peak concentration of inflammatory cytokines in the first 6 hours after transplantation, probably reflecting the effects of surgical manipulation. The decrease observed from postoperative day 1 and thereafter suggests the action of the immunosuppression agents and a temporal reduction in pleural inflammation.
Resumo:
Aims We conducted a meta-analysis to evaluate the accuracy of quantitative stress myocardial contrast echocardiography (MCE) in coronary artery disease (CAD). Methods and results Database search was performed through January 2008. We included studies evaluating accuracy of quantitative stress MCE for detection of CAD compared with coronary angiography or single-photon emission computed tomography (SPECT) and measuring reserve parameters of A, beta, and A beta. Data from studies were verified and supplemented by the authors of each study. Using random effects meta-analysis, we estimated weighted mean difference (WMD), likelihood ratios (LRs), diagnostic odds ratios (DORs), and summary area under curve (AUC), all with 95% confidence interval (0). Of 1443 studies, 13 including 627 patients (age range, 38-75 years) and comparing MCE with angiography (n = 10), SPECT (n = 1), or both (n = 2) were eligible. WMD (95% CI) were significantly less in CAD group than no-CAD group: 0.12 (0.06-0.18) (P < 0.001), 1.38 (1.28-1.52) (P < 0.001), and 1.47 (1.18-1.76) (P < 0.001) for A, beta, and A beta reserves, respectively. Pooled LRs for positive test were 1.33 (1.13-1.57), 3.76 (2.43-5.80), and 3.64 (2.87-4.78) and LRs for negative test were 0.68 (0.55-0.83), 0.30 (0.24-0.38), and 0.27 (0.22-0.34) for A, beta, and A beta reserves, respectively. Pooled DORs were 2.09 (1.42-3.07), 15.11 (7.90-28.91), and 14.73 (9.61-22.57) and AUCs were 0.637 (0.594-0.677), 0.851 (0.828-0.872), and 0.859 (0.842-0.750) for A, beta, and A beta reserves, respectively. Conclusion Evidence supports the use of quantitative MCE as a non-invasive test for detection of CAD. Standardizing MCE quantification analysis and adherence to reporting standards for diagnostic tests could enhance the quality of evidence in this field.
Resumo:
Studies that have investigated ascorbic acid (AA) concentrations in cord blood have pointed to significant associations with maternal blood AA concentrations. smoking, age, diet, type of delivery, duration of gestation, fetal distress and birth weight. The aim of the present study was to determine the relationship between cord blood AA concentrations in newborns and maternal characteristics. A total of 117 Brazilian healthy parturients were included in this cross-sectional study. The concentrations of AA in blood were determined by the HPLC method. Data concerning socio-economic, demographic, obstetric, nutritional and health characteristics of the parturients, including alcohol consumption and smoking habit, were assessed by a standardised questionnaire. A FFQ was used to investigate the intake of foods rich in vitamin C. Cord blood AA concentration was significantly correlated with per capita income (r 0.26; P=0.005), maternal blood AA concentration (r 0.48; P<0.001) and maternal vitamin C-rich food intake score (r 0.36; P<0.001). The linear regression model including maternal AA concentration, alcohol consumption, smoking, parity, vitamin C-rich food intake score and per capita income explained 31.13% of the variation in cord blood AA concentrations in newborns. We recommend further experimental studies to assess the effects of ethanol on placental AA uptake, and epidemiological cohort studies to evaluate in detail the influence of maternal alcohol consumption on cord blood AA concentrations.
Resumo:
Strategies to minimize the immunogenicity and toxicity of murine anti-CD3 antibodies (e.g. OKT3) are of special interest for organ transplantation and for the treatment of autoimmune diseases. In the present work, we have developed two humanized anti-CD3 antibodies. These molecules were shown to bind to human CD3, though less efficiently, and display less mitogenic activity than CKT3. These results prompted us to investigate whether this reduced mitogenic potential was associated with the development of anti-inflammatory properties. Indeed, in peripheral blood mononuclear cells (PBMCs), the humanized antibody versions induced a predominantly anti-inflammatory cytokine profile, in contrast with the pro-inflammatory profile induced by OKT3. Neither OKT3 nor the humanized versions induced the expression of IL-4, IL-2 or TGF-beta. Both humanized antibodies induced significantly lower production of IFN-gamma and IL-5 and slightly higher production of IL-10 than OKT3. This immunomodulatory profile was most evident by the 80-fold higher ratio of IL-10/IFN-gamma production in PBMCs cultured in the presence of the humanized antibodies, compared to those stimulated with CKT3. Furthermore, these humanized anti-CD3 antibodies induced a late FOXP3 gene expression while OKT3 led to a more transient expression of FOXP3. Taken our results, we suggest that these humanized anti-CD3 antibodies may promote the development of T cells with immunoregulatory activity. (C) 2009 Elsevier B.V. All rights reserved.
Resumo:
Blood pressure (BP) measurement is the basis for the diagnosis and management of arterial hypertension. The aim of this study was to compare BP measurements performed in the office and at home (home blood pressure monitoring, HBPM) in children and adolescents with chronic arterial hypertension. HBPM was performed by the patient or by his/her legal guardian. During a 14-day period, three BP measurements were performed in the morning or in the afternoon (daytime measurement) and in the evening (night-time measurement), with 1-min intervals between measurements, totalling six measurements per day. HBPM was defined for systolic blood pressure (SBP) and diastolic blood pressure (DBP) values. HBPM was evaluated in 40 patients (26 boys), mean age of 12.1 years (4-18 years). SBP and DBP records were analysed. The mean differences between average HBP and doctor`s office BP were 0.6 +/- 14 and 4 +/- 13 mm Hg for SBP and DBP, respectively. Average systolic HBPM (daytime and night-time) did not differ from average office BP, and diastolic HBPM (daytime and night-time) was statistically lower than office BP. The comparison of individual BP measurements along the study period (13 days) by s.d. of differences shows a significant decline only for DBP values from day 5, on which difference tends to disappear towards the end of the study. Mean daytime and night-time SBP and DBP values remained stable throughout the study period, confirming HBPM as an acceptable methodology for BP evaluation in hypertensive children and adolescents. Journal of Human Hypertension (2009) 23, 464-469; doi:10.1038/jhh.2008.167; published online 12 March 2009
Resumo:
The Wisconsin Card Sorting Test (WCST) is the gold standard in the evaluation of executive dysfunction (ED) in patients with temporal lobe epilepsy (TLE). We evaluated 35 children with TLE and 25 healthy controls with the WCST and with a more comprehensive battery. Among the children with TLE, 77.14% showed impairment on the WCST. On other tests (Wechsler Intelligence Scale for Children-Digit Forward, Matching Familiar Figures Test, Trail Making Test, Word Fluency, Finger Windows, and Number-Letter Memory), impairment was demonstrated in 94.29%. The authors concluded that the WCST is a good paradigm to measure executive impairment in children with TLE: however, it may be not enough. Evaluation performed only with the WCST not only underestimated the number of patients with ED, but also missed relevant information regarding the type of ED. (C) 2009 Elsevier Inc. All rights reserved.
Resumo:
Objective: Biofuel from sugarcane is widely produced in developing countries and is a clean and renewable alternative source of energy. However, sugarcane harvesting is mostly performed after biomass burning. The aim of this study was to evaluate the effects of harvesting after biomass burning on nasal mucociliary clearance and the nasal mucus properties of farm workers. Methods: Twenty seven sugarcane workers (21-45 years old) were evaluated at the end of two successive time-periods: first at the end of a 6-month harvesting period (harvesting), and then at the end of a 3-month period without harvesting (non-harvesting). Nasal mucociliary clearance was evaluated by the saccharine transit test, and mucus properties were analyzed using in vitro mucus contact angle and mucus transportability by sneeze. Arterial blood pressure, heart rate, respiratory rate, pulse oximetry, body temperature, associated illness, and exhaled carbon monoxide were registered. Results: Data are presented as mean values (95% confidence interval). The multivariate model analysis adjusted for age, body-mass index, smoking status and years of working with this agricultural practice showed that harvesting yielded prolonged saccharine transit test in 7.83 min (1.88-13.78), increased mucus contact angle in 8.68 degrees (3.18-14.17) and decreased transportability by sneeze in 32.12 mm (-44.83 to -19.42) compared with the non-harvesting period. No significant differences were detected in any of the clinical parameter at either time-period. Conclusion: Sugarcane harvesting after biomass burning negatively affects the first barrier of the respiratory system in farm workers by impairing nasal mucociliary clearance and inducing abnormal mucus properties. (C) 2011 Elsevier Inc. All rights reserved.
Resumo:
Background: Impairment in pulmonary capacity due to pleural effusion compromises daily activity. Removal of fluid improves symptoms, but the impact, especially on exercise capacity, has not been determined. Methods: Twenty-five patients with unilateral pleural effusion documented by chest radiograph were included. The 6-min walk test, Borg modified dyspnea score, FVC, and FEV, were analyzed before and 48 h after the removal of large pleural effusions. Results: The mean fluid removed was 1,564 +/- 695 mL. After the procedure, values of FVC, FEV and 6-min walk distance increased (P<.001), whereas dyspnea decreased (P<.001). Statistical correlations (P<.001) between 6-min walk distance and FVC (r=0.725) and between 6-min walk distance and FEV, (r=0.661) were observed. Correlations also were observed between the deltas (prethoracentesis X postthoracentesis) of the 6-min walk test and the percentage of FVC (r=0.450) and of FEV, (r=0.472) divided by the volume of fluid removed (P<.05). Conclusion: In addition to the improvement in lung function after thoracentesis, the benefits of fluid removal are more evident in situations of exertion, allowing better readaptation of patients to routine activities. CHEST 2011; 139(6):1424-1429
Resumo:
Background: Although obesity is usually observed in peripheral arterial disease (PAD) patients, the effects of the association between these diseases on walking capacity are not well documented. Objective: The main objectives of this study were to determine the effects of obesity on exercise tolerance and post-exercise hemodynamic recovery in elderly PAD patients. Methods: 46 patients with stable symptoms of intermittent claudication were classified according to their body mass index (BMI) into normal group (NOR) = BMI < 28.0 and obese or in risk of obesity group (OBE) = BMI >= 28.0. All patients performed a progressive graded treadmill test. During exercise, ventilatory responses were evaluated and pre- and post-exercise ankle and arm blood pressures were measured. Results: Exercise tolerance and oxygen consumption at total walking time were similar between OBE and NOR. However, OBE showed a lower claudication time (309 +/- 151 vs. 459 +/- 272 s, p = 0.02) with a similar oxygen consumption at this time. In addition, OBE presented a longer time for ankle brachial index recovery after exercise (7.8 +/- 2.8 vs. 6.3 +/- 2.6 min, p = 0.02). Conclusion: Obesity in elderly PAD patients decreased time to claudication, and delayed post-exercise hemodynamic recovery. These results suggest that muscle metabolic demand, and not total workload, is responsible for the start of the claudication and maximal exercise tolerance in PAD patients. Moreover, claudication duration might be responsible for the time needed to a complete hemodynamic recovery after exercise. Copyright (c) 2008 S. Karger AG, Basel