119 resultados para log-convexity
Resumo:
The influence of inulin, oligofructose and oligosaccharides from honey, combined in different proportions, on the consumers` sensory acceptance, probiotic viable count and fructan content of novel potentially synbiotic petit-suisse cheeses was investigated. Probiotic populations varied from 7.20 up to 7.69 log cfu g(-1) (Bifidobacterium animalis subsp. lactis) and from 6.08 up to 6.99 log cfu g(-1) (Lactobacillus acidophilus). The highest fructan contents were achieved by the cheese trials containing oligofructose and/or inulin (above 8.90 g 100 g(-1)). The control trial showed the lowest mean acceptance (6.63) after 28 days of refrigerated storage, whereas the highest acceptance (7.43) was observed for the trial containing 10 g 100 g(-1) oligofructose. Acceptance increased significantly during storage (P < 0.05) only for cheeses supplemented with oligoftuctose and/or inulin. Cheeses containing honey did not perform well enough compared to the cheeses with addition of inulin and/or oligofructose, and the best synbiotic petit-suisse cheese considering sensory and technological functional features was that containing oligofructose and inulin combined, therefore encouraging the commercial product use. (c) 2007 Swiss Society of Food Science and Technology. Published by Elsevier Ltd. All rights reserved.
Resumo:
Minimally processed leafy vegetables are ready-to-eat (RTE) products very attractive to consumers looking for healthy and convenient meals. However, the microbiological safety of these foods is of special concern due to the absence of lethal treatments during processing. In the present study, indicator microorganisms, Listeria spp. and Salmonella spp. were determined for 162 samples of minimally processed leafy vegetables commercialized in Brazil. Psychrotrophic aerobic bacterial populations >5 log CFU/g were found in 96.7% of the samples, while total and thermotolerant coliforms were detected respectively in 132 (81.5%) and 107 (66%) of vegetables analyzed. Escherichia coil was present in 86 (53.1%) samples analyzed and Listeria spp. and Salmonella spp. were detected respectively in 6 (3.7%) and 2 (1.2%) samples. These results indicate the need of implementing quality programs in the production chain of RTE vegetables to improve shelf life and microbiological safety. (C) 2011 Elsevier Ltd. All rights reserved.
Resumo:
Introduction: Whole blood is used for diagnosis of lead exposure. A non-invasive method to obtain samples for the biomonitoring of lead contamination has become a necessity. This study 1) compares the lead content in whole saliva samples (Pb-saliva) of children from a city with no reported lead contamination (Ribeirao Preto, Sao Paulo State, Brazil) and children of a region notoriously contaminated with lead (Bauru, Sao Paulo State, Brazil), and 2) correlates Pb-saliva with the lead content in the enamel microbiopsy samples (Pb-enamel) in the case of these two populations. Methods: From a population of our previous study that had included 247 children (4- to 6-year-old) from Ribeirao Preto, and 26 children from Bauru, Pb-saliva was analyzed in 125 children from Ribeirdo Preto and 19 children from Bauru by inductively coupled plasma mass spectrometry (ICPMS). To correlate Pb-saliva with Pb-enamel, we used Pb-enamel data obtained in our previous study. The Mann-Whitney test was employed to compare the Pb-saliva data of the two cities. Pb-saliva and Pb-enamel values were then Log(10) transformed to normalize data, and Pb-saliva and Pb-enamel were correlated using Pearson`s correlation coefficient. Results: Median Pb-saliva from the Ribeirao Preto population (1.64 mu g/L) and the Bauru population (5.85 mu g/L) were statistically different (p<0.0001). Pearson`s correlation coefficient for Log(10) Pb-saliva versus Log(10) Pb-enamel was 0.15 (p=0.08) for Ribeirao Preto and 0.38 (p=0.11) for Bauru. Conclusions: A clear relationship between Pb-saliva and environmental contamination by lead is shown. Further studies on Pb-saliva should be undertaken to elucidate the usefulness of saliva as a biomarker of lead exposure, particularly in children. (C) 2008 Elsevier B.V. All rights reserved.
Resumo:
Listeria monocytogenes is of particular concern for the food industry due to its psychrotolerant and ubiquitous nature. In this work, the ability of L monocytogenes culturable cells to adhere to stainless steel coupons was studied in co-culture with the bacteriocin-producing food isolate Lactobacillus sakei 1 as well as in the presence of the cell-free neutralized supernatant of L sakei 1 (CFSN-S1) containing sakacin 1. Results were compared with counts obtained using a non bacteriocin-producing strain (L sakei ATCC 15521) and its bacteriocin free supernatant (CFSN-SA). Culturable adherent L monocytogenes and lactobacilli cells were enumerated respectively on PALCAM and MRS agars at 3-h intervals for up to 12 h and after 24 and 48 h of incubation. Bacteriocin activity was evaluated by critical dilution method. After 6 h of incubation, the number of adhered L monocytogenes cells in pure culture increased from 3.8 to 5.3 log CFU/cm(2) (48h). Co-culture with L sakei 1 decreased the number of adhered L monocytogenes cells (P < 0.001) during all sampling times with counts lower than 3.0 log CFU/cm(2). The CFNS-S1 also led to a significant and similar reduction in culturable adhered L. monocytogenes counts for up to 24 h of incubation, however after 48 h of incubation, re-growth of L monocytogenes number of adhered cells was observed, likely due to lack of competition for nutrients. L sakei ATCC 15521 or its supernatant (CFNS-SA) did not reduce the number of adhered L monocytogenes cells on stainless steel surface and from 6 h of incubation, listerial counts were between 4.3 and 4.5 log CFU/cm(2). These results indicate that L sakei 1 and its bacteriocin sakacin 1 may be useful to inhibit early stages of L monocytogenes adherence to abiotic surface. (C) 2011 Elsevier Ltd. All rights reserved.
Resumo:
Iron and oxidative stress have a regulatory interplay. During the oxidative burst, phagocytic cells produce free radicals such as hypochlorous acid (HOCl). Nevertheless, scarce studies evaluated the effect of either iron deficiency anemia (IDA) or anemia of chronic disease (ACD) on phagocyte function in the elderly. The aim of the present study was to determine the oxidative burst, phagocytosis, and nitric oxide ((aEuro cent)NO) and HOCl, reactive species produced by monocytes and neutrophils in elderly with ACD or IDA. Soluble transferrin receptor, serum ferritin, and soluble transferrin receptor/log ferritin (TfR-F) index determined the iron status. The study was constituted of 39 patients aged over 60 (28 women and 11 men) recruited from the Brazilian Public Health System. Oxidative burst fluorescence intensity per neutrophil in IDA group and HOCl generation in both ACD and IDA groups were found to be lower (p < 0.05). The percentages of neutrophils and monocytes expressing phagocytosis in ACD group were found to be higher (p < 0.05). There was an overproduction of (aEuro cent)NO from monocytes, whereas the fundamental generation of HOCl appeared to be lower. Phagocytosis, oxidative burst, and (aEuro cent)NO and HOCl production are involved in iron metabolism regulation in elderly patients with ACD and IDA.
Resumo:
The supervised pattern recognition methods K-Nearest Neighbors (KNN), stepwise discriminant analysis (SDA), and soft independent modelling of class analogy (SIMCA) were employed in this work with the aim to investigate the relationship between the molecular structure of 27 cannabinoid compounds and their analgesic activity. Previous analyses using two unsupervised pattern recognition methods (PCA-principal component analysis and HCA-hierarchical cluster analysis) were performed and five descriptors were selected as the most relevants for the analgesic activity of the compounds studied: R (3) (charge density on substituent at position C(3)), Q (1) (charge on atom C(1)), A (surface area), log P (logarithm of the partition coefficient) and MR (molecular refractivity). The supervised pattern recognition methods (SDA, KNN, and SIMCA) were employed in order to construct a reliable model that can be able to predict the analgesic activity of new cannabinoid compounds and to validate our previous study. The results obtained using the SDA, KNN, and SIMCA methods agree perfectly with our previous model. Comparing the SDA, KNN, and SIMCA results with the PCA and HCA ones we could notice that all multivariate statistical methods classified the cannabinoid compounds studied in three groups exactly in the same way: active, moderately active, and inactive.
Resumo:
The present study aimed to determine the richness, occurrence constancy, reproductive modes. standard of abundance distribution, season of vocalization and to test correlation among climatic variables and activity of vocalization of anurans in a region of the Pampa Biome, Santa Maria, Rio Grande do Sul State. During the period of Novernber/2001 to October/2002 monthly collections were carried out utilizing the `survey at breeding site` method and examination of specimens kept in the Colecao Herpetologica do Setor de Zoologia da Universidade Federal de Santa Maria (ZUFSM). Tire Occurrence of 25 species of anurans was recorded. The anurofauna recorded represents 30% of the species known to Occur in Rio Grande do Sul, and comprises species generally associated with grasslands in this state and neighboring countries. Four reproductive modes were recorded: mode 1 (14 species: 58.3%) mode 11 and 30 (9 species` 37.5%) and mode 24 (1 species; 4.2%). The low diversification of reproductive modes is likely related to the homogeneity of the grassland habitat. Most species were constant or accessory in the Study area and the species abundance distribution patterns fit in the Broken Stick and Log-normal models. characterized by homogeneity of species abundance distribution. Most species showed great plasticity in habitat. but few were plastic in vocalization sites use. There was a weak positive correlation between species richness and precipitation. There was also a weak positive correlation between the abundance of species calling activity and maximum average temperatures. These correlations indicated that, in the study area. the abundance of calling males is more affected by the temperature, and species richness is more affected by precipitation, despite the fact that significantly higher species richness occurs during the hottest period of the year. These results showed that the climatological variables examined were not enough to explain the seasonal occurrence of species, thus the influence of other environmental variables merit to be tested in future studies.
Resumo:
Objectives: We compared 12-month outcomes, regarding ischemic events, repeat intervention, and ST, between diabetic and nondiabetic patients treated with the Genous (TM) EPC capturing R stent (TM) during routine nonurgent percutaneous coronary intervention (PCI) using data from the multicenter, prospective worldwide e-HEALING registry. Background: Diabetic patients have an increased risk for restenosis and stent thrombosis (ST). Methods: In the 4,996 patient e-HEALING registry, 273 were insulin requiring diabetics (IRD), 963 were non-IRD (NIRD), and 3,703 were nondiabetics. The 12-month primary outcome was target vessel failure (TVF), defined as target vessel-related cardiac death or myocardial infarction (MI) and target vessel revascularization. Secondary outcomes were the composite of cardiac death, MI or target lesion revascularization (TLR), and individual outcomes including ST. Cumulative event rates were estimated with the Kaplan-Meier method and compared with a log-rank test. Results: TVF rates were respectively 13.4% in IRD, 9.0% in NIRD, and 7.9% in nondiabetics (P < 0.01). This was mainly driven by a higher mortality hazard in IRD (P < 0.001) and NIRD (P = 0.07), compared with nondiabetics. TLR rates were comparable in NIRD and nondiabetics, but significantly higher in IRD (P = 0.04). No difference was observed in ST. Conclusion: The 1-year results of the Genous stent in a real-world population of diabetics show higher TVF rates in diabetics compared with nondiabetics, mainly driven by a higher mortality hazard. IRD is associated with a significant higher TLR hazard. Definite or probable ST in all diabetic patients was comparable with nondiabetics. (J Interven Cardiol 2011;24:285-294)
Resumo:
Background: We tested the hypothesis that the universal application of myocardial scanning with single-photon emission computed tomography (SPECT) would result in better risk stratification in renal transplant candidates (RTC) compared with SPECT being restricted to patients who, in addition to renal disease, had other clinical risk factors. Methods: RTCs (n=363) underwent SPECT and clinical risk stratification according to the American Society of Transplantation (AST) algorithm and were followed up until a major adverse cardiovascular event (MACE) or death. Results: Of the 363 patients, 79 patients (22%) had an abnormal SPECT scan and 270 (74%) were classified as high risk. Both methods correctly identified patients with increased probability of MACE. However, clinical stratification performed better (sensitivity and negative predictive value 99% and 99% vs. 25% and 87%, respectively). High-risk patients with an abnormal SPECT scan had a modest increased risk of events (log-rank = 0.03; hazard ratio [HR] = 1.37; 95% confidence interval [95% CI], 1.02-1.82). Eighty-six patients underwent coronary angiography, and coronary artery disease (CAD) was found in 60%. High-risk patients with CAD had an increased incidence of events (log-rank = 0.008; HR=3.85; 95% CI, 1.46-13.22), but in those with an abnormal SPECT scan, the incidence of events was not influenced by CAD (log-rank = 0.23). Forty-six patients died. Clinical stratification, but not SPECT, correlated with the probability of death (log-rank = 0.02; HR=3.25; 95% CI, 1.31-10.82). Conclusion: SPECT should be restricted to high-risk patients. Moreover, in contrast to SPECT, the AST algorithm was also useful for predicting death by any cause in RTCs and for selecting patients for invasive coronary testing.
Resumo:
Background: Prolonged use of lamivudine in patients coinfected with HIV and hepatitis B virus (HBV) leads to an increasing risk of lamivudine resistance in both diseases. We investigated the addition of entecavir, a potent inhibitor of HBV polymerase, to lamivudine-containing highly active antiretroviral therapy (HAART) in patients who experienced rebound in HBV viremia while maintaining Suppression of plasma HIV RNA less than 400 copies/ml. Methods: Sixty-eight patients were randomized to entecavir 1 mg (n = 51) or placebo (n = 17) once daily for 24 weeks; 65 patients continued the study with entecavir for an additional 24 weeks. Lamivudine-containing HAART was continued throughout. Results: At week 24, the mean HBV DNA in entecavir-treated patients was 5.52 log(10) - copies/ml versus 9.27 log(10) copies/ml for placebo, and at week 48, it was 4.79log(10) copies/ml versus 5.63log(10) copies/ml, respectively. The mean HBV DNA change from baseline for entecavir was -3.65 log(10) copies/ml (versus + 0.11 for placebo, P < 0.0001) and alanine aminotransferase normalization in 34%. of patients (versus 8% for placebo, P=0.08)At 48 weeks, mean change in HBV DNA reached -4.20log(10) copies/ml inpatients who received entecavir for the entire 48 weeks. The frequency of adverse events with entecavir and placebo was comparable. Through 48 weeks, no clinically relevant changes in HIV viremia or CD4 cell Counts were identified. Conclusion: In this study, entecavir was associated with rapid, clinically significant reductions in HBV DNA, with maintenance of HIV viremia suppression, in HIV/HBV coinfected patients with HBV viremia while on lamivudine treatment. (C) 2008 Wolters Kluwer Health vertical bar Lippincott Williams & Wilkins.
Resumo:
Background: Progression and long-term renal outcome of lupus nephritis (LN) in male patients is a controversial subject in the literature. The aim of this study was to evaluate the influence of male gender on the renal outcome of LN. Methods: All male (M) LN patients who fulfilled American College of Rheumatology lupus criteria and who were referred for a kidney biopsy from 1999 to 2009 were enrolled in the study. Subjects with end-stage renal disease at baseline, or follow-up time below 6 months, were excluded. Cases were randomly matched to female (F) patients according to the class of LN, baseline estimated glomerular filtration rate (eGFR, Modification of Diet in Renal Disease simplified formula) and follow-up time. Treatment was decided by the clinical staff based on usual literature protocols. The primary endpoint was doubling of serum creatinine and/or end-stage renal disease. The secondary endpoint was defined as a variation of glomerular filtration rate (GFR) per year (Delta GFR/y index), calculated as the difference between final and initial eGFR adjusted by follow-up time for each patient. Results: We included 93 patients (31 M : 62 F). At baseline, M and F patients were not statistically different regarding WHO LN class (II 9.7%, IV 71%, V 19.3%), eGFR (M 62.4 +/- 36.4 ml/min/1.73 m(2) versus F 59.9 +/- 32.7 ml/min/1.73 m(2)), follow-up time (M 44.2 +/- 27.3 months versus F 39.9 +/- 27.9 months), and 24-hour proteinuria (M 5.3 +/- 4.6 g/day versus F 5.2 +/- 3.0 g/day), as well as age, albumin, C3, antinuclear antibody, anti-DNA antibody and haematuria. There was no difference in the primary outcome (M 19% versus F 13%, log-rank p = 0.62). However, male gender was significantly associated with a worse renal function progression, as measured by Delta GFR/y index (beta coefficient for male gender -12.4, 95% confidence interval -22.8 to -2.1, p = 0.02). The multivariate linear regression model showed that male gender remained statistically associated with a worse renal outcome even after adjustment for eGFR, proteinuria, albumin and C3 complement at baseline. Conclusion: In our study, male gender presented a worse evolution of LN (measured by an under GFR recovering) when compared with female patients with similar baseline features and treatment. Factors that influence the progression of LN in men and sex-specific treatment protocols should be further addressed in new studies. Lupus (2011) 20, 561-567.
Resumo:
SETTING: Chronic obstructive pulmonary disease (COPD) is the third leading cause of death among adults in Brazil. OBJECTIVE: To evaluate the mortality and hospitalisation trends in Brazil caused by COPD during the period 1996-2008. DESIGN: We used the health official statistics system to obtain data about mortality (1996-2008) and morbidity (1998-2008) due to COPD and all respiratory diseases (tuberculosis: codes A15-16; lung cancer: code C34, and all diseases coded from J40 to 47 in the 10th Revision of the International Classification of Diseases) as the underlying cause, in persons aged 45-74 years. We used the Joinpoint Regression Program log-linear model using Poisson regression that creates a Monte Carlo permutation test to identify points where trend lines change significantly in magnitude/direction to verify peaks and trends. RESULTS: The annual per cent change in age-adjusted death rates due to COPD declined by 2.7% in men (95%CI -3.6 to -1.8) and -2.0% (95%CI -2.9 to -1.0) in women; and due to all respiratory causes it declined by -1.7% (95%CI 2.4 to -1.0) in men and -1.1% (95%CI -1.8 to -0.3) in women. Although hospitalisation rates for COPD are declining, the hospital admission fatality rate increased in both sexes. CONCLUSION: COPD is still a leading cause of mortality in Brazil despite the observed decline in the mortality/hospitalisation rates for both sexes.
Resumo:
Background We validated a strategy for diagnosis of coronary artery disease ( CAD) and prediction of cardiac events in high-risk renal transplant candidates ( at least one of the following: age >= 50 years, diabetes, cardiovascular disease). Methods A diagnosis and risk assessment strategy was used in 228 renal transplant candidates to validate an algorithm. Patients underwent dipyridamole myocardial stress testing and coronary angiography and were followed up until death, renal transplantation, or cardiac events. Results The prevalence of CAD was 47%. Stress testing did not detect significant CAD in 1/3 of patients. The sensitivity, specificity, and positive and negative predictive values of the stress test for detecting CAD were 70, 74, 69, and 71%, respectively. CAD, defined by angiography, was associated with increased probability of cardiac events [log-rank: 0.001; hazard ratio: 1.90, 95% confidence interval (CI): 1.29-2.92]. Diabetes (P=0.03; hazard ratio: 1.58, 95% CI: 1.06-2.45) and angiographically defined CAD (P=0.03; hazard ratio: 1.69, 95% CI: 1.08-2.78) were the independent predictors of events. Conclusion The results validate our observations in a smaller number of high-risk transplant candidates and indicate that stress testing is not appropriate for the diagnosis of CAD or prediction of cardiac events in this group of patients. Coronary angiography was correlated with events but, because less than 50% of patients had significant disease, it seems premature to recommend the test to all high-risk renal transplant candidates. The results suggest that angiography is necessary in many high-risk renal transplant candidates and that better noninvasive methods are still lacking to identify with precision patients who will benefit from invasive procedures. Coron Artery Dis 21: 164-167 (C) 2010 Wolters Kluwer Health vertical bar Lippincott Williams & Wilkins.
A Randomized Trial of a Skin Sealant to Reduce the Risk of Incision Contamination in Cardiac Surgery
Resumo:
Background. Immobilizing skin microbes is a rational approach to reducing contamination of surgical sites by endogenous microorganisms. Methods. This randomized, controlled, parallel-group, multicenter, open-label clinical trial (ClinicalTrials.gov NCT00467857) enrolled 300 adults scheduled for elective coronary artery bypass graft surgery. Patients received iodine-based skin preparations followed by a cyanoacrylate-based skin sealant or skin preparations alone. Microbiological samples collected from sternal and graft incision sites immediately before any skin preparation, at the wound border after skin incision, and at the incision after fascial closure were evaluated quantitatively. Results. In evaluable patients, mean microbial counts in collected samples increased at the sternal site after fascial closure compared with after skin incision by 0.37 log(10) colony-forming units (CFU)/mL in the skin sealant group (n = 120) and by 0.57 log10 CFU/mL in the control group (n = 132) (p = 0.047, Wilcoxon rank sum test). At the graft site, mean microbial counts increased by 0.09 (n = 119) and 0.27 (n = 127) log(10) CFU/mL, respectively (p = 0.037). There was a 35.3% relative risk reduction in surgical site infection (SSI) occurring in the skin sealant group (9 of 146 patients, 6.2%) versus the control group (14 of 147 patients, 9.5%). In obese patients (body mass index [BMI] > 30.0 to <= 37.0 kg/m(2)), the relative risk reduction for SSI associated with skin sealant was 83.3%. Conclusions. Pretreatment with skin sealant protects against contamination of the surgical incision by migration of skin microbes. Further data are needed to confirm the impact of this technology on SSI rates in clinical practice. (Ann Thorac Surg 2011;92:632-7) (C) 2011 by The Society of Thoracic Surgeons ADULT CARDIAC
Resumo:
Background Left atrial volume indexed (LAVI) has been reported as a predictor of cardiovascular events. We sought to determine the prognostic value of LAVI for predicting the outcome of patients who underwent dobutamine stress echocardiography (DSE) for known or suspected coronary artery disease (CAD). Methods From January 2000 to July 2005, we studied 981 patients who underwent DSE and off-line measurements of LAVI. The value of DSE over clinical and LAVI data was examined using a stepwise log-rank test. Results During a median follow-up of 24 months, 56 (6%) events occurred. By univariate analysis, predictors of events were male sex, diabetes mellitus, previous myocardial infarction, left ventricular ejection fraction (LVEF), left atrial diameter indexed, LAVI, and abnormal DSE. By multivariate analysis, independent predictors were LVEF (relative risk [RR] = 0.98, 95% CI 0.95-1.00), LAVI (RR = 1.04, 95% CI 1.02-1.05), and abnormal DSE (RR = 2.70, 95% CI 1.28-5.69). In an incremental multivariate model, LAVI was additional to clinical data for predicting events (chi(2) 36.8, P < .001). The addition of DSE to clinical and LAVI yielded incremental information (chi(2) 55.3, P < .001). The 3-year event-free survival in patients with normal DSE and LAVI <= 33 mL/m(2) was 96%; with abnormal DSE and LAVI <= 33 mL/m(2), 91%; with normal DSE and LAVI >34 mL/m(2), 83%; and with abnormal DSE and LAVI >34 mL/m(2) 51%. Conclusion Left atrial volume indexed provides independent prognostic information in patients who underwent DSE for known or suspected CAD. Among patients with normal DSE, those with larger LAVI had worse outcome, and among patients with abnormal DSE, LAVI was still predictive. (Am Heart J 2008; 156:1110-6.)