114 resultados para Beginning number
Resumo:
It is well known that eosinophilia is a key pathogenetic component of toxocariasis. The objective of the present study was to determine if there is an association between peritoneal and blood eosinophil influx, mast cell hyperplasia and leukotriene B4 (LTB4) production after Toxocara canis infection. Oral inoculation of 56-day-old Wistar rats (N = 5-7 per group) with 1000 embryonated eggs containing third-stage (L3) T. canis larvae led to a robust accumulation of total leukocytes in blood beginning on day 3 and peaking on day 18, mainly characterized by eosinophils and accompanied by higher serum LTB4 levels. At that time, we also noted increased eosinophil numbers in the peritoneal cavity. In addition, we observed increased peritoneal mast cell number in the peritoneal cavity, which correlated with the time course of eosinophilia during toxocariasis. We also demonstrated that mast cell hyperplasia in the intestines and lungs began soon after the T. canis larvae migrated to these compartments, reaching maximal levels on day 24, which correlated with the complete elimination of the parasite. Therefore, mast cells appear to be involved in peritoneal and blood eosinophil infiltration through an LTB4-dependent mechanism following T. canis infection in rats. Our data also demonstrate a tight association between larval migratory stages and intestinal and pulmonary mast cell hyperplasia in the toxocariasis model.
Resumo:
The objective of the present study was to determine if there is a relationship between serum levels of brain-derived neurotrophic factor (BDNF) and the number of T2/fluid-attenuated inversion recovery (T2/FLAIR) lesions in multiple sclerosis (MS). The use of magnetic resonance imaging (MRI) has revolutionized the study of MS. However, MRI has limitations and the use of other biomarkers such as BDNF may be useful for the clinical assessment and the study of the disease. Serum was obtained from 28 MS patients, 18-50 years old (median 38), 21 women, 0.5-10 years (median 5) of disease duration, EDSS 1-4 (median 1.5) and 28 healthy controls, 19-49 years old (median 33), 19 women. BDNF levels were measured by ELISA. T1, T2/FLAIR and gadolinium-enhanced lesions were measured by a trained radiologist. BDNF was reduced in MS patients (median [range] pg/mL; 1160 [352.6-2640]) compared to healthy controls (1640 [632.4-4268]; P = 0.03, Mann-Whitney test) and was negatively correlated (Spearman correlation test, r = -0.41; P = 0.02) with T2/FLAIR (11-81 lesions, median 42). We found that serum BDNF levels were inversely correlated with the number of T2/FLAIR lesions in patients with MS. BDNF may be a promising biomarker of MS.
Resumo:
We aimed to evaluate knowledge of first aid among new undergraduates and whether it is affected by their chosen course. A questionnaire was developed to assess knowledge of how to activate the Mobile Emergency Attendance Service - MEAS (Serviço de Atendimento Móvel de Urgência; SAMU), recognize a pre-hospital emergency situation and the first aid required for cardiac arrest. The students were also asked about enrolling in a first aid course. Responses were received from 1038 of 1365 (76.04%) new undergraduates. The questionnaires were completed in a 2-week period 1 month after the beginning of classes. Of the 1038 respondents (59.5% studying biological sciences, 11.6% physical sciences, and 28.6% humanities), 58.5% knew how to activate the MEAS/SAMU (54.3% non-biological vs 61.4% biological, P=0.02), with an odds ratio (OR)=1.39 (95%CI=1.07-1.81) regardless of age, sex, origin, having a previous degree or having a relative with cardiac disease. The majority could distinguish emergency from non-emergency situations. When faced with a possible cardiac arrest, 17.7% of the students would perform chest compressions (15.5% non-biological vs 19.1% biological first-year university students, P=0.16) and 65.2% would enroll in a first aid course (51.1% non-biological vs 74.7% biological, P<0.01), with an OR=2.61 (95%CI=1.98-3.44) adjusted for the same confounders. Even though a high percentage of the students recognized emergency situations, a significant proportion did not know the MEAS/SAMU number and only a minority had sufficient basic life support skills to help with cardiac arrest. A significant proportion would not enroll in a first aid course. Biological first-year university students were more prone to enroll in a basic life support course.
Resumo:
Retrograde autologous priming (RAP) has been routinely applied in cardiac pediatric cardiopulmonary bypass (CPB). However, this technique is performed in pediatric patients weighing more than 20 kg, and research about its application in pediatric patients weighing less than 20 kg is still scarce. This study explored the clinical application of RAP in CPB in pediatric patients undergoing cardiac surgery. Sixty pediatric patients scheduled for cardiac surgery were randomly divided into control and experimental groups. The experimental group was treated with CPB using RAP, while the control group was treated with conventional CPB (priming with suspended red blood cells, plasma and albumin). The hematocrit (Hct) and lactate (Lac) levels at different perioperative time-points, mechanical ventilation time, hospitalization duration, and intraoperative and postoperative blood usage were recorded. Results showed that Hct levels at 15 min after CPB beginning (T2) and at CPB end (T3), and number of intraoperative blood transfusions were significantly lower in the experimental group (P<0.05). There were no significant differences in CPB time, aortic blocking time, T2-Lac value or T3-Lac between the two groups (P>0.05). Postoperatively, there were no significant differences in Hct (2 h after surgery), mechanical ventilation time, intensive care unit time, or postoperative blood transfusion between two groups (P>0.05). RAP can effectively reduce the hemodilution when using less or not using any banked blood, while meeting the intraoperative perfusion conditions, and decreasing the perioperative blood transfusion volume in pediatric patients.
Resumo:
The artisanal production of cachaça, a beverage obtained by the fermentation of sugar cane juice after distillation, especially by small-sized producers, has traditionally used natural ferment ("fermento caipira") which consists of sugar cane juice with crushed corn, powdered rice, or citrus fruits. In despite of the difficulties in quality control due to the high level of contaminants and longer periods of preparation, the sensorial quality of the beverage may be attributed to the physiological activities of wild yeasts and even bacteria present during fermentation when this ferment is used. In this context, the aim here was to evaluate the microbiological (yeasts) and physicochemical characteristics of sugar cane juice extracted from different parts of three different varieties (RB72454, RB835486, and RB867515) of the cane stalk (lower, medium, and upper sections) in three harvesting periods (from May to December 2007) in an area under organic management. The juice from the upper section (from the eleventh internode to the top) of the sugar cane stalk could be indicated for the preparation of the natural ferment since it is as a source of yeasts and reducing sugars, especially the variety RB867515. Due to the seasonality, the best period for using this part of the sugar cane stalk is at the beginning of harvesting when the phenolic compounds are at low concentration, but there are higher number of Saccharomyces population and other yeast species. The high acidity in this section of the plant could result in a better control of bacterial contamination. These findings explain the traditional instructions of adding the upper sections for the preparation of natural ferment and can help its management in order to get a better performance with respect to organic cachaça production.
Resumo:
The aim of this study was to evaluate some physical and chemical parameters (total solids, pH, acidity, fat, acid degree value of fat, salt, protein and nitrogen fractions) and their effects on the beneficial (lactic acid bacteria: LAB) and undesirable microbial populations (coliforms, Escherichia coli, Staphylococcus aureus, moulds, and yeast) during ripening of Artisanal Corrientes Cheese, an Argentinian cow's milk variety, to determine whether a longer ripening period than usual improve its hygienic-sanitary quality. The protein content was much higher than that of other cow's milk cheeses with similar values of fat. The larger peptides showed values three times higher in the 30 day-old cheese than those obtained in the beginning of the process. Staphylococcus aureus and Escherichia coli were detected (3.04 ± 1.48 log10 cfu/g of cheese, 2.21 ± 0.84 log10 MPN/g of cheese) even at 15 and 30 days of ripening, respectively. The distribution of three hundred LAB strains classified to the genus level (lactococci:lactobacilli:leuconostocs) was maintained during the ripening period. The high number of LAB in rennet may have contributed to the fermentation as a natural whey starter, unknown source of LAB for this specific cheese so far. The physicochemical changes that occur during ripening were not big enough to inhibit the growth of undesirable microorganisms.
Resumo:
This study aimed to evaluate the efficiency of natural biocides, brown and green propolis, for the control of bacterial contamination in the production of sugarcane spirit. The treatments consisted of brown and green propolis extracts, ampicillin, and a control and were assessed at the beginning and end of harvest season in ten fermentation cycles. In the microbiological analyses, the lactic acid bacteria were quantified in the inoculum before and after the treatment with biocides, and the viability of yeast cells during fermentation was evaluated. The levels of acids, glycerol, total residual reducing sugars, and ethanol were analyzed for the wine resulting from each fermentation cycle. A reduction in the number of bacterial contaminants in the inoculum in the treatments with the natural biocides was observed, but it did not affect the viability of yeast cells. The control of the contaminants led to the production of higher levels of ethanol and reduced acidity in the wine produced. The results of the use of brown and green propolis to control the growth microorganisms in the fermentation of sugarcane spirit can be of great importance for using alternative strategies to synthetic antibacterials in fermentation processes including other distilled beverage or spirits.
Resumo:
INTRODUCTION: Some beneficial effects from long-term use of corticosteroids have been reported in patients with IgA nephropathy. OBJECTIVE: This retrospective study aimed to evaluate the outcome of proteinuria and renal function according to a protocol based on a 6-month course of steroid treatment. METHOD: Twelve patients were treated with 1 g/day intravenous methylprednisolone for 3 consecutive days at the beginning of months 1, 3, and 5 plus 0.5 mg/kg oral prednisone on alternate days for 6 months (treated group). The control group included 9 untreated patients. RESULTS: Proteinuria (median and 25th and 75th percentiles) at baseline in the treated group was 1861 mg/24h (1518; 2417 mg/24h) and was 703 mg/24h (245; 983) and 684 mg/24h (266; 1023) at the 6th (p < 0.05 vs. baseline) and 12th months (p < 0.05 vs. baseline), respectively. In the control group the proteinuria was 1900 mg/24h (1620; 3197) at baseline and was 2290 mg/24h (1500; 2975) and 1600 mg/24h (1180; 2395) at the 6th and 12th months, respectively (not significant vs. baseline). When compared with the control group, the treated group showed lower proteinuria (p < 0.05) during the follow-up and a higher number of patients in remission (p < 0.05) at the 6th and 12th months. Renal function did not change during the follow-up and the adverse effects were mild in most of the patients. CONCLUSION: The 6-month course of steroid treatment was effective in reducing proteinuria during the 12 months of the follow-up, and was well-tolerated by most of the patients.
Resumo:
The objective of the study was to characterize annual ryegrass seed population dynamics, managed for natural re-sowing, in no til systems in rotation with soybean, in different chronosequences An area was cultivated for two years with soybean, left as fallow land for the next two years and then cultivated again with soybean for the next two years. The four chronosequences represented different management periods, two with soybean (6 and 8 years old) and the other two resting (3 and 9 years old). Soil samples were taken every month during one year and divided into two depths (0-5 and 5-10 cm). Vegetation dynamics were also evaluated (number of plants, inflorescences and seedlings). Soil seed bank (SSB) dynamics showed structural patterns in time, with a "storage period" in summer, an "exhausting period" during autumn and a "transition period" in winter and spring. Pasture establishment by natural re-sowing was totally dependent on the annual recruitment of seeds from the soil. The influence of the management practices on the SSB was more important than the number of years that these practices had been implemented. Places where soybean was sown showed the largest SSBs. Most of the seeds overcame dormancy and germinated at the end of the summer and beginning of the autumn, showing a typically transitory SSB, but with a small proportion of persistent seeds