909 resultados para Fecal contamination indicators
Resumo:
The behavioral repertory of Atta sexdens rubropilosa Forel (Hymenoptera: Formicidae) workers marked by size category was studied during the preparation of the leaf substrate in the laboratory. The workers were marked according to three physical castes, i.e., minima, generalist and forager. Seven types of behavioral acts were recorded for each caste, with the following frequencies: licking leaf fragments (64.6%), holding fragments on the surface of the fungus garden (16.4%), shredding the fragments (6.0%), chewing and crimping the edges of the fragments (9.0%), incorporating the fragments (2.7%), touching the surface of the fungus with their mandibles and other mouthparts after incorporation (0.3%), and depositing fecal fluid (0.1%). The minima workers were found to be more specialized in the activities related to the preparation of the leaf substrate, which represented 52% of the total number of tasks performed. The generalists performed 40.3% of these tasks, and the foragers 7.9%. Licking the substrate was the behavior most frequently recorded and performed for a longer period of time. In this way, the workers may feed and at the same time eliminate microorganisms that are harmful to the symbiont fungus. The smaller castes, minima and generalists, are those most responsible for the preparation of the leaf substrate and predominate within a colony. From a practical viewpoint, with the introduction of toxic bait containing insecticides, for example, these size categories will be those most intensely intoxicated, especially through the behavior of licking bait pellets. On the basis of the data obtained about these behaviors, we may raise the hypothesis that trophallaxis in not the major factor triggering contamination with an insecticide among the workers of a colony.
Resumo:
Objective. To assess the potential for contamination of wastewaters from pig farming. Methods. Wastewaters from pig farming were stored in a tank. After 0, 30, 60, 90, and 120 days of hydraulic retention, they were added to lysimeters filled with argillaceous, sandy, or medium soil. Finally, these lysimeters were submitted to simulations of either a rainy season or a dry season. The number of colony-forming units (CFUs) of total coliforms, fecal coliforms, and fecal streptococci was measured in the effluents of the storage tank (for the various periods of hydraulic retention), in the percolate from the lysimeters, and in the three types of soil. The microbiological analyses were carried out using the membrane filter technique. The pH analyses were done potentiometrically. Results. For the three microorganisms, the largest decrease in bacterial counts in the storage tanks occurred with 90 or 120 days of retention. There was a marked decrease in the bacterial count in the percolates of the three soils. For the three soil types the greatest reduction in bacterial counts was found in medium soil, due to its acidity (pH < 7.0). Hydraulic retention was not sufficient to ensure the sanitary adequacy of the wastewaters and their use for irrigation, given that fecal coliform values were above 1 000 CFU per 100 mL. Therefore, adding the residues to the soil was considered a second stage of treatment. Conclusions. The retention of wastewaters followed by adding them to soil was effective in minimizing the contaminating effect of pig farming residues. The storage time for wastewaters from pig farming could be decreased from 120 to 90 days.
Resumo:
Purpose: This study tested the null hypothesis that different treatments of saliva-contaminated substrate would not affect microgap formation at the dentin walls of bonded restorations. Materials and Methods: Forty freshly extracted human molars received standardized Class V preparations on buccal and lingual surfaces. The specimens were assigned to four experimental groups (n = 20): [G1] no contamination (control group), [G2] saliva contamination (10 s) after etching followed by 5 s air stream; [G3] saliva contamination after etching and rinsed for 10 s; and [G4] re-etching for 10 s after saliva contamination. All specimens were restored with a one-bottle adhesive (Single Bond, 3M ESPE) and microhybrid composite resin (Filtek Z250, 3M ESPE) according to the manufacturer's instructions. The specimens were thermocycled, sectioned through the center of the restoration, and then processed for SEM. Microgaps were measured at the axial wall at 1500X magnification. The data were submitted to Kruskal-Wallis nonparametric statistical analysis at p < 0.05. Results: The data revealed that different groups resulted in a statistically significant difference (p < 0.01) in gap formation. Air drying [G2] and rinsing [G3] the saliva-contaminated dentin resulted in similar microgap values (p > 0.05). However, re-etching the dentin after saliva contamination [G4] increased microgap formation (p < 0.05) when compared with the groups G1 and G2. Although air drying and rinsing produced results comparable to noncontaminated dentin, the presence of microgaps was not completely eliminated. Conclusion: Contaminated saliva did not prevent hybrid layer formation; however, it did reduce the adaptation of the restorative material to bonded surfaces.
Resumo:
Background. Iron-deficiency anemia currently is the most frequently occurring nutritional disorder worldwide. Previous Brazilian studies have demonstrated that drinking water fortified with iron and ascorbic acid is an adequate vehicle for improving the iron supply for children frequenting day-care centers. Objective. The objective of this study was to clarify the role of ascorbic acid as a vehicle for improving iron intake in children in day-care centers in Brazil. Methods. A six-month study was conducted on 150 children frequenting six day-care centers divided into two groups of three day-care centers by drawing lots: the iron-C group (3 day-care centers, n = 74), which used water fortified with 10 mg elemental iron and 100 mg ascorbic acid per liter, and the comparison group (3 day-care centers, n = 76), which used water containing only 100 mg ascorbic acid per liter. Anthropometric measurements and determinations of capillary hemoglobin were performed at the beginning of the study and after six months of intervention. The food offered at the day-care centers was also analyzed. Results. The fo od offered at the day-care center was found to be deficient in ascorbic acid, poor in heme iron, and adequate in non-heme iron. Supplementation with fortified drinking water resulted in a decrease in the prevalence of anemia and an increase in mean hemoglobin levels associated with height gain in both groups. Conclusions. Fortification of drinking water with iron has previously demonstrated effectiveness in increasing iron supplies. This simple strategy was confirmed in the present study. The present study also demonstrated that for populations receiving an abundant supply of non-heme iron, it is possible to control anemia in a simple, safe, and inexpensive manner by adding ascorbic acid to drinking water. © 2005, The United Nations University.
Resumo:
This project has been developed to evaluate the possible relationship between the cesspit (pit latrine) in as far as it degrades the quality of underground water. Its importance is due to the fact that in the rural communities in the State of São Paulo (Brazil) this type of cesspit is very common as a means of sewage disposal and these communities use underground water for their supply of drinking water. Rural properties distributed over the rural area in the municipality of São José do Rio Preto were selected. A preliminary study was then set up to determine the social situation and health of the households as well as qualitative evaluations on the type of water supply and sewage disposal of these communities. Campaigns of water sampling then followed and laboratory analyses of water taken from wells were carried out. Parameters were set up to evaluate the potability according to Brazilian legislation (2004) paying attention to microbiologic (coliforms, Crytosporidium sp., and adenovirus). The analyses showed evidence of possible interaction between the wells and the sewage effluents and drainage in these communities. A PCR reaction to detect adenovirus showed a presence in 53.3% of the samples. The tests for the detection of Cryotosporidium sp all showed a negative result.
Resumo:
This work aimed to study the bacterial contamination in stings of the catfish Genidens genidens and Cathorops agassizii found in the São Vicente estuarine system (São Paulo State, Brazil). For bacteriological analyses, we used fish samples distributed into a group of 50 specimens (25 C. agassizii and 25 G. genidens) and a group of 14 specimens (7 C. agassizii and 7 G. genidens). Results showed contamination of 13 different bacterial species of Enterobacteriaceae, being Klebsiella pneumoniae the most frequent bacteria (26.80%) followed by Enterobacter sp and Escherichia coli (16.27%), and Serratia marcescens, Serratia sp. and Proteus mirabilis (1.16%). Gram-positive bacteria as well as fungi were not detected in the samples. According to the Gram-negative species characterized and with regard to the environmental conditions, it can also be considered that accidents with these catfish stings may develop significant acute secondary infections in humans.
Resumo:
The objective of this study was to describe a new platelet-rich plasma (PRP) protocol with a reduced concentration of leukocytes and intact platelets. We collected 8 mL of venous blood (VB) from marginal ear veins of 10 male New Zealand white rabbits in acid dextrose citrate Vacutainer tubes. Tubes were centrifuged at 302g for 10 minutes. All plasma was collected in plastic tubes to avoid buffy-coat contamination and centrifuged at 2862g for 5 minutes. A 10% calcium chloride activator (10 PRP:2 CaCl2) was added to the lower third of this plasma (PRP), and the PRP gel was obtained. Mean platelet count was 317.7 x 10(3) +/- 39.9/microL in VB and 1344.9 x 10(3) +/- 347.5/microL in PRP. Leukocyte counts were 3.96 x 10(3) +/- 2.01/microL and 0.46 x 10(3) +/- 0.45/microL in VB and PRP, respectively. Mean platelet enrichment was 327.4 +/- 97.8%. All differences were statistically significant (P > .05). This protocol is practical and reproducible, resulting in a high concentration of intact platelets to help tissue repair and low levels of leukocytes.
Resumo:
Three grazing management systems were compared to examine pasture decontamination of gastrointestinal nematode parasites (GIN) of sheep (Ovies aries) and cattle (Bos taurus). They consisted of sheep and cattle grazing paddocks alternately for 32, 96 or 192 days over 2 years. Pastureland (8.43 ha) was subdivided into six areas of eight paddocks each to produce an eight-paddock rotational grazing system. Every paddock was grazed for 4 days and then rested for 28 days. Sixty-six Ile de France ewes and 12 steers were randomly divided into three groups (22 sheep and four cattle per group). Each grazing system included a cattle area and a sheep area. Sheep and cattle interchanged areas every 32 days in system 1 (Group 1), every 96 days in system 2 (Group 2) and every 192 days in system 3 (Group 3). Fecal examination and larvae counting on pasture were performed every 32 days. During summer, winter and spring 2005, tracer lambs free of nematode infection were introduced into each sheep group and later sacrificed for quantification and identification of GIN species. All cattle were sacrificed for the same purpose. The main parasites found in tracer lambs were Haemonchus contortus and Trichostrongylus colubriformis, and in cattle, Haemonchus similis, Cooperia punctata and Oesophagostomum radiatum. Pasture contamination by sheep-infective GIN larvae was considerably reduced after 96 or 192 days of cattle grazing. Cross-infections between sheep and cattle GIN were not significant, which suggested that integrated grazing using such animals could be used for pasture decontamination. However, as effective anthelmintics were not available, decontamination was not sufficient for proper prophylaxis of GIN infections in Ile de France sheep, which are quite susceptible to such parasites. © 2007 Elsevier B.V. All rights reserved.
Resumo:
BACKGROUND AND OBJECTIVES: Based on the knowledge of the anti-inflammatory and anti-bacterial actions of local anesthetics (LA), the objective of this study was to determine the effects of peritoneal lavage with bupivacaine on survival of mice with fecal peritonitis. METHODS: Forty-eight Wistar mice, weighing between 300 and 330 g (311.45 ± 9.67 g), undergoing laparotomy 6 hours after induction of peritonitis were randomly divided in 4 groups: 1 - Control, without treatment (n = 12); 2 - Drying of the abdominal cavity (n = 12); 3 - Lavage with 3 mL NS and posterior drying of the abdominal cavity (n = 12); and 4 - Lavage with 8 mg.kg -1 (± 0.5 mL) of 0.5% bupivacaine added to 2.5 mL of NS followed by drying out of the abdominal cavity (n = 12). Animals that died underwent necropsy and the time of death was recorded. Surviving animals were killed on the 11 th postoperative day and underwent necropsy. RESULTS: Group 1 presented a 100% mortality rate in 52 hours, 100% mortality rate in Group 2 in 126 hours, and Group 3 presented a 50% mortality rate in 50 hours. Animals in Group 4 survived. Survival on the 11 th day was greater in groups 3 and 4 than in Groups 1 and 2 (p < 0.001) and greater in Group 4 than in Group 3 (p < 0.01). CONCLUSIONS: Peritoneal lavage with a solution of bupivacaine diluted in NS was effective in preventing death for 11 days in 100% of animals with fecal peritonitis. © Sociedade Brasileira de Anestesiologia, 2008.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Includes bibliography
Resumo:
Incluye Bibliografía
Resumo:
Includes bibliography
Resumo:
Includes bibliography
Resumo:
Includes bibliography