712 resultados para Diarrhea incidenc
Resumo:
A demonstration project entailing disease surveillance was conducted in the western Cayo District, Belize, from November 1981 to March 1982. The purpose was to test and demonstrate the feasibility of community-based surveillance. Interviews were conducted in three hundred twenty households at monthly intervals over a five-month period. Information regarding disease prevalence and medical care utilization relevant to public health practice was analyzed by staff attached to the health center in Benque Viejo. Data collected at the health center were used to validate reported findings.^ Differences between reported and actual study findings regarding clinic visits were small, though in many instances statistically significant. The proportion of underreported clinic visits was greater than the proportion overreported. Overall, reporting accuracy improved with time, particularly from the first to second month. Clinic utilization experience reported for men was as accurate as that reported for females.^ There was agreement between interview and clinic disease findings. In fact, the proportion of conditions defined in the interview and matched to clinic findings was high (malaria, diarrhea, dysentery, skin sores and ulcerations, and problems of nutrition) except for upper respiratory disorders. Finally, some conditions were more likely to be taken to the health center than others, e.g., children with diarrhea or skin sores and ulcerations were less likely to be taken to the health center than if they had malaria. ^
Resumo:
The vast majority of Bangladesh are poor and are unable even to provide for the most basic human needs. These are the landless and marginal farmers of Bangladesh. They constitute 70% of the rural population, which in turn constitute about 90% of the country's population.^ Effective development of Bangladesh would largely mean the development of the landless and marginal farmers. Past efforts of development in this section of the population, including that of the government, have not succeeded. One of the development goals of the government of Bangladesh is to improve the quality of life of the rural population through health and population control measures. Overpopulation, malnutrition and diarrhea are the major impediments to socioeconomic development in Bangladesh.^ The current study was designed to identify whether there is effective opinion leadership among the marginal and landless peasants affecting decisions on acceptance or nonacceptance of family planning methods and oral rehydration therapy (ORT) in the selected rural areas of Bangladesh. The study was conducted in eight randomly selected villages with funding from the Ministry of Health and Family Planning, government of Bangladesh. One hundred twenty-five opinion leaders were interviewed after they were identified by 408 rural couples owning land less than 2 acres and wives' age below 50. The study was conducted in two phases; couples' interview preceded that of the leaders.^ Findings of the study reveal that the opinion leaders influencing adoption of health and family planning among the landless and marginal farmers belong to the same class. Theses opinion leaders own land much less than the rich farmers and the formal leaders in the rural areas. Majority of these of opinion leaders are friends, neighbors and relatives, some are other persons who are businessmen and professionals like doctors, while the rest few are the field workers of health and family planning. Source of influence as a factor contribute most in differentiating use and non-use of family planning and ORT among both couples and leaders. The most frequent sources of influence referred by the couples and the leaders are the field workers of health and family planning, followed by the peer opinion leaders (friends, neighbors, relatives) and spouse.^ The opinion leaders do not differ much from the poor couples on land holding, a strong indicator of economic status, they however differ considerably on social factors such as family planning practice, education, and exposure to mass media.^ The study suggests that future development efforts in Bangladesh have to ensure community participation by the landless and marginal farmers and opinion leaders belonging to their class. ^
A systematic review of clostridium difficile infection in patients with iatrogenic immunesuppression
Resumo:
Background: Incidence of C. difficile infection (CDI) has increased dramatically in the past decade and is the most frequent cause of nosocomial infectious diarrhea. The outcome of infection may range from mild diarrhea to life-threatening pseudomembranous colitis depending on the immunological response of the host, which is highly compromised in this special population that includes bone marrow transplant (BMT), solid organ transplant (SOT) and cancer patients on cytotoxic chemotherapy. ^ Objectives: We conducted a meta-analysis to assess the incidence rates of CDI and the time to onset of infection in patients with iatrogenic immune suppression. ^ Methods: Original studies were identified through an extensive search of electronic databases including PubMed, Ovid Medline (R), RefWorks and Biological Abstracts and their references. The overall incidence rate of CDI in the immune suppressed population was calculated using random effects model and their 95% confidence interval was derived. Differences in the incidence of CDI and time to onset of infection were calculated between the groups and within the groups. Publication bias was assessed using a funnel plot. Results: Twenty nine published articles involving 7,424 patients met the eligibility requirements. The overall incidence of CDI in the immune suppressed population is 11.1% (95% Confidence Interval (CI): 9.2–13.4%). The incidence of CDI was higher in SOT patients (14.2%, 95% CI: 6.8–21.5%); (p-value-0.022) and in cancer patients on cytotoxic chemotherapy (11.4%, 95% CI: 8.4–15.4%); (p = 0.042) than in BMT patients (10.5%, 95% CI: 7.9–13.1%). In a subgroup analysis of BMT population, the incidence of CDI is significantly higher in patients who received allogeneic BMT (15.1%, 95% CI: 11.2–20.0%; p value <0.0001). Similarly, in the SOT population, the incidence of CDI was higher in patients who underwent liver transplantation (11.0%, 95% CI: 5.6–20.3%); (p= 0.0672). The median time to onset of infection was shorter in BMT patients (p=0.0025). ^ Conclusions: It is evident from the combined analysis of these 29 published studies that the incidence of CDI in the immune suppressed population is higher. However, early diagnosis and treatment of CDI will help reduce the morbidity and mortality due to CDI in this special population.^
Resumo:
Groundwater constitutes approximately 30% of freshwater globally and serves as a source of drinking water in many regions. Groundwater sources are subject to contamination with human pathogens (viruses, bacteria and protozoa) from a variety of sources that can cause diarrhea and contribute to the devastating global burden of this disease. To attempt to describe the extent of this public health concern in developing countries, a systematic review of the evidence for groundwater microbially-contaminated at its source as risk factor for enteric illness under endemic (non-outbreak) conditions in these countries was conducted. Epidemiologic studies published in English language journals between January 2000 and January 2011, and meeting certain other criteria, were selected, resulting in eleven studies reviewed. Data were extracted on microbes detected (and their concentrations if reported) and on associations measured between microbial quality of, or consumption of, groundwater and enteric illness; other relevant findings are also reported. In groundwater samples, several studies found bacterial indicators of fecal contamination (total coliforms, fecal coliforms, fecal streptococci, enterococci and E. coli), all in a wide range of concentrations. Rotavirus and a number of enteropathogenic bacteria and parasites were found in stool samples from study subjects who had consumed groundwater, but no concentrations were reported. Consumption of groundwater was associated with increased risk of diarrhea, with odds ratios ranging from 1.9 to 6.1. However, limitations of the selected studies, especially potential confounding factors, limited the conclusions that could be drawn from them. These results support the contention that microbial contamination of groundwater reservoirs—including with human enteropathogens and from a variety of sources—is a reality in developing countries. While microbially-contaminated groundwaters pose risk for diarrhea, other factors are also important, including water treatment, water storage practices, consumption of other water sources, water quantity and access to it, sanitation and hygiene, housing conditions, and socio-economic status. Further understanding of the interrelationships between, and the relative contributions to disease risk of, the various sources of microbial contamination of groundwater can guide the allocation of resources to interventions with the greatest public health benefit. Several recommendations for future research, and for practitioners and policymakers, are presented.^
Resumo:
Clostridium difficile is the leading definable cause of nosocomial diarrhea worldwide due to its virulence, multi-drug resistance, spore-forming ability, and environmental persistence. The incidence of C. difficile infection (CDI) has been increasing exponentially in the last decade. Virulent strains of C. difficile produce either toxin A and/or toxin B, which are essential for the pathogenesis of this bacterium. Current methods for diagnosing CDI are mostly qualitative tests that detect the bacterium, the toxins, or the toxin genes. These methods do not differentiate virulent C. difficile strains that produce active toxins from non-virulent strains that do not produce toxins or produce inactive toxins. Based on the knowledge that C. difficile toxins A and B cleave a substrate that is stereochemically similar to the native substrate of the toxins, uridine diphosphoglucose, a quantitative, cost-efficient assay, the Cdifftox activity assay, was developed to measure C. difficile toxin activity. The concept behind the activity assay was modified to develop a novel, rapid, sensitive, and specific assay for C. difficile toxins in the form of a selective and differential agar plate culture medium, the Cdifftox Plate assay (CDPA). This assay combines in a single step the specific identification of C. difficile strains and the detection of active toxin(s). The CDPA was determined to be extremely accurate (99.8% effective) at detecting toxin-producing strains based on the analysis of 528 C. difficile isolates selected from 50 tissue culture cytotoxicity assay-positive clinical stool samples. This new assay advances and improves the culture methodology in that only C. difficile strains will grow on this medium and virulent strains producing active toxins can be differentiated from non-virulent strains. This new method reduces the time and effort required to isolate and confirm toxin-producing C. difficile strains and provides a clinical isolate for antibiotic susceptibility testing and strain typing. The Cdifftox activity assay was used to screen for inhibitors of toxin activity. Physiological levels of the common human conjugated bile salt, taurocholate, was found to inhibit toxin A and B in vitro activities. When co-incubated ex vivo with purified toxin B, taurocholate protected Caco-2 colonic epithelial cells from the damaging effects of the toxin. Furthermore, using a caspase-3 detection assay, taurocholate reduced the extent of toxin B-induced Caco-2 cell apoptosis. These results suggest that bile salts can be effective in protecting the gut epithelium from C. difficile toxin damage, thus, the delivery of physiologic amounts of taurocholate to the colon, where it is normally in low concentration, could be useful in CDI treatment. These findings may help to explain why bile rich small intestine is spared damage in CDI, while the bile salt poor colon is vulnerable in CDI. Toxin synthesis in C. difficile occurs during the stationary phase, but little is known about the regulation of these toxins. It was hypothesized that C. difficile toxin synthesis is regulated by a quorum sensing mechanism. Two lines of evidence supported this hypothesis. First, a small (KDa), diffusible, heat-stable toxin-inducing activity accumulates in the medium of high-density C. difficile cells. This conditioned medium when incubated with low-density log-phase cells causes them to produce toxin early (2-4 hrs instead of 12-16 hrs) and at elevated levels when compared with cells grown in fresh medium. These data suggested that C. difficile cells extracellularly release an inducing molecule during growth that is able to activate toxin synthesis prematurely and demonstrates for the first time that toxin synthesis in C. difficile is regulated by quorum signaling. Second, this toxin-inducing activity was partially purified from high-density stationary-phase culture supernatant fluid by HPLC and confirmed to induce early toxin synthesis, even in C. difficile virulent strains that over-produce the toxins. Mass spectrometry analysis of the purified toxin-inducing fraction from HPLC revealed a cyclic compound with a mass of 655.8 Da. It is anticipated that identification of this toxin-inducing compound will advance our understanding of the mechanism involved in the quorum-dependent regulation of C. difficile toxin synthesis. This finding should lead to the development of even more sensitive tests to diagnose CDI and may lead to the discovery of promising novel therapeutic targets that could be harnessed for the treatment C. difficile infections.
Resumo:
C. difficile causes gastrointestinal infections in humans, including severe diarrhea. It is implicated in 20%-30% of cases of antibiotic-associated diarrhea, in 50%-70% of cases of antibiotic-associated colitis, and in >90% of cases of antibiotic-associated pseudomembranous colitis. Exposure to antimicrobial agent, hospitalization and age are some of the risk factors that predispose to CDI. Virtually all hospitalized patients with nosocomially-acquired CDI have a history of treatment with antimicrobials or neoplastic agent within the previous 2 months. The development of CDI usually occurs during treatment with antibiotics or some weeks after completing the course of the antibiotics. ^ After exposure to the organism (often in a hospital), the median incubation period is less than 1 week, with a median time of onset of 2days. The difference in the time between the use of antibiotic and the development of the disease relate to the timing of exogenous acquisition of C. difficile. ^ This paper reviewed the literature for studies on different classes of antibiotics in association with the rates of primary CDI and RCDI from the year 1984 to 2012. The databases searched in this systematic review were: PubMed (National Library of Medicine) and Medline (R) (Ovid). RefWorks was used to store bibliographic data. ^ The search strategy yielded 733 studies, 692 articles from Ovid Medline (R) and 41 articles from PubMed after removing all duplicates. Only 11 studies were included as high quality studies. Out of the 11 studies reviewed, 6 studies described the development of CDI in non-CDI patients taking antibiotics for other purposes and 5 studies identified the risk factors associated with the development of recurrent CDI after exposure to antibiotics. ^ The risk of developing CDI in non-CDI patients receiving beta lactam antibiotics was 2.35%, while fluoroquinolones, clindamycin/macrolides and other antibiotics were associated with 2.64%, 2.54% and 2.35% respectively. Of those who received beta lactam antibiotic, 26.7% developed RCDI, while 36.8% of those who received any fluoroquinolone developed RCDI, 26.5% of those who received either clindamycin or macrolides developed RCDI and 29.1% of those who received other antibiotics developed RCDI. Continued use of non-C. difficile antibiotics especially fluoroquinolones was identified as an important risk factor for primary CDI and recurrent CDI. ^
Resumo:
Between 1999 and 2011, 4,178 suspected dengue cases in children less than 18 months of age were reported to the Centers for Disease Control and Prevention Dengue Branch in Puerto Rico. Of the 4,178, 813 were determined to be laboratory-positive and 737 laboratory-negative. Those remaining were either laboratory-indeterminate, not processed or positive for Leptospira . On average, 63 laboratory-positive cases were reported per year. Laboratory-positive cases had a median age of 8.5 months. Among these cases, the median age for those with dengue fever was 8.7 months and 7.9 months for dengue hemorrhagic fever. Clinical signs and symptoms indicative of dengue were greatest among laboratory-positive cases and included fever, rash, thrombocytopenia, bleeding manifestations, and petechiae. The most common symptoms among patients who were laboratory-negative were fever, nasal congestion, cough, diarrhea, and vomiting. Using the 1997 WHO guidelines, nearly 50% of the laboratory-positive cases met the case definition for dengue fever, and 61 of these were further determined to meet the case definition for dengue hemorrhagic fever. In comparison, 15% of laboratory-negative cases met the case definition for dengue fever and less than 1% for dengue hemorrhagic fever. None of the laboratory-positive or laboratory-negative cases met the criteria for dengue shock syndrome.^
Resumo:
Diarrhea remains a significant cause of worldwide morbidity and mortality. Over 4 million children die of diarrhea annually. Although antibiotics can be used as prophylaxis or for treatment of diarrhea, concern remains over antibiotic resistance. Rifaximin is a semi-synthetic rifamycin derivative that can be used to treat symptoms of infectious diarrhea, inflammatory bowel syndrome, bacterial overgrowth of the small bowel, pouchitis, and fulminant ulcerative colitis. Rifaximin is of particular interest because it is poorly adsorbed in the intestines, shows no indication of inducing bacterial resistance, and has minimal effect on intestinal flora. In order to better understand how rifaximin functions, we sought to compare the protein expression profile of cells pretreated with rifaximin, as compared to cells treated with acetone, rifamycin (control antibiotic), or media (untreated). 2-D gel electrophoresis identified 38 protein spots that were up- or down-regulated by over 2-fold in rifaximin treated cells compared to controls. 16 of these spots were down-regulated, including keratin, annexin A5, intestinal-type alkaline phosphatase, histone h4, and histone-binding protein RbbP4. 22 spots were up-regulated, including heat shock protein HSP 90 alpha, alkaline phosphatase, and fascin. Many of the identified proteins are associated with cell structure and cytoskeleton, transcription and translation, and cellular metabolism. A better understanding of the functionality of rifaximin will identify additional potential uses for rifaximin and determine for whom the drug is best suited. ^
Resumo:
This research examined the relationship between concomitant non-CDI antibiotic use and complications arising due to Clostridium difficile infection. To observe the hypothesized association, 160 total CDI patients between the ages of 50-90 were selected, 80 exposed to concomitant antibiotics and 80 unexposed. Samples were matched based upon their age and Horn's index, a severity score for underlying illness. Patients were de-identified by a third party, and analyzed retrospectively for differences between the two groups. In addition, patients exposed to broad spectrum antibiotics at the time of CDI treatment were further studied to demonstrate whether antibiotics had any effect on CDI complications. Between the two groups, the outcomes of interest (recurrent CDI, refractory CDI, mortality, ICU stay, and length of hospitalization) were not associated with concomitant antibiotic use at the time of CDI therapy. However, within the exposed population, certain classes of antibiotics such as cephalosporin, antifungals, and tetracyclines were more common in patients compared to other types of therapy. In addition, days of therapy provided evidence that sustained use of antibiotics affected CDI (p = 0.08), although a more robust sample size and additional study would be needed. Finally, refractory CDI was found to be potentially overestimated within the exposed population due to the possibility of antibiotic-associated diarrhea.^
Resumo:
OBJECTIVE: To systematically review published literature to examine the complications associated with the use of misoprostol and compare these complications to those associated with other forms of abortion induction. ^ DATA SOURCES: Studies were identified through searches of medical literature databases including Medline (Ovid), PubMed (NLM), LILACS, sciELO, and AIM (AFRO), and review of references of relevant articles. ^ STUDY SELECTION AND METHODS: A descriptive systematic review that included studies reported in English and published before December 2012. Eligibility criteria included: misoprostol (with or without other methods) and any other method of abortion in a developing country, as well as quantitative data on the complication of each method. The following is information extracted from each study: author/year, country/city, study design/study sample, age range, setting of data collection, sample size, the method of abortion induction, the number of cases for each method, and the percentage of complications with each method. RESULTS: A total of 4 studies were identified (all in Latin America) describing post-abortion complications of misoprostol and other methods in countries where abortion is generally considered unsafe and/or illegal. The four studies reported on a range of complications including: bleeding, infection, incomplete abortion, intense pelvic pain, uterine perforation, headache, diarrhea, nausea, mechanical lesions, and systemic collapse. The most prevalent complications of misoprostol-induced abortion reported were: bleeding (7-82%), incomplete abortion (33-70%), and infection (0.8-67%). The prevalence of these complications reported from other abortion methods include: bleeding (16-25%), incomplete abortion (15-82%), and infection (13-50%). ^ CONCLUSION: The literature identified by this systematic review is inadequate for determining the complications of misoprostol used in unsafe settings. Abortion is considered an illicit behavior in these countries, therefore making it difficult to investigate the details needed to conduct a study on abortion complications. Given the differences between the reviewed studies as well as a variety of study limitations, it is not possible to draw firm conclusions about the rates of specific-abortion related complications.^
Resumo:
Vaccination is a management strategy utilized to help reduce prevalence of bovine respiratory disease in feedlots. However, not all animals respond similarly to vaccinations. It is believed that an animal’s genetics control part of the ability to respond to a vaccination protocol. In order to evaluate the genetic control of a new trait such as response to vaccination, it is important to understand the non-genetic factors that affect an animal’s response to vaccination. The objective of this study was to characterize the non-genetic factors affecting overall response to a two-shot vaccination for bovine viral diarrhea virus type 2 (BVDV2) in Angus weanling calves.
Resumo:
Se presume que la prescripción de medicamentos sin receta médica en las farmacias es una práctica frecuente. El objetivo fue conocer la conducta del personal de las farmacias ante una consulta realizada por estudiantes de medicina entrenados para actuar como pacientes simulados de las siguientes situaciones: 1: Infección respiratoria alta, 2: Diarrea aguda; 3: Disuria, 4: Ulcera genital, 5: Hipertensión arterial, 6: Cefalea aguda, 7: Artralgia de tobillo. Se efectuaron 100 entrevistas y cada una de las situaciones se realizó al menos 12 veces. En solo el 28% de los casos no se indicó tratamiento y las 72 prescripciones fueron realizadas por 38 farmacéuticos y 34 no profesionales. La medicación se consideró inadecuada en 58.3%, iatrogénica en 51.4% y la posología incorrecta en 50%. Los fármacos más indicados fueron antibióticos (23.6%), AINES (20.8%), antidiarreicos (11.8%) y antigripales (9.7%). Las situaciones 7 (100%), 1 (93.3%) y 2 (84.6%) tuvieron la mayor frecuencia de indicación de tratamiento y fue significativa la negativa a medicar en las situaciones 4 (OR, 0.16) y 5 (OR, 0.22) (p<0.05). La prescripción fue incorrecta en el 100% de las situaciones 2 y 4 y iatrogénica en el 100% de las situaciones 2, 4 y 5. En 48 casos se sugirió consulta médica y la situación 5 tuvo 4.27 veces más posibilidades de ser derivada (p= 0.01). Este estudio demuestra que en las farmacias del gran Mendoza es común la venta de medicamentos sin prescripción médica lo que compromete la seguridad y salud de las personas.
Resumo:
Los objetivos principales de esta Tesis Doctoral fueron estudiar en 4 ensayos los efectos a) del procesado del maíz y la inclusión en los piensos de ingredientes de alta calidad como harina de pescado o fuentes de lactosa en lechones blancos b) inclusión en el pienso de diferentes productos derivados del haba de soja, con diferente contenido de proteína bruta (PB), tamaño de partícula y origen en lechones blancos e ibéricos y c) inclusión en el pienso de lechones ibéricos de ingredientes de alta calidad; forma de presentación del pienso y la duración del suministro del pienso prestárter sobre los parámetros productivos, la digestibilidad de los nutrientes, y las características morfológicas de la mucosa digestiva en lechones blancos e ibéricos recién destetados. En el experimento 1, los efectos de la complejidad del pienso prestárter sobre los parámetros productivos y la digestibilidad total aparente (TTAD) de los nutrientes fueron estudiados en lechones blancos recién destetados. Se utilizaron 10 tratamientos experimentales como resultado de 5 piensos prestárter (21 a 41 d de edad) y 2 piensos estárter (42 a 62 d de edad). Los piensos prestárter consistieron en un control negativo que incluía 40% de maíz crudo, 4% de harina de pescado y 7% de lactosa, un control positivo que incluía 40% de maíz cocido, 10% de harina de pescado, y 14% de lactosa, y 3 piensos adicionales con similares ingredientes que el pienso control positivo pero en los que a) 40% de maíz cocido fue sustituido por el mismo porcentaje de maíz crudo, b) se redujo el nivel de harina de pescado del 10 al 4%, y c) se redujo el nivel de lactosa del 14 al 7%. Cada tratamiento se replicó 6 veces (6 lechones/departamento). De 42 a 62 d de edad, la mitad de cada uno de los 5 piensos prestárter recibió un pienso estándar compuesto por harina de soja- maíz crudo y manteca y la otra mitad un pienso con similar perfil nutricional pero incluyendo un 20% de maíz cocido, 5% de harina de pescado, 1.3% de lactosa, 2% de concentrado de proteína de soja obtenido por fermentación y 1% de aceite de soja en lugar de harina de soja, maíz sin procesar y manteca. La complejidad del pienso no afectó a los parámetros productivos en ninguno de los periodos estudiados, pero el índice de diarreas durante la fase prestárter fue mayor en los lechones que recibieron el pienso control negativo que en los alimentados con cualquiera de los otros piensos (P<0.05). A los 30 días de edad (piensos prestárter), la digestibilidad de la materia orgánica (MO) y de la energía bruta (EB) fue menor (P<0.001) en los lechones que consumieron el pienso control negativo que en los lechones que consumieron cualquiera de los otros piensos. Sin embrago, la digestibilidad fecal de la PB no fue afectada. A los 50 días de edad (piensos estárter), la digestibilidad de los nutrientes fue similar en ambos piesnsos. Se concluye que la utilización de niveles elevados de ingredientes de alta calidad en los piensos no mejora los parámetros productivos de los lechones blancos en ninguno de los períodos estudiados. De 21 a 41 días de edad, el índice de diarreas se redujo y la digestibilidad de los nutrientes aumentó con la utilización de piensos de mayor calidad. Por lo tanto, la utilización de piensos con niveles elevados de ingredientes de calidad para reducir problemas digestivos y por lo tanto, mejorar los parámetros productivos podría estar justificada en algunos casos. En el experimento 2, se estudiaron los efectos de la inclusión en el pienso de harina de soja con diferente contenido de PB (44 vs. 49 % PB), la micronización de la harina de soja de alta proteína (AP-HS; 49% PB) y la utilización de concentrado de proteína de soja (CPS; 65% PB) sobre los parámetros productivos y la TTAD de los nutrientes en lechones blancos recién destetados de 28 a 63 días de edad. De 28 a 49 días de edad (fase I), hubo un pienso control positivo con un 10% de CPS, un pienso control negativo con 14.8% de harina de soja estándar (R-HS; 44% de PB) y otros 4 piensos que incluían 13.3% de AP-HS de origen Americano (USA) o Argentino (ARG) y molidas groseramente (980 μm) o micronizadas (80 μm). Cada tratamiento se replicó 8 veces (6 lechones/departamento). De 49 a 63 días de edad (fase II), todos los lechones recibieron un pienso comercial común en forma de harina. En el global de la fase I, el tratamiento experimental no afectó a ninguno de los parámetros productivos estudiados. Sin embargo, de 28 a 35 días de edad, los lechones alimentados con AP-HS micronizadas tuvieron un mejor índice de conversión (IC; 1.11 vs. 0.98; P<0.05) que los alimentados con AP-HS molidas groseramente. También, de 35 a 42 días de edad, los lechones que recibieron el pienso con AP-HS micronizada tendieron (P=0.08) a consumir más pienso que los lechones que consumieron el pienso con AP-HS molida. Durante la fase II (49 a 63 días de edad), cuando todos los lechones recibieron un pienso común, no se observaron diferencias en productividad de los lechones debido al tratamiento previo. En general, la digestibilidad de los nutrientes a los 35 días de edad fue mayor para los lechones que consumieron CPS que para los lechones que consumieron R-HS con los lechones que consumieron AP-HS en una posición intermedia. La digestibilidad de la PB fue mayor (P≤0.01) para el pienso que contenía CPS que para el promedio de los 5 tratamientos en base a HS. También, la digestibilidad de la MO y de la materia seca (MS) fue mayor para el pienso que contenía AP-HS micronizada o molida groseramente que para el pienso que contenía R-HS. La micronización de la AP-HS no tuvo efecto alguno sobre la digestibilidad de los nutrientes. Se concluye que cuando el CPS sustituye en el pienso a R-HS, la digestibilidad de la PB aumenta pero no tiene efecto alguno sobre los parámetros productivos. La utilización de AP-HS en sustitución de R-HS en el pienso mejora la digestibilidad de los nutrientes pero no afecta a los parámetros productivos. La utilización de harina de soja micronizada en los piensos mejora la eficiencia alimenticia durante la primera semana post-destete pero no tiene efecto alguno sobre la digestibilidad de los nutrientes. En general, la inclusión de productos derivados del haba de soja con un alto valor añadido (CPS o AP-HS) en el pienso presenta pocas ventajas en términos productivos al uso de AP-HS en lechones blancos recién destetados. En el experimento 3, se estudiaron los mismos productos de soja y piensos similares al experimento 2 en lechones ibéricos recién destetados. Además de los parámetros productivos y la TTAD de los nutrientes, en este ensayo se estudió también la digestibilidad ileal aparente (AID) de los nutrientes, así como las características histológicas y morfometría de la mucosa ileal. Cada uno de los 6 tratamientos fue replicado 6 veces (6 lechones/departamento). De 30 a 51 días de edad la fuente de harina de soja no afectó a los parámetros productivos, pero el índice de diarreas fue mayor (P<0.001) y la TTAD y AID de los nutrientes menor en los lechones alimentados con R-HS que en los alimentados con CPS o AP-HS. Sin embargo, no se encontró ninguna diferencia para éstos parámetros entre los piensos que contenían AP-HS y CPS. La TTAD de la MO (P=0.07) y de la EB (P=0.05) tendieron a ser mayores en los piensos basados en AP-HS micronizada que en los basados en AP-HS molida. La TTAD de la EB tendió (P<0.05) a ser mayor para la AP-HS de origen USA que para la AP-HS de origen ARG. Los lechones que consumieron R-HS presentaron villi de menor longitud (P<0.01) que los lechones que consumieron AP-HS o CPS, pero no se observaron diferencias en el caso de los lechones que recibieron los piensos que contenían AP-HS o CPS. Se concluye que la inclusión de AP-HS o CPS en el pienso en sustitución de R-HS reduce el índice de diarreas y mejora la digestibilidad de los nutrientes y las características morfológicas del íleon sin afectar a los parámetros productivos. La utilización de piensos basados en productos derivados del haba de soja con mayor valor añadido (CPS o AP-HS) en sustitución de la R-HS, mejora la TTAD de todos los nutrientes y reduce el índice de diarreas si llegar afectar a los parámetros productivos. En el experimento 4 se estudiaron los efectos del contenido de PB y la complejidad del pienso, la presentación física y la duración del suministro del pienso prestárter sobre los parámetros productivos y la TTAD de los nutrientes en lechones ibéricos recién destetados de 28 a 63 días de edad. Hubo 12 tratamientos experimentales con 2 tipos de pienso (AC; calidad alta y BC: calidad media), 2 presentaciones del pienso (gránulo y harina) y 3 duraciones de suministro del pienso prestárter (7, 14 y 21 días). Desde los 7, 14 y 21 días de experimento (dependiendo del tratamiento), hasta los 35 días, todos los lechones recibieron un pienso comercial en forma de harina. Cada uno de los tratamientos fue replicado 3 veces (6 lechones/departamento). En el global del experimento, la ganancia media diaria (GMD; P<0.05) y el consumo medio diario (CMD; P<0.01) fue menor en los lechones que recibieron el pienso AC que para los que recibieron el pienso de BC, si bien el IC no se vio afectado. La granulación del pienso prestárter no afectó a los crecimientos pero mejoró la eficiencia alimenticia. La utilización del pienso prestárter de 0 a 21 días de prueba mejoró el IC (P<0.05), pero redujo la GMD (P<0.01) en comparación con la utilización de éste pienso solo durante 7 o 14 días. El índice de diarreas tendió a ser mayor (P=0.06) en los lechones alimentados con los piensos AC que en los alimentados con los piensos BC. Asimismo, el índice de diarreas fue superior en los lechones que recibieron el pienso en gránulo que los que los recibieron en harina (P<0.001). Además, el índice de diarreas fue superior en los lechones que recibieron el pienso prestárter durante 14 o 21 días que en los que lo recibieron solo durante 7 días (P<0.01). De 28 a 49 días de edad, la GMD y el IC no se vieron afectados por la complejidad del pienso, pero la presentación en gránulo o el aumento en la duración de suministro del pienso prestárter mejoró el IC (P<0.01). También, en este periodo el índice de diarreas fue mayor en lechones alimentados con piensos granulados que aquellos alimentados con piensos en harina. Asimismo, fue superior para los lechones alimentados con el pienso prestárter durante 14 o 21 días que para los que recibieron éste pienso solo durante 7 días (P<0.01). De 49 a 63 días de edad, los lechones que previamente habían recibido piensos BC crecieron más que los que recibieron piensos AC (P<0.001). Asimismo, los lechones que recibieron el pienso prestárter durante 21 días comieron (P< 0.001) y crecieron menos (P<0.05) presentando una peor eficacia alimenticia (P<0.05) que los lechones que lo recibieron solo durante 7 14 días. La digestibilidad de la MO fue mayor en los lechones alimentados con los piensos AC que en los alimentados con piensos BC (P<0.05). La granulación del pienso mejoró la digestibilidad de los principales nutrientes. Los piensos prestárter AC mejoraron la digestibilidad de los nutrientes pero no la eficiencia alimenticia en lechones ibéricos de 28 a 63 días de edad. La granulación del pienso mejoró la eficiencia alimenticia. El aumento del suministro del pienso prestárter de 7 a 21 días mejoró la eficiencia alimenticia pero redujo la GMD. Por lo tanto, la utilización de piensos granulados de alta calidad durante el periodo prestárter es recomendable en lechones ibéricos, pero solo durante la primera semana post-destete. ABSTRACT The main objectives of this PhD Thesis were to study the effects of a) heat processing (HP) of corn and inclusion of high quality ingredients of animal origin such as fish meal (FM) and dried milk products in the diet, b) inclusion of different soy products varying in crude protein (CP) content, particle size, and origin of the beans in diets for conventional white and Iberian weanling pigs, and c) effects of ingredient quality, feed form, and duration of supply of the phase I diets on growth performance, nutrient digestibility, and intestinal morphology of weanling pigs. In experiment 1, the effect of diet complexity on total tract apparent digestibility (TTAD) and growth performance was studied in piglets from 21 to 62 d of age. There were 10 experimental treatments which resulted from the combination of 5 phase I (21 to 41 d of age) and 2 phase II (42 to 62 d of age) diets. The 5 phase I diets consisted of a negative control diet that contained 40 % raw corn, 4% FM, and 7% lactose (LAC); a positive control diet that contained 40 % HP corn, 10% FM, and 14% LAC, and 3 extra diets that used similar ingredients to those of the positive control diet but in which a) 40% of HP corn was substituted by raw corn, b) 4% FM rather than 10% FM, and c) 7% LAC instead of 14% LAC were included in the diet. Each treatment was replicated 6 times (6 pigs per pen). From 42 to 62 d of age, half of the pens of each of the 5 phase I treatments received a standard soybean meal (SBM)–native corn–lard diet wheras the other half received a diet with similar nutrient profile but that included 20% HP corn, 5% FM, 1.3% lactosa, 2% fermented soy protein concentrate, and 1% soybean oil in substitution of variables amounts of non-processed corn, SBM, and lard. Dietary treatment did not affect piglet performance at any age, but the incidence of post-weaning diarrhea (PWD) was higher during phase I in piglets fed the negative control diet than in piglets fed any of the other diets (P<0.05). At 30 d of age (phase I diets), the TTAD of organic matter (OM) and gross energy (GE) was lower (P<0.001) in pigs fed the negative control diet than in pigs fed the other diets but CP digestibility was not affected. At 50 d of age (phase II diets), dietary treatment did not affect TTAD of any dietary component. It is concluded that the use of high quality ingredients at high levels in the diet did not improve growth performance of piglets at any age. From 21 to 41 d of age, PWD was reduced and nutrient digestibility was increased in pigs fed the more complex diets. Consequently, the inclusion of high levels of high quality ingredients in piglet diets to maximize growth performance might not be justified under all circumstances In experiment 2, the effect of CP content (44 vs. 49 % CP) of SBM, micronization (fine grinding) of the high CP SBM (HP-SBM; 49% CP), and soy protein concentrate (SPC; 65% CP) on TTAD and growth performance was studied in conventional white piglets from 28 to 63 d of age. From 28 to 49 d of age (phase I), there was a positive control diet that included 6.5% CP from SPC and a negative control diet that supplied the same amount of CP as regular SBM (R-SBM; 44% CP) of Argentina (ARG) origin. The other 4 diets included the same amount of dietary CP from 2 sources of HP-SBM (USA or ARG origin), either ground (990 μm) or micronized (60 μm). Each treatment was replicated 8 times (6 pigs per pen). From 49 to 63 d of age (phase II), all pigs were fed a common commercial starter diet. For the entire phase I, type of soy product included in the diet did not affect growth performance of the pigs. However, from 28 to 35 d of age pigs fed the micronized HP-SBM had better feed conversion ratio (FCR; 0.90 vs. 1.01; P<0.05) than pigs fed the ground HP-SBM. Also, from 35 to 42 d of age, average daily feed intake (ADFI) tended to be higher (P=0.08) for pigs fed the micronized HP-SBM than for pigs fed the ground HP-SBM. During phase II, when all the pigs received the same diet, no differences among treatments were observed. In general, the TTAD of nutrients at 35 d of age was higher for the SPC than for the R-SBM diet with the HP-SBM diets being intermediate. The TTAD of CP was higher (83.8% vs. 81.9%; P≤0.01) for the SPC diet than for the average of 5 SBM containing diets. Also, the digestibility of OM and dry matter (DM) was higher (P<0.01) for the HP-SBM, either ground or micronized, than for the R-SBM diet. Micronization of the HP-SBM did not affect nutrient digestibility. It is concluded that when R-SBM was substituted by SPC, CP digestibility was improved but no effects on growth performance were observed. The use of HP-SBM in substitution of R-SBM in the diet improved nutrient digestibility but did not affect piglet performance. The inclusion of micronized HP-SBM in the diet improved FCR during the first week post-weaning but did not affect TTAD of nutrients. Therefore, the inclusion of added value soy products (SPC or micronized SBM) in the diet presents little advantage in terms of growth performance over the use of HP-SBM in pigs weaned at 28 d of age. In experiment 3, the effects of the same sources of soy protein used in experiment 2 on TTAD and growth performance of crossbreed Iberian pigs from 30 to 61 d of age were studied. In addition, the apparent ileal digestibility (AID) of nutrients and mucosa ileum morphology were also determined. Dietary treatment did not affect growth performance of the pigs at any age but from 30 to 51 d of age (phase I diets), PWD was higher (P<0.001) and the TTAD and AID of all nutrients were lower for pigs fed the R-SBM diet than for pigs fed the HP-SBM or the SPC diets. However, no differences between the HP-SBM and the SPC containing diets were detected for digestibility of any dietary component. The TTAD of OM (P=0.07) and GE (P=0.05) tended to be higher for the micronized HP-SBM than for the ground HP-SBM and that of GE was higher (P<0.05) for the USA meal than for the ARG meal. Pigs fed R-SBM had lower villus height (P<0.01) than pigs fed HP-SBM or SPC but no differences in ileal mucosal morphology were detected between SPC and HP-SBM containing diets. It is concluded that feeding the HP-SBM or SPC in substitution of R-SBM reduced PWD and improved nutrient digestibility and ileal morphology in piglets as compared with feeding the R-SBM, but had no effect on growth performance. The inclusion in the diet of added value soy products (micronized SBM or SPC) in substitution of the R-SBM increased the TTAD of all nutrients and reduced PWD but had no advantage in terms of growth performance over the use of ground HP-SBM. In experiment 4, the effect of CP content and ingredient complexity, feed form, and duration of feeding of the phase I diets on growth performance and TTAD of nutrients were studied in Iberian pigs from 28 to 63 d of age. There were 12 dietary treatments with 2 type of feeds (HQ; higher quality and LQ; medium quality), 2 feed forms (pellets vs. mash), and 3 durations of supply (7, 14, and 21 d) of the phase I diets. From d 7, 14, or 21 (depending on treatment) to d 35 of experiment, all pigs received a common diet in mash form. Each treatment was replicated 3 times (6 pigs/pen). For the entire experiment, average daily gain (ADG; P<0.05) and ADFI (P<0.01) were lower with the HQ than with the LQ phase I diets but FCR was not affected. Pelleting of the phase I diets did not affect ADG but improved FCR (P<0.01). Feeding the phase I diets from d 0 to 21 improved FCR (P<0.05) but decreased ADG (P<0.01) as compared with 7 or 14 d of feeding. Post-weaning diarrhea tended to be higher (P=0.06) for pigs fed the HQ diets than for pigs fed the LQ diets and for pigs fed pellets than for pigs fed mash (P<0.001). Also, PWD was higher for pigs fed the phase I diet for 14 or 21 d than for pigs fed this diet for 7 d (P<0.01). From d 0 to 21, ADG and FCR were not affected by feed quality but feeding pellets or increasing the duration of feeding the phase I diets improved FCR (P<0.01). Also, in this period PWD was higher with pellets than with mash and for pigs fed the phase I diets for 14 or 21 d than for pigs fed this diet for only 7 d (P<0.01). From d 21 to 35, pigs previously fed the LQ diet had higher ADG than pigs fed the HQ phase I diets (P<0.001). Also, pigs that were fed the phase I diets for 21 d had lower ADG (P<0.05) and ADFI (P< 0.001) and poor FCR (P<0.05) than pigs fed these diets for 7 or 14 d. Organic matter digestibility was higher for pigs fed the HQ phase I diets than for pigs fed the LQ phase I diets (P<0.05). Pelleting improved TTAD of all nutrients (P<0.01). It is concluded that HQ phase I diets increased TTAD of nutrients but not feed efficiency of Iberian pigs from d 28 to 63 d of age. Also, pelleting improved nutrient digestibility and feed efficiency. Increasing the duration of supply of the phase I diets from 7 to 21 d improved feed efficiency but reduced ADG. Therefore, the use of LQ phase I diets in pellet form for no more than 7 d after weaning is recommended in Iberian pigs.
Resumo:
Gluten is the main structural protein complex of wheat with equivalent toxic proteins found in other cereals (rye, barley, and oats) which are responsible for different immunologic responses with different clinical expressions of disease. The spectrum of gluten-related disorders has been classified according to pathogenic, clinical, and epidemiological differences in three main forms: (i) wheat allergy (WA), an IgE-mediated disease; (ii) autoimmune disease, including celiac disease (CD), dermatitis herpetiformis, and gluten ataxia; and (iii) possibly immune-mediated, gluten sensitivity [1]. WA is an immunologic Th2 response with typical manifestations which can vary from dermatological, respiratory, and/or intestinal to anaphylactic reactions. In contrast, CD is an autoimmune disorder, a gliadin-specific T-cell response which is enhanced by the action of intestinal tissue transglutaminase (tTG), with a wide clinical spectrum including symptomatic cases with either intestinal (e.g., chronic diarrhea, weight loss) or extraintestinal features (e.g., anemia, osteoporosis, neurologic disturbances) and silent forms that are occasionally discovered as a result of serological screening [1]. We studied wheat allergy in two children with early diagnosis of CD, who developed immediate allergic symptoms after eating small amounts of wheat flour.
Resumo:
Enterotoxigenic Escherichia coli associated with human diarrheal disease utilize any of a limited group of serologically distinguishable pili for attachment to intestinal cells. These include CS1 and CFA/I pili. We show here that chemical modification of arginyl residues in CS1 pili abolishes CS1-mediated agglutination of bovine erythrocytes, which serves as a model system for attachment. Alanine substitution of the single arginyl residue in CooA, the major pilin, had no effect on the assembly of pili or on hemagglutination. In contrast, substitution of alanine for R181 in CooD, the minor pilin associated with the pilus tip, abolished hemagglutination, and substitution of R20 reduced hemagglutination. Neither of these substitutions affected CS1 pilus assembly. This shows that CooD is essential for CS1-mediated attachment and identifies specific residues that are involved in receptor binding but not in pilus assembly. In addition to mediating agglutination of bovine erythrocytes, CFA/I also mediates agglutination of human erythrocytes. Substitution of R181 by alanine in the CooD homolog, CfaE, abolished both of these reactions. We conclude that the same region of the pilus tip protein is involved in adherence of CS1 and CFA/I pili, although their receptor specificities differ. This suggests that the region of the pilus tip adhesin protein that includes R181 might be an appropriate target for therapeutic intervention or for a vaccine to protect against human diarrhea caused by enterotoxigenic E. coli strains that have serologically different pili.