962 resultados para fecal microhystology
Resumo:
Background. Large field studies in travelers' diarrhea (TD) in multiple destinations are limited by the need to perform stool cultures on site in a timely manner. A method for the collection, transport and storage of fecal specimens that does not require immediate processing, refrigeration and is stable for months would be advantageous. ^ Objectives. Determine if enteric pathogen bacterial DNA can be identified in cards routinely used for evaluation of fecal occult blood. ^ Methods. U.S. students traveling to Mexico in 2005-07 were followed for occurrence of diarrheal illness. When ill, students provided a stool specimen for culture and occult blood by the standard method. Cards were then stored at room temperature prior to DNA extraction. A multiplex fecal PCR was performed to identify enterotoxigenic Escherichia coli and enteroaggregative E. coli (EAEC) in DNA extracted from stools and occult blood cards. ^ Results. Significantly more EAEC cases were identified by PCR done in DNA extracted from cards (49%) or from frozen feces (40%) than by culture followed by HEp-2 adherence assays (13%). Similarly more ETEC cases were detected in card DNA (38%) than fecal DNA (30%) or culture followed by hybridization (10%). Sensitivity and specificity of the card test was 75% and 62%, respectively, and 50% and 63%, respectively, when compared to EAEC and ETEC culture, respectively, and 53% and 51%, respectively compared to EAEC multiplex fecal PCR and 56% and 70%, respectively, compared to ETEC multiplex fecal PCR. ^ Conclusions. DNA extracted from fecal cards used for detection of occult blood is of use in detecting enteric pathogens. ^
Resumo:
Acute diarrhea is the most common medical problem in the developing countries. Infectious agents are responsible for a majority of cases of acute diarrhea. Knowing the cause of acute diarrhea is important to developing plans for disease prevention, control and therapy. Acute diarrhea is caused by many viruses, bacteria and parasites. ^ Travelers to developing countries of the world commonly develop diarrhea as a result of eating contaminated food or drinking contaminated water. About 30-50% of travelers who travel from industrialized countries like United States to the developing countries are at risk of developing diarrhea. High risk areas for travelers' diarrhea are Mexico, Latin America and Southeast Asia. Public restaurants are the common sites for exposure to this type of food-borne infectious disease in travelers. Food becomes contaminated when they are handled by people with fecal content on their hands. ^ The importance of Diffusely Adherent Escherichia Coli (DAEC) in travelers to these areas has not been well studied. Some of the studies looking at DAEC have shown the organism to be present in children without symptoms. Other studies have shown a relationship between DAEC infection and presence of symptoms. I have selected this topic because the patho-physiological processes in DAEC infection that allow intestinal and extra-intestinal infections to develop are not fully understood. DAEC related acute diarrhea is a relatively new topic of public health significance. There is a limited number of studies regarding the virulence and pathogenic mechanisms of DAEC. The presumed virulence factor of the organism is diffuse attachment to the intestinal lining of the infected host. However more research needs to be done to identify the pathogenic mechanisms and virulence factors associated with DAEC infection for better treatment planning and diarrhea prevention. ^
Resumo:
Individuals who do not respond to medical therapy for ulcerative colitis (UC) often undergo proctocolectomy followed by ileal-pouch anal anastomosis (IPAA) in hopes of resolving symptoms associated with UC. Inflammation of the ileal pouch, better known as pouchitis, is the most common complication of the IPAA procedure. The causes and development of pouchitis is not well understood. To better understand pathogenesis of pouchitis, pouch aspirates of patients having undergone IPAA were quantitatively analyzed for fecal IL-8, IL-17, and IL-23 levels. According to published literature IL-8 has been linked to pouchitis whereas IL-17 and IL-23 are associated with intestinal inflammation. The study had 80 participants, 33 patients diagnosed with Crohn's Disease (CD) of the pouch, 19 patients diagnosed with pouchitis, and 28 diagnosed with having normal pouches. Patient characteristics and histopathological findings for all patients were noted and statistically compared in addition to fecal cytokine levels. This study supported previous literature that IL-8 production was associated with pouch inflammation. However, IL-17 and IL-23 levels in both CD of the pouch and pouchitis were not significantly different to the levels noted in normal pouch.^
Resumo:
Histo-blood group antigens (HBGAs) have been associated with susceptibility to enteric pathogens including noroviruses (NoVs), enterotoxigenic Escherichia coli (ETEC), Campylobacter jejuni, and Vibrio cholerae. We performed a retrospective cohort study to evaluate the relationship between traveler HBGA phenotypes and susceptibility to travelers' diarrhea (TD) and post-infectious complications. 364 travelers to Guadalajara, Mexico were followed prospectively from June 1 - September 30, 2007 and from June 7–July 28, 2008 for the development of TD and at 6 months for post-infectious irritable bowel syndrome (PIIBS). Noroviruses were detected from illness stool specimens with RT-PCR. Diarrheal stool samples were also assayed for enterotoxigenic and enteroaggregative E. coli, Salmonella species, Shigella species, Vibrio species, Campylobacter jejuni, Yersinia enterocolitica, Aeromonas species, and Plesiomonas species. Diarrheal stools were evaluated for inflammation with fecal leukocytes, mucus, and occult blood. Phenotyping for ABO and Lewis antigens with an ELISA assay and FUT2 gene PCR genotyping for secretor status were performed with saliva. 171 of 364 (47%) subjects developed TD. HBGA typing for the travelers revealed O (62.9%), A (34.6%), B (1.6%), and AB (0.8%) phenotypes. There were 7% nonsecretors and 93% secretors among the travelers. AB phenotypes were more commonly associated with Cryptosporidium species (P=0.04) and ETEC ( P=0.08) as causes of TD. AB and B phenotype individuals were more likely to experience inflammatory diarrhea, particularly mucoid diarrhea ( P=0.02). However, there were relatively few individuals with AB and B phenotypes. GI and GII NoV and Cryptosporidium species infections and PI-IBS were identified only in secretors, but these differences were not statistically significant, (P=1.00), (P=1.00), and (P=0.60), respectively. Additional studies are needed to evaluate whether AB phenotype individuals may be more susceptible to developing TD associated with Cryptosporidium species or ETEC, and whether AB and B phenotype individuals may be more likely to develop inflammatory TD. Further studies are needed to investigate whether nonsecretor travelers may be at less risk for developing infections with NoVs and Cryptosporidium species and PI-IBS.^
Resumo:
Background. Colorectal cancer (CRC) is the third most commonly diagnosed cancer (excluding skin cancer) in both men and women in the United States, with an estimated 148,810 new cases and 49,960 deaths in 2008 (1). Racial/ethnic disparities have been reported across the CRC care continuum. Studies have documented racial/ethnic disparities in CRC screening (2-9), but only a few studies have looked at these differences in CRC screening over time (9-11). No studies have compared these trends in a population with CRC and without cancer. Additionally, although there is evidence suggesting that hospital factors (e.g. teaching hospital status and NCI designation) are associated with CRC survival (12-16), no studies have sought to explain the racial/ethnic differences in survival by looking at differences in socio-demographics, tumor characteristics, screening, co-morbidities, treatment, as well as hospital characteristics. ^ Objectives and Methods. The overall goals of this dissertation were to describe the patterns and trends of racial/ethnic disparities in CRC screening (i.e. fecal occult blood test (FOBT), sigmoidoscopy (SIG) and colonoscopy (COL)) and to determine if racial/ethnic disparities in CRC survival are explained by differences in socio-demographic, tumor characteristics, screening, co-morbidities, treatment, and hospital factors. These goals were accomplished in a two-paper format.^ In Paper 1, "Racial/Ethnic Disparities and Trends in Colorectal Cancer Screening in Medicare Beneficiaries with Colorectal Cancer and without Cancer in SEER Areas, 1992-2002", the study population consisted of 50,186 Medicare beneficiaries diagnosed with CRC from 1992 to 2002 and 62,917 Medicare beneficiaries without cancer during the same time period. Both cohorts were aged 67 to 89 years and resided in 16 Surveillance, Epidemiology and End Results (SEER) regions of the United States. Screening procedures between 6 months and 3 years prior to the date of diagnosis for CRC patients and prior to the index date for persons without cancer were identified in Medicare claims. The crude and age-gender-adjusted percentages and odds ratios of receiving FOBT, SIG, or COL were calculated. Multivariable logistic regression was used to assess race/ethnicity on the odds of receiving CRC screening over time.^ Paper 2, "Racial/Ethnic Disparities in Colorectal Cancer Survival: To what extent are racial/ethnic disparities in survival explained by racial differences in socio-demographics, screening, co-morbidities, treatment, tumor or hospital characteristics", included a cohort of 50,186 Medicare beneficiaries diagnosed with CRC from 1992 to 2002 and residing in 16 SEER regions of the United States which were identified in the SEER-Medicare linked database. Survival was estimated using the Kaplan-Meier method. Cox proportional hazard modeling was used to estimate hazard ratios (HR) of mortality and 95% confidence intervals (95% CI).^ Results. The screening analysis demonstrated racial/ethnic disparities in screening over time among the cohort without cancer. From 1992 to 1995, Blacks and Hispanics were less likely than Whites to receive FOBT (OR=0.75, 95% CI: 0.65-0.87; OR=0.50, 95% CI: 0.34-0.72, respectively) but their odds of screening increased from 2000 to 2002 (OR=0.79, 95% CI: 0.72-0.85; OR=0.67, 95% CI: 0.54-0.75, respectively). Blacks and Hispanics were less likely than Whites to receive SIG from 1992 to 1995 (OR=0.75, 95% CI: 0.57-0.98; OR=0.29, 95% CI: 0.12-0.71, respectively), but their odds of screening increased from 2000 to 2002 (OR=0.79, 95% CI: 0.68-0.93; OR=0.50, 95% CI: 0.35-0.72, respectively).^ The survival analysis showed that Blacks had worse CRC-specific survival than Whites (HR: 1.33, 95% CI: 1.23-1.44), but this was reduced for stages I-III disease after full adjustment for socio-demographic, tumor characteristics, screening, co-morbidities, treatment and hospital characteristics (aHR=1.24, 95% CI: 1.14-1.35). Socioeconomic status, tumor characteristics, treatment and co-morbidities contributed to the reduction in hazard ratios between Blacks and Whites with stage I-III disease. Asians had better survival than Whites before (HR: 0.73, 95% CI: 0.64-0.82) and after (aHR: 0.80, 95% CI: 0.70-0.92) adjusting for all predictors for stage I-III disease. For stage IV, both Asians and Hispanics had better survival than Whites, and after full adjustment, survival improved (aHR=0.73, 95% CI: 0.63-0.84; aHR=0.74, 95% CI: 0.61-0.92, respectively).^ Conclusion. Screening disparities remain between Blacks and Whites, and Hispanics and Whites, but have decreased in recent years. Future studies should explore other factors that may contribute to screening disparities, such as physician recommendations and language/cultural barriers in this and younger populations.^ There were substantial racial/ethnic differences in CRC survival among older Whites, Blacks, Asians and Hispanics. Co-morbidities, SES, tumor characteristics, treatment and other predictor variables contributed to, but did not fully explain the CRC survival differences between Blacks and Whites. Future research should examine the role of quality of care, particularly the benefit of treatment and post-treatment surveillance, in racial disparities in survival.^
Resumo:
Background. A few studies have reported gender differences along the colorectal cancer (CRC) continuum but none has done so longitudinally to compare a cancer and a non-cancer populations.^ Objectives and Methods. To examine gender differences in colorectal cancer screening (CRCS); to examine trends in gender differences in CRC screening among two groups of patients (Medicare beneficiaries with and without cancer); to examine gender differences in CRC incidence; and to examine for any differences over time. In Paper 1, the study population consisted of men and women, ages 67–89 years, with CRC (73,666) or without any cancer (39,006), residing in 12 U.S. Surveillance Epidemiology and End-Results (SEER) regions. Crude and age-adjusted percentages and odds ratios of receiving fecal occult blood test (FOBT), sigmoidoscopy (SIG), or colonoscopy (COL) were calculated. Multivariable logistic regression was used to assess gender on the odds of receiving CRC screening over time.^ In Paper 2, age-adjusted incidence rates and proportions over time were reported across race, CRC subsite, CRC stage and SEER region for 373,956 patients, ages 40+ years, residing in 9 SEER regions and diagnosed with malignant CRC. ^ Results. Overall, women had higher CRC screening rates than men and screening rates in general were higher in the SEER sample of persons with CRC diagnosis. Significant temporal divergence in FOBT screening was observed between men and women in both cohorts. Although the largest temporal increases in screening rates were found for COL, especially among the cohort with CRC, little change in the gender gap was observed over time. Receipt of FOBT was significantly associated with female gender especially in the period of full Medicare coverage. Receipt of COL was also significantly associated with male gender, especially in the period of limited Medicare coverage.^ Overall, approximately equal numbers of men (187,973) and women (185,983) were diagnosed with malignant CRC. Men had significantly higher age-adjusted CRC incidence rates than women across all categories of age, race, subsite, stage and SEER region even though rates declined in all categories over time. Significant moderate increases in rate difference occurred among 40-59 year olds; significant reductions occurred among patients age 70+, within subsite rectum, unstaged and distant stage CRC, and eastern and western SEER regions. ^ Conclusions. Persistent gender differences in CRC incidence across time may have implications for gender-based interventions that take age into consideration. A shift toward proximal cancer was observed over time for both genders, but the high proportion of men who develop rectal cancer suggests that a greater proportion of men may need to be targeted with newer screening methods such as fecal DNA or COL. Although previous reports have documented higher CRC screening among men, higher incidence of CRC observed among men suggests that higher risk categories of men are probably not being reached. FOBT utilization rates among women have increased over time and the gender gap has widened between 1998 and 2005. COL utilization is associated with male gender but the differences over time are small.^
Resumo:
Diarrhea is a major public health problem in developing countries among infants and young children. Not all episodes of diarrhea are confirmed as infectious, suggesting alternate mechanisms. One such is immunoglobulin E (IgE) mediated or allergic diarrhea that can be seen in food allergy. In order to determine the relation between allergic gastroenteritis and feeding practice, a cohort of 152 infants were followed from birth to one year age in a rural community of Egypt between October, 1987 to April, 1988 were analyzed. In multivariate analysis of the data, statistically conclusive higher risk had been observed with presence of factors, like consumption of milk pudding (RR = 7.4, CI = 1.5-36.2 and p = 0.01), infant's age 3-6 months (RR = 7.7, CI = 1.3-45.9 and p = 0.02), infants whose mothers were vaccinated antenatally (RR = 3.1, CI = 1.3-7.0 and p = 1.3-7.0, p = 0.0) and wet-nursed infants (RR = 2.7, CI = 1.1-6.5 and p = 0.02). In contrast, infants who were completely breast-fed (RR = 0.13, CI = 0.02-0.6 and p = 0.01), and infants family owning a television set (RR = 0.29, CI = 0.1-0.6 and p = 0.0) were less likely to develop allergic gastroenteritis. The role of IgE on development of persistent diarrhea was also examined in a nested case-control design. Multivariate analysis revealed a significant association between detection of fecal IgE and development of persistent diarrhea compared to acute diarrhea (OR = 3.32, CI = 1.0-10.9 and p = 0.04) and health or non diarrhea (OR = 4.8, CI = 1.07-21.7 and p = 0.03) controls. ^
Resumo:
The natural history of placebo treated travelers' diarrhea and the prognostic factors of recovery from diarrhea were evaluated using 9 groups of placebo treated subjects from 9 clinical trial studies conducted since 1975, for use as a historical control in the future clinical trial of antidiarrheal agents. All of these studies were done by the same group of investigators in one site (Guadalajara, Mexico). The studies are similar in terms of population, measured parameters, microbiologic identification of enteropathogens and definitions of parameters. The studies had two different durations of followup. In some studies, subjects were followed for two days, and in some they were followed for five days.^ Using definitions established by the Infectious Diseases society of America and the Food and Drug Administration, the following efficacy parameters were evaluated: Time to last unformed stool (TLUS), number of unformed stools post-initiation of placebo treatment for five consecutive days of followup, microbiologic cure, and improvement of diarrhea. Among the groups that were followed for five days, the mean TLUS ranged from 59.1 to 83.5 hours. Fifty percent to 78% had diarrhea lasting more than 48 hours and 25% had diarrhea more than five days. The mean number of unformed stools passed on the first day post-initiation of therapy ranged from 3.6 to 5.8 and, for the fifth day ranged from 0.5 to 1.5. By the end of followup, diarrhea improved in 82.6% to 90% of the subjects. Subjects with enterotoxigenic E. coli had 21.6% to 90.0% microbiologic cure; and subjects with shigella species experienced 14.3% to 60.0% microbiologic cure.^ In evaluating the prognostic factors of recovery from diarrhea (primary efficacy parameter in evaluating the efficacy of antidiarrheal agents against travelers' diarrhea). The subjects from five studies were pooled and the Cox proportional hazard model was used to evaluate the predictors of prolonged diarrhea. After adjusting for design characteristics of each trial, fever with a rate ratio (RR) of 0.40, presence of invasive pathogens with a RR of 0.41, presence of severe abdominal pain and cramps with a RR of 0.50, number of watery stools more than five with a RR of 0.60, and presence of non-invasive pathogens with a RR of 0.84 predicted a longer duration of diarrhea. Severe vomiting with a RR of 2.53 predicted a shorter duration of diarrhea. The number of soft stools, presence of fecal leukocytes, presence of nausea, and duration of diarrhea before enrollment were not associated with duration of diarrhea. ^
Resumo:
Attempts have been made in this dissertation to develop a purified antigen with high sensitivity and specificity for diagnosis of Schistosoma mansoni (Sm) infection by using the hybridoma technique.^ Spleen cells, obtained from mice immunized by infection with Sm and boosted by cercarial antigens, or by injection of circulating antigen (CA) in serum from infected mice, were fused with Sp2/0 myeloma cells. The active infection resulted a higher number of hybridomas (100%) than by CA (20%), and higher levels of antibody reactivity as measured by ELISA.^ The IgM and IgG monoclonal antibodies (MCAbs) were purified respectively by gel filtration, DE 52 ion exchange column and proteinase A affinity column. The cercarial and egg antigens were purified by affinity chromatography through MCAb/affi-gel column. The reactivity of the purified antigens were then monitored by ELISA, SDS-PAGE silver stain and EITB.^ The respective MCAbs recognized varying antigenic determinants (AD) present in adult, cercaria and egg stages. By EITB the MCAbs IgM and IgG, when reacted with nine antigens from the various stages, revealed identical bands, suggesting that the two MCAb classes originated from identical AD. By ELISA and COPT, the MCAbs from thirteen cell lines gave same results. But by CHR, two MCAbs showed negative results while eleven other MCAbs showed strong positive. It is assumed that the AD in the immunogen that ilicited the MCAbs were immunochemically closely related.^ One egg purified by immunoaffinity indicated that the epitopes recognized by MCAb were present on four antigenic components with molecular weights (Mr) of approximately 19, 25, 60 and >224 kd, respectively. By EITB the Mr 19 doublet appeared to be species specific; the Mr 25 kd genus specific. They reacted with mouse serum from 13-16 weeks after infection. In monkey serum Mr 19 doublet appeared 8-10 weeks after infection and disappeared at 8-12 weeks after Droncit treatment, paralleled to the disappearance of fecal egg. The Mr 60 and >224 kd bands were also demonstrated with S. japonicum, S. haematobium and Trichinella spiralis infection sera and may be the cause of cross-reaction in conventional serological test. ^
Resumo:
A longitudinal investigation of the health effects and reservoirs of Giardia lamblia was undertaken in forty households located in a rural Nile Delta region of Egypt. Stool specimens obtained once weekly for six months from two to four year old children were cyst or trophozoite-positive in 42 percent of the 724 examined. The mean duration of excretion in all but one Giardia-negative child was seven and one-half weeks with a range of one to 17 weeks. Clinical symptoms of illness were frequently observed within a month before or after Giardia excretion in stool of children, but a statistical inference of association was not demonstrated.^ Seventeen percent of 697 specimens obtained from their mothers was Giardia-positive for a mean duration of four weeks and a range of one to 18 weeks. Mothers were observed to excrete Giardia in stool less frequently during pregnancy than during lactation.^ Nine hundred sixty-two specimens were collected from 13 species of household livestock. Giardia was detected in a total of 22 specimens from cows, goats, sheep and one duck. Giardia cysts were detected in three of 899 samples of household drinking water.^ An ELISA technique of Giardia detection in human and animal stool was field tested under variable environmental conditions. The overall sensitivity of the assay of human specimens was 74 percent and specificity was 97 percent. These values for assay of animal specimens were 82 percent and 98 percent, respectively.^ Surface antigen studies reported from the NIH Laboratory of Parasitic Diseases show that antigens of three Egyptian human isolates are different from each other and from most other isolates against which they were tested.^ The ubiquity of human and animal fecal contamination combined with estimates of ill days per child per year in this setting are substantial arguments for the introduction of a suggested mass parasite control program to intervene in the cyclical transmission of agents of enteric disease. ^
Resumo:
Biodegradability is a desirable, if not a necessary characteristic of pesticides. Carbaryl, as Sevin, is one of the more widely used insecticides for the control of agricultural pests and has been reported to be readily degraded by microorganisms. Because of its broad application, the concentration of Sevin in surface waters has been reported to reach nearly four parts per million (PPM) in surface waters, where it has been reported to affect the growth and metabolic rates of aquatic bacterial populations. Following these reports, it is of public health importance to determine the effects of this insecticide on the growth and metabolic rates of bacteria used to indicate water pollution, and on pathogenic organisms which are found in polluted water.^ This study was conducted to determine the effect of carbaryl on the growth and metabolic rates of indicator and pathogenic organisms. Escherichia coli and Streptococcus faecalis were used as indicators, while Staphylococcus aureus and Salmonella typhimurium were the pathogens studied. Pure and mixed cultures of these organisms were exposed to two concentrations of carbaryl (Sevin).^ The study demonstrated that the fecal pollution indicator organisms, E. coli and S. faecalis respond differently to the presence of small concentrations of carbaryl in water as do the two pathogens tested, (S. typhimurium and S. aureus). The growth of all test organisms as measured by spread plate counts, was reduced by the presence of either one mg/l or five mg/l carbaryl within a period of eight days. Survival of the organisms in the presence of five mg/l carbaryl varied dependent upon whether the organism was in pure or mixed culture. In the presence of five mg/l carbaryl, both pure and mixed culture of E. coli showed longer survival. S. faecalis survived for more than eight days in pure culture, neither S. typhimurium nor S. aureus survived for eight days in pure culture.^ The metabolic rate of S. faecalis and S. aureus was reduced by both five mg/l and one mg/l Sevin concentrations, contrary to E. coli and S. typhimurium which had reduced metabolic rate with the introduction of five mg/l Sevin but showed an increase in the metabolic rate with one mg/l Sevin. There was no difference between the test and control when mixed populations were exposed to five mg/l Sevin and the metabolic rate tested. A mixture of E. coli and S. typhimurium populations showed a respiration increase over the control when exposed to one mg/l Sevin concentration. If similar effects occur in polluted surface waters, misleading results from bacteriological water quality testing may occur. ^
Resumo:
An investigation was undertaken to evaluate the role of fomites in the transmission of diarrhea in day-care centers (DCC) and to elucidate the paths by which enteric organisms spread within this setting.^ During a nine-month period (December 1980-August 1981) extensive culturing of inanimate objects, as well as children and staff was done routinely each month and again repeated during diarrhea outbreaks. Air was sampled from the classrooms and toilets using a Single-Stage Sieve Sampler (Ross Industries, Midland, VA.). Stool samples were collected from both ill and well children and staff in the affected rooms only during outbreaks. Environmental samples were processed for Shigella, salmonella and fecal coliforms while stools were screened for miscellaneous enteropathogens.^ A total of 11 outbreaks occurred in the 5 DCC during the study period. Enteric pathogens were recovered in 7 (64%) of the outbreaks. Multiple pathogens were identified in 3 outbreaks. The most frequently identified pathogen in stools was Giardia lamblia which was recovered in 5 (45%) of the outbreaks. Ten of the 11 (91%) outbreaks occurred in children less than 12 months of age.^ Environmental microbiology studies together with epidemiologic information revealed that enteric organisms were transmitted from person-to-person. On routine sampling, fecal coliforms were most frequently isolated from tap handles and diaper change areas. Contamination with fetal coliforms was wide-spread during diarrhea outbreaks. Fecal coliforms were recovered with significantly greater frequency from hands, toys and other classroom objects during outbreaks than during non-outbreak period. Salmonella typhimurium was recovered from a table top during an outbreak of Salmonellosis. There was no association between the level of enteric microbial contamination in the toilet areas and the occurrence of outbreaks. No evidence was found to indicate that enteric organisms were spread by the airborne route via aerosols.^ Toys, other classroom objects and contaminated hands probably play a major role in the transmission of enteropathogens during day-care center outbreaks. The presence of many enteric agents in the environment undoubtedly explains the polymicrobial etiology of the day-care center associated diarrhea outbreaks. ^
Resumo:
This study establishes the extent and relevance of bias of population estimates of prevalence, incidence, and intensity of infection with Schistosoma mansoni caused by the relative sensitivity of stool examination techniques. The population studied was Parcelas de Boqueron in Las Piedras, Puerto Rico, where the Centers for Disease Control, had undertaken a prospective community-based study of infection with S. mansoni in 1972. During each January of the succeeding years stool specimens from this population were processed according to the modified Ritchie concentration (MRC) technique. During January 1979 additional stool specimens were collected from 30 individuals selected on the basis of their mean S. mansoni egg output during previous years. Each specimen was divided into ten 1-gm aliquots and three 42-mg aliquots. The relationship of egg counts obtained with the Kato-Katz (KK) thick smear technique as a function of the mean of ten counts obtained with the MRC technique was established by means of regression analysis. Additionally, the effect of fecal sample size and egg excretion level on technique sensitivity was evaluated during a blind assessment of single stool specimen samples, using both examination methods, from 125 residents with documented S. mansoni infections. The regression equation was: Ln KK = 2.3324 + 0.6319 Ln MRC, and the coefficient of determination (r('2)) was 0.73. The regression equation was then utilized to correct the term "m" for sample size in the expression P ((GREATERTHEQ) 1 egg) = 1 - e('-ms), which estimates the probability P of finding at least one egg as a function of the mean S. mansoni egg output "m" of the population and the effective stool sample size "s" utilized by the coprological technique. This algorithm closely approximated the observed sensitivity of the KK and MRC tests when these were utilized to blindly screen a population of known parasitologic status for infection with S. mansoni. In addition, the algorithm was utilized to adjust the apparent prevalence of infection for the degree of functional sensitivity exhibited by the diagnostic test. This permitted the estimation of true prevalence of infection and, hence, a means for correcting estimates of incidence of infection. ^
Resumo:
Groundwater constitutes approximately 30% of freshwater globally and serves as a source of drinking water in many regions. Groundwater sources are subject to contamination with human pathogens (viruses, bacteria and protozoa) from a variety of sources that can cause diarrhea and contribute to the devastating global burden of this disease. To attempt to describe the extent of this public health concern in developing countries, a systematic review of the evidence for groundwater microbially-contaminated at its source as risk factor for enteric illness under endemic (non-outbreak) conditions in these countries was conducted. Epidemiologic studies published in English language journals between January 2000 and January 2011, and meeting certain other criteria, were selected, resulting in eleven studies reviewed. Data were extracted on microbes detected (and their concentrations if reported) and on associations measured between microbial quality of, or consumption of, groundwater and enteric illness; other relevant findings are also reported. In groundwater samples, several studies found bacterial indicators of fecal contamination (total coliforms, fecal coliforms, fecal streptococci, enterococci and E. coli), all in a wide range of concentrations. Rotavirus and a number of enteropathogenic bacteria and parasites were found in stool samples from study subjects who had consumed groundwater, but no concentrations were reported. Consumption of groundwater was associated with increased risk of diarrhea, with odds ratios ranging from 1.9 to 6.1. However, limitations of the selected studies, especially potential confounding factors, limited the conclusions that could be drawn from them. These results support the contention that microbial contamination of groundwater reservoirs—including with human enteropathogens and from a variety of sources—is a reality in developing countries. While microbially-contaminated groundwaters pose risk for diarrhea, other factors are also important, including water treatment, water storage practices, consumption of other water sources, water quantity and access to it, sanitation and hygiene, housing conditions, and socio-economic status. Further understanding of the interrelationships between, and the relative contributions to disease risk of, the various sources of microbial contamination of groundwater can guide the allocation of resources to interventions with the greatest public health benefit. Several recommendations for future research, and for practitioners and policymakers, are presented.^
Resumo:
Early detection by screening is the key to colorectal cancer control. However, colorectal cancer screening and its determinants in rural areas have not been adequately studied. This goal of this study was to investigate the screening participation and determinants of colonoscopy, sigmoidoscopy, and/or fecal occult blood test (FOBT) in subjects of Project Frontier from the rural counties of Cochran, Bailey and Parmer, Texas. Subjects ( n=820 with 435 Hispanics, 355 Non-Hispanic Whites, 26 African Americans, and 4 unknown ethnicity; 255 males, 565 females, aged from 40 to 92 years) were from Project FRONTIER. Stepwise logistic regression analysis was performed. Explanatory variables included ethnicity (Hispanic, Non-Hispanic white and African American), gender, health insurance, smoking status, household income, education (years), physical activity, overweight, other health screenings, personal physicians, family history (first-degree relatives) of cancers, and preferred language (English vs. Spanish) for interview/testing. The screening percentage for ever having had a colonoscopy/sigmoidoscopy (51.8%) in this cohort aged 50 years or older is well below the percentage of the nation (65.2%) and Texas (64.6%) while the percentage for FOBT (29.2%) is higher than in the nation (17.2%) and Texas (14.9%). However, Hispanics had significantly lower participation than non-Hispanic whites for colonoscopy/sigmoidoscopy (37.0% vs. 66.0%) and FOBT (16.5% vs. 41.7%), respectively. Stepwise logistic regression showed that predictors for colonoscopy, sigmoidoscopy or FOBT included Hispanic race (p = 0.0045), age (p < 0.0001), other screening procedure (p < 0.0001), insurance status (p < 0.0001) and physician status (p = 0.0053). Screening percentage for colonoscopy/sigmoidoscopy in this rural cohort is well below the national and Texas level mainly due to the lower participation of Hispanics vs. Non-Hispanic whites. Health insurance, having had a personal physician, having had screenings for other cancers, race, and older age are among the main predictors.^