696 resultados para Population-based adults
Resumo:
BACKGROUND Anxiety disorders have been linked to an increased risk of incident coronary heart disease in which inflammation plays a key pathogenic role. To date, no studies have looked at the association between proinflammatory markers and agoraphobia. METHODS In a random Swiss population sample of 2890 persons (35-67 years, 53% women), we diagnosed a total of 124 individuals (4.3%) with agoraphobia using a validated semi-structured psychiatric interview. We also assessed socioeconomic status, traditional cardiovascular risk factors (i.e., body mass index, hypertension, blood glucose levels, total cholesterol/high-density lipoprotein-cholesterol ratio), and health behaviors (i.e., smoking, alcohol consumption, and physical activity), and other major psychiatric diseases (other anxiety disorders, major depressive disorder, drug dependence) which were treated as covariates in linear regression models. Circulating levels of inflammatory markers, statistically controlled for the baseline demographic and health-related measures, were determined at a mean follow-up of 5.5 ± 0.4 years (range 4.7 - 8.5). RESULTS Individuals with agoraphobia had significantly higher follow-up levels of C-reactive protein (p = 0.007) and tumor-necrosis-factor-α (p = 0.042) as well as lower levels of the cardioprotective marker adiponectin (p = 0.032) than their non-agoraphobic counterparts. Follow-up levels of interleukin (IL)-1β and IL-6 did not significantly differ between the two groups. CONCLUSIONS Our results suggest an increase in chronic low-grade inflammation in agoraphobia over time. Such a mechanism might link agoraphobia with an increased risk of atherosclerosis and coronary heart disease, and needs to be tested in longitudinal studies.
Resumo:
BACKGROUND Eosinophilic esophagitis (EoE) is a chronic, inflammatory disease of the esophagus with a rapidly increasing incidence. However, population-based epidemiologic data on EoE are rare and limited to regions with less than 200 000 inhabitants. We evaluated the incidence and prevalence of EoE over time in Canton of Vaud, Switzerland. MATERIALS AND METHODS Canton of Vaud lies in the French-speaking, Western part of Switzerland. As of December 2013, it had a population of 743 317 inhabitants. We contacted all pathology institutes (n = 6) in this canton to identify patients that have been diagnosed with esophageal eosinophilia between 1993 and 2013. We then performed a chart review in all adult and pediatric gastroenterology practices to identify patients with EoE. RESULTS Of 263 patients with esophageal eosinophilia, a total of 179 fulfilled the diagnostic criteria for EoE. Median diagnostic delay was 4 (IQR 1-9) years. No patient was diagnosed with EoE prior to 2003. Incidence of EoE increased from 0.16/100 000 inhabitants in 2004 to 6.3/100 000 inhabitants in 2013 (P < 0.001). The cumulative EoE prevalence in 2013 was 24.1/100 000. The incidence in males was 2.8 times higher (95% CI 2.01-3.88, P < 0.001) when compared to that in females. The annual EoE incidence was 10.6 times higher (95%-CI 7.61-14.87, P < 0.001) in the period from 2010 to 2013 when compared to that in the period from 1993 to 2009. CONCLUSIONS The incidence and cumulative prevalence of EoE in Canton of Vaud, Switzerland, has rapidly increased in the past 10 years.
Resumo:
OBJECTIVE This study aims to assess the odds of developing incident gout in association with the use of postmenopausal estrogen-progestogen therapy, according to type, timing, duration, and route of administration of estrogen-progestogen therapy. METHODS We conducted a retrospective population-based case-control analysis using the United Kingdom-based Clinical Practice Research Datalink. We identified women (aged 45 y or older) who had a first-time diagnosis of gout recorded between 1990 and 2010. We matched one female control with each case on age, general practice, calendar time, and years of active history in the database. We used multivariate conditional logistic regression to calculate odds ratios (ORs) with 95% CIs (adjusted for confounders). RESULTS The adjusted OR for gout with current use of oral formulations of opposed estrogens (estrogen-progestogen) was 0.69 (95% CI, 0.56-0.86) compared with never use. Current use was associated with a decreased OR for gout in women without renal failure (adjusted OR, 0.71; 95% CI, 0.57-0.87) and hypertension (adjusted OR, 0.62; 95% CI, 0.44-0.87) compared with never use. Tibolone was associated with a decreased OR for gout (adjusted OR, 0.77; 95% CI, 0.63-0.95) compared with never use. Estrogens alone did not alter the OR for gout. CONCLUSIONS Current use of oral opposed estrogens, but not unopposed estrogens, is associated with a decreased OR for incident gout in women without renal failure and is more pronounced in women with hypertension. Use of tibolone is associated with a decreased OR for incident gout. The decreased OR for gout may be related to the progestogen component rather than the estrogen component.
Resumo:
OBJECTIVE AND BACKGROUND Anemia and thyroid dysfunction are common and often co-occur. Current guidelines recommend the assessment of thyroid function in the work-up of anemia, although evidence on this association is scarce. PATIENTS AND METHODS In the "European Prospective Investigation of Cancer" (EPIC)-Norfolk population-based cohort, we aimed to examine the prevalence and type of anemia (defined as hemoglobin <13 g/dl for men and <12 g/dl for women) according to different thyroid function groups. RESULTS The mean age of the 8791 participants was 59.4 (SD 9.1) years and 55.2% were women. Thyroid dysfunction was present in 437 (5.0%) and anemia in 517 (5.9%) participants. After excluding 121 participants with three most common causes of anemia (chronic kidney disease, inflammation, iron deficiency), anemia was found in 4.7% of euthyroid participants. Compared with the euthyroid group, the prevalence of anemia was significantly higher in overt hyperthyroidism (14.6%, P < .01), higher with borderline significance in overt hypothyroidism (7.7%, P = .05) and not increased in subclinical thyroid dysfunction (5.0% in subclinical hypothyroidism, 3.3% in subclinical hyperthyroidism). Anemia associated with thyroid dysfunction was mainly normocytic (94.0%), and rarely macrocytic (6.0%). CONCLUSION The prevalence of anemia was higher in overt hyperthyroidism, but not increased in subclinical thyroid dysfunction. Systematic measurement of thyroid-stimulating hormone in anemic patients is likely to be useful only after excluding common causes of anemia.
Resumo:
OBJECTIVE To assess whether palliative primary tumor resection in colorectal cancer patients with incurable stage IV disease is associated with improved survival. BACKGROUND There is a heated debate regarding whether or not an asymptomatic primary tumor should be removed in patients with incurable stage IV colorectal disease. METHODS Stage IV colorectal cancer patients were identified in the Surveillance, Epidemiology, and End Results database between 1998 and 2009. Patients undergoing surgery to metastatic sites were excluded. Overall survival and cancer-specific survival were compared between patients with and without palliative primary tumor resection using risk-adjusted Cox proportional hazard regression models and stratified propensity score methods. RESULTS Overall, 37,793 stage IV colorectal cancer patients were identified. Of those, 23,004 (60.9%) underwent palliative primary tumor resection. The rate of patients undergoing palliative primary cancer resection decreased from 68.4% in 1998 to 50.7% in 2009 (P < 0.001). In Cox regression analysis after propensity score matching primary cancer resection was associated with a significantly improved overall survival [hazard ratio (HR) of death = 0.40, 95% confidence interval (CI) = 0.39-0.42, P < 0.001] and cancer-specific survival (HR of death = 0.39, 95% CI = 0.38-0.40, P < 0.001). The benefit of palliative primary cancer resection persisted during the time period 1998 to 2009 with HRs equal to or less than 0.47 for both overall and cancer-specific survival. CONCLUSIONS On the basis of this population-based cohort of stage IV colorectal cancer patients, palliative primary tumor resection was associated with improved overall and cancer-specific survival. Therefore, the dogma that an asymptomatic primary tumor never should be resected in patients with unresectable colorectal cancer metastases must be questioned.
Resumo:
INTRODUCTION Known genetic variants with reference to preeclampsia only explain a proportion of the heritable contribution to the development of this condition. The association between preeclampsia and the risk of cardiovascular disease later in life has encouraged the study of genetic variants important in thrombosis and vascular inflammation also in relation to preeclampsia. The von Willebrand factor-cleaving protease, ADAMTS13, plays an important role in micro vascular thrombosis, and partial deficiencies of this enzyme have been observed in association with cardiovascular disease and preeclampsia. However, it remains unknown whether decreased ADAMTS13 levels represent a cause or an effect of the event in placental and cardiovascular disease. METHODS We studied the distribution of three functional genetic variants of ADAMTS13, c.1852C>G (rs28647808), c.4143_4144dupA (rs387906343), and c.3178C>T (rs142572218) in women with preeclampsia and their controls in a nested case-control study from the second Nord-Trøndelag Health Study (HUNT2). We also studied the association between ADAMTS13 activity and preeclampsia, in serum samples procured unrelated in time of the preeclamptic pregnancy. RESULTS No differences were observed in genotype, allele or haplotype frequencies of the different ADAMTS13 variants when comparing cases and controls, and no association to preeclampsia was found with lower levels of ADAMTS13 activity. CONCLUSION Our findings indicate that ADAMTS13 variants and ADAMTS13 activity do not contribute to an increased risk of preeclampsia in the general population.
Resumo:
Allostatic load (AL) is a marker of physiological dysregulation which reflects exposure to chronic stress. High AL has been related to poorer health outcomes including mortality. We examine here the association of socioeconomic and lifestyle factors with AL. Additionally, we investigate the extent to which AL is genetically determined. We included 803 participants (52% women, mean age 48±16years) from a population and family-based Swiss study. We computed an AL index aggregating 14 markers from cardiovascular, metabolic, lipidic, oxidative, hypothalamus-pituitary-adrenal and inflammatory homeostatic axes. Education and occupational position were used as indicators of socioeconomic status. Marital status, stress, alcohol intake, smoking, dietary patterns and physical activity were considered as lifestyle factors. Heritability of AL was estimated by maximum likelihood. Women with a low occupational position had higher AL (low vs. high OR=3.99, 95%CI [1.22;13.05]), while the opposite was observed for men (middle vs. high OR=0.48, 95%CI [0.23;0.99]). Education tended to be inversely associated with AL in both sexes(low vs. high OR=3.54, 95%CI [1.69;7.4]/OR=1.59, 95%CI [0.88;2.90] in women/men). Heavy drinking men as well as women abstaining from alcohol had higher AL than moderate drinkers. Physical activity was protective against AL while high salt intake was related to increased AL risk. The heritability of AL was estimated to be 29.5% ±7.9%. Our results suggest that generalized physiological dysregulation, as measured by AL, is determined by both environmental and genetic factors. The genetic contribution to AL remains modest when compared to the environmental component, which explains approximately 70% of the phenotypic variance.
Resumo:
OBJECTIVE Renal resistive index (RRI) varies directly with renal vascular stiffness and pulse pressure. RRI correlates positively with arteriolosclerosis in damaged kidneys and predicts progressive renal dysfunction. Matrix Gla-protein (MGP) is a vascular calcification inhibitor that needs vitamin K to be activated. Inactive MGP, known as desphospho-uncarboxylated MGP (dp-ucMGP), can be measured in plasma and has been associated with various cardiovascular (CV) markers, CV outcomes and mortality. In this study we hypothesize that increased RRI is associated with high levels of dp-ucMGP. DESIGN AND METHOD We recruited participants via a multi-center family-based cross-sectional study in Switzerland exploring the role of genes and kidney hemodynamics in blood pressure regulation. Dp-ucMGP was quantified in plasma samples by sandwich ELISA. Renal doppler sonography was performed using a standardized protocol to measure RRIs on 3 segmental arteries in each kidney. The mean of the 6 measures was reported. Multiple regression analysis was performed to estimate associations between RRI and dp-ucMGP adjusting for sex, age, pulse pressure, mean pressure, renal function and other CV risk factors. RESULTS We included 1035 participants in our analyses. Mean values were 0.64 ± 0.06 for RRI and 0.44 ± 0.21 (nmol/L) for dp-ucMGP. RRI was positively associated with dp-ucMGP both before and after adjustment for sex, age, body mass index, pulse pressure, mean pressure, heart rate, renal function, low and high density lipoprotein, smoking status, diabetes, blood pressure and cholesterol lowering drugs, and history of CV disease (P < 0.001). CONCLUSIONS RRI is independently and positively associated with high levels of dp-ucMGP after adjustment for pulse pressure and common CV risk factors. Further studies are needed to determine if vitamin K supplementation can have a positive effect on renal vascular stiffness and kidney function.
Resumo:
Survivors of childhood cancer have a higher mortality than the general population. We describe cause-specific long-term mortality in a population-based cohort of childhood cancer survivors. We included all children diagnosed with cancer in Switzerland (1976-2007) at age 0-14 years, who survived ≥5 years after diagnosis and followed survivors until December 31, 2012. We obtained causes of death (COD) from the Swiss mortality statistics and used data from the Swiss general population to calculate age-, calendar year- and sex-standardized mortality ratios (SMR), and absolute excess risks (AER) for different COD, by Poisson regression. We included 3'965 survivors and 49'704 person years at risk. Of these, 246 (6.2%) died, which was 11 times higher than expected (SMR 11.0). Mortality was particularly high for diseases of the respiratory (SMR 14.8) and circulatory system (SMR 12.7), and for second cancers (SMR 11.6). The pattern of cause-specific mortality differed by primary cancer diagnosis, and changed with time since diagnosis. In the first 10 years after 5-year survival, 78.9% of excess deaths were caused by recurrence of the original cancer (AER 46.1). Twenty-five years after diagnosis, only 36.5% (AER 9.1) were caused by recurrence, 21.3% by second cancers (AER 5.3) and 33.3% by circulatory diseases (AER 8.3). Our study confirms an elevated mortality in survivors of childhood cancer for at least 30 years after diagnosis with an increased proportion of deaths caused by late toxicities of the treatment. The results underline the importance of clinical follow-up continuing years after the end of treatment for childhood cancer. This article is protected by copyright. All rights reserved.
Resumo:
Gender and racial/ethnic disparities in colorectal cancer screening (CRC) has been observed and associated with income status, education level, treatment and late diagnosis. According to the American Cancer Society, among both males and females, CRC is the third most frequently diagnosed type of cancer and accounts for 10% of cancer deaths in the United States. Differences in CRC test use have been documented and limited to access to health care, demographics and health behaviors, but few studies have examined the correlates of CRC screening test use by gender. This present study examined the prevalence of CRC screening test use and assessed whether disparities are explained by gender and racial/ethnic differences. To assess these associations, the study utilized a cross-sectional design and examined the distribution of the covariates for gender and racial/ethnic group differences using the chi square statistic. Logistic regression was used to estimate the prevalence odds ratio and to adjust for the confounding effects of the covariates. ^ Results indicated there are disparities in the use of CRC screening test use and there were statistically significant difference in the prevalence for both FOBT and endoscopy screening between gender, χ2, p≤0.003. Females had a lower prevalence of endoscopy colorectal cancer screening than males when adjusting for age and education (OR 0.88, 95% CI 0.82–0.95). However, no statistically significant difference was reported between racial/ethnic groups, χ 2 p≤0.179 after adjusting for age, education and gender. For both FOBT and endoscopy screening Non-Hispanic Blacks and Hispanics had a lower prevalence of screening compared with Non-Hispanic Whites. In the multivariable regression model, the gender disparities could largely be explained by age, income status, education level, and marital status. Overall, individuals between the age "70–79" years old, were married, with some college education and income greater than $20,000 were associated with a higher prevalence of colorectal cancer screening test use within gender and racial/ethnic groups. ^
Resumo:
Background. Colorectal cancer (CRC) is the third most commonly diagnosed cancer (excluding skin cancer) in both men and women in the United States, with an estimated 148,810 new cases and 49,960 deaths in 2008 (1). Racial/ethnic disparities have been reported across the CRC care continuum. Studies have documented racial/ethnic disparities in CRC screening (2-9), but only a few studies have looked at these differences in CRC screening over time (9-11). No studies have compared these trends in a population with CRC and without cancer. Additionally, although there is evidence suggesting that hospital factors (e.g. teaching hospital status and NCI designation) are associated with CRC survival (12-16), no studies have sought to explain the racial/ethnic differences in survival by looking at differences in socio-demographics, tumor characteristics, screening, co-morbidities, treatment, as well as hospital characteristics. ^ Objectives and Methods. The overall goals of this dissertation were to describe the patterns and trends of racial/ethnic disparities in CRC screening (i.e. fecal occult blood test (FOBT), sigmoidoscopy (SIG) and colonoscopy (COL)) and to determine if racial/ethnic disparities in CRC survival are explained by differences in socio-demographic, tumor characteristics, screening, co-morbidities, treatment, and hospital factors. These goals were accomplished in a two-paper format.^ In Paper 1, "Racial/Ethnic Disparities and Trends in Colorectal Cancer Screening in Medicare Beneficiaries with Colorectal Cancer and without Cancer in SEER Areas, 1992-2002", the study population consisted of 50,186 Medicare beneficiaries diagnosed with CRC from 1992 to 2002 and 62,917 Medicare beneficiaries without cancer during the same time period. Both cohorts were aged 67 to 89 years and resided in 16 Surveillance, Epidemiology and End Results (SEER) regions of the United States. Screening procedures between 6 months and 3 years prior to the date of diagnosis for CRC patients and prior to the index date for persons without cancer were identified in Medicare claims. The crude and age-gender-adjusted percentages and odds ratios of receiving FOBT, SIG, or COL were calculated. Multivariable logistic regression was used to assess race/ethnicity on the odds of receiving CRC screening over time.^ Paper 2, "Racial/Ethnic Disparities in Colorectal Cancer Survival: To what extent are racial/ethnic disparities in survival explained by racial differences in socio-demographics, screening, co-morbidities, treatment, tumor or hospital characteristics", included a cohort of 50,186 Medicare beneficiaries diagnosed with CRC from 1992 to 2002 and residing in 16 SEER regions of the United States which were identified in the SEER-Medicare linked database. Survival was estimated using the Kaplan-Meier method. Cox proportional hazard modeling was used to estimate hazard ratios (HR) of mortality and 95% confidence intervals (95% CI).^ Results. The screening analysis demonstrated racial/ethnic disparities in screening over time among the cohort without cancer. From 1992 to 1995, Blacks and Hispanics were less likely than Whites to receive FOBT (OR=0.75, 95% CI: 0.65-0.87; OR=0.50, 95% CI: 0.34-0.72, respectively) but their odds of screening increased from 2000 to 2002 (OR=0.79, 95% CI: 0.72-0.85; OR=0.67, 95% CI: 0.54-0.75, respectively). Blacks and Hispanics were less likely than Whites to receive SIG from 1992 to 1995 (OR=0.75, 95% CI: 0.57-0.98; OR=0.29, 95% CI: 0.12-0.71, respectively), but their odds of screening increased from 2000 to 2002 (OR=0.79, 95% CI: 0.68-0.93; OR=0.50, 95% CI: 0.35-0.72, respectively).^ The survival analysis showed that Blacks had worse CRC-specific survival than Whites (HR: 1.33, 95% CI: 1.23-1.44), but this was reduced for stages I-III disease after full adjustment for socio-demographic, tumor characteristics, screening, co-morbidities, treatment and hospital characteristics (aHR=1.24, 95% CI: 1.14-1.35). Socioeconomic status, tumor characteristics, treatment and co-morbidities contributed to the reduction in hazard ratios between Blacks and Whites with stage I-III disease. Asians had better survival than Whites before (HR: 0.73, 95% CI: 0.64-0.82) and after (aHR: 0.80, 95% CI: 0.70-0.92) adjusting for all predictors for stage I-III disease. For stage IV, both Asians and Hispanics had better survival than Whites, and after full adjustment, survival improved (aHR=0.73, 95% CI: 0.63-0.84; aHR=0.74, 95% CI: 0.61-0.92, respectively).^ Conclusion. Screening disparities remain between Blacks and Whites, and Hispanics and Whites, but have decreased in recent years. Future studies should explore other factors that may contribute to screening disparities, such as physician recommendations and language/cultural barriers in this and younger populations.^ There were substantial racial/ethnic differences in CRC survival among older Whites, Blacks, Asians and Hispanics. Co-morbidities, SES, tumor characteristics, treatment and other predictor variables contributed to, but did not fully explain the CRC survival differences between Blacks and Whites. Future research should examine the role of quality of care, particularly the benefit of treatment and post-treatment surveillance, in racial disparities in survival.^
Resumo:
Background. A few studies have reported gender differences along the colorectal cancer (CRC) continuum but none has done so longitudinally to compare a cancer and a non-cancer populations.^ Objectives and Methods. To examine gender differences in colorectal cancer screening (CRCS); to examine trends in gender differences in CRC screening among two groups of patients (Medicare beneficiaries with and without cancer); to examine gender differences in CRC incidence; and to examine for any differences over time. In Paper 1, the study population consisted of men and women, ages 67–89 years, with CRC (73,666) or without any cancer (39,006), residing in 12 U.S. Surveillance Epidemiology and End-Results (SEER) regions. Crude and age-adjusted percentages and odds ratios of receiving fecal occult blood test (FOBT), sigmoidoscopy (SIG), or colonoscopy (COL) were calculated. Multivariable logistic regression was used to assess gender on the odds of receiving CRC screening over time.^ In Paper 2, age-adjusted incidence rates and proportions over time were reported across race, CRC subsite, CRC stage and SEER region for 373,956 patients, ages 40+ years, residing in 9 SEER regions and diagnosed with malignant CRC. ^ Results. Overall, women had higher CRC screening rates than men and screening rates in general were higher in the SEER sample of persons with CRC diagnosis. Significant temporal divergence in FOBT screening was observed between men and women in both cohorts. Although the largest temporal increases in screening rates were found for COL, especially among the cohort with CRC, little change in the gender gap was observed over time. Receipt of FOBT was significantly associated with female gender especially in the period of full Medicare coverage. Receipt of COL was also significantly associated with male gender, especially in the period of limited Medicare coverage.^ Overall, approximately equal numbers of men (187,973) and women (185,983) were diagnosed with malignant CRC. Men had significantly higher age-adjusted CRC incidence rates than women across all categories of age, race, subsite, stage and SEER region even though rates declined in all categories over time. Significant moderate increases in rate difference occurred among 40-59 year olds; significant reductions occurred among patients age 70+, within subsite rectum, unstaged and distant stage CRC, and eastern and western SEER regions. ^ Conclusions. Persistent gender differences in CRC incidence across time may have implications for gender-based interventions that take age into consideration. A shift toward proximal cancer was observed over time for both genders, but the high proportion of men who develop rectal cancer suggests that a greater proportion of men may need to be targeted with newer screening methods such as fecal DNA or COL. Although previous reports have documented higher CRC screening among men, higher incidence of CRC observed among men suggests that higher risk categories of men are probably not being reached. FOBT utilization rates among women have increased over time and the gender gap has widened between 1998 and 2005. COL utilization is associated with male gender but the differences over time are small.^
Resumo:
Objective: The objective of this study is to investigate the association between processed and unprocessed red meat consumption and prostate cancer (PCa) stage in a homogenous Mexican-American population. Methods: This population-based case-control study had a total of 582 participants (287 cases with histologically confirmed adenocarcinoma of the prostate gland and 295 age and ethnicity-matched controls) that were all residing in the Southeast region of Texas from 1998 to 2006. All questionnaire information was collected using a validated data collection instrument. Statistical Analysis: Descriptive analyses included Student's t-test and Pearson's Chi-square tests. Odds ratios and 95% confidence intervals were calculated to quantify the association between nutritional factors and PCa stage. A multivariable model was used for unconditional logistic regression. Results: After adjusting for relevant covariates, those who consume high amounts of processed red meat have a non-significant increased odds of being diagnosed with localized PCa (OR = 1.60 95% CI: 0.85 - 3.03) and total PCa (OR = 1.43 95% CI: 0.81 - 2.52) but not for advanced PCa (OR = 0.91 95% CI: 1.37 - 2.23). Interestingly, high consumption of carbohydrates shows a significant reduction in the odds of being diagnosed with total PCa and advanced PCa (OR = 0.43 95% CI: 0.24 - 0.77; OR = 0.27 95% CI: 0.10 - 0.71, respectively). However, consuming high amounts of energy from protein and fat was shown to increase the odds of being diagnosed with advanced PCa (OR = 4.62 95% CI: 1.69 - 12.59; OR = 2.61 95% CI: 1.04 - 6.58, respectively). Conclusion: Mexican-Americans who consume high amounts of energy from protein and fat had increased odds of being diagnosed with advanced PCa, while high amounts of carbohydrates reduced the odds of being diagnosed with total and advanced PCa.^
Resumo:
Few studies have been conducted on the epidemiology of enteric infectious diseases of public health importance in communities along the United States-Mexico border, and these studies typically focus on bacterial and viral diseases. The epidemiology of intestinal helminth infections along the border has not recently been explored, and there are no published reports for El Paso and Ciudad Juarez, both of which are high traffic urban areas along the Texas-Mexico border. The purpose of this research project was to conduct a cross-sectional epidemiologic survey for enteric helminths of medical importance along the Texas-Mexico border region of El Paso and Ciudad Juarez and to evaluate risk factors for exposure to these parasites. In addition, an emphasis was placed on the zoonotic tapeworm, Taenia solium. This tapeworm is especially important in this region because of the increasing incidence of neurocysticercosis, a severe disease spread by carriers of intestinal T. solium. Fecal samples were collected from individuals of all ages in a population-based cross-sectional household survey and evaluated for the presence of helminth parasites using fecal flotations. In addition, a Taenia coproantigen enzyme linked immunosorbent assay (ELISA) was performed on each stool sample to identify tapeworm carriers. A standardized questionnaire was administered to identify risk factors and routes of exposure for enteric helminth infections with additional questions to assess risk factors specific for taeniasis. The actual prevalence of taeniasis along the Texas-Mexico border was unknown, and this is the first population-based study performed in this region. Flotations were performed on 395 samples and four (1%) were positive for helminths including Ascaris, hookworms and Taenia species. Immunodiagnostic testing demonstrated a prevalence of 2.9% (11/378) for taeniasis. Based on the case definition, a 3% (12/395) prevalence of taeniasis was detected in this area. In addition, statistical analyses indicate that residents of El Paso are 8.5 times more likely to be a tapeworm carrier compared to residents of Juarez (PR=8.5, 95% CI=2.35, 30.81). This finding has important implications in terms of planning effective health education campaigns to decrease the prevalence of enteric helminths in populations along the Texas-Mexico border. ^
Resumo:
Peer reviewed