19 resultados para Residual-based tests
em DigitalCommons@The Texas Medical Center
Resumo:
Linkage disequilibrium methods can be used to find genes influencing quantitative trait variation in humans. Linkage disequilibrium methods can require smaller sample sizes than linkage equilibrium methods, such as the variance component approach to find loci with a specific effect size. The increase in power is at the expense of requiring more markers to be typed to scan the entire genome. This thesis compares different linkage disequilibrium methods to determine which factors influence the power to detect disequilibrium. The costs of disequilibrium and equilibrium tests were compared to determine whether the savings in phenotyping costs when using disequilibrium methods outweigh the additional genotyping costs.^ Nine linkage disequilibrium tests were examined by simulation. Five tests involve selecting isolated unrelated individuals while four involved the selection of parent child trios (TDT). All nine tests were found to be able to identify disequilibrium with the correct significance level in Hardy-Weinberg populations. Increasing linked genetic variance and trait allele frequency were found to increase the power to detect disequilibrium, while increasing the number of generations and distance between marker and trait loci decreased the power to detect disequilibrium. Discordant sampling was used for several of the tests. It was found that the more stringent the sampling, the greater the power to detect disequilibrium in a sample of given size. The power to detect disequilibrium was not affected by the presence of polygenic effects.^ When the trait locus had more than two trait alleles, the power of the tests maximized to less than one. For the simulation methods used here, when there were more than two-trait alleles there was a probability equal to 1-heterozygosity of the marker locus that both trait alleles were in disequilibrium with the same marker allele, resulting in the marker being uninformative for disequilibrium.^ The five tests using isolated unrelated individuals were found to have excess error rates when there was disequilibrium due to population admixture. Increased error rates also resulted from increased unlinked major gene effects, discordant trait allele frequency, and increased disequilibrium. Polygenic effects did not affect the error rates. The TDT, Transmission Disequilibrium Test, based tests were not liable to any increase in error rates.^ For all sample ascertainment costs, for recent mutations ($<$100 generations) linkage disequilibrium tests were less expensive than the variance component test to carry out. Candidate gene scans saved even more money. The use of recently admixed populations also decreased the cost of performing a linkage disequilibrium test. ^
Resumo:
Apolipoprotein E (ApoE) plays a major role in the metabolism of high density and low density lipoproteins (HDL and LDL). Its common protein isoforms (E2, E3, E4) are risk factors for coronary artery disease (CAD) and explain between 16 to 23% of the inter-individual variation in plasma apoE levels. Linkage analysis has been completed for plasma apoE levels in the GENOA study (Genetic Epidemiology Network of Atherosclerosis). After stratification of the population by lipoprotein levels and body mass index (BMI) to create more homogeneity with regard to biological context for apoE levels, Hispanic families showed significant linkage on chromosome 17q for two strata (LOD=2.93 at 104 cM for a low cholesterol group, LOD=3.04 at 111 cM for a low cholesterol, high HDLC group). Replication of 17q linkage was observed for apoB and apoE levels in the unstratified Hispanic and African-American populations, and for apoE levels in African-American families. Replication of this 17q linkage in different populations and strata provides strong support for the presence of gene(s) in this region with significant roles in the determination of inter-individual variation in plasma apoE levels. Through a positional and functional candidate gene approach, ten genes were identified in the 17q linked region, and 62 polymorphisms in these genes were genotyped in the GENOA families. Association analysis was performed with FBAT, GEE, and variance-component based tests followed by conditional linkage analysis. Association studies with partial coverage of TagSNPs in the gene coding for apolipoprotein H (APOH) were performed, and significant results were found for 2 SNPs (APOH_20951 and APOH_05407) in the Hispanic low cholesterol strata accounting for 3.49% of the inter-individual variation in plasma apoE levels. Among the other candidate genes, we identified a haplotype block in the ACE1 gene that contains two major haplotypes associated with apoE levels as well as total cholesterol, apoB and LDLC levels in the unstratified Hispanic population. Identifying genes responsible for the remaining 60% of inter-individual variation in plasma apoE level, will yield new insights into the understanding of genetic interactions involved in the lipid metabolism, and a more precise understanding of the risk factors leading to CAD. ^
Resumo:
Genome-wide association studies (GWAS) have successfully identified several genetic loci associated with inherited predisposition to primary biliary cirrhosis (PBC), the most common autoimmune disease of the liver. Pathway-based tests constitute a novel paradigm for GWAS analysis. By evaluating genetic variation across a biological pathway (gene set), these tests have the potential to determine the collective impact of variants with subtle effects that are individually too weak to be detected in traditional single variant GWAS analysis. To identify biological pathways associated with the risk of development of PBC, GWAS of PBC from Italy (449 cases and 940 controls) and Canada (530 cases and 398 controls) were independently analyzed. The linear combination test (LCT), a recently developed pathway-level statistical method was used for this analysis. For additional validation, pathways that were replicated at the P <0.05 level of significance in both GWAS on LCT analysis were also tested for association with PBC in each dataset using two complementary GWAS pathway approaches. The complementary approaches included a modification of the gene set enrichment analysis algorithm (i-GSEA4GWAS) and Fisher's exact test for pathway enrichment ratios. Twenty-five pathways were associated with PBC risk on LCT analysis in the Italian dataset at P<0.05, of which eight had an FDR<0.25. The top pathway in the Italian dataset was the TNF/stress related signaling pathway (p=7.38×10 -4, FDR=0.18). Twenty-six pathways were associated with PBC at the P<0.05 level using the LCT in the Canadian dataset with the regulation and function of ChREBP in liver pathway (p=5.68×10-4, FDR=0.285) emerging as the most significant pathway. Two pathways, phosphatidylinositol signaling system (Italian: p=0.016, FDR=0.436; Canadian: p=0.034, FDR=0.693) and hedgehog signaling (Italian: p=0.044, FDR=0.636; Canadian: p=0.041, FDR=0.693), were replicated at LCT P<0.05 in both datasets. Statistically significant association of both pathways with PBC genetic susceptibility was confirmed in the Italian dataset on i-GSEA4GWAS. Results for the phosphatidylinositol signaling system were also significant in both datasets on applying Fisher's exact test for pathway enrichment ratios. This study identified a combination of known and novel pathway-level associations with PBC risk. If functionally validated, the findings may yield fresh insights into the etiology of this complex autoimmune disease with possible preventive and therapeutic application.^
Resumo:
Since interferon-gamma release assays (IGRAs) were introduced in the 2000's, tuberculin skin testing (TST) and IGRAs have been used in various latent tuberculosis infection (LTBI) screening settings. IGRAs are laboratory-based tests and are considered not to be affected by previous Bacille de Calmette et Guérin (BCG) vaccination; however, they are more costly when compared directly with TST, which does not require specimen processing in a laboratory. This study aimed to examine TST and two types of IGRAs, QuantiFERON-TB Gold in Tube (QFT-GIT) and T-SPOT. TB (TSPOT), from an economic viewpoint. Firstly, a systematic literature review was conducted to identify cost related analyses of LTBI screening. Secondly, specific cost information detailing each test's items and labor was collected from an LTBI screening program of health care workers in Houston, and the cost of each test was computed. Thirdly, using the computed cost estimate of each test, cost-effectiveness analyses were conducted to compare TST and IGRAs.^ A literature search showed that a limited number of studies have been conducted, but the IGRA's economic advantages were common among studies. Cost analyses showed that IGRAs were much more costly than TST. The results were consistent with previous studies. In cost-effectiveness analyses, where test cost and consequential TB-related cost were considered, IGRAs showed variable advantages over TST depending on the targeted population. When only non BCG-vaccinated people were considered, TST was the least costly option among the three tests. On the other hand, when only BCG-vaccinated people were considered, IGRAs were less costly options. These results were mostly consistent even with varying assumption parameters.^ IGRAs can be more costly than TST, but their economic disadvantages are alleviated when the target population was BCG-vaccinated. Based on current knowledge, IGRAs may be recommended in a population where the BCG history is mixed. Additional studies are needed to better understand IGRA's reliability among low-incidence and low-risk populations in which background TB prevalence is low.^
Resumo:
Students arrive at classes with a varying social situations and course subject knowledge. Blackboard is a web based course delivery program that permits testing of students before arriving at the first class. A pretest was used to assess preexisting subject knowledge(S) and a survey was used to assess non-subject (N) factors that might impact the student’s final grade. A posttest was administered after all content was delivered and used to access change in S. [See PDF for complete abstract]
Devices in heart failure: potential methods for device-based monitoring of congestive heart failure.
Resumo:
Congestive heart failure has long been one of the most serious medical conditions in the United States; in fact, in the United States alone, heart failure accounts for 6.5 million days of hospitalization each year. One important goal of heart-failure therapy is to inhibit the progression of congestive heart failure through pharmacologic and device-based therapies. Therefore, there have been efforts to develop device-based therapies aimed at improving cardiac reserve and optimizing pump function to meet metabolic requirements. The course of congestive heart failure is often worsened by other conditions, including new-onset arrhythmias, ischemia and infarction, valvulopathy, decompensation, end-organ damage, and therapeutic refractoriness, that have an impact on outcomes. The onset of such conditions is sometimes heralded by subtle pathophysiologic changes, and the timely identification of these changes may promote the use of preventive measures. Consequently, device-based methods could in the future have an important role in the timely identification of the subtle pathophysiologic changes associated with congestive heart failure.
Resumo:
PURPOSE: The purpose of this study was to assess the impact of different policies on access to hormonal contraception and pregnancy rates at two high school-based clinics. METHODS: Two clinics in high schools (Schools A and B), located in a large urban district in the southwest US, provide primary medical care to enrolled students with parental consent; the majority of whom have no health insurance coverage. The hormonal contraceptive dispensing policy of at School clinic A involves providing barrier, hormonal and emergency contraceptive services on site. School clinic B uses a referral policy that directs students to obtain contraception at an off-campus affiliated family planning clinic. Baseline data (age, race and history of prior pregnancy) on female students seeking hormonal contraception at the two clinics between 9/2008-12/2009 were extracted from an electronic administrative database (AHLERS Integrated System). Data on birth control use and pregnancy tests for each student was then tracked electronically through 3/31/2010. The outcomes measures were accessing hormonal contraception and positive pregnancy tests at any point during or after birth control use were started through 12/2009. The appointment keeping rate for contraceptive services and the overall pregnancy rates were compared between the two schools. In addition the pregnancy rates were compared between the two schools for students with and without a prior history of pregnancy. RESULTS: School clinic A: 79 students sought hormonal contraception; mean age 17.5 years; 68% were > 18 years; 77% were Hispanic; and 20% reported prior pregnancy. The mean duration of the observation period was 13 months (4-19 months). All 79 students received hormonal contraception (65% pill and 35% long acting progestin injection) onsite. During the observation period, the overall pregnancy rate was 6% (5/79); 4.7% (3/63) among students with no prior pregnancy. School clinic B: 40 students sought hormonal contraception; mean age 17.5 years; 52% > 18 years; 88 % were Hispanic; and 7.5% reported prior pregnancy. All 40 students were referred to the affiliated clinic. The mean duration of the observation period was 11.9 months (4-19 months). 50% (20) kept their appointment. Pills were dispensed to 85% (17/20) and 15% (3/20) received long acting progestin injection. The overall pregnancy rate was 20% (8/40); 21.6% (8/37) among students with no prior pregnancy. A significantly higher frequency of students seeking hormonal contraception kept their initial appointment for birth control at the school dispensing onsite contraception compared to the school with a referral policy for contraception (p<0.05). The pregnancy rate was significantly higher for the school with a referral policy for contraception compared to the school with onsite contraceptive services (p< 0.05). The pregnancy rate was also significantly higher for students without a prior history of pregnancy in the school with a referral policy for contraception (21.6%) versus the school with onsite contraceptive services (4.7%) (p< 0.05). CONCLUSION: This preliminary study showed that School clinic B with a referral policy had a lower appointment keeping rate for contraceptive services and a higher pregnancy rate than School clinic A with on-site contraceptive services. An on-site dispensing policy for hormonal contraceptives at high school-based health clinics may be a convenient and effective approach to prevent unintended first and repeat pregnancies among adolescents who seek hormonal contraception. This study has strong implications for reproductive health policy, especially as directed toward high-risk teenage populations.
Resumo:
Genetic anticipation is defined as a decrease in age of onset or increase in severity as the disorder is transmitted through subsequent generations. Anticipation has been noted in the literature for over a century. Recently, anticipation in several diseases including Huntington's Disease, Myotonic Dystrophy and Fragile X Syndrome were shown to be caused by expansion of triplet repeats. Anticipation effects have also been observed in numerous mental disorders (e.g. Schizophrenia, Bipolar Disorder), cancers (Li-Fraumeni Syndrome, Leukemia) and other complex diseases. ^ Several statistical methods have been applied to determine whether anticipation is a true phenomenon in a particular disorder, including standard statistical tests and newly developed affected parent/affected child pair methods. These methods have been shown to be inappropriate for assessing anticipation for a variety of reasons, including familial correlation and low power. Therefore, we have developed family-based likelihood modeling approaches to model the underlying transmission of the disease gene and penetrance function and hence detect anticipation. These methods can be applied in extended families, thus improving the power to detect anticipation compared with existing methods based only upon parents and children. The first method we have proposed is based on the regressive logistic hazard model. This approach models anticipation by a generational covariate. The second method allows alleles to mutate as they are transmitted from parents to offspring and is appropriate for modeling the known triplet repeat diseases in which the disease alleles can become more deleterious as they are transmitted across generations. ^ To evaluate the new methods, we performed extensive simulation studies for data simulated under different conditions to evaluate the effectiveness of the algorithms to detect genetic anticipation. Results from analysis by the first method yielded empirical power greater than 87% based on the 5% type I error critical value identified in each simulation depending on the method of data generation and current age criteria. Analysis by the second method was not possible due to the current formulation of the software. The application of this method to Huntington's Disease and Li-Fraumeni Syndrome data sets revealed evidence for a generation effect in both cases. ^
Resumo:
Linkage and association studies are major analytical tools to search for susceptibility genes for complex diseases. With the availability of large collection of single nucleotide polymorphisms (SNPs) and the rapid progresses for high throughput genotyping technologies, together with the ambitious goals of the International HapMap Project, genetic markers covering the whole genome will be available for genome-wide linkage and association studies. In order not to inflate the type I error rate in performing genome-wide linkage and association studies, multiple adjustment for the significant level for each independent linkage and/or association test is required, and this has led to the suggestion of genome-wide significant cut-off as low as 5 × 10 −7. Almost no linkage and/or association study can meet such a stringent threshold by the standard statistical methods. Developing new statistics with high power is urgently needed to tackle this problem. This dissertation proposes and explores a class of novel test statistics that can be used in both population-based and family-based genetic data by employing a completely new strategy, which uses nonlinear transformation of the sample means to construct test statistics for linkage and association studies. Extensive simulation studies are used to illustrate the properties of the nonlinear test statistics. Power calculations are performed using both analytical and empirical methods. Finally, real data sets are analyzed with the nonlinear test statistics. Results show that the nonlinear test statistics have correct type I error rates, and most of the studied nonlinear test statistics have higher power than the standard chi-square test. This dissertation introduces a new idea to design novel test statistics with high power and might open new ways to mapping susceptibility genes for complex diseases. ^
Resumo:
With the recognition of the importance of evidence-based medicine, there is an emerging need for methods to systematically synthesize available data. Specifically, methods to provide accurate estimates of test characteristics for diagnostic tests are needed to help physicians make better clinical decisions. To provide more flexible approaches for meta-analysis of diagnostic tests, we developed three Bayesian generalized linear models. Two of these models, a bivariate normal and a binomial model, analyzed pairs of sensitivity and specificity values while incorporating the correlation between these two outcome variables. Noninformative independent uniform priors were used for the variance of sensitivity, specificity and correlation. We also applied an inverse Wishart prior to check the sensitivity of the results. The third model was a multinomial model where the test results were modeled as multinomial random variables. All three models can include specific imaging techniques as covariates in order to compare performance. Vague normal priors were assigned to the coefficients of the covariates. The computations were carried out using the 'Bayesian inference using Gibbs sampling' implementation of Markov chain Monte Carlo techniques. We investigated the properties of the three proposed models through extensive simulation studies. We also applied these models to a previously published meta-analysis dataset on cervical cancer as well as to an unpublished melanoma dataset. In general, our findings show that the point estimates of sensitivity and specificity were consistent among Bayesian and frequentist bivariate normal and binomial models. However, in the simulation studies, the estimates of the correlation coefficient from Bayesian bivariate models are not as good as those obtained from frequentist estimation regardless of which prior distribution was used for the covariance matrix. The Bayesian multinomial model consistently underestimated the sensitivity and specificity regardless of the sample size and correlation coefficient. In conclusion, the Bayesian bivariate binomial model provides the most flexible framework for future applications because of its following strengths: (1) it facilitates direct comparison between different tests; (2) it captures the variability in both sensitivity and specificity simultaneously as well as the intercorrelation between the two; and (3) it can be directly applied to sparse data without ad hoc correction. ^
Resumo:
Monte Carlo simulation has been conducted to investigate parameter estimation and hypothesis testing in some well known adaptive randomization procedures. The four urn models studied are Randomized Play-the-Winner (RPW), Randomized Pôlya Urn (RPU), Birth and Death Urn with Immigration (BDUI), and Drop-the-Loses Urn (DL). Two sequential estimation methods, the sequential maximum likelihood estimation (SMLE) and the doubly adaptive biased coin design (DABC), are simulated at three optimal allocation targets that minimize the expected number of failures under the assumption of constant variance of simple difference (RSIHR), relative risk (ORR), and odds ratio (OOR) respectively. Log likelihood ratio test and three Wald-type tests (simple difference, log of relative risk, log of odds ratio) are compared in different adaptive procedures. ^ Simulation results indicates that although RPW is slightly better in assigning more patients to the superior treatment, the DL method is considerably less variable and the test statistics have better normality. When compared with SMLE, DABC has slightly higher overall response rate with lower variance, but has larger bias and variance in parameter estimation. Additionally, the test statistics in SMLE have better normality and lower type I error rate, and the power of hypothesis testing is more comparable with the equal randomization. Usually, RSIHR has the highest power among the 3 optimal allocation ratios. However, the ORR allocation has better power and lower type I error rate when the log of relative risk is the test statistics. The number of expected failures in ORR is smaller than RSIHR. It is also shown that the simple difference of response rates has the worst normality among all 4 test statistics. The power of hypothesis test is always inflated when simple difference is used. On the other hand, the normality of the log likelihood ratio test statistics is robust against the change of adaptive randomization procedures. ^
Resumo:
Hypertension is usually defined as having values of systolic blood pressure ≥140 mmHg, diastolic blood pressure ≥90 mmHg. Hypertension is one of the main adverse effects of glucocorticoid on the cardiovascular system. Glucocorticoids are essential hormones, secreted from adrenal glands in circadian fashion. Glucocorticoid's effect on blood pressure is conveyed by the glucocorticoid receptor (NR3C1), an omnipresent nuclear transcription factor. Although polymorphisms in this gene have long been implicated to be a causal factor for cardiovascular diseases such as hypertension, no study has yet thoroughly interrogated the gene's polymorphisms for their effect on blood pressure levels. Therefore, I have first resequenced ∼30 kb of the gene, encompassing all exons, promoter regions, 5'/3' UTRs as well as at least 1.5 kb of the gene's flanking regions from 114 chromosome 5 monosomic cell lines, comprised of three major American ethnic groups—European American, African American and Mexican American. I observed 115 polymorphisms and 14 common molecularly phased haplotypes. A subset of markers was chosen for genotyping study populations of GENOA (Genetic Epidemiology Network of Atherosclerosis; 1022 non-Hispanic whites, 1228 African Americans and 954 Mexican Americans). Since these study populations include sibships, the family-based association test was performed on 4 blood pressure-related quantitative variables—pulse, systolic blood pressure, diastolic blood pressure and mean arterial pressure. Using these analyses, multiple correlated SNPs are significantly protective against high systolic blood pressure in non-Hispanic whites, which includes rsb198, a SNP formerly associated with beneficial body compositions. Haplotype association analysis also supports this finding and all p-values remained significant after permutation tests. I therefore conclude that multiple correlated SNPs on the gene may confer protection against high blood pressure in non-Hispanic whites. ^
Resumo:
This study retrospectively evaluated the spatial and temporal disease patterns associated with influenza-like illness (ILI), positive rapid influenza antigen detection tests (RIDT), and confirmed H1N1 S-OIV cases reported to the Cameron County Department of Health and Human Services between April 26 and May 13, 2009 using the space-time permutation scan statistic software SaTScan in conjunction with geographical information system (GIS) software ArcGIS 9.3. The rate and age-adjusted relative risk of each influenza measure was calculated and a cluster analysis was conducted to determine the geographic regions with statistically higher incidence of disease. A Poisson distribution model was developed to identify the effect that socioeconomic status, population density, and certain population attributes of a census block-group had on that area's frequency of S-OIV confirmed cases over the entire outbreak. Predominant among the spatiotemporal analyses of ILI, RIDT and S-OIV cases in Cameron County is the consistent pattern of a high concentration of cases along the southern border with Mexico. These findings in conjunction with the slight northward space-time shifts of ILI and RIDT cluster centers highlight the southern border as the primary site for public health interventions. Finally, the community-based multiple regression model revealed that three factors—percentage of the population under age 15, average household size, and the number of high school graduates over age 25—were significantly associated with laboratory-confirmed S-OIV in the Lower Rio Grande Valley. Together, these findings underscore the need for community-based surveillance, improve our understanding of the distribution of the burden of influenza within the community, and have implications for vaccination and community outreach initiatives.^
Resumo:
Objectives. Triple Negative Breast Cancer (TNBC) lack expression of estrogen receptors (ER), progesterone receptors (PR), and absence of Her2 gene amplification. Current literature has identified TNBC and over-expression of cyclo-oxygenase-2 (COX-2) protein in primary breast cancer to be independent markers of poor prognosis in terms of overall and distant disease free survival. The purpose of this study was to compare COX-2 over-expression in TNBC patients to those patients who expressed one or more of the three tumor markers (i.e. ER, and/or PR, and/or Her2).^ Methods. Using a secondary data analysis, a cross-sectional design was implemented to examine the association of interest. Data collected from two ongoing protocols titled "LAB04-0657: a model for COX-2 mediated bone metastasis (Specific aim 3)" and "LAB04-0698: correlation of circulating tumor cells and COX-2 expression in primary breast cancer metastasis" was used for analysis. A sample of 125 female patients was analyzed using Chi-square tests and logistic regression models. ^ Results. COX-2 over-expression was present in 33% (41/125) and 28% (35/124) patients were identified as having TNBC. TNBC status was associated with elevated COX-2 expression (OR= 3.34; 95% CI= 1.40–8.22) and high tumor grade (OR= 4.09; 95% CI= 1.58–10.82). In a multivariable analysis, TNBC status was an important predictor of COX-2 expression after adjusting for age, menopausal status, BMI, and lymph node status (OR= 3.31; 95% CI: 1.26–8.67; p=0.01).^ Conclusion. TNBC is associated with COX-2 expression—a known marker of poor prognosis in patients with operable breast cancer. Replication of these results in a study with a larger sample size, or a future randomized clinical trial demonstrating an improved prognosis with COX-2 suppression in these patients would support this hypothesis.^
Resumo:
The objective of this dissertation was to determine the initiation and completion rates of adjuvant chemotherapy, its toxicity and the compliance rates of post-treatment surveillance for elderly patients with colon cancer using the linked Surveillance, Epidemiology, and End Results – Medicare database.^ The first study assessed the initiation and completion rate of 5-fluorouracil-based adjuvant chemotherapy and its relationship with patient characteristics. Of the 12,265 patients diagnosed with stage III colon adenocarcinoma in 1991-2005, 64.4% received adjuvant chemotherapy within 3-months after tumor resection and 40% of them completed the treatment. Age, marital status, and comorbidity score were significant predictors for chemotherapy initiation and completion.^ The second study estimated the incidence rate of toxicity-related endpoints among stage III colon adenocarcinoma patients treated with chemotherapy in 1991-2005. Of the 12,099 patients, 63.9% underwent chemotherapy and had volume depletion disorder (3-month cumulative incidence rate [CIR]=9.1%), agranulocytosis (CIR=3.4%), diarrhea (CIR=2.4%), nausea and vomiting (CIR=2.3%). Cox regression analysis confirmed such association (HR=2.76; 95% CI=2.42-3.15). The risk of ischemic heart diseases was slightly associated with chemotherapy (HR=1.08), but significantly among patients aged <75 with no comorbidity (HR=1.70). ^ The third study determined the adherence rate of follow-up cares among patients diagnosed with stage I-III colon adenocarcinoma in 2000 - June 2002. We identified 7,348 patients with a median follow-up of 59 months. The adherence rate was 83.9% for office visits, 29.4% for CEA tests, and 74.3% for colonoscopy. Overall, 25.2% met the recommended post-treatment care. Younger age at diagnosis, white race, married, advanced stage, fewer comorbidities, and chemotherapy use were significantly associated with guideline adherence.^ In conclusions, not all colon cancer patients received chemotherapy. Receiving chemotherapy was associated with increased risk of developing gastrointestinal, hematological and cardiac toxicities. Patients were more likely to comply with the schedule for office visits and colonoscopy but failed in CEA tests. ^