942 resultados para Cox Proportional Hazards Model


Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the general population, HDL cholesterol (HDL-C) is associated with reduced cardiovascular events. However, recent experimental data suggest that the vascular effects of HDL can be heterogeneous. We examined the association of HDL-C with all-cause and cardiovascular mortality in the Ludwigshafen Risk and Cardiovascular Health study comprising 3307 patients undergoing coronary angiography. Patients were followed for a median of 9.9 years. Estimated GFR (eGFR) was calculated using the Chronic Kidney Disease Epidemiology Collaboration eGFR creatinine-cystatin C (eGFRcreat-cys) equation. The effect of increasing HDL-C serum levels was assessed using Cox proportional hazard models. In participants with normal kidney function (eGFR>90 ml/min per 1.73 m(2)), higher HDL-C was associated with reduced risk of all-cause and cardiovascular mortality and coronary artery disease severity (hazard ratio [HR], 0.51, 95% confidence interval [95% CI], 0.26-0.92 [P=0.03]; HR, 0.30, 95% CI, 0.13-0.73 [P=0.01]). Conversely, in patients with mild (eGFR=60-89 ml/min per 1.73 m(2)) and more advanced reduced kidney function (eGFR<60 ml/min per 1.73 m(2)), higher HDL-C did not associate with lower risk for mortality (eGFR=60-89 ml/min per 1.73 m(2): HR, 0.68, 95% CI, 0.45-1.04 [P=0.07]; HR, 0.84, 95% CI, 0.50-1.40 [P=0.50]; eGFR<60 ml/min per 1.73 m(2): HR, 1.18, 95% CI, 0.60-1.81 [P=0.88]; HR, 0.82, 95% CI, 0.40-1.69 [P=0.60]). Moreover, Cox regression analyses revealed interaction between HDL-C and eGFR in predicting all-cause and cardiovascular mortality (P=0.04 and P=0.02, respectively). We confirmed a lack of association between higher HDL-C and lower mortality in an independent cohort of patients with definite CKD (P=0.63). In summary, higher HDL-C levels did not associate with reduced mortality risk and coronary artery disease severity in patients with reduced kidney function. Indeed, abnormal HDL function might confound the outcome of HDL-targeted therapies in these patients.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVES To report on trends of tuberculosis ascertainment among HIV patients in a rural HIV cohort in Tanzania, and assessing the impact of a bundle of services implemented in December 2012, consisting of three components:(i)integration of HIV and tuberculosis services; (ii)GeneXpert for tuberculosis diagnosis; and (iii)electronic data collection. DESIGN Retrospective cohort study of patients enrolled in the Kilombero Ulanga Antiretroviral Cohort (KIULARCO), Tanzania.). METHODS HIV patients without prior history of tuberculosis enrolled in the KIULARCO cohort between 2005 and 2013 were included.Cox proportional hazard models were used to estimate rates and predictors of tuberculosis ascertainment. RESULTS Of 7114 HIV positive patients enrolled, 5123(72%) had no history of tuberculosis. Of these, 66% were female, median age was 38 years, median baseline CD4+ cell count was 243 cells/µl, and 43% had WHO clinical stage 3 or 4. During follow-up, 421 incident tuberculosis cases were notified with an estimated incidence of 3.6 per 100 person-years(p-y)[95% confidence interval(CI)3.26-3.97]. The incidence rate varied over time and increased significantly from 2.96 to 43.98 cases per 100 p-y after the introduction of the bundle of services in December 2012. Four independent predictors of tuberculosis ascertainment were identified:poor clinical condition at baseline (Hazard Ratio (HR) 3.89, 95% CI 2.87-5.28), WHO clinical stage 3 or 4 (HR 2.48, 95% CI 1.88-3.26), being antiretroviralnaïve (HR 2.97, 95% CI 2.25-3.94), and registration in 2013(HR 6.07, 95% CI 4.39-8.38). CONCLUSION The integration of tuberculosis and HIV services together with comprehensive electronic data collection and use of GeneXpert increased dramatically the ascertainment of tuberculosis in this rural African HIV cohort.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVES We studied the influence of noninjecting and injecting drug use on mortality, dropout rate, and the course of antiretroviral therapy (ART), in the Swiss HIV Cohort Study (SHCS). METHODS Cohort participants, registered prior to April 2007 and with at least one drug use questionnaire completed until May 2013, were categorized according to their self-reported drug use behaviour. The probabilities of death and dropout were separately analysed using multivariable competing risks proportional hazards regression models with mutual correction for the other endpoint. Furthermore, we describe the influence of drug use on the course of ART. RESULTS A total of 6529 participants (including 31% women) were followed during 31 215 person-years; 5.1% participants died; 10.5% were lost to follow-up. Among persons with homosexual or heterosexual HIV transmission, noninjecting drug use was associated with higher all-cause mortality [subhazard rate (SHR) 1.73; 95% confidence interval (CI) 1.07-2.83], compared with no drug use. Also, mortality was increased among former injecting drug users (IDUs) who reported noninjecting drug use (SHR 2.34; 95% CI 1.49-3.69). Noninjecting drug use was associated with higher dropout rates. The mean proportion of time with suppressed viral replication was 82.2% in all participants, irrespective of ART status, and 91.2% in those on ART. Drug use lowered adherence, and increased rates of ART change and ART interruptions. Virological failure on ART was more frequent in participants who reported concomitant drug injections while on opiate substitution, and in current IDUs, but not among noninjecting drug users. CONCLUSIONS Noninjecting drug use and injecting drug use are modifiable risks for death, and they lower retention in a cohort and complicate ART.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND The impact of early treatment with immunomodulators (IM) and/or TNF antagonists on bowel damage in Crohn's disease (CD) patients is unknown. AIM To assess whether 'early treatment' with IM and/or TNF antagonists, defined as treatment within a 2-year period from the date of CD diagnosis, was associated with development of lesser number of disease complications when compared to 'late treatment', which was defined as treatment initiation after >2 years from the time of CD diagnosis. METHODS Data from the Swiss IBD Cohort Study were analysed. The following outcomes were assessed using Cox proportional hazard modelling: bowel strictures, perianal fistulas, internal fistulas, intestinal surgery, perianal surgery and any of the aforementioned complications. RESULTS The 'early treatment' group of 292 CD patients was compared to the 'late treatment' group of 248 CD patients. We found that 'early treatment' with IM or TNF antagonists alone was associated with reduced risk of bowel strictures [hazard ratio (HR) 0.496, P = 0.004 for IM; HR 0.276, P = 0.018 for TNF antagonists]. Furthermore, 'early treatment' with IM was associated with reduced risk of undergoing intestinal surgery (HR 0.322, P = 0.005), and perianal surgery (HR 0.361, P = 0.042), as well as developing any complication (HR 0.567, P = 0.006). CONCLUSIONS Treatment with immunomodulators or TNF antagonists within the first 2 years of CD diagnosis was associated with reduced risk of developing bowel strictures, when compared to initiating these drugs >2 years after diagnosis. Furthermore, early immunomodulators treatment was associated with reduced risk of intestinal surgery, perianal surgery and any complication.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND There are limited published data on the outcomes of infants starting antiretroviral therapy (ART) in routine care in Southern Africa. This study aimed to examine the baseline characteristics and outcomes of infants initiating ART. METHODS We analyzed prospectively collected cohort data from routine ART initiation in infants from 11 cohorts contributing to the International Epidemiologic Database to Evaluate AIDS in Southern Africa. We included ART-naive HIV-infected infants aged <12 months initiating ≥3 antiretroviral drugs between 2004 and 2012. Kaplan-Meier estimates were calculated for mortality, loss to follow-up (LTFU), transfer out, and virological suppression. We used Cox proportional hazard models stratified by cohort to determine baseline characteristics associated with outcomes mortality and virological suppression. RESULTS The median (interquartile range) age at ART initiation of 4945 infants was 5.9 months (3.7-8.7) with follow-up of 11.2 months (2.8-20.0). At ART initiation, 77% had WHO clinical stage 3 or 4 disease and 87% were severely immunosuppressed. Three-year mortality probability was 16% and LTFU 29%. Severe immunosuppression, WHO stage 3 or 4, anemia, being severely underweight, and initiation of treatment before 2010 were associated with higher mortality. At 12 months after ART initiation, 17% of infants were severely immunosuppressed and the probability of attaining virological suppression was 56%. CONCLUSIONS Most infants initiating ART in Southern Africa had severe disease with high probability of LTFU and mortality on ART. Although the majority of infants remaining in care showed immune recovery and virological suppression, these responses were suboptimal.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND Impact of contemporary treatment of pre-invasive breast cancer (ductal carcinoma in situ [DCIS]) on long-term outcomes remains poorly defined. We aimed to evaluate national treatment trends for DCIS and to determine their impact on disease-specific (DSS) and overall survival (OS). METHODS The Surveillance, Epidemiology, and End Results (SEER) registry was queried for patients diagnosed with DCIS from 1991 to 2010. Treatment pattern trends were analyzed using Cochran-Armitage trend test. Survival analyses were performed using inverse probability weights (IPW)-adjusted competing risk analyses for DSS and Cox proportional hazard regression for OS. All tests performed were two-sided. RESULTS One hundred twenty-one thousand and eighty DCIS patients were identified. The greatest proportion of patients was treated with lumpectomy and radiation therapy (43.0%), followed by lumpectomy alone (26.5%) and unilateral (23.8%) or bilateral mastectomy (4.5%) with significant shifts over time. The rate of sentinel lymph node biopsy increased from 9.7% to 67.1% for mastectomy and from 1.4% to 17.8% for lumpectomy. Compared with mastectomy, OS was higher for lumpectomy with radiation (hazard ratio [HR] = 0.79, 95% confidence interval [CI] = 0.76 to 0.83, P < .001) and lower for lumpectomy alone (HR = 1.17, 95% CI = 1.13 to 1.23, P < .001). IPW-adjusted ten-year DSS was highest in lumpectomy with XRT (98.9%), followed by mastectomy (98.5%), and lumpectomy alone (98.4%). CONCLUSIONS We identified substantial shifts in treatment patterns for DCIS from 1991 to 2010. When outcomes between locoregional treatment options were compared, we observed greater differences in OS than DSS, likely reflecting both a prevailing patient selection bias as well as clinically negligible differences in breast cancer outcomes between groups.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVE To assess whether palliative primary tumor resection in colorectal cancer patients with incurable stage IV disease is associated with improved survival. BACKGROUND There is a heated debate regarding whether or not an asymptomatic primary tumor should be removed in patients with incurable stage IV colorectal disease. METHODS Stage IV colorectal cancer patients were identified in the Surveillance, Epidemiology, and End Results database between 1998 and 2009. Patients undergoing surgery to metastatic sites were excluded. Overall survival and cancer-specific survival were compared between patients with and without palliative primary tumor resection using risk-adjusted Cox proportional hazard regression models and stratified propensity score methods. RESULTS Overall, 37,793 stage IV colorectal cancer patients were identified. Of those, 23,004 (60.9%) underwent palliative primary tumor resection. The rate of patients undergoing palliative primary cancer resection decreased from 68.4% in 1998 to 50.7% in 2009 (P < 0.001). In Cox regression analysis after propensity score matching primary cancer resection was associated with a significantly improved overall survival [hazard ratio (HR) of death = 0.40, 95% confidence interval (CI) = 0.39-0.42, P < 0.001] and cancer-specific survival (HR of death = 0.39, 95% CI = 0.38-0.40, P < 0.001). The benefit of palliative primary cancer resection persisted during the time period 1998 to 2009 with HRs equal to or less than 0.47 for both overall and cancer-specific survival. CONCLUSIONS On the basis of this population-based cohort of stage IV colorectal cancer patients, palliative primary tumor resection was associated with improved overall and cancer-specific survival. Therefore, the dogma that an asymptomatic primary tumor never should be resected in patients with unresectable colorectal cancer metastases must be questioned.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND & AIMS Cirrhotic patients with acute decompensation frequently develop acute-on-chronic liver failure (ACLF), which is associated with high mortality rates. Recently, a specific score for these patients has been developed using the CANONIC study database. The aims of this study were to develop and validate the CLIF-C AD score, a specific prognostic score for hospitalised cirrhotic patients with acute decompensation (AD), but without ACLF, and to compare this with the Child-Pugh, MELD, and MELD-Na scores. METHODS The derivation set included 1016 CANONIC study patients without ACLF. Proportional hazards models considering liver transplantation as a competing risk were used to identify score parameters. Estimated coefficients were used as relative weights to compute the CLIF-C ADs. External validation was performed in 225 cirrhotic AD patients. CLIF-C ADs was also tested for sequential use. RESULTS Age, serum sodium, white-cell count, creatinine and INR were selected as the best predictors of mortality. The C-index for prediction of mortality was better for CLIF-C ADs compared with Child-Pugh, MELD, and MELD-Nas at predicting 3- and 12-month mortality in the derivation, internal validation and the external dataset. CLIF-C ADs improved in its ability to predict 3-month mortality using data from days 2, 3-7, and 8-15 (C-index: 0.72, 0.75, and 0.77 respectively). CONCLUSIONS The new CLIF-C ADs is more accurate than other liver scores in predicting prognosis in hospitalised cirrhotic patients without ACLF. CLIF-C ADs therefore may be used to identify a high-risk cohort for intensive management and a low-risk group that may be discharged early.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The objective of this study was to determine the impact of different follow-up cystoscopy frequencies on time to development of invasive bladder cancer in a cohort of 3,658 eligible patients 65 and older with an initial diagnosis of superficial bladder cancer between 1994 and 1998. Bladder cancer patients in the Surveillance, Epidemiology, and End Results (SEER)-Medicare database were used as the study population. ^ It was hypothesized that superficial bladder cancer patients receiving less frequent cystoscopy follow-up would develop invasive bladder cancer sooner after initial diagnosis and treatment than patients seen more frequently for cystoscopy follow-up. Cox Proportional Hazard Regression revealed that patients seen for cystoscopy every 3 or more months were 83–89% less likely to develop invasive cancer than patients seen every 1 to 2 months. A comparison of the 2 groups (1 to 2 months vs. 3≥ months) revealed that the 1 to 2 month group may have had more aggressive disease, and they are seen more frequently as a result. ^ These findings suggest that there are two groups of superficial bladder cancer patients: those at high risk of developing invasive bladder cancer and those at low risk. Patients who developed invasive bladder cancer sooner after initial diagnosis and treatment were seen more frequently for cystoscopy follow-up. The recommendation is that cystoscopy should be based on disease status at 3 months. Standardized schedules give all patients the same number of cystoscopies regardless of their risk factors. This could lead to unnecessary cystoscopies in low risk patients, and fewer than optimal cystoscopies in high risk patients. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The ordinal logistic regression models are used to analyze the dependant variable with multiple outcomes that can be ranked, but have been underutilized. In this study, we describe four logistic regression models for analyzing the ordinal response variable. ^ In this methodological study, the four regression models are proposed. The first model uses the multinomial logistic model. The second is adjacent-category logit model. The third is the proportional odds model and the fourth model is the continuation-ratio model. We illustrate and compare the fit of these models using data from the survey designed by the University of Texas, School of Public Health research project PCCaSO (Promoting Colon Cancer Screening in people 50 and Over), to study the patient’s confidence in the completion colorectal cancer screening (CRCS). ^ The purpose of this study is two fold: first, to provide a synthesized review of models for analyzing data with ordinal response, and second, to evaluate their usefulness in epidemiological research, with particular emphasis on model formulation, interpretation of model coefficients, and their implications. Four ordinal logistic models that are used in this study include (1) Multinomial logistic model, (2) Adjacent-category logistic model [9], (3) Continuation-ratio logistic model [10], (4) Proportional logistic model [11]. We recommend that the analyst performs (1) goodness-of-fit tests, (2) sensitivity analysis by fitting and comparing different models.^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Hereditary nonpolyposis colorectal cancer (HNPCC) is an autosomal dominant disease caused by germline mutations in DNA mismatch repair(MMR) genes. The nucleotide excision repair(NER) pathway plays a very important role in cancer development. We systematically studied interactions between NER and MMR genes to identify NER gene single nucleotide polymorphism (SNP) risk factors that modify the effect of MMR mutations on risk for cancer in HNPCC. We analyzed data from polymorphisms in 10 NER genes that had been genotyped in HNPCC patients that carry MSH2 and MLH1 gene mutations. The influence of the NER gene SNPs on time to onset of colorectal cancer (CRC) was assessed using survival analysis and a semiparametric proportional hazard model. We found the median age of onset for CRC among MMR mutation carriers with the ERCC1 mutation was 3.9 years earlier than patients with wildtype ERCC1(median 47.7 vs 51.6, log-rank test p=0.035). The influence of Rad23B A249V SNP on age of onset of HNPCC is age dependent (likelihood ratio test p=0.0056). Interestingly, using the likelihood ratio test, we also found evidence of genetic interactions between the MMR gene mutations and SNPs in ERCC1 gene(C8092A) and XPG/ERCC5 gene(D1104H) with p-values of 0.004 and 0.042, respectively. An assessment using tree structured survival analysis (TSSA) showed distinct gene interactions in MLH1 mutation carriers and MSH2 mutation carriers. ERCC1 SNP genotypes greatly modified the age onset of HNPCC in MSH2 mutation carriers, while no effect was detected in MLH1 mutation carriers. Given the NER genes in this study play different roles in NER pathway, they may have distinct influences on the development of HNPCC. The findings of this study are very important for elucidation of the molecular mechanism of colon cancer development and for understanding why some mutation carriers of the MSH2 and MLH1 gene develop CRC early and others never develop CRC. Overall, the findings also have important implications for the development of early detection strategies and prevention as well as understanding the mechanism of colorectal carcinogenesis in HNPCC. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Introduction and objective. A number of prognostic factors have been reported for predicting survival in patients with renal cell carcinoma. Yet few studies have analyzed the effects of those factors at different stages of the disease process. In this study, different stages of disease progression starting from nephrectomy to metastasis, from metastasis to death, and from evaluation to death were evaluated. ^ Methods. In this retrospective follow-up study, records of 97 deceased renal cell carcinoma (RCC) patients were reviewed between September 2006 to October 2006. Patients with TNM Stage IV disease before nephrectomy or with cancer diagnoses other than RCC were excluded leaving 64 records for analysis. Patient TNM staging, Furhman Grade, age, tumor size, tumor volume, histology and patient gender were analyzed in relation to time to metastases. Time from nephrectomy to metastasis, TNM staging, Furhman Grade, age, tumor size, tumor volume, histology and patient gender were tested for significance in relation to time from metastases to death. Finally, analysis of laboratory values at time of evaluation, Eastern Cooperative Oncology Group performance status (ECOG), UCLA Integrated Staging System (UISS), time from nephrectomy to metastasis, TNM staging, Furhman Grade, age, tumor size, tumor volume, histology and patient gender were tested for significance in relation to time from evaluation to death. Linear regression and Cox Proportional Hazard (univariate and multivariate) was used for testing significance. Kaplan-Meier Log-Rank test was used to detect any significance between groups at various endpoints. ^ Results. Compared to negative lymph nodes at time of nephrectomy, a single positive lymph node had significantly shorter time to metastasis (p<0.0001). Compared to other histological types, clear cell histology had significant metastasis free survival (p=0.003). Clear cell histology compared to other types (p=0.0002 univariate, p=0.038 multivariate) and time to metastasis with log conversion (p=0.028) significantly affected time from metastasis to death. A greater than one year and greater than two year metastasis free interval, compared to patients that had metastasis before one and two years, had statistically significant survival benefit (p=0.004 and p=0.0318). Time from evaluation to death was affected by greater than one year metastasis free interval (p=0.0459), alcohol consumption (p=0.044), LDH (p=0.006), ECOG performance status (p<0.001), and hemoglobin level (p=0.0092). The UISS risk stratified the patient population in a statistically significant manner for survival (p=0.001). No other factors were found to be significant. ^ Conclusion. Clear cell histology is predictive for both time to metastasis and metastasis to death. Nodal status at time of nephrectomy may predict risk of metastasis. The time interval to metastasis significantly predicts time from metastasis to death and time from evaluation to death. ECOG performance status, and hemoglobin levels predicts survival outcome at evaluation. Finally, UISS appropriately stratifies risk in our population. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The overall purpose of this study was to assess the relationship between the promoter region polymorphism (-2607 1G/2G) of matrix metalloproteinase-1 (MMP-1) polymorphism and outcome in brain tumor patients diagnosed with a primary brain tumor between 1994 and 2000 at The University of Texas M. D. Anderson Cancer Center. The MMP-1 polymorphism was genotyped for all brain tumor patients who participated in the Family Brain Tumor Study and for whom blood samples were available. Relevant covariates were abstracted from medical records for all cases from the original protocol, including information on demographics, tumor histology, therapy and outcome was obtained. The hypothesis was that brain tumor patients with the 2G allele have a poorer prognosis and shorter survival than brain tumor patients with the 1G allele. ^ Experimental Design: Genetic variants for the MMP-1 enzyme were determined by a polymerase chain reaction-restriction fragment length polymorphism assay. Comparison was made between the overall survival for cases with the 2G polymorphism and overall survival for cases with the 1G polymorphism using multivariable Cox Proportional-Hazard analysis, controlling for age, sex, Karnofsky Performance Scale (KPS), extent of surgery, tumor histology and treatment received. Kaplan-Meier and Cox Proportional-Hazard analyses were utilized to assess if the MMP-1 polymorphisms were related to overall survival. Results: Overall survival was not statistically significantly different between the 2G allele brain tumor patients and the 1G allele patients and there was no statistically significant difference between tumor types. ^ Conclusions: No association was found between MMP-1 polymorphisms and survival in patients with malignant gliomas. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Bladder cancer is the fourth most common cancer in men in the United States. There is compelling evidence supporting that genetic variations contribute to the risk and outcomes of bladder cancer. The PI3K-AKT-mTOR pathway is a major cellular pathway involved in proliferation, invasion, inflammation, tumorigenesis, and drug response. Somatic aberrations of PI3K-AKT-mTOR pathway are frequent events in several cancers including bladder cancer; however, no studies have investigated the role of germline genetic variations in this pathway in bladder cancer. In this project, we used a large case control study to evaluate the associations of a comprehensive catalogue of SNPs in this pathway with bladder cancer risk and outcomes. Three SNPs in RAPTOR were significantly associated with susceptibility: rs11653499 (OR: 1.79, 95%CI: 1.24–2.60), rs7211818 (OR: 2.13, 95%CI: 1.35–3.36), and rs7212142 (OR: 1.57, 95%CI: 1.19–2.07). Two haplotypes constructed from these 3 SNPs were also associated with bladder cancer risk. In combined analysis, a significant trend was observed for increased risk with an increase in the number of unfavorable genotypes (P for trend<0.001). Classification and regression tree analysis identified potential gene-environment interactions between RPS6KA5 rs11653499 and smoking. In superficial bladder cancer, we found that PTEN rs1234219 and rs11202600, TSC1 rs7040593, RAPTOR rs901065, and PIK3R1 rs251404 were significantly associated with recurrence in patients receiving BCG. In muscle invasive and metastatic bladder cancer, AKT2 rs3730050, PIK3R1 rs10515074, and RAPTOR rs9906827 were associated with survival. Survival tree analysis revealed potential gene-gene interactions: patients carrying the unfavorable genotypes of PTEN rs1234219 and TSC1 rs704059 exhibited a 5.24-fold (95% CI: 2.44–11.24) increased risk of recurrence. In combined analysis, with the increasing number of unfavorable genotypes, there was a significant trend of higher risk of recurrence and death (P for trend<0.001) in Cox proportional hazard regression analysis, and shorter event (recurrence and death) free survival in Kaplan-Meier estimates (P log rank<0.001). This study strongly suggests that genetic variations in PI3K-AKT-mTOR pathway play an important role in bladder cancer development. The identified SNPs, if validated in further studies, may become valuable biomarkers in assessing an individual's cancer risk, predicting prognosis and treatment response, and facilitating physicians to make individualized treatment decisions. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background. Colorectal cancer (CRC) is the third most commonly diagnosed cancer (excluding skin cancer) in both men and women in the United States, with an estimated 148,810 new cases and 49,960 deaths in 2008 (1). Racial/ethnic disparities have been reported across the CRC care continuum. Studies have documented racial/ethnic disparities in CRC screening (2-9), but only a few studies have looked at these differences in CRC screening over time (9-11). No studies have compared these trends in a population with CRC and without cancer. Additionally, although there is evidence suggesting that hospital factors (e.g. teaching hospital status and NCI designation) are associated with CRC survival (12-16), no studies have sought to explain the racial/ethnic differences in survival by looking at differences in socio-demographics, tumor characteristics, screening, co-morbidities, treatment, as well as hospital characteristics. ^ Objectives and Methods. The overall goals of this dissertation were to describe the patterns and trends of racial/ethnic disparities in CRC screening (i.e. fecal occult blood test (FOBT), sigmoidoscopy (SIG) and colonoscopy (COL)) and to determine if racial/ethnic disparities in CRC survival are explained by differences in socio-demographic, tumor characteristics, screening, co-morbidities, treatment, and hospital factors. These goals were accomplished in a two-paper format.^ In Paper 1, "Racial/Ethnic Disparities and Trends in Colorectal Cancer Screening in Medicare Beneficiaries with Colorectal Cancer and without Cancer in SEER Areas, 1992-2002", the study population consisted of 50,186 Medicare beneficiaries diagnosed with CRC from 1992 to 2002 and 62,917 Medicare beneficiaries without cancer during the same time period. Both cohorts were aged 67 to 89 years and resided in 16 Surveillance, Epidemiology and End Results (SEER) regions of the United States. Screening procedures between 6 months and 3 years prior to the date of diagnosis for CRC patients and prior to the index date for persons without cancer were identified in Medicare claims. The crude and age-gender-adjusted percentages and odds ratios of receiving FOBT, SIG, or COL were calculated. Multivariable logistic regression was used to assess race/ethnicity on the odds of receiving CRC screening over time.^ Paper 2, "Racial/Ethnic Disparities in Colorectal Cancer Survival: To what extent are racial/ethnic disparities in survival explained by racial differences in socio-demographics, screening, co-morbidities, treatment, tumor or hospital characteristics", included a cohort of 50,186 Medicare beneficiaries diagnosed with CRC from 1992 to 2002 and residing in 16 SEER regions of the United States which were identified in the SEER-Medicare linked database. Survival was estimated using the Kaplan-Meier method. Cox proportional hazard modeling was used to estimate hazard ratios (HR) of mortality and 95% confidence intervals (95% CI).^ Results. The screening analysis demonstrated racial/ethnic disparities in screening over time among the cohort without cancer. From 1992 to 1995, Blacks and Hispanics were less likely than Whites to receive FOBT (OR=0.75, 95% CI: 0.65-0.87; OR=0.50, 95% CI: 0.34-0.72, respectively) but their odds of screening increased from 2000 to 2002 (OR=0.79, 95% CI: 0.72-0.85; OR=0.67, 95% CI: 0.54-0.75, respectively). Blacks and Hispanics were less likely than Whites to receive SIG from 1992 to 1995 (OR=0.75, 95% CI: 0.57-0.98; OR=0.29, 95% CI: 0.12-0.71, respectively), but their odds of screening increased from 2000 to 2002 (OR=0.79, 95% CI: 0.68-0.93; OR=0.50, 95% CI: 0.35-0.72, respectively).^ The survival analysis showed that Blacks had worse CRC-specific survival than Whites (HR: 1.33, 95% CI: 1.23-1.44), but this was reduced for stages I-III disease after full adjustment for socio-demographic, tumor characteristics, screening, co-morbidities, treatment and hospital characteristics (aHR=1.24, 95% CI: 1.14-1.35). Socioeconomic status, tumor characteristics, treatment and co-morbidities contributed to the reduction in hazard ratios between Blacks and Whites with stage I-III disease. Asians had better survival than Whites before (HR: 0.73, 95% CI: 0.64-0.82) and after (aHR: 0.80, 95% CI: 0.70-0.92) adjusting for all predictors for stage I-III disease. For stage IV, both Asians and Hispanics had better survival than Whites, and after full adjustment, survival improved (aHR=0.73, 95% CI: 0.63-0.84; aHR=0.74, 95% CI: 0.61-0.92, respectively).^ Conclusion. Screening disparities remain between Blacks and Whites, and Hispanics and Whites, but have decreased in recent years. Future studies should explore other factors that may contribute to screening disparities, such as physician recommendations and language/cultural barriers in this and younger populations.^ There were substantial racial/ethnic differences in CRC survival among older Whites, Blacks, Asians and Hispanics. Co-morbidities, SES, tumor characteristics, treatment and other predictor variables contributed to, but did not fully explain the CRC survival differences between Blacks and Whites. Future research should examine the role of quality of care, particularly the benefit of treatment and post-treatment surveillance, in racial disparities in survival.^