61 resultados para switching regression model


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Placenta previa is alleged to be more common among women with a history of prior induced abortion. To investigate further whether there is a relationship between previous induced abortion and subsequent pregnancy complication of placenta previa, a matched case-comparison study was conducted comparing the reproductive histories of 256 women with placenta previa matched on age, date of delivery, and hospital with those of 256 women having normal deliveries and cesarean section deliveries without placental complications.^ Women with placenta previa had a twofold increase in the odds of having had one previous induced abortion (odds ratio 2.25) over women with no placental complications. Women with placenta previa and two or more previous induced abortions had a sevenfold increase in odds.^ The significant association of placenta previa and previous induced abortion remained after including gravida status, previous dilatation and curettage (D&C) status, previous spontaneous abortion, and race in a conditional logistic regression model. There is interaction between high gravidity and previous spontaneous abortion. Dilatation and curettage is associated with placenta previa primarily because women with abortion histories have also had a dilatation and curettage.^ Women who are seeking abortion and wish to have children later should be informed that there may be a longterm effect of developing placental complications in subsequent pregnancies. Women who have had at least one induced abortion or any dilatation and curettage procedure should be monitored carefully during any subsequent pregnancy for the risk of the complication of placenta previa. This knowledge should alert the physician or nurse-midwife to treat those women with a history of previous induced abortions as potential high risk pregnancies and could perhaps reduce maternal and fetal morbidity rates. ^

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This study described the relationship of sexual maturation and blood pressure in a sample (n = 361) of white females, ages seven through 18, attending public schools in a defined area of Central Texas during October through December, 1984. Other correlates of blood pressure were also described for this sample.^ A survey was performed to obtain the data on height, weight, body mass, pulse rate, upper arm circumference and length, and blood pressure. Each subject self-assessed her secondary sex characteristics (breast and pubic hair) according to drawings of the Tanner stages of maturation. The subjects were interviewed to obtain data on personal health habits and menstrual status. Student age, ethnic group and place of residence were abstracted from school records. Parents or guardians of the subjects responded to a questionnaire pertaining to parental and subject health history and parents' occupation and educational attainment.^ In the simple linear regression analysis, sexual maturation and variables of body size were significantly (p < 0.001) and positively associated with systolic and fourth- and fifth-phase diastolic blood pressure. The demographic and socioeconomic variables were not sufficiently variant in this population to have differential effects on the relation between blood pressure and maturation. Stepwise multiple regression was used to assess the contribution of sexual maturation to the variance of blood pressure after accounting for the variables of body size. Sexual maturation (breast stage) along with weight, height and body mass remained in the multiple regression models for fourth- and fifth-phase diastolic blood pressure. Only height and body mass remained in the regression model for systolic blood pressure; sexual maturation did not contribute more to the explanation of the systolic blood pressure variance.^ The association of sexual maturation with blood pressure level was established in this sample of young white females. More research is needed first, to determine if this relationship prevails in other populations of young females, and second, to determine the relationship of sexual maturation sequence and change with the change of blood pressure during childhood and adolescence. ^

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The purpose of this study was to assess the impact of the Arkansas Long-Term Care Demonstration Project upon Arkansas' Medicaid expenditures and upon the clients it serves. A Retrospective Medicaid expenditure study component used analyses of variance techniques to test for the Project's effects upon aggregated expenditures for 28 demonstration and control counties representing 25 percent of the State's population over four years, 1979-1982.^ A second approach to the study question utilized a 1982 prospective sample of 458 demonstration and control clients from the same 28 counties. The disability level or need for care of each patient was established a priori. The extent to which an individual's variation in Medicaid utilization and costs was explained by patient need, presence or absence of the channeling project's placement decision or some other patient characteristic was examined by multiple regression analysis. Long-term and acute care Medicaid, Medicare, third party, self-pay and the grand total of all Medicaid claims were analyzed for project effects and explanatory relationships.^ The main project effect was to increase personal care costs without reducing nursing home or acute care costs (Prospective Study). Expansion of clients appeared to occur in personal care (Prospective Study) and minimum care nursing home (Retrospective Study) for the project areas. Cost-shifting between Medicaid and Medicare in the project areas and two different patterns of utilization in the North and South projects tended to offset each other such that no differences in total costs between the project areas and demonstration areas occurred. The project was significant ((beta) = .22, p < .001) only for personal care costs. The explanatory power of this personal care regression model (R('2) = .36) was comparable to other reported health services utilization models. Other variables (Medicare buy-in, level of disability, Social Security Supplemental Income (SSI), net monthly income, North/South areas and age) explained more variation in the other twelve cost regression models. ^

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Trauma and severe head injuries are important issues because they are prevalent, because they occur predominantly in the young, and because variations in clinical management may matter. Trauma is the leading cause of death for those under age 40. The focus of this head injury study is to determine if variations in time from the scene of accident to a trauma center hospital makes a difference in patient outcomes.^ A trauma registry is maintained in the Houston-Galveston area and includes all patients admitted to any one of three trauma center hospitals with mild or severe head injuries. A study cohort, derived from the Registry, includes 254 severe head injury cases, for 1980, with a Glasgow Coma Score of 8 or less.^ Multiple influences relate to patient outcomes from severe head injury. Two primary variables and four confounding variables are identified, including time to emergency room, time to intubation, patient age, severity of injury, type of injury and mode of transport to the emergency room. Regression analysis, analysis of variance, and chi-square analysis were the principal statistical methods utilized.^ Analysis indicates that within an urban setting, with a four-hour time span, variations in time to emergency room do not provide any strong influence or predictive value to patient outcome. However, data are suggestive that at longer time periods there is a negative influence on outcomes. Age is influential only when the older group (55-64) is included. Mode of transport (helicopter or ambulance) did not indicate any significant difference in outcome.^ In a multivariate regression model, outcomes are influenced primarily by severity of injury and age which explain 36% (R('2)) of variance. Inclusion of time to emergency room, time to intubation, transport mode and type injury add only 4% (R('2)) additional contribution to explaining variation in patient outcome.^ The research concludes that since the group most at risk to head trauma is the young adult male involved in automobile/motorcycle accidents, more may be gained by modifying driving habits and other preventive measures. Continuous clinical and evaluative research are required to provide updated clinical wisdom in patient management and trauma treatment protocols. A National Institute of Trauma may be required to develop a national public policy and evaluate the many medical, behavioral and social changes required to cope with the country's number 3 killer and the primary killer of young adults.^

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The relationship between degree of diastolic blood pressure (DBP) reduction and mortality was examined among hypertensives, ages 30-69, in the Hypertension Detection and Follow-up Program (HDFP). The HDFP was a multi-center community-based trial, which followed 10,940 hypertensive participants for five years. One-year survival was required for inclusion in this investigation since the one-year annual visit was the first occasion where change in blood pressure could be measured on all participants. During the subsequent four years of follow-up on 10,052 participants, 568 deaths occurred. For levels of change in DBP and for categories of variables related to mortality, the crude mortality rate was calculated. Time-dependent life tables were also calculated so as to utilize available blood pressure data over time. In addition, the Cox life table regression model, extended to take into account both time-constant and time-dependent covariates, was used to examine the relationship change in blood pressure over time and mortality.^ The results of the time-dependent life table and time-dependent Cox life table regression analyses supported the existence of a quadratic function which modeled the relationship between DBP reduction and mortality, even after adjusting for other risk factors. The minimum mortality hazard ratio, based on a particular model, occurred at a DBP reduction of 22.6 mm Hg (standard error = 10.6) in the whole population and 8.5 mm Hg (standard error = 4.6) in the baseline DBP stratum 90-104. After this reduction, there was a small increase in the risk of death. There was not evidence of the quadratic function after fitting the same model using systolic blood pressure. Methodologic issues involved in studying a particular degree of blood pressure reduction were considered. The confidence interval around the change corresponding to the minimum hazard ratio was wide and the obtained blood pressure level should not be interpreted as a goal for treatment. Blood pressure reduction was attributed, not only to pharmacologic therapy, but also to regression to the mean, and to other unknown factors unrelated to treatment. Therefore, the surprising results of this study do not provide direct implications for treatment, but strongly suggest replication in other populations. ^

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Invasive pneumococcal disease (IPD) causes significant health burden in the US, is responsible for the majority of bacterial meningitis, and causes more deaths than any other vaccine preventable bacterial disease in the US. The estimated National IPD rate is 14.3 cases per 100,000 population with a case-fatality rate of 1.5 cases per 100,000 population. Although cases of IPD are routinely reported to the local health department in Harris County Texas, the incidence (IR) and case-fatality (CFR) rates have not been reported. Additionally, it is important to know which serotypes of S. pneumoniae are circulating in Harris County Texas and to determine if ‘replacement disease’ is occurring. ^ This study reported incidence and case-fatality rates from 2003 to 2009, and described the trends in IPD, including the IPD serotypes circulating in Harris County Texas during the study period, particularly in 2008 and 2010. Annual incidence rates were calculated and reported for 2003 to 2009, using complete surveillance-year data. ^ Geographic information system (GIS) software was used to create a series of maps of the data reported during the study period. Cluster and outlier analysis and hot spot analysis were conducted using both case counts by census tract and disease rate by census tract. ^ IPD age- and race-adjusted IR for Harris County Texas and their 95% confidence intervals (CIs) were 1.40 (95% CI 1.0, 1.8), 1.71 (95% CI 1.24, 2.17), 3.13 (95% CI 2.48, 3.78), 3.08 (95% CI 2.43, 3.74), 5.61 (95% CI 4.79, 6.43), 8.11 (95% CI 7.11, 9.1), and 7.65 (95% CI 6.69, 8.61) for the years 2003 to 2009, respectively (rates were age- and race-adjusted to each year's midyear US population estimates). A Poisson regression model demonstrated a statistically significant increasing trend of about 32 percent per year in the IPD rates over the course of the study period. IPD age- and race-adjusted case-fatality rates (CFR) for Harris County Texas were also calculated and reported. A Poisson regression model demonstrated a statistically significant increasing trend of about 26 percent per year in the IPD case-fatality rates from 2003 through 2009. A logistic regression model associated the risk of dying from IPD to alcohol abuse (OR 4.69, 95% CI 2.57, 8.56) and to meningitis (OR 2.42, 95% CI 1.46, 4.03). ^ The prevalence of non-vaccine serotypes (NVT) among IPD cases with serotyped isolates was 98.2 percent. In 2008, the year with the sample more geographically representative of all areas of Harris County Texas, the prevalence was 96 percent. Given these findings, it is reasonable to conclude that ‘replacement disease’ is occurring in Harris County Texas, meaning that, the majority of IPD is caused by serotypes not included in the PCV7 vaccine. Also in conclusion, IPD rates increased during the study period in Harris County Texas.^

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Early and accurate detection of TB disease in HIV-infected individuals is a critical step for a successful TB program. In Vietnam, the diagnosis of TB disease, which is based predominantly on the clinical examination, chest radiography (CXR) and acid fast bacilli (AFB) sputum smear, has shown to be of low sensitivity in immunocompromised patients. The sputum culture is not routinely performed for patients with AFB negative smears, even in HIV-infected individuals.^ In that background, we conducted this cross-sectional study to estimate the prevalence of sputum culture-confirmed pulmonary tuberculosis (PTB), smear-negative PTB, and multidrug-resistant TB (MDR-TB) in the HIV-infected population in Ho Chi Minh City (HCMC), the largest city in Vietnam where both TB and HIV are highly prevalent. We also evaluated the diagnostic performance of various algorithms based on routine available tools in Vietnam such as symptoms screening, CXR, and AFB smear. Nearly 400 subjects were consecutively recruited from HIV-infected patients seeking care at the An Hoa Clinic in District 6 of Ho Chi Minh City from August 2009 through June 2010. Participants’ demographic data, clinical status, CXR, and laboratory results were collected. A multiple logistic regression model was developed to assess the association of covariates and PTB. ^ The prevalence of smear-positive TB, smear-negative TB, resistant TB, and MDR-TB were 7%, 2%, 5%, 2.5%, and 0.3%, respectively. Adjusted odds ratios for low CD4+ cell count, positive sputum smear, and CXR to positive sputum culture were 3.17, 32.04, and 4.28, respectively. Clinical findings alone had poor sensitivity, but the combination of CD4+ cell count, sputum smear, and CXR proved to perform a more accurate diagnosis.^ This study results support the routine use of sputum culture to improve the detection of TB disease in HIV-infected individuals in Vietnam. When routine sputum culture is not available, an algorithm combining CD4+ cell count, sputum smear, and CXR is recommended for diagnosing PTB. Future studies on more affordable, rapid, and accurate tests for TB infection would also be necessary to timely provide specific treatments for patients in need, reduce mortality, and minimize TB transmission to the general population.^

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The association between fine particulate matter air pollution (PM2.5) and cardiovascular disease (CVD) mortality was spatially analyzed for Harris County, Texas, at the census tract level. The objective was to assess how increased PM2.5 exposure related to CVD mortality in this area while controlling for race, income, education, and age. An estimated exposure raster was created for Harris County using Kriging to estimate the PM2.5 exposure at the census tract level. The PM2.5 exposure and the CVD mortality rates were analyzed in an Ordinary Least Squares (OLS) regression model and the residuals were subsequently assessed for spatial autocorrelation. Race, median household income, and age were all found to be significant (p<0.05) predictors in the model. This study found that for every one μg/m3 increase in PM2.5 exposure, holding age and education variables constant, an increase of 16.57 CVD deaths per 100,000 would be predicted for increased minimum exposure values and an increase of 14.47 CVD deaths per 100,000 would be predicted for increased maximum exposure values. This finding supports previous studies associating PM2.5 exposure with CVD mortality. This study further identified the areas of greatest PM2.5 exposure in Harris County as being the geographical locations of populations with the highest risk of CVD (i.e., predominantly older, low-income populations with a predominance of African Americans). The magnitude of the effect of PM2.5 exposure on CVD mortality rates in the study region indicates a need for further community-level studies in Harris County, and suggests that reducing excess PM2.5 exposure would reduce CVD mortality.^

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Background. The mTOR pathway is commonly altered in human tumors and promotes cell survival and proliferation. Preliminary evidence suggests this pathway's involvement in chemoresistance to platinum and taxanes, first line therapy for epithelial ovarian cancer. A pathway-based approach was used to identify individual germline single nucleotide polymorphisms (SNPs) and cumulative effects of multiple genetic variants in mTOR pathway genes and their association with clinical outcome in women with ovarian cancer. ^ Methods. The case-series was restricted to 319 non-Hispanic white women with high grade ovarian cancer treated with surgery and platinum-based chemotherapy. 135 SNPs in 20 representative genes in the mTOR pathway were genotyped. Hazard ratios (HRs) for death and Odds ratios (ORs) for failure to respond to primary therapy were estimated for each SNP using the multivariate Cox proportional hazards model and multivariate logistic regression model, respectively, while adjusting for age, stage, histology and treatment sequence. A survival tree analysis of SNPs with a statistically significant association (p<0.05) was performed to identify higher order gene-gene interactions and their association with overall survival. ^ Results. There was no statistically significant difference in survival by tumor histology or treatment regimen. The median survival for the cohort was 48.3 months. Seven SNPs were significantly associated with decreased survival. Compared to those with no unfavorable genotypes, the HR for death increased significantly with the increasing number of unfavorable genotypes and women in the highest risk category had HR of 4.06 (95% CI 2.29–7.21). The survival tree analysis also identified patients with different survival patterns based on their genetic profiles. 13 SNPs on five different genes were found to be significantly associated with a treatment response, defined as no evidence of disease after completion of primary therapy. Rare homozygous genotype of SNP rs6973428 showed a 5.5-fold increased risk compared to the wild type carrying genotypes. In the cumulative effect analysis, the highest risk group (individuals with ≥8 unfavorable genotypes) was significantly less likely to respond to chemotherapy (OR=8.40, 95% CI 3.10–22.75) compared to the low risk group (≤4 unfavorable genotypes). ^ Conclusions. A pathway-based approach can demonstrate cumulative effects of multiple genetic variants on clinical response to chemotherapy and survival. Therapy targeting the mTOR pathway may modify outcome in select patients.^

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Numerous harmful occupational exposures affect working teens in the United States. Teens working in agriculture and other heavy-labor industries may be at risk for occupational exposures to pesticides and solvents. The neurotoxicity of pesticides and solvents at high doses is well-known; however, the long term effects of these substances at low doses on occupationally exposed adolescents have not been well-studied. To address this research gap, a secondary analysis of cross-sectional data was completed in order to estimate the prevalence of self-reported symptoms of neurotoxicity among a cohort of high school students from Starr County, Texas, a rural area along the Texas-Mexico border. Multivariable linear regression was used to estimate the association between work status (i.e., no work, farm work, and non-farm work) and symptoms of neurotoxicity, while controlling for age, gender, Spanish speaking preference, inhalant use, tobacco use, and alcohol use. The sample included 1,208 students. Of these, the majority (85.84%) did not report having worked during the prior nine months compared to 4.80% who did only farm work, 6.21% who did only non-farm work, and 3.15% who did both types of work. On average, students reported 3.26 symptoms with a range from 0-16. The most commonly endorsed items across work status were those related to memory impairment. Adolescents employed in non-farm work jobs reported more neurotoxicity symptoms than those who reported that they did not work (Mean 4.31; SD 3.97). In the adjusted multivariable regression model, adolescents reporting non-farm work status reported an average of 0.77 more neurotoxicity symptoms on the Q16 than those who did not work (P = 0.031). The confounding variables included in the final model were all found to be factors significantly associated with report of neurotoxicity symptoms. Future research should examine the relationship between these variables and self-report of symptoms of neurotoxicity.^

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Trastuzumab is a humanized-monoclonal antibody, developed specifically for HER2-neu over-expressed breast cancer patients. Although highly effective and well tolerated, it was reported associated with Congestive Heart Failure (CHF) in clinical trial settings (up to 27%). This leaves a gap where, Trastuzumab-related CHF rate in general population, especially older breast cancer patients with long term treatment of Trastuzumab remains unknown. This thesis examined the rates and risk factors associated with Trastuzumab-related CHF in a large population of older breast cancer patients. A retrospective cohort study using the existing Surveillance, Epidemiology and End Results (SEER) and Medicare linked de-identified database was performed. Breast cancer patients ≥ 66 years old, stage I-IV, diagnosed in 1998-2007, fully covered by Medicare but no HMO within 1-year before and after first diagnosis month, received 1st chemotherapy no earlier than 30 days prior to diagnosis were selected as study cohort. The primary outcome of this study is a diagnosis of CHF after starting chemotherapy but none CHF claims on or before cancer diagnosis date. ICD-9 and HCPCS codes were used to pool the claims for Trastuzumab use, chemotherapy, comorbidities and CHF claims. Statistical analysis including comparison of characteristics, Kaplan-Meier survival estimates of CHF rates for long term follow up, and Multivariable Cox regression model using Trastuzumab as a time-dependent variable were performed. Out of 17,684 selected cohort, 2,037 (12%) received Trastuzumab. Among them, 35% (714 out of 2037) were diagnosed with CHF, compared to 31% (4784 of 15647) of CHF rate in other chemotherapy recipients (p<.0001). After 10 years of follow-up, 65% of Trastuzumab users developed CHF, compared to 47% in their counterparts. After adjusting for patient demographic, tumor and clinical characteristics, older breast cancer patients who used Trastuzumab showed a significantly higher risk in developing CHF than other chemotherapy recipients (HR 1.69, 95% CI 1.54 - 1.85). And this risk is increased along with the increment of age (p-value < .0001). Among Trastuzumab users, these covariates also significantly increased the risk of CHF: older age, stage IV, Non-Hispanic black race, unmarried, comorbidities, Anthracyclin use, Taxane use, and lower educational level. It is concluded that, Trastuzumab users in older breast cancer patients had 69% higher risk in developing CHF than non-Trastuzumab users, much higher than the 27% increase reported in younger clinical trial patients. Older age, Non-Hispanic black race, unmarried, comorbidity, combined use with Anthracycline or Taxane also significantly increase the risk of CHF development in older patients treated with Trastuzumab. ^

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Purpose: School districts in the U.S. regularly offer foods that compete with the USDA reimbursable meal, known as `a la carte' foods. These foods must adhere to state nutritional regulations; however, the implementation of these regulations often differs across districts. The purpose of this study is to compare two methods of offering a la carte foods on student's lunch intake: 1) an extensive a la carte program in which schools have a separate area for a la carte food sales, that includes non-reimbursable entrees; and 2) a moderate a la carte program, which offers the sale of a la carte foods on the same serving line with reimbursable meals. ^ Methods: Direct observation was used to assess children's lunch consumption in six schools, across two districts in Central Texas (n=373 observations). Schools were matched on socioeconomic status. Data collectors were randomly assigned to students, and recorded foods obtained, foods consumed, source of food, gender, grade, and ethnicity. Observations were entered into a nutrient database program, FIAS Millennium Edition, to obtain nutritional information. Differences in energy and nutrient intake across lunch sources and districts were assessed using ANOVA and independent t-tests. A linear regression model was applied to control for potential confounders. ^ Results: Students at schools with extensive a la carte programs consumed significantly more calories, carbohydrates, total fat, saturated fat, calcium, and sodium compared to students in schools with moderate a la carte offerings (p<.05). Students in the extensive a la carte program consumed approximately 94 calories more than students in the moderate a la carte program. There was no significant difference in the energy consumption in students who consumed any amount of a la carte compared to students who consumed none. In both districts, students who consumed a la carte offerings were more likely to consume sugar-sweetened beverages, sweets, chips, and pizza compared to students who consumed no a la carte foods. ^ Conclusion: The amount, type and method of offering a la carte foods can significantly affect student dietary intake. This pilot study indicates that when a la carte foods are more available, students consume more calories. Findings underscore the need for further investigation on how availability of a la carte foods affects children's diets. Guidelines for school a la carte offerings should be maximized to encourage the consumption of healthful foods and appropriate energy intake.^

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Bisphosphonates represent a unique class of drugs that effectively treat and prevent a variety of bone-related disorders including metastatic bone disease and osteoporosis. High tolerance and high efficacy rates quickly ranked bisphosphonates as the standard of care for bone-related diseases. However, in the early 2000s, case reports began to surface that linked bisphosphonates with osteonecrosis of the jaw (ONJ). Since that time, studies conducted have corroborated the linkage. However, as with most disease states, many factors can contribute to the onset of disease. The aim of this study was to determine which comorbid factors presented an increased risk for developing ONJ in cancer patients.^ Using a case-control study design, investigators used a combination of ICD-9 codes and chart review to identify confirmed cases of ONJ at The University of Texas M. D. Anderson Cancer Center (MDACC). Each case was then matched to five controls based on age, gender, race/ethnicity, and primary cancer diagnosis. Data querying and chart review provided information on variables of interest. These variables included bisphosphonate exposure, glucocorticoids exposure, smoking history, obesity, and diabetes. Statistical analysis was conducted using PASW (Predictive Analytics Software) Statistics, Version 18 (SPSS Inc., Chicago, Illinois).^ One hundred twelve (112) cases were identified as confirmed cases of ONJ. Variables were run using univariate logistic regression to determine significance (p < .05); significant variables were included in the final conditional logistic regression model. Concurrent use of bisphosphonates and glucocorticoids (OR, 18.60; CI, 8.85 to 39.12; p < .001), current smokers (OR, 2.52; CI, 1.21 to 5.25; p = .014), and presence of diabetes (OR, 1.84; CI, 1.06 to 3.20; p = .030) were found to increase the risk for developing ONJ. Obesity was not associated significantly with ONJ development.^ In this study, cancer patients that received bisphosphonates as part of their therapeutic regimen were found to have an 18-fold increase in their risk of developing ONJ. Other factors included smoking and diabetes. More studies examining the concurrent use of glucocorticoids and bisphosphonates may be able to strengthen any correlations.^

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Pancreatic cancer is the 4th most common cause for cancer death in the United States, accompanied by less than 5% five-year survival rate based on current treatments, particularly because it is usually detected at a late stage. Identifying a high-risk population to launch an effective preventive strategy and intervention to control this highly lethal disease is desperately needed. The genetic etiology of pancreatic cancer has not been well profiled. We hypothesized that unidentified genetic variants by previous genome-wide association study (GWAS) for pancreatic cancer, due to stringent statistical threshold or missing interaction analysis, may be unveiled using alternative approaches. To achieve this aim, we explored genetic susceptibility to pancreatic cancer in terms of marginal associations of pathway and genes, as well as their interactions with risk factors. We conducted pathway- and gene-based analysis using GWAS data from 3141 pancreatic cancer patients and 3367 controls with European ancestry. Using the gene set ridge regression in association studies (GRASS) method, we analyzed 197 pathways from the Kyoto Encyclopedia of Genes and Genomes (KEGG) database. Using the logistic kernel machine (LKM) test, we analyzed 17906 genes defined by University of California Santa Cruz (UCSC) database. Using the likelihood ratio test (LRT) in a logistic regression model, we analyzed 177 pathways and 17906 genes for interactions with risk factors in 2028 pancreatic cancer patients and 2109 controls with European ancestry. After adjusting for multiple comparisons, six pathways were marginally associated with risk of pancreatic cancer ( P < 0.00025): Fc epsilon RI signaling, maturity onset diabetes of the young, neuroactive ligand-receptor interaction, long-term depression (Ps < 0.0002), and the olfactory transduction and vascular smooth muscle contraction pathways (P = 0.0002; Nine genes were marginally associated with pancreatic cancer risk (P < 2.62 × 10−5), including five reported genes (ABO, HNF1A, CLPTM1L, SHH and MYC), as well as four novel genes (OR13C4, OR 13C3, KCNA6 and HNF4 G); three pathways significantly interacted with risk factors on modifying the risk of pancreatic cancer (P < 2.82 × 10−4): chemokine signaling pathway with obesity ( P < 1.43 × 10−4), calcium signaling pathway (P < 2.27 × 10−4) and MAPK signaling pathway with diabetes (P < 2.77 × 10−4). However, none of the 17906 genes tested for interactions survived the multiple comparisons corrections. In summary, our current GWAS study unveiled unidentified genetic susceptibility to pancreatic cancer using alternative methods. These novel findings provide new perspectives on genetic susceptibility to and molecular mechanisms of pancreatic cancer, once confirmed, will shed promising light on the prevention and treatment of this disease. ^

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Background: The follow-up care for women with breast cancer requires an understanding of disease recurrence patterns and the follow-up visit schedule should be determined according to the times when the recurrence are most likely to occur, so that preventive measure can be taken to avoid or minimize the recurrence. Objective: To model breast cancer recurrence through stochastic process with an aim to generate a hazard function for determining a follow-up schedule. Methods: We modeled the process of disease progression as the time transformed Weiner process and the first-hitting-time was used as an approximation of the true failure time. The women's "recurrence-free survival time" or a "not having the recurrence event" is modeled by the time it takes Weiner process to cross a threshold value which represents a woman experiences breast cancer recurrence event. We explored threshold regression model which takes account of covariates that contributed to the prognosis of breast cancer following development of the first-hitting time model. Using real data from SEER-Medicare, we proposed models of follow-up visits schedule on the basis of constant probability of disease recurrence between consecutive visits. Results: We demonstrated that the threshold regression based on first-hitting-time modeling approach can provide useful predictive information about breast cancer recurrence. Our results suggest the surveillance and follow-up schedule can be determined for women based on their prognostic factors such as tumor stage and others. Women with early stage of disease may be seen less frequently for follow-up visits than those women with locally advanced stages. Our results from SEER-Medicare data support the idea of risk-controlled follow-up strategies for groups of women. Conclusion: The methodology we proposed in this study allows one to determine individual follow-up scheduling based on a parametric hazard function that incorporates known prognostic factors.^