902 resultados para Survival-Curve
Resumo:
OBJECTIVES In HIV-negative populations light to moderate alcohol consumption is associated with a lower cardiovascular morbidity and mortality than alcohol abstention. Whether the same holds true for HIV-infected individuals has not been evaluated in detail. DESIGN Cohort study METHODS:: Adults on antiretroviral therapy in the Swiss HIV Cohort Study with follow-up after August 2005 were included. We categorized alcohol consumption into: abstention, low (1-9 g/d), moderate (10-29 g/d in females and 10-39g/d in men) and high alcohol intake. Cox proportional hazards models were used to describe the association between alcohol consumption and cardiovascular disease free survival (combined endpoint) as well as cardiovascular disease events (CADE) and overall survival. Baseline and time-updated risk factors for CADE were included in the models. RESULTS Among 9,741 individuals included, there were 788 events of major CADE or death during 46,719 years of follow-up, corresponding to an incidence of 1.69 events/100 person-years. Follow-up according to alcohol consumption level was 51% abstention, 20% low, 23% moderate and 6% high intake. As compared to abstention, low (hazard ratio 0.79, 95% confidence interval 0.63-0.98) and moderate alcohol intake (0.78, 0.64-0.95) were associated with a lower incidence of the combined endpoint. There was no significant association between alcohol consumption and CADE. CONCLUSIONS Compared to abstention, low and moderate alcohol intake were associated with a better CADE-free survival. However, this result was mainly driven by mortality and the specific impact of drinking patterns and type of alcoholic beverage on this outcome remains to be determined.
Resumo:
Breast cancer is the most common non-skin cancer and the second leading cause of cancer-related death in women in the United States. Studies on ipsilateral breast tumor relapse (IBTR) status and disease-specific survival will help guide clinic treatment and predict patient prognosis.^ After breast conservation therapy, patients with breast cancer may experience breast tumor relapse. This relapse is classified into two distinct types: true local recurrence (TR) and new ipsilateral primary tumor (NP). However, the methods used to classify the relapse types are imperfect and are prone to misclassification. In addition, some observed survival data (e.g., time to relapse and time from relapse to death)are strongly correlated with relapse types. The first part of this dissertation presents a Bayesian approach to (1) modeling the potentially misclassified relapse status and the correlated survival information, (2) estimating the sensitivity and specificity of the diagnostic methods, and (3) quantify the covariate effects on event probabilities. A shared frailty was used to account for the within-subject correlation between survival times. The inference was conducted using a Bayesian framework via Markov Chain Monte Carlo simulation implemented in softwareWinBUGS. Simulation was used to validate the Bayesian method and assess its frequentist properties. The new model has two important innovations: (1) it utilizes the additional survival times correlated with the relapse status to improve the parameter estimation, and (2) it provides tools to address the correlation between the two diagnostic methods conditional to the true relapse types.^ Prediction of patients at highest risk for IBTR after local excision of ductal carcinoma in situ (DCIS) remains a clinical concern. The goals of the second part of this dissertation were to evaluate a published nomogram from Memorial Sloan-Kettering Cancer Center, to determine the risk of IBTR in patients with DCIS treated with local excision, and to determine whether there is a subset of patients at low risk of IBTR. Patients who had undergone local excision from 1990 through 2007 at MD Anderson Cancer Center with a final diagnosis of DCIS (n=794) were included in this part. Clinicopathologic factors and the performance of the Memorial Sloan-Kettering Cancer Center nomogram for prediction of IBTR were assessed for 734 patients with complete data. Nomogram for prediction of 5- and 10-year IBTR probabilities were found to demonstrate imperfect calibration and discrimination, with an area under the receiver operating characteristic curve of .63 and a concordance index of .63. In conclusion, predictive models for IBTR in DCIS patients treated with local excision are imperfect. Our current ability to accurately predict recurrence based on clinical parameters is limited.^ The American Joint Committee on Cancer (AJCC) staging of breast cancer is widely used to determine prognosis, yet survival within each AJCC stage shows wide variation and remains unpredictable. For the third part of this dissertation, biologic markers were hypothesized to be responsible for some of this variation, and the addition of biologic markers to current AJCC staging were examined for possibly provide improved prognostication. The initial cohort included patients treated with surgery as first intervention at MDACC from 1997 to 2006. Cox proportional hazards models were used to create prognostic scoring systems. AJCC pathologic staging parameters and biologic tumor markers were investigated to devise the scoring systems. Surveillance Epidemiology and End Results (SEER) data was used as the external cohort to validate the scoring systems. Binary indicators for pathologic stage (PS), estrogen receptor status (E), and tumor grade (G) were summed to create PS+EG scoring systems devised to predict 5-year patient outcomes. These scoring systems facilitated separation of the study population into more refined subgroups than the current AJCC staging system. The ability of the PS+EG score to stratify outcomes was confirmed in both internal and external validation cohorts. The current study proposes and validates a new staging system by incorporating tumor grade and ER status into current AJCC staging. We recommend that biologic markers be incorporating into revised versions of the AJCC staging system for patients receiving surgery as the first intervention.^ Chapter 1 focuses on developing a Bayesian method to solve misclassified relapse status and application to breast cancer data. Chapter 2 focuses on evaluation of a breast cancer nomogram for predicting risk of IBTR in patients with DCIS after local excision gives the statement of the problem in the clinical research. Chapter 3 focuses on validation of a novel staging system for disease-specific survival in patients with breast cancer treated with surgery as the first intervention. ^
Resumo:
Head and Neck Squamous Cell Carcinoma (HNSCC) is the sixth common malignancy in the world, with high rates of developing second primary malignancy (SPM) and moderately low survival rates. This disease has become an enormous challenge in the cancer research and treatments. For HNSCC patients, a highly significant cause of post-treatment mortality and morbidity is the development of SPM. Hence, assessment of predicting the risk for the development of SPM would be very helpful for patients, clinicians and policy makers to estimate the survival of patients with HNSCC. In this study, we built a prognostic model to predict the risk of developing SPM in patients with newly diagnosed HNSCC. The dataset used in this research was obtained from The University of Texas MD Anderson Cancer Center. For the first aim, we used stepwise logistic regression to identify the prognostic factors for the development of SPM. Our final model contained cancer site and overall cancer stage as our risk factors for SPM. The Hosmer-Lemeshow test (p-value= 0.15>0.05) showed the final prognostic model fit the data well. The area under the ROC curve was 0.72 that suggested the discrimination ability of our model was acceptable. The internal validation confirmed the prognostic model was a good fit and the final prognostic model would not over optimistically predict the risk of SPM. This model needs external validation by using large data sample size before it can be generalized to predict SPM risk for other HNSCC patients. For the second aim, we utilized a multistate survival analysis approach to estimate the probability of death for HNSCC patients taking into consideration of the possibility of SPM. Patients without SPM were associated with longer survival. These findings suggest that the development of SPM could be a predictor of survival rates among the patients with HNSCC.^
Resumo:
To provide a more general method for comparing survival experience, we propose a model that independently scales both hazard and time dimensions. To test the curve shape similarity of two time-dependent hazards, h1(t) and h2(t), we apply the proposed hazard relationship, h12(tKt)/ h1(t) = Kh, to h1. This relationship doubly scales h1 by the constant hazard and time scale factors, Kh and Kt, producing a transformed hazard, h12, with the same underlying curve shape as h1. We optimize the match of h12 to h2 by adjusting Kh and Kt. The corresponding survival relationship S12(tKt) = [S1(t)]KtKh transforms S1 into a new curve S12 of the same underlying shape that can be matched to the original S2. We apply this model to the curves for regional and local breast cancer contained in the National Cancer Institute's End Results Registry (1950-1973). Scaling the original regional curves, h1 and S1 with Kt = 1.769 and Kh = 0.263 produces transformed curves h12 and S12 that display congruence with the respective local curves, h2 and S2. This similarity of curve shapes suggests the application of the more complete curve shapes for regional disease as templates to predict the long-term survival pattern for local disease. By extension, this similarity raises the possibility of scaling early data for clinical trial curves according to templates of registry or previous trial curves, projecting long-term outcomes and reducing costs. The proposed model includes as special cases the widely used proportional hazards (Kt = 1) and accelerated life (KtKh = 1) models.
Resumo:
Reflecting the natural biology of mass spawning fish aquaculture production of fish larvae is often hampered by high and unpredictable mortality rates. The present study aimed to enhance larval performance and immunity via the oral administration of an immunomodulator, beta-glucan (MacroGard®) in turbot (Scophthalmus maximus). Rotifers (Brachionus plicatilis) were incubated with or without yeast beta-1,3/1,6-glucan in form of MacroGard® at a concentration of 0.5 g/L. Rotifers were fed to first feeding turbot larvae once a day. From day 13 dph onwards all tanks were additionally fed untreated Artemia sp. nauplii (1 nauplius ml/L). Daily mortality was monitored and larvae were sampled at 11 and 24 dph for expression of 30 genes, trypsin activity and size measurements. Along with the feeding of beta-glucan daily mortality was significantly reduced by ca. 15% and an alteration of the larval microbiota was observed. At 11 dph gene expression of trypsin and chymotrypsin was elevated in the MacroGard® fed fish, which resulted in heightened tryptic enzyme activity. No effect on genes encoding antioxidative proteins was observed, whilst the immune response was clearly modulated by beta-glucan. At 11 dph complement component c3 was elevated whilst cytokines, antimicrobial peptides, toll like receptor 3 and heat shock protein 70 were not affected. At the later time point (24 dph) an anti-inflammatory effect in form of a down-regulation of hsp 70, tnf-alpha and il-1beta was observed. We conclude that the administration of beta-glucan induced an immunomodulatory response and could be used as an effective measure to increase survival in rearing of turbot.
Resumo:
Survival of vegetation on soil-capped mining wastes is often impaired during dry seasons due to the limited amount of water stored in the shallow soil capping. Growth and survival of Rhodes grass (Chloris gayana) during soil drying on various layered capping sequences constructed of combinations of topsoil, subsoil, seawater-neutralised residue sand and low grade bauxite was determined in a glasshouse. The aim was to describe the survival of Rhodes grass in terms of plant and soil water relationships. The soil water characteristic curve and soil texture analysis was a good predictor of plant survival. The combination of soil with a high water holding capacity and low soil water diffusivity (e.g. subsoil with high clay contents) with soil having a high water holding capacity and high diffusivity (e.g. residue sand) gave best survival during drying down (up to 88 days without water), whereas topsoil and low grade bauxite were unsuitable (plants died within 18-39 days). Clayey soil improved plant survival by triggering a water stress response during peak evaporative water demand once residue sand dried down and its diffusivity fell below a critical range. Thus, for revegetation in seasonally dry climates, soil capping should combine one soil with low diffusivity and one or more soils with high total water holding capacity and high diffusivity.
Resumo:
The growth of Pseudomonas aeruginosa 6750 as a biofilm was investigated using a novel system based on that of Gilbert et al (1989). The aim was to test the effect of controlled growth of the organism on antibiotic susceptibility and examine the survival of the organism as a biofilm. During the investigations it became clear that, because of the increasing growth of P.aeruginosa and production of exopolysaccharide, a growth rate controlled monolayer could not be achieved and so the method was not used further. The data, however, showed that there was an increase in the smooth colony type of the organism during growth. Investigations were focused on the survival of P.aeruginosa in batch and chemostat studies. Survival or percentage culturability, as measured by total and colony count ratio, was found to decrease both in extended batch culture and for chemostat cells with decreasing growth rate. Extended batch culture, however, did not exhibit further increases in resistance to ciprofloxacin and polymyxin B. Survival was also measured using other parameters namely the direct viable count, vital staining, effect of temperature downshift and measurement of lag. In batch culture, the most notable change was a decrease in cell size along the growth curve. This was accompanied by an increase in the cellular protein content. Protein per volume was calculated from the data which showed a marked increase in batch culture, which was not demonstrated for chemostat cells with decreasing growth rate. Outer membrane protein profiles were obtained for batch and chemostat cells. An LPS profile of batch culture cells was also demonstrated. In general, there was little difference in the outer membrane protein profiles of cells from early and late stationary phases.The result of the LPS profile showed that there appeared to be an increase in the B-band of the region of the LPS in the older stationary phase cultures.
Resumo:
Amphibians have been declining worldwide and the comprehension of the threats that they face could be improved by using mark-recapture models to estimate vital rates of natural populations. Recently, the consequences of marking amphibians have been under discussion and the effects of toe clipping on survival are debatable, although it is still the most common technique for individually identifying amphibians. The passive integrated transponder (PIT tag) is an alternative technique, but comparisons among marking techniques in free-ranging populations are still lacking. We compared these two marking techniques using mark-recapture models to estimate apparent survival and recapture probability of a neotropical population of the blacksmith tree frog, Hypsiboas faber. We tested the effects of marking technique and number of toe pads removed while controlling for sex. Survival was similar among groups, although slightly decreased from individuals with one toe pad removed, to individuals with two and three toe pads removed, and finally to PIT-tagged individuals. No sex differences were detected. Recapture probability slightly increased with the number of toe pads removed and was the lowest for PIT-tagged individuals. Sex was an important predictor for recapture probability, with males being nearly five times more likely to be recaptured. Potential negative effects of both techniques may include reduced locomotion and high stress levels. We recommend the use of covariates in models to better understand the effects of marking techniques on frogs. Accounting for the effect of the technique on the results should be considered, because most techniques may reduce survival. Based on our results, but also on logistical and cost issues associated with PIT tagging, we suggest the use of toe clipping with anurans like the blacksmith tree frog.
Resumo:
We assessed associations between steroid receptors including: estrogen-alpha, estrogen-beta, androgen receptor, progesterone receptor, the HER2 status and triple-negative epithelial ovarian cancer (ERα-/PR-/HER2-; TNEOC) status and survival in women with epithelial ovarian cancer. The study included 152 women with primary epithelial ovarian cancer. The status of steroid receptor and HER2 was determined by immunohistochemistry. Disease-free and overall survival were calculated and compared with steroid receptor and HER2 status as well as clinicopathological features using the Cox Proportional Hazards model. A mean follow-up period of 43.6 months (interquartile range=41.4 months) was achieved where 44% of patients had serous tumor, followed by mucinous (23%), endometrioid (9%), mixed (9%), undifferentiated (8.5%) and clear cell tumors (5.3%). ER-alpha staining was associated with grade II-III tumors. Progesterone receptor staining was positively associated with a Body Mass Index≥25. Androgen receptor positivity was higher in serous tumors. In stand-alone analysis of receptor contribution to survival, estrogen-alpha positivity was associated with greater disease-free survival. However, there was no significant association between steroid receptor expression, HER2 status, or TNEOC status, and overall survival. Although estrogen-alpha, androgen receptor, progesterone receptor and the HER2 status were associated with key clinical features of the women and pathological characteristics of the tumors, these associations were not implicated in survival. Interestingly, women with TNEOC seem to fare the same way as their counterparts with non-TNEOC.
Resumo:
Galectin-3 (gal-3) is a β-galactoside binding protein related to many tumoral aspects, e.g. angiogenesis, cell growth and motility and resistance to cell death. Evidence has shown its upregulation upon hypoxia, a common feature in solid tumors such as glioblastoma multiformes (GBM). This tumor presents a unique feature described as pseudopalisading cells, which accumulate large amounts of gal-3. Tumor cells far from hypoxic/nutrient deprived areas express little, if any gal-3. Here, we have shown that the hybrid glioma cell line, NG97ht, recapitulates GBM growth forming gal-3 positive pseudopalisades even when cells are grafted subcutaneously in nude mice. In vitro experiments were performed exposing these cells to conditions mimicking tumor areas that display oxygen and nutrient deprivation. Results indicated that gal-3 transcription under hypoxic conditions requires previous protein synthesis and is triggered in a HIF-1α and NF-κB dependent manner. In addition, a significant proportion of cells die only when exposed simultaneously to hypoxia and nutrient deprivation and demonstrate ROS induction. Inhibition of gal-3 expression using siRNA led to protein knockdown followed by a 1.7-2.2 fold increase in cell death. Similar results were also found in a human GBM cell line, T98G. In vivo, U87MG gal-3 knockdown cells inoculated subcutaneously in nude mice demonstrated decreased tumor growth and increased time for tumor engraftment. These results indicate that gal-3 protected cells from cell death under hypoxia and nutrient deprivation in vitro and that gal-3 is a key factor in tumor growth and engraftment in hypoxic and nutrient-deprived microenvironments. Overexpression of gal-3, thus, is part of an adaptive program leading to tumor cell survival under these stressing conditions.
Resumo:
Health economic evaluations require estimates of expected survival from patients receiving different interventions, often over a lifetime. However, data on the patients of interest are typically only available for a much shorter follow-up time, from randomised trials or cohorts. Previous work showed how to use general population mortality to improve extrapolations of the short-term data, assuming a constant additive or multiplicative effect on the hazards for all-cause mortality for study patients relative to the general population. A more plausible assumption may be a constant effect on the hazard for the specific cause of death targeted by the treatments. To address this problem, we use independent parametric survival models for cause-specific mortality among the general population. Because causes of death are unobserved for the patients of interest, a polyhazard model is used to express their all-cause mortality as a sum of latent cause-specific hazards. Assuming proportional cause-specific hazards between the general and study populations then allows us to extrapolate mortality of the patients of interest to the long term. A Bayesian framework is used to jointly model all sources of data. By simulation, we show that ignoring cause-specific hazards leads to biased estimates of mean survival when the proportion of deaths due to the cause of interest changes through time. The methods are applied to an evaluation of implantable cardioverter defibrillators for the prevention of sudden cardiac death among patients with cardiac arrhythmia. After accounting for cause-specific mortality, substantial differences are seen in estimates of life years gained from implantable cardioverter defibrillators.
Resumo:
BACKGROUND: The model for end-stage liver disease (MELD) was developed to predict short-term mortality in patients with cirrhosis. There are few reports studying the correlation between MELD and long-term posttransplantation survival. AIM: To assess the value of pretransplant MELD in the prediction of posttransplant survival. METHODS: The adult patients (age >18 years) who underwent liver transplantation were examined in a retrospective longitudinal cohort of patients, through the prospective data base. We excluded acute liver failure, retransplantation and reduced or split-livers. The liver donors were evaluated according to: age, sex, weight, creatinine, bilirubin, sodium, aspartate aminotransferase, personal antecedents, brain death cause, steatosis, expanded criteria donor number and index donor risk. The recipients' data were: sex, age, weight, chronic hepatic disease, Child-Turcotte-Pugh points, pretransplant and initial MELD score, pretransplant creatinine clearance, sodium, cold and warm ischemia times, hospital length of stay, blood requirements, and alanine aminotransferase (ALT >1,000 UI/L = liver dysfunction). The Kaplan-Meier method with the log-rank test was used for the univariable analyses of posttransplant patient survival. For the multivariable analyses the Cox proportional hazard regression method with the stepwise procedure was used with stratifying sodium and MELD as variables. ROC curve was used to define area under the curve for MELD and Child-Turcotte-Pugh. RESULTS: A total of 232 patients with 10 years follow up were available. The MELD cutoff was 20 and Child-Turcotte-Pugh cutoff was 11.5. For MELD score > 20, the risk factors for death were: red cell requirements, liver dysfunction and donor's sodium. For the patients with hyponatremia the risk factors were: negative delta-MELD score, red cell requirements, liver dysfunction and donor's sodium. The regression univariated analyses came up with the following risk factors for death: score MELD > 25, blood requirements, recipient creatinine clearance pretransplant and age donor >50. After stepwise analyses, only red cell requirement was predictive. Patients with MELD score < 25 had a 68.86%, 50,44% and 41,50% chance for 1, 5 and 10-year survival and > 25 were 39.13%, 29.81% and 22.36% respectively. Patients without hyponatremia were 65.16%, 50.28% and 41,98% and with hyponatremia 44.44%, 34.28% and 28.57% respectively. Patients with IDR > 1.7 showed 53.7%, 27.71% and 13.85% and index donor risk <1.7 was 63.62%, 51.4% and 44.08%, respectively. Age donor > 50 years showed 38.4%, 26.21% and 13.1% and age donor <50 years showed 65.58%, 26.21% and 13.1%. Association with delta-MELD score did not show any significant difference. Expanded criteria donors were associated with primary non-function and severe liver dysfunction. Predictive factors for death were blood requirements, hyponatremia, liver dysfunction and donor's sodium. CONCLUSION: In conclusion MELD over 25, recipient's hyponatremia, blood requirements, donor's sodium were associated with poor survival.
Resumo:
OBJECTIVE: This study aimed to assess the survival and life quality evolution of patients subjected to surgical excision of oral and oropharyngeal squamous cell carcinoma. MATERIAL AND METHODS: Forty-seven patients treated at a Brazilian healthcare unit specialized in head and neck surgery between 2006 and 2007 were enrolled in the study. The gathering of data comprised reviewing hospital files and applying the University of Washington Quality of Life (UW-QOL) questionnaire previously and 1 year after the surgery. Comparative analysis used Poisson regression to assess factors associated with survival and a paired t-test to compare preoperative and 1-year postoperative QOL ratings. RESULTS: 1 year after surgery, 7 patients were not found (dropout of the cohort); 15 had died and 25 fulfilled the UW-QOL again. The risk of death was associated with having regional metastasis previously to surgery (relative risk=2.18; 95% confidence interval=1.09-5.17) and tumor size T3 or T4 (RR=2.30; 95%CI=1.05-5.04). Survivors presented significantly (p<0.05) poorer overall and domain-specific ratings of quality of life. Chewing presented the largest reduction: from 74.0 before surgery to 34.0 one year later. Anxiety was the only domain whose average rating increased (from 36.0 to 70.7). CONCLUSIONS: The prospective assessment of survival and quality of life may contribute to anticipate interventions aimed at reducing the incidence of functional limitations in patients with oral and oropharyngeal cancer.
Resumo:
In order to study the effect of pH on defaunation in the rumen, four rumen fistulated steers were fed a basal roughage diet for a 4-week adaptation period followed by 17 weeks of feeding with three diets and two feeding levels of high concentrate diet. Rumen outflow fluid rate was evaluated in both ration levels. Rumen protozoa population was monitored weekly and when animals became defaunated, protozoa were reinoculated with rumen contents from one of the faunated steers. At every two weeks, during all the experimental period, rumen pH was measured in all animals at 0, 4, 8 and 12 h after feeding. It was observed an individual animal influence on the establishment and maintenance of the rumen ciliate protozoa population. In all sampling times, mean rumen pH values were higher in faunated steers than in the defaunated ones. No differences were observed in rumen outflow fluid rates between the two ration levels. Extended periods of low rumen pH are probably more detrimental to the survival of ciliate protozoa in the rumen than other factors.
Resumo:
The tolerance to the combined effects of temperature and salinity was investigated in the interstitial isopod Coxicerberus ramosae (Albuquerque, 1978), a species of intertidal zone of sandy beaches in Rio de Janeiro, Brazil. The animals were collected on Praia Vermelha Beach. The experiments lasted 24 h and nine salinities and seven temperatures were used for a total of 63 combinations. Thirty animals were tested in each combination. The species showed high survival in most of the combinations. The temperature of 35 ºC was lethal and at 5 ºC, the animals tolerated only a narrow range of salinities. The statistical analyses showed that the effects of temperature and salinity were significant on the survival, which confirmed the euryhalinity and eurythermy of this species.