936 resultados para PROPORTIONAL HAZARD AND ACCELERATED FAILURE MODELS
Resumo:
Despite research showing the benefits of glycemic control, it remains suboptimal among adults with diabetes in the United States. Possible reasons include unaddressed risk factors as well as lack of awareness of its immediate and long term consequences. The objectives of this study were to, using cross-sectional data, (1) ascertain the association between suboptimal (Hemoglobin A1c (HbA1c) .7%), borderline (HbA1c 7-8.9%), and poor (HbA1c .9%) glycemic control and potentially new risk factors (e.g. work characteristics), and (2) assess whether aspects of poor health and well-being such as poor health related quality of life (HRQOL), unemployment, and missed-work are associated with glycemic control; and (3) using prospective data, assess the relationship between mortality risk and glycemic control in US adults with type 2 diabetes. Data from the 1988-1994 and 1999-2004 National Health and Nutrition Examination Surveys were used. HbA1c values were used to create dichotomous glycemic control indicators. Binary logistic regression models were used to assess relationships between risk factors, employment status and glycemic control. Multinomial logistic regression analyses were conducted to assess relationships between glycemic control and HRQOL variables. Zero-inflated Poisson regression models were used to assess relationships between missed work days and glycemic control. Cox-proportional hazard models were used to assess effects of glycemic control on mortality risk. Using STATA software, analyses were weighted to account for complex survey design and non-response. Multivariable models adjusted for socio-demographics, body mass index, among other variables. Results revealed that being a farm worker and working over 40 hours/week were risk factors for suboptimal glycemic control. Having greater days of poor mental was associated with suboptimal, borderline, and poor glycemic control. Having greater days of inactivity was associated with poor glycemic control while having greater days of poor physical health was associated with borderline glycemic control. There were no statistically significant relationships between glycemic control, self-reported general health, employment, and missed work. Finally, having an HbA1c value less than 6.5% was protective against mortality. The findings suggest that work-related factors are important in a person’s ability to reach optimal diabetes management levels. Poor glycemic control appears to have significant detrimental effects on HRQOL.^
Resumo:
This dissertation studies newly founded U.S. firms' survival using three different releases of the Kauffman Firm Survey. I study firms' survival from a different perspective in each chapter. ^ The first essay studies firms' survival through an analysis of their initial state at startup and the current state of the firms as they gain maturity. The probability of survival is determined using three probit models, using both firm-specific variables and an industry scale variable to control for the environment of operation. The firm's specific variables include size, experience and leverage as a debt-to-value ratio. The results indicate that size and relevant experience are both positive predictors for the initial and current states. Debt appears to be a predictor of exit if not justified wisely by acquiring assets. As suggested previously in the literature, entering a smaller-scale industry is a positive predictor of survival from birth. Finally, a smaller-scale industry diminishes the negative effects of debt. ^ The second essay makes use of a hazard model to confirm that new service-providing (SP) firms are more likely to survive than new product providers (PPs). I investigate the possible explanations for the higher survival rate of SPs using a Cox proportional hazard model. I examine six hypotheses (variations in capital per worker, expenses per worker, owners' experience, industry wages, assets and size), none of which appear to explain why SPs are more likely than PPs to survive. Two other possibilities are discussed: tax evasion and human/social relations, but these could not be tested due to lack of data. ^ The third essay investigates women-owned firms' higher failure rates using a Cox proportional hazard on two models. I make use of a never-before used variable that proxies for owners' confidence. This variable represents the owners' self-evaluated competitive advantage. ^ The first empirical model allows me to compare women's and men's hazard rates for each variable. In the second model I successively add the variables that could potentially explain why women have a higher failure rate. Unfortunately, I am not able to fully explain the gender effect on the firms' survival. Nonetheless, the second empirical approach allows me to confirm that social and psychological differences among genders are important in explaining the higher likelihood to fail in women-owned firms.^
Resumo:
Despite research showing the benefits of glycemic control, it remains suboptimal among adults with diabetes in the United States. Possible reasons include unaddressed risk factors as well as lack of awareness of its immediate and long term consequences. The objectives of this study were to, using cross-sectional data, 1) ascertain the association between suboptimal (Hemoglobin A1c (HbA1c) ≥7%), borderline (HbA1c 7-8.9%), and poor (HbA1c ≥9%) glycemic control and potentially new risk factors (e.g. work characteristics), and 2) assess whether aspects of poor health and well-being such as poor health related quality of life (HRQOL), unemployment, and missed-work are associated with glycemic control; and 3) using prospective data, assess the relationship between mortality risk and glycemic control in US adults with type 2 diabetes. Data from the 1988-1994 and 1999-2004 National Health and Nutrition Examination Surveys were used. HbA1c values were used to create dichotomous glycemic control indicators. Binary logistic regression models were used to assess relationships between risk factors, employment status and glycemic control. Multinomial logistic regression analyses were conducted to assess relationships between glycemic control and HRQOL variables. Zero-inflated Poisson regression models were used to assess relationships between missed work days and glycemic control. Cox-proportional hazard models were used to assess effects of glycemic control on mortality risk. Using STATA software, analyses were weighted to account for complex survey design and non-response. Multivariable models adjusted for socio-demographics, body mass index, among other variables. Results revealed that being a farm worker and working over 40 hours/week were risk factors for suboptimal glycemic control. Having greater days of poor mental was associated with suboptimal, borderline, and poor glycemic control. Having greater days of inactivity was associated with poor glycemic control while having greater days of poor physical health was associated with borderline glycemic control. There were no statistically significant relationships between glycemic control, self-reported general health, employment, and missed work. Finally, having an HbA1c value less than 6.5% was protective against mortality. The findings suggest that work-related factors are important in a person’s ability to reach optimal diabetes management levels. Poor glycemic control appears to have significant detrimental effects on HRQOL.
Resumo:
The neoliberal period was accompanied by a momentous transformation within the US health care system. As the result of a number of political and historical dynamics, the healthcare law signed by President Barack Obama in 2010 ‑the Affordable Care Act (ACA)‑ drew less on universal models from abroad than it did on earlier conservative healthcare reform proposals. This was in part the result of the influence of powerful corporate healthcare interests. While the ACA expands healthcare coverage, it does so incompletely and unevenly, with persistent uninsurance and disparities in access based on insurance status. Additionally, the law accommodates an overall shift towards a consumerist model of care characterized by high cost sharing at time of use. Finally, the law encourages the further consolidation of the healthcare sector, for instance into units named “Accountable Care Organizations” that closely resemble the health maintenance organizations favored by managed care advocates. The overall effect has been to maintain a fragmented system that is neither equitable nor efficient. A single payer universal system would, in contrast, help transform healthcare into a social right.
Resumo:
BACKGROUND: Abnormalities in serum phosphorus, calcium and parathyroid hormone (PTH) have been associated with poor survival in haemodialysis patients. This COSMOS (Current management Of Secondary hyperparathyroidism: a Multicentre Observational Study) analysis assesses the association of high and low serum phosphorus, calcium and PTH with a relative risk of mortality. Furthermore, the impact of changes in these parameters on the relative risk of mortality throughout the 3-year follow-up has been investigated. METHODS:COSMOS is a 3-year, multicentre, open-cohort, prospective study carried out in 6797 adult chronic haemodialysis patients randomly selected from 20 European countries. RESULTS:Using Cox proportional hazard regression models and penalized splines analysis, it was found that both high and low serum phosphorus, calcium and PTH were associated with a higher risk of mortality. The serum values associated with the minimum relative risk of mortality were 4.4 mg/dL for serum phosphorus, 8.8 mg/dL for serum calcium and 398 pg/mL for serum PTH. The lowest mortality risk ranges obtained using as base the previous values were 3.6-5.2 mg/dL for serum phosphorus, 7.9-9.5 mg/dL for serum calcium and 168-674 pg/mL for serum PTH. Decreases in serum phosphorus and calcium and increases in serum PTH in patients with baseline values of >5.2 mg/dL (phosphorus), >9.5 mg/dL (calcium) and <168 pg/mL (PTH), respectively, were associated with improved survival. CONCLUSIONS:COSMOS provides evidence of the association of serum phosphorus, calcium and PTH and mortality, and suggests survival benefits of controlling chronic kidney disease-mineral and bone disorder biochemical parameters in CKD5D patients.
Resumo:
Background: The role of temporary ovarian suppression with luteinizing hormone-releasing hormone agonists (LHRHa) in the prevention of chemotherapy-induced premature ovarian failure (POF) is still controversial. Our meta-analysis of randomized, controlled trials (RCTs) investigates whether the use of LHRHa during chemotherapy in premenopausal breast cancer patients reduces treatment-related POF rate, increases pregnancy rate, and impacts disease-free survival (DFS). Methods: A literature search using PubMed, Embase, and the Cochrane Library, and the proceedings of major conferences, was conducted up to 30 April 2015. Odds ratios (ORs) and 95% confidence intervals (CIs) for POF (i.e. POF by study definition, and POF defined as amenorrhea 1 year after chemotherapy completion) and for patients with pregnancy, as well hazard ratios (HRs) and 95% CI for DFS, were calculated for each trial. Pooled analysis was carried out using the fixed- and random-effects models. Results: A total of 12 RCTs were eligible including 1231 breast cancer patients. The use of LHRHa was associated with a significant reduced risk of POF (OR 0.36, 95% CI 0.23-0.57; P < 0.001), yet with significant heterogeneity (I2 = 47.1%, Pheterogeneity = 0.026). In eight studies reporting amenorrhea rates 1 year after chemotherapy completion, the addition of LHRHa reduced the risk of POF (OR 0.55, 95% CI 0.41-0.73, P < 0.001) without heterogeneity (I2 = 0.0%, Pheterogeneity = 0.936). In five studies reporting pregnancies, more patients treated with LHRHa achieved pregnancy (33 versus 19 women; OR 1.83, 95% CI 1.02-3.28, P = 0.041; I2 = 0.0%, Pheterogeneity = 0.629). In three studies reporting DFS, no difference was observed (HR 1.00, 95% CI 0.49-2.04, P = 0.939; I2 = 68.0%, Pheterogeneity = 0.044). Conclusion: Temporary ovarian suppression with LHRHa in young breast cancer patients is associated with a reduced risk of chemotherapy-induced POF and seems to increase the pregnancy rate, without an apparent negative consequence on prognosis.
Resumo:
Background: Oral cancer is a significant public health problem world-wide and exerts high economic, social, psychological, and physical burdens on patients, their families, and on their primary care providers. We set out to describe the changing trends in incidence and survival rates of oral cancer in Ireland between 1994 and 2009. Methods: National data on incident oral cancers [ICD 10 codes C01-C06] were obtained from the National Cancer Registry Ireland from 1994 to 2009. We estimated annual percentage change (APC) in oral cancer incidence during 1994–2009 using joinpoint regression software (version 4.2.0.2). The lifetime risk of oral cancer to age 79 was estimated using Irish incidence and population data from 2007 to 2009. Survival rates were also examined using Kaplan-Meier curves and Cox proportional hazard models to explore the influence of several demographic/lifestyle covariates with follow-up to end 2012. Results: Data were obtained on 2,147 oral cancer incident cases. Men accounted for two-thirds of oral cancer cases (n = 1,430). Annual rates in men decreased significantly during 1994–2001 (APC = -4.8 %, 95 % CI: −8.7 to −0.7) and then increased moderately (APC = 2.3 %, 95 % CI: −0.9 to 5.6). In contrast, annual incidence increased significantly in women throughout the study period (APC = 3.2 %, 95 % CI: 1.9 to 4.6). There was an elevated risk of death among oral cancer patients who were: older than 60 years of age; smokers; unemployed or retired; those living in the most deprived areas; and those whose tumour was sited in the base of the tongue. Being married and diagnosed in more recent years were associated with reduced risk of death. Conclusion: Oral cancer increased significantly in both sexes between 1999 and 2009 in Ireland. Our analyses demonstrate the influence of measured factors such as smoking, time of diagnosis and age on observed trends. Unmeasured factors such as alcohol use, HPV and dietary factors may also be contributing to increased trends. Several of these are modifiable risk factors which are crucial for informing public health policies, and thus more research is needed.
Resumo:
BACKGROUND: The model for end-stage liver disease (MELD) was developed to predict short-term mortality in patients with cirrhosis. There are few reports studying the correlation between MELD and long-term posttransplantation survival. AIM: To assess the value of pretransplant MELD in the prediction of posttransplant survival. METHODS: The adult patients (age >18 years) who underwent liver transplantation were examined in a retrospective longitudinal cohort of patients, through the prospective data base. We excluded acute liver failure, retransplantation and reduced or split-livers. The liver donors were evaluated according to: age, sex, weight, creatinine, bilirubin, sodium, aspartate aminotransferase, personal antecedents, brain death cause, steatosis, expanded criteria donor number and index donor risk. The recipients' data were: sex, age, weight, chronic hepatic disease, Child-Turcotte-Pugh points, pretransplant and initial MELD score, pretransplant creatinine clearance, sodium, cold and warm ischemia times, hospital length of stay, blood requirements, and alanine aminotransferase (ALT >1,000 UI/L = liver dysfunction). The Kaplan-Meier method with the log-rank test was used for the univariable analyses of posttransplant patient survival. For the multivariable analyses the Cox proportional hazard regression method with the stepwise procedure was used with stratifying sodium and MELD as variables. ROC curve was used to define area under the curve for MELD and Child-Turcotte-Pugh. RESULTS: A total of 232 patients with 10 years follow up were available. The MELD cutoff was 20 and Child-Turcotte-Pugh cutoff was 11.5. For MELD score > 20, the risk factors for death were: red cell requirements, liver dysfunction and donor's sodium. For the patients with hyponatremia the risk factors were: negative delta-MELD score, red cell requirements, liver dysfunction and donor's sodium. The regression univariated analyses came up with the following risk factors for death: score MELD > 25, blood requirements, recipient creatinine clearance pretransplant and age donor >50. After stepwise analyses, only red cell requirement was predictive. Patients with MELD score < 25 had a 68.86%, 50,44% and 41,50% chance for 1, 5 and 10-year survival and > 25 were 39.13%, 29.81% and 22.36% respectively. Patients without hyponatremia were 65.16%, 50.28% and 41,98% and with hyponatremia 44.44%, 34.28% and 28.57% respectively. Patients with IDR > 1.7 showed 53.7%, 27.71% and 13.85% and index donor risk <1.7 was 63.62%, 51.4% and 44.08%, respectively. Age donor > 50 years showed 38.4%, 26.21% and 13.1% and age donor <50 years showed 65.58%, 26.21% and 13.1%. Association with delta-MELD score did not show any significant difference. Expanded criteria donors were associated with primary non-function and severe liver dysfunction. Predictive factors for death were blood requirements, hyponatremia, liver dysfunction and donor's sodium. CONCLUSION: In conclusion MELD over 25, recipient's hyponatremia, blood requirements, donor's sodium were associated with poor survival.
Resumo:
Universidade Estadual de Campinas . Faculdade de Educação Física
Resumo:
Interval-censored survival data, in which the event of interest is not observed exactly but is only known to occur within some time interval, occur very frequently. In some situations, event times might be censored into different, possibly overlapping intervals of variable widths; however, in other situations, information is available for all units at the same observed visit time. In the latter cases, interval-censored data are termed grouped survival data. Here we present alternative approaches for analyzing interval-censored data. We illustrate these techniques using a survival data set involving mango tree lifetimes. This study is an example of grouped survival data.
Resumo:
Objective - To assess the relationship between infrarenal aortic diameter and subsequent all-cause mortality in men aged 65 years or older. Methods and Results - Aortic diameter was measured using ultrasound in 12 203 men aged 65 to 83 years as part of a trial of screening for abdominal aortic aneurysms. A range of cardiovascular risk factors was also documented. Mortality over the next 3 to 7 years was assessed using record linkage. Initial aortic diameter was categorized into 10 intervals, and the relationship between increasing diameter and subsequent mortality was explored using Cox proportional hazard models. Median diameter increased from 21.4 mm in the youngest men to 22.1 mm in the oldest men. The cumulative all-cause mortality increased in a graded fashion with increasing aortic diameter. Using the diameter interval 19 to 22 mm as the reference, the adjusted hazard ratio for all-cause mortality increased from 1.26 (95% CI: 1.09, 1.44; P = 0.001) for aortic diameters of 23 to 26 mm to 2.38 (95% CI: 1.22, 4.61; P = 0.011) for aortic diameters of 47 to 50 mm. Analysis of causes of death indicated that cardiovascular disease was an important contributor to this increase. Conclusion - Infrarenal aortic diameter is an independent marker of subsequent all-cause mortality.
Resumo:
Study Objectives: To test the effects of exercise training on sleep and neurovascular control in patients with systolic heart failure with and without sleep disordered breathing. Design: Prospective interventional study. Setting: Cardiac rehabilitation and exercise physiology unit and sleep laboratory. Patients: Twenty-five patients with heart failure, aged 42 to 70 years, and New York Heart Association Functional Class I-III were divided into 1 of 3 groups: obstructive sleep apnea (n = 8), central sleep apnea (n 9) and no sleep apnea (n = 7). Interventions: Four months of no-training (control) followed by 4 months of an exercise training program (three 60-minute, supervised, exercise sessions per week). Measures and Results: Sleep (polysomnography), microneurography, forearm blood flow (plethysmography), peak VO(2). and quality of life were evaluated at baseline and at the end of the control and trained periods. No significant changes occurred in the control period. Exercise training reduced muscle sympathetic nerve activity (P < 0.001) and increased forearm blood flow (P < 0.01), peak VO(2) (P < 0.01), and quality of life (P < 0.01) in all groups, independent of the presence of sleep apnea. Exercise training improved the apnea-hypopnea index, minimum O(2) saturation, and amount stage 3-4 sleep (P < 0.05) in patients with obstructive sleep apnea but had no significant effects in patients with central sleep apnea. Conclusions. The beneficial effects of exercise training on neurovascular function, functional capacity, and quality of life in patients with systolic dysfunction and heart failure occurs independently of sleep disordered breathing. Exercise training lessens the severity of obstructive sleep apnea but does not affect central sleep apnea in patients with heart failure and sleep disordered breathing.
Resumo:
Background. A consistent association between paternal age and their offspring`s risk of schizophrenia has been observed, with no independent association with maternal age. The relationship of paternal and maternal ages with risk of bipolar affective disorders (BPAD) in the offspring is less clear. The present study aimed at testing the hypothesis that paternal age is associated with their offspring`s risk of BPAD, whereas maternal age is not. Method. This population-based cohort study was conducted with individuals born in Sweden during 1973-1980 and still resident there at age 16 years. Outcome was first hospital admission with a diagnosis of BPAD. Hazard ratios (HRs) were calculated using Cox`s proportional hazard regression. Results. After adjustment for all potential confounding variables except maternal age, the HR for risk of BPAD for each 10-year increase in paternal age was 1.28 [95% confidence interval (Cl) 1.11-1.48], but this fell to 1.20 (95% CI 0.97-1.48) after adjusting for maternal age. A similar result was found for maternal age and risk of BPAD [HR 1.30 (95% CI 1.08-1.56) before adjustment for paternal age, HR 1.12 (95% Cl 0.86-1.45) after adjustment]. The HR associated with having either parent aged 30 years or over was 1.26 (95% CI 1.01-1.57) and it was 1.45 (95%, CI 1.16-1.81) if both parents were >30 years. Conclusions. Unlike schizophrenia, the risk of BPAD seems to be associated with both paternal and maternal ages.
Resumo:
OBJECTIVES We sought to assess the prognostic value and risk classification improvement using contemporary single-photon emission computed tomography myocardial perfusion imaging (SPECT-MPI) to predict all-cause mortality. BACKGROUND Myocardial perfusion is a strong estimator of prognosis. Evidence published to date has not established the added prognostic value of SPECT-MPI nor defined an approach to detect improve classification of risk in women from a developing nation. METHODS A total of 2,225 women referred for SPECT-MPI were followed by a mean period of 3.7 +/- 1.4 years. SPECT-MPI results were classified as abnormal on the presence of any perfusion defect. Abnormal scans were further classified as with mild/moderate reversible, severe reversible, partial reversible, or fixed perfusion defects. Risk estimates for incident mortality were categorized as <1%/year, 1% to 2%/year, and >2%/year using Cox proportional hazard models. Risk-adjusted models incorporated clinical risk factors, left ventricular ejection fraction (LVEF), and perfusion variables. RESULTS All-cause death occurred in 139 patients. SPECT-MPI significantly risk stratified the population; patients with abnormal scans had significantly higher death rates compared with patients with normal scans, 13.1% versus 4.0%, respectively (p < 0.001). Cox analysis demonstrated that after adjusting for clinical risk factors and LVEF, SPECT-MPI improved the model discrimination (integrated discrimination index = 0.009; p = 0.02), added significant incremental prognostic information (global chi-square increased from 87.7 to 127.1; p < 0.0001), and improved risk prediction (net reclassification improvement = 0.12; p = 0.005). CONCLUSIONS SPECT-MPI added significant incremental prognostic information to clinical and left ventricular functional variables while enhancing the ability to classify this Brazilian female population into low-and high-risk categories of all-cause mortality. (J Am Coll Cardiol Img 2011;4:880-8) (C) 2011 by the American College of Cardiology Foundation
Resumo:
To study and characterize the in vivo effect of the lectin from Luetzelburgia auriculata seed on acute inflammation models. The lectin was purified from the crude saline extract by affinity chromatography on a guar-gum matrix. Native, heat-treated, and digested lectin was evaluated for anti-inflammatory activity by using peritonitis and paw edema models. The anti-inflammatory activity was characterized by intravital microscopy, nitric oxide production, and myeloperoxidase activity. The lectin exhibited anti-inflammatory activity (2 mg/kg) on both models, reducing local myeloperoxidase activity. Galactose or heat treatment (100A degrees C, 10 min) reduced anti-inflammatory action. Anti-inflammation involves the inhibition of adhesion and rolling of leukocytes along with augmentation of nitric oxide in serum. The lectin inhibited the edematogenic effect of histamine and prostaglandins (PGE2) but did not alter the chemoattractant effect of IL-8. The results indicate that this lectin is a potent anti-inflammatory molecule. Its effects engage diverse modulatory events.