962 resultados para competing risks model


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Peritoneal dialysis (PD) should be considered a suitable method of renal replacement therapy in acute kidney injury (AKI) patients. This study is the largest cohort providing patient characteristics, clinical practice, patterns and their relationship to outcomes in a developing country. Its objective was to describe the main determinants of patient and technique survival, including trends over time of PD treatment in AKI patients. This was a Brazilian prospective cohort study in which all adult AKI patients on PD were studied from January/2004 to January/2014. For comparison purposes, patients were divided into 2 groups according to the year of treatment: 2004-2008 and 2009-2014. Patient survival and technique failure (TF) were analyzed using the competing risk model of Fine and Gray. A total of 301 patients were included, 51 were transferred to hemodialysis (16.9%) during the study period. The main cause of TF was mechanical complication (47%) followed by peritonitis (41.2%). There was change in TF during the study period: compared to 2004-2008, patients treated at 2009-2014 had relative risk (RR) reduction of 0.86 (95% CI 0.77-0.96) and three independent risk factors were identified: period of treatment at 2009 and 2014, sepsis and age>65 years. There were 180 deaths (59.8%) during the study. Death was the leading cause of dropout (77.9% of all cases) mainly by sepsis (58.3%), followed cardiovascular disease (36.1%). The overall patient survival was 41% at 30 days. Patient survival improved along study periods: compared to 2004-2008, patients treated at 2009-2014 had a RR reduction of 0.87 (95% CI 0.79-0.98). The independent risk factors for mortality were sepsis, age>70 years, ATN-ISS > 0.65 and positive fluid balance. As conclusion, we observed an improvement in patient survival and TF along the years even after correction for several confounders and using a competing risk approach.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Background and ObjectivesHypokalemia has been consistently associated with high mortality rate in peritoneal dialysis. However, studies investigating if hypokalemia is acting as a surrogate marker of comorbidities or has a direct effect in the risk for mortality have not been studied. Thus, the aim of this study was to analyze the effect of hypokalemia on overall and cause-specific mortality.Design, Setting, Participants and MeasurementsThis is an analysis of BRAZPD II, a nationwide prospective cohort study. All patients on PD for longer than 90 days with measured serum potassium levels were used to verify the association of hypokalemia with overall and cause-specific mortality using a propensity match score to reduce selection bias. In addition, competing risks were also taken into account for the analysis of cause-specific mortality.ResultsThere was a U-shaped relationship between time-averaged serum potassium and all-cause mortality of PD patients. Cardiovascular disease was the main cause of death in the normokalemic group with 133 events (41.8%) followed by PD-non related infections, n=105 (33.0%). Hypokalemia was associated with a 49% increased risk for CV mortality after adjustments for covariates and the presence of competing risks (SHR 1.49; CI95% 1.01-2.21). In contrast, in the group of patients with K < 3.5mEq/L, PD-non related infections were the main cause of death with 43 events (44.3%) followed by cardiovascular disease (n=36; 37.1%). For PD-non related infections the SHR was 2.19 (CI95% 1.52-3.14) while for peritonitis was SHR 1.09 (CI95% 0.47-2.49).ConclusionsHypokalemia had a significant impact on overall, cardiovascular and infectious mortality even after adjustments for competing risks. The causative nature of this association suggested by our study raises the need for intervention studies looking at the effect of potassium supplementation on clinical outcomes of PD patients.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Background: The prevalence of systemic lupus erythematous (SLE) patients requiring renal replacement therapy (RRT) is increasing but data on clinical outcomes are scarce. Interestingly, data on technique failure and peritoneal-dialysis (PD)-related infections are rarer, despite SLE patients being considered at high risk for infections. The aim of our study is to compare clinical outcomes of SLE patients on PD in a large PD cohort. Methods: We conducted a nationwide prospective observational study from the BRAZPD II cohort. For this study we identified all patients on PD for greater than 90 days. Within that subset, all those with SLE as primary renal disease were matched with PD patients without SLE for comparison of clinical outcomes, namely: patient mortality, technique survival and time to first peritonitis, then were analyzed taking into account the presence of competing risks. Results: Out of a total of 9907 patients, we identified 102 SLE patients incident in PD and with more than 90 days on PD. After matching the groups consisted of 92 patients with SLE and 340 matched controls. Mean age was 46.9 +/- 16.8 years, 77.3% were females and 58.1% were Caucasians. After adjustments SLE sub-hazard distribution ratio for mortality was 1.06 (CI 95% 0.55-2.05), for technique failure was 1.01 (CI 95% 0.54-1.91) and for time to first peritonitis episode was 1.40 (CI 95% 0.92-2.11). The probability for occurrence of competing risks in all three outcomes was similar between groups. Conclusion: PD therapy was shown to be a safe and equally successful therapy for SLE patients compared to matched non-SLE patients.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The impact of peritoneal dialysis modality on patient survival and peritonitis rates is not fully understood, and no large-scale randomized clinical trial (RCT) is available. In the absence of a RCT, the use of an advanced matching procedure to reduce selection bias in large cohort studies may be the best approach. The aim of this study is to compare automated peritoneal dialysis (APD) and continuous ambulatory peritoneal dialysis (CAPD) according to peritonitis risk, technique failure and patient survival in a large nation-wide PD cohort. This is a prospective cohort study that included all incident PD patients with at least 90 days of PD recruited in the BRAZPD study. All patients who were treated exclusively with either APD or CAPD were matched for 15 different covariates using a propensity score calculated with the nearest neighbor method. Clinical outcomes analyzed were overall mortality, technique failure and time to first peritonitis. For all analysis we also adjusted the curves for the presence of competing risks with the Fine and Gray analysis. After the matching procedure, 2,890 patients were included in the analysis (1,445 in each group). Baseline characteristics were similar for all covariates including: age, diabetes, BMI, Center-experience, coronary artery disease, cancer, literacy, hypertension, race, previous HD, gender, pre-dialysis care, family income, peripheral artery disease and year of starting PD. Mortality rate was higher in CAPD patients (SHR1.44 CI95%1.21-1.71) compared to APD, but no difference was observed for technique failure (SHR0.83 CI95%0.69-1.02) nor for time till the first peritonitis episode (SHR0.96 CI95%0.93-1.11). In the first large PD cohort study with groups balanced for several covariates using propensity score matching, PD modality was not associated with differences in neither time to first peritonitis nor in technique failure. Nevertheless, patient survival was significantly better in APD patients.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A method is presented for estimating age-specific mortality based on minimal information: a model life table and an estimate of longevity. This approach uses expected patterns of mammalian survivorship to define a general model of age-specific mortality rates. One such model life table is based on data for northern fur seals (Callorhinus ursinus) using Siler’s (1979) 5-parameter competing risk model. Alternative model life tables are based on historical data for human females and on a published model for Old World monkeys. Survival rates for a marine mammal species are then calculated by scaling these models by the longevity of that species. By using a realistic model (instead of assuming constant mortality), one can see more easily the real biological limits to population growth. The mortality estimation procedure is illustrated with examples of spotted dolphins (Stenella attenuata) and harbor porpoise (Phocoena phocoena).

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this paper, we proposed a new three-parameter long-term lifetime distribution induced by a latent complementary risk framework with decreasing, increasing and unimodal hazard function, the long-term complementary exponential geometric distribution. The new distribution arises from latent competing risk scenarios, where the lifetime associated scenario, with a particular risk, is not observable, rather we observe only the maximum lifetime value among all risks, and the presence of long-term survival. The properties of the proposed distribution are discussed, including its probability density function and explicit algebraic formulas for its reliability, hazard and quantile functions and order statistics. The parameter estimation is based on the usual maximum-likelihood approach. A simulation study assesses the performance of the estimation procedure. We compare the new distribution with its particular cases, as well as with the long-term Weibull distribution on three real data sets, observing its potential and competitiveness in comparison with some usual long-term lifetime distributions.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Objective: In South Africa, many HIV-infected patients experience delays in accessing antiretroviral therapy (ART). We examined pretreatment mortality and access to treatment in patients waiting for ART. Design: Cohort of HIV-infected patients assessed for ART eligibility at 36 facilities participating in the Comprehensive HIV and AIDS Management (CHAM) program in the Free State Province. Methods: Proportion of patients initiating ART, pre-ART mortality and risk factors associated with these outcomes were estimated using competing risks survival analysis. Results: Forty-four thousand, eight hundred and forty-four patients enrolled in CHAM between May 2004 and December 2007, of whom 22 083 (49.2%) were eligible for ART; pre-ART mortality was 53.2 per 100 person-years [95% confidence interval (CI) 51.8–54.7]. Median CD4 cell count at eligibility increased from 87 cells/ml in 2004 to 101 cells/ml in 2007. Two years after eligibility an estimated 67.7% (67.1–68.4%) of patients had started ART, and 26.2% (25.6–26.9%) died before starting ART. Among patients with CD4 cell counts below 25 cells/ml at eligibility, 48% died before ART and 51% initiated ART. Men were less likely to start treatment and more likely to die than women. Patients in rural clinics or clinics with low staffing levels had lower rates of starting treatment and higher mortality compared with patients in urban/peri-urban clinics, or better staffed clinics. Conclusions: Mortality is high in eligible patients waiting for ART in the Free State Province. The most immunocompromised patients had the lowest probability of starting ART and the highest risk of pre-ART death. Prioritization of these patients should reduce waiting times and pre-ART mortality.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

BACKGROUND: Extracapsular tumor spread (ECS) has been identified as a possible risk factor for breast cancer recurrence, but controversy exists regarding its role in decision making for regional radiotherapy. This study evaluates ECS as a predictor of local, axillary, and supraclavicular recurrence. PATIENTS AND METHODS: International Breast Cancer Study Group Trial VI accrued 1475 eligible pre- and perimenopausal women with node-positive breast cancer who were randomly assigned to receive three to nine courses of classical combination chemotherapy with cyclophosphamide, methotrexate, and fluorouracil. ECS status was determined retrospectively in 933 patients based on review of pathology reports. Cumulative incidence and hazard ratios (HRs) were estimated using methods for competing risks analysis. Adjustment factors included treatment group and baseline patient and tumor characteristics. The median follow-up was 14 years. RESULTS: In univariable analysis, ECS was significantly associated with supraclavicular recurrence (HR = 1.96; 95% confidence interval 1.23-3.13; P = 0.005). HRs for local and axillary recurrence were 1.38 (P = 0.06) and 1.81 (P = 0.11), respectively. Following adjustment for number of lymph node metastases and other baseline prognostic factors, ECS was not significantly associated with any of the three recurrence types studied. CONCLUSIONS: Our results indicate that the decision for additional regional radiotherapy should not be based solely on the presence of ECS.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

PURPOSE Somatostatin-based radiopeptide treatment is generally performed using the β-emitting radionuclides (90)Y or (177)Lu. The present study aimed at comparing benefits and harms of both therapeutic approaches. METHODS In a comparative cohort study, patients with advanced neuroendocrine tumours underwent repeated cycles of [(90)Y-DOTA]-TOC or [(177)Lu-DOTA]-TOC until progression of disease or permanent adverse events. Multivariable Cox regression and competing risks regression were employed to examine predictors of survival and adverse events for both treatment groups. RESULTS Overall, 910 patients underwent 1,804 cycles of [(90)Y-DOTA]-TOC and 141 patients underwent 259 cycles of [(177)Lu-DOTA]-TOC. The median survival after [(177)Lu-DOTA]-TOC and after [(90)Y-DOTA]-TOC was comparable (45.5 months versus 35.9 months, hazard ratio 0.91, 95% confidence interval 0.63-1.30, p = 0.49). Subgroup analyses revealed a significantly longer survival for [(177)Lu-DOTA]-TOC over [(90)Y-DOTA]-TOC in patients with low tumour uptake, solitary lesions and extra-hepatic lesions. The rate of severe transient haematotoxicities was lower after [(177)Lu-DOTA]-TOC treatment (1.4 vs 10.1%, p = 0.001), while the rate of severe permanent renal toxicities was similar in both treatment groups (9.2 vs 7.8%, p = 0.32). CONCLUSION The present results revealed no difference in median overall survival after [(177)Lu-DOTA]-TOC and [(90)Y-DOTA]-TOC. Furthermore, [(177)Lu-DOTA]-TOC was less haematotoxic than [(90)Y-DOTA]-TOC.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

BACKGROUND No data exist on the patterns of biochemical recurrence (BCR) and their effect on survival in patients with high-risk prostate cancer (PCa) treated with surgery. The aim of our investigation was to evaluate the natural history of PCa in patients treated with radical prostatectomy (RP) alone. MATERIALS AND METHODS Overall, 2,065 patients with high-risk PCa treated with RP at 7 tertiary referral centers between 1991 and 2011 were identified. First, we calculated the probability of experiencing BCR after surgery. Particularly, we relied on conditional survival estimates for BCR after RP. Competing-risks regression analyses were then used to evaluate the effect of time to BCR on the risk of cancer-specific mortality (CSM). RESULTS Median follow-up was 70 months. Overall, the 5-year BCR-free survival rate was 55.2%. Given the BCR-free survivorship at 1, 2, 3, 4, and 5 years, the BCR-free survival rates improved by+7.6%,+4.1%,+4.8%,+3.2%, and+3.7%, respectively. Overall, the 10-year CSM rate was 14.8%. When patients were stratified according to time to BCR, patients experiencing BCR within 36 months from surgery had higher 10-year CSM rates compared with those experiencing late BCR (19.1% vs. 4.4%; P<0.001). At multivariate analyses, time to BCR represented an independent predictor of CSM (P<0.001). CONCLUSIONS Increasing time from surgery is associated with a reduction of the risk of subsequent BCR. Additionally, time to BCR represents a predictor of CSM in these patients. These results might help provide clinicians with better follow-up strategies and more aggressive treatments for early BCR.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

OBJECTIVES We studied the influence of noninjecting and injecting drug use on mortality, dropout rate, and the course of antiretroviral therapy (ART), in the Swiss HIV Cohort Study (SHCS). METHODS Cohort participants, registered prior to April 2007 and with at least one drug use questionnaire completed until May 2013, were categorized according to their self-reported drug use behaviour. The probabilities of death and dropout were separately analysed using multivariable competing risks proportional hazards regression models with mutual correction for the other endpoint. Furthermore, we describe the influence of drug use on the course of ART. RESULTS A total of 6529 participants (including 31% women) were followed during 31 215 person-years; 5.1% participants died; 10.5% were lost to follow-up. Among persons with homosexual or heterosexual HIV transmission, noninjecting drug use was associated with higher all-cause mortality [subhazard rate (SHR) 1.73; 95% confidence interval (CI) 1.07-2.83], compared with no drug use. Also, mortality was increased among former injecting drug users (IDUs) who reported noninjecting drug use (SHR 2.34; 95% CI 1.49-3.69). Noninjecting drug use was associated with higher dropout rates. The mean proportion of time with suppressed viral replication was 82.2% in all participants, irrespective of ART status, and 91.2% in those on ART. Drug use lowered adherence, and increased rates of ART change and ART interruptions. Virological failure on ART was more frequent in participants who reported concomitant drug injections while on opiate substitution, and in current IDUs, but not among noninjecting drug users. CONCLUSIONS Noninjecting drug use and injecting drug use are modifiable risks for death, and they lower retention in a cohort and complicate ART.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

BACKGROUND As access to antiretroviral therapy (ART) expands, increasing numbers of older patients will start treatment and need specialised long-term care. However, the effect of age in ART programmes in resource-constrained settings is poorly understood. The HIV epidemic is ageing rapidly and South Africa has one of the highest HIV population prevalences worldwide. We explored the effect of age on mortality of patients on ART in South Africa and whether this effect is mediated by baseline immunological status. METHODS In this retrospective cohort analysis, we studied HIV-positive patients aged 16-80 years who started ART for the first time in six large South African cohorts of the International Epidemiologic Databases to Evaluate AIDS-Southern Africa collaboration, in KwaZulu-Natal, Gauteng, and Western Cape (two primary care clinics, three hospitals, and a large rural cohort). The primary outcome was mortality. We ascertained patients' vital status through linkage to the National Population Register. We used inverse probability weighting to correct mortality for loss to follow-up. We estimated mortality using Cox's proportional hazards and competing risks regression. We tested the interaction between baseline CD4 cell count and age. FINDINGS Between Jan 1, 2004, and Dec 31, 2013, 84,078 eligible adults started ART. Of these, we followed up 83,566 patients for 174,640 patient-years. 8% (1817 of 23,258) of patients aged 16-29 years died compared with 19% (93 of 492) of patients aged 65 years or older. The age adjusted mortality hazard ratio was 2·52 (95% CI 2·01-3·17) for people aged 65 years or older compared with those 16-29 years of age. In patients starting ART with a CD4 count of less than 50 cells per μL, the adjusted mortality hazard ratio was 2·52 (2·04-3·11) for people aged 50 years or older compared with those 16-39 years old. Mortality was highest in patients with CD4 counts of less than 50 cells per μL, and 15% (1103 of 7295) of all patients aged 50 years or older starting ART were in this group. The proportion of patients aged 50 years or older enrolling in ART increased with successive years, from 6% (290 of 4999) in 2004 to 10% (961 of 9657) in 2012-13, comprising 9% of total enrolment (7295 of 83 566). At the end of the study, 6304 (14%) of 44,909 patients still alive and in care were aged 50 years or older. INTERPRETATION Health services need reorientation towards HIV diagnosis and starting of ART in older individuals. Policies are needed for long-term care of older people with HIV. FUNDING National Institutes of Health (National Institute of Allergy and Infectious Diseases), US Agency for International Development, and South African Centre for Epidemiological Modelling and Analysis.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A life table methodology was developed which estimates the expected remaining Army service time and the expected remaining Army sick time by years of service for the United States Army population. A measure of illness impact was defined as the ratio of expected remaining Army sick time to the expected remaining Army service time. The variances of the resulting estimators were developed on the basis of current data. The theory of partial and complete competing risks was considered for each type of decrement (death, administrative separation, and medical separation) and for the causes of sick time.^ The methodology was applied to world-wide U.S. Army data for calendar year 1978. A total of 669,493 enlisted personnel and 97,704 officers were reported on active duty as of 30 September 1978. During calendar year 1978, the Army Medical Department reported 114,647 inpatient discharges and 1,767,146 sick days. Although the methodology is completely general with respect to the definition of sick time, only sick time associated with an inpatient episode was considered in this study.^ Since the temporal measure was years of Army service, an age-adjusting process was applied to the life tables for comparative purposes. Analyses were conducted by rank (enlisted and officer), race and sex, and were based on the ratio of expected remaining Army sick time to expected remaining Army service time. Seventeen major diagnostic groups, classified by the Eighth Revision, International Classification of Diseases, Adapted for Use In The United States, were ranked according to their cumulative (across years of service) contribution to expected remaining sick time.^ The study results indicated that enlisted personnel tend to have more expected hospital-associated sick time relative to their expected Army service time than officers. Non-white officers generally have more expected sick time relative to their expected Army service time than white officers. This racial differential was not supported within the enlisted population. Females tend to have more expected sick time relative to their expected Army service time than males. This tendency remained after diagnostic groups 580-629 (Genitourinary System) and 630-678 (Pregnancy and Childbirth) were removed. Problems associated with the circulatory system, digestive system and musculoskeletal system were among the three leading causes of cumulative sick time across years of service. ^