952 resultados para mortality-incidence ratio
Resumo:
BACKGROUND: The extent to which mortality differs following individual acquired immunodeficiency syndrome (AIDS)-defining events (ADEs) has not been assessed among patients initiating combination antiretroviral therapy. METHODS: We analyzed data from 31,620 patients with no prior ADEs who started combination antiretroviral therapy. Cox proportional hazards models were used to estimate mortality hazard ratios for each ADE that occurred in >50 patients, after stratification by cohort and adjustment for sex, HIV transmission group, number of antiretroviral drugs initiated, regimen, age, date of starting combination antiretroviral therapy, and CD4+ cell count and HIV RNA load at initiation of combination antiretroviral therapy. ADEs that occurred in <50 patients were grouped together to form a "rare ADEs" category. RESULTS: During a median follow-up period of 43 months (interquartile range, 19-70 months), 2880 ADEs were diagnosed in 2262 patients; 1146 patients died. The most common ADEs were esophageal candidiasis (in 360 patients), Pneumocystis jiroveci pneumonia (320 patients), and Kaposi sarcoma (308 patients). The greatest mortality hazard ratio was associated with non-Hodgkin's lymphoma (hazard ratio, 17.59; 95% confidence interval, 13.84-22.35) and progressive multifocal leukoencephalopathy (hazard ratio, 10.0; 95% confidence interval, 6.70-14.92). Three groups of ADEs were identified on the basis of the ranked hazard ratios with bootstrapped confidence intervals: severe (non-Hodgkin's lymphoma and progressive multifocal leukoencephalopathy [hazard ratio, 7.26; 95% confidence interval, 5.55-9.48]), moderate (cryptococcosis, cerebral toxoplasmosis, AIDS dementia complex, disseminated Mycobacterium avium complex, and rare ADEs [hazard ratio, 2.35; 95% confidence interval, 1.76-3.13]), and mild (all other ADEs [hazard ratio, 1.47; 95% confidence interval, 1.08-2.00]). CONCLUSIONS: In the combination antiretroviral therapy era, mortality rates subsequent to an ADE depend on the specific diagnosis. The proposed classification of ADEs may be useful in clinical end point trials, prognostic studies, and patient management.
Resumo:
BACKGROUND: There are differences in the literature regarding outcomes of premature small-for-gestational-age (SGA) and appropriate-for gestational-age (AGA) infants, possibly due to failure to take into account gestational age at birth. OBJECTIVE: To compare mortality and respiratory morbidity of SGA and AGA premature newborn infants. DESIGN/METHODS: A retrospective study was done of the 2,487 infants born without congenital anomalies at RESULTS: Controlling for GA, premature SGA infants were at a higher risk for mortality (Odds ratio 3.1, P = 0.001) and at lower risk of respiratory distress syndrome (OR = 0.71, p = 0.02) than AGA infants. However multivariate logistic regression modeling found that the odds of having respiratory distress syndrome (RDS) varied between SGA and AGA infants by GA. There was no change in RDS risk in SGA infants at GA 32 wk (OR = 0.41, 95% CI 0.27 - 0.63; p < 0.01). After controlling for GA, SGA infants were observed to be at a significantly higher risk for developing chronic lung disease as compared to AGA infants (OR = 2.2, 95% CI = 1.2 - 3.9, P = 0.01). There was no significant difference between SGA and AGA infants in total days on ventilator. Among infants who survived, mean length of hospital stay was significantly higher in SGA infants born between 26-36 wks GA than AGA infants. CONCLUSIONS: Premature SGA infants have significantly higher mortality, significantly higher risk of developing chronic lung disease and longer hospital stay as compared to premature AGA infants. Even the reduced risk of RDS in infants born at >/=32 wk GA, (conferred possibly by intra-uterine stress leading to accelerated lung maturation) appears to be of transient effect and is counterbalanced by adverse effects of poor intrauterine growth on long term pulmonary outcomes such as chronic lung disease.
Resumo:
BACKGROUND Treatment of patients with paediatric acute lymphoblastic leukaemia has evolved such that the risk of late effects in survivors treated in accordance with contemporary protocols could be different from that noted in those treated decades ago. We aimed to estimate the risk of late effects in children with standard-risk acute lymphoblastic leukaemia treated with contemporary protocols. METHODS We used data from similarly treated members of the Childhood Cancer Survivor Study cohort. The Childhood Cancer Survivor Study is a multicentre, North American study of 5-year survivors of childhood cancer diagnosed between 1970 and 1986. We included cohort members if they were aged 1·0-9·9 years at the time of diagnosis of acute lymphoblastic leukaemia and had received treatment consistent with contemporary standard-risk protocols for acute lymphoblastic leukaemia. We calculated mortality rates and standardised mortality ratios, stratified by sex and survival time, after diagnosis of acute lymphoblastic leukaemia. We calculated standardised incidence ratios and absolute excess risk for subsequent neoplasms with age-specific, sex-specific, and calendar-year-specific rates from the Surveillance, Epidemiology and End Results Program. Outcomes were compared with a sibling cohort and the general US population. FINDINGS We included 556 (13%) of 4329 cohort members treated for acute lymphoblastic leukaemia. Median follow-up of the survivors from 5 years after diagnosis was 18·4 years (range 0·0-33·0). 28 (5%) of 556 participants had died (standardised mortality ratio 3·5, 95% CI 2·3-5·0). 16 (57%) deaths were due to causes other than recurrence of acute lymphoblastic leukaemia. Six (1%) survivors developed a subsequent malignant neoplasm (standardised incidence ratio 2·6, 95% CI 1·0-5·7). 107 participants (95% CI 81-193) in each group would need to be followed-up for 1 year to observe one extra chronic health disorder in the survivor group compared with the sibling group. 415 participants (376-939) in each group would need to be followed-up for 1 year to observe one extra severe, life-threatening, or fatal disorder in the group of survivors. Survivors did not differ from siblings in their educational attainment, rate of marriage, or independent living. INTERPRETATION The prevalence of adverse long-term outcomes in children treated for standard risk acute lymphoblastic leukaemia according to contemporary protocols is low, but regular care from a knowledgeable primary-care practitioner is warranted. FUNDING National Cancer Institute, American Lebanese-Syrian Associated Charities, Swiss Cancer Research.
Resumo:
BACKGROUND Febrile neutropenia (FN) and other infectious complications are some of the most serious treatment-related toxicities of chemotherapy for cancer, with a mortality rate of 2% to 21%. The two main types of prophylactic regimens are granulocyte (macrophage) colony-stimulating factors (G(M)-CSF) and antibiotics, frequently quinolones or cotrimoxazole. Current guidelines recommend the use of colony-stimulating factors when the risk of febrile neutropenia is above 20%, but they do not mention the use of antibiotics. However, both regimens have been shown to reduce the incidence of infections. Since no systematic review has compared the two regimens, a systematic review was undertaken. OBJECTIVES To compare the efficacy and safety of G(M)-CSF compared to antibiotics in cancer patients receiving myelotoxic chemotherapy. SEARCH METHODS We searched The Cochrane Library, MEDLINE, EMBASE, databases of ongoing trials, and conference proceedings of the American Society of Clinical Oncology and the American Society of Hematology (1980 to December 2015). We planned to include both full-text and abstract publications. Two review authors independently screened search results. SELECTION CRITERIA We included randomised controlled trials (RCTs) comparing prophylaxis with G(M)-CSF versus antibiotics for the prevention of infection in cancer patients of all ages receiving chemotherapy. All study arms had to receive identical chemotherapy regimes and other supportive care. We included full-text, abstracts, and unpublished data if sufficient information on study design, participant characteristics, interventions and outcomes was available. We excluded cross-over trials, quasi-randomised trials and post-hoc retrospective trials. DATA COLLECTION AND ANALYSIS Two review authors independently screened the results of the search strategies, extracted data, assessed risk of bias, and analysed data according to standard Cochrane methods. We did final interpretation together with an experienced clinician. MAIN RESULTS In this updated review, we included no new randomised controlled trials. We included two trials in the review, one with 40 breast cancer patients receiving high-dose chemotherapy and G-CSF compared to antibiotics, a second one evaluating 155 patients with small-cell lung cancer receiving GM-CSF or antibiotics.We judge the overall risk of bias as high in the G-CSF trial, as neither patients nor physicians were blinded and not all included patients were analysed as randomised (7 out of 40 patients). We considered the overall risk of bias in the GM-CSF to be moderate, because of the risk of performance bias (neither patients nor personnel were blinded), but low risk of selection and attrition bias.For the trial comparing G-CSF to antibiotics, all cause mortality was not reported. There was no evidence of a difference for infection-related mortality, with zero events in each arm. Microbiologically or clinically documented infections, severe infections, quality of life, and adverse events were not reported. There was no evidence of a difference in frequency of febrile neutropenia (risk ratio (RR) 1.22; 95% confidence interval (CI) 0.53 to 2.84). The quality of the evidence for the two reported outcomes, infection-related mortality and frequency of febrile neutropenia, was very low, due to the low number of patients evaluated (high imprecision) and the high risk of bias.There was no evidence of a difference in terms of median survival time in the trial comparing GM-CSF and antibiotics. Two-year survival times were 6% (0 to 12%) in both arms (high imprecision, low quality of evidence). There were four toxic deaths in the GM-CSF arm and three in the antibiotics arm (3.8%), without evidence of a difference (RR 1.32; 95% CI 0.30 to 5.69; P = 0.71; low quality of evidence). There were 28% grade III or IV infections in the GM-CSF arm and 18% in the antibiotics arm, without any evidence of a difference (RR 1.55; 95% CI 0.86 to 2.80; P = 0.15, low quality of evidence). There were 5 episodes out of 360 cycles of grade IV infections in the GM-CSF arm and 3 episodes out of 334 cycles in the cotrimoxazole arm (0.8%), with no evidence of a difference (RR 1.55; 95% CI 0.37 to 6.42; P = 0.55; low quality of evidence). There was no significant difference between the two arms for non-haematological toxicities like diarrhoea, stomatitis, infections, neurologic, respiratory, or cardiac adverse events. Grade III and IV thrombopenia occurred significantly more frequently in the GM-CSF arm (60.8%) compared to the antibiotics arm (28.9%); (RR 2.10; 95% CI 1.41 to 3.12; P = 0.0002; low quality of evidence). Neither infection-related mortality, incidence of febrile neutropenia, nor quality of life were reported in this trial. AUTHORS' CONCLUSIONS As we only found two small trials with 195 patients altogether, no conclusion for clinical practice is possible. More trials are necessary to assess the benefits and harms of G(M)-CSF compared to antibiotics for infection prevention in cancer patients receiving chemotherapy.
Resumo:
The relationship between degree of diastolic blood pressure (DBP) reduction and mortality was examined among hypertensives, ages 30-69, in the Hypertension Detection and Follow-up Program (HDFP). The HDFP was a multi-center community-based trial, which followed 10,940 hypertensive participants for five years. One-year survival was required for inclusion in this investigation since the one-year annual visit was the first occasion where change in blood pressure could be measured on all participants. During the subsequent four years of follow-up on 10,052 participants, 568 deaths occurred. For levels of change in DBP and for categories of variables related to mortality, the crude mortality rate was calculated. Time-dependent life tables were also calculated so as to utilize available blood pressure data over time. In addition, the Cox life table regression model, extended to take into account both time-constant and time-dependent covariates, was used to examine the relationship change in blood pressure over time and mortality.^ The results of the time-dependent life table and time-dependent Cox life table regression analyses supported the existence of a quadratic function which modeled the relationship between DBP reduction and mortality, even after adjusting for other risk factors. The minimum mortality hazard ratio, based on a particular model, occurred at a DBP reduction of 22.6 mm Hg (standard error = 10.6) in the whole population and 8.5 mm Hg (standard error = 4.6) in the baseline DBP stratum 90-104. After this reduction, there was a small increase in the risk of death. There was not evidence of the quadratic function after fitting the same model using systolic blood pressure. Methodologic issues involved in studying a particular degree of blood pressure reduction were considered. The confidence interval around the change corresponding to the minimum hazard ratio was wide and the obtained blood pressure level should not be interpreted as a goal for treatment. Blood pressure reduction was attributed, not only to pharmacologic therapy, but also to regression to the mean, and to other unknown factors unrelated to treatment. Therefore, the surprising results of this study do not provide direct implications for treatment, but strongly suggest replication in other populations. ^
Resumo:
OBJECTIVE - To assess the performance of health systems using diabetes as a tracer condition. RESEARCH DESIGN AND METHODS - We generated a measure of case-fatality among young people with diabetes Using the mortalily-to-incidence ratio (M/I ratio) for 29 industrialized countries using published data on diabetes incidence and mortality. Standardized incidence rates for ages 0-14 years were extracted from the World Health Organization DiaMond Study for the period 1990-1994; data on death from diabetes for ages 0-39 years were obtained from the World Health Organization Mortality database and converted into age-standardized death rates for the period 1994-1998, using the European standard population. RESULTS - The MA ratio varied > 10-fold. These relative differences appear similar to those observed in cohort studies of mortality among young people with type I diabetes in five countries. A sensitivity analysis showed that using plausible assumptions about potential overestimation of diabetes as a cause of death and underestimation of incidence rates in the U.S. yields an M/I ratio that would still be twice as high as in the U.K. or Canada. CONCLUSIONS - The M/I ratio for diabetes provides a means of differentiating countries on quality of care for people with diabetes. It is solely an indicator of potential problems, a basis for Stimulating more detailed assessments of whether such problems exist, and what can be done to address them.
Resumo:
The association of an excessive blood pressure increase with exercise (EBPIE) on cardiovascular outcomes remains controversial. We sought to assess its impact on the risk of all-cause mortality and major cardiac events in patients with known or suspected coronary artery disease (CAD) referred for stress testing. Exercise echocardiography was performed in 10,047 patients with known or suspected CAD. An EBPIE was defined as an increase in systolic blood pressure with exercise ≥80 mmHg. The endpoints were all-cause mortality and major cardiac events (MACE), including cardiac death or nonfatal myocardial infarction (MI). Overall, 573 patients exhibited an EBPIE during the tests. Over a mean follow-up of 4.8 years, there were 1,950 deaths (including 725 cardiac deaths), 1,477 MI, and 1,900 MACE. The cumulative 10-year rates of all-cause mortality, cardiac death, nonfatal MI and MACE were 32.9%, 13.1%, 26,9% and 33% in patients who did not develop an EBPIE vs. 18.9%, 4.7%, 17.5% and 20.7% in those experiencing an EBPIE, respectively (p <0.001 for all comparisons). In Cox regression analyses, an EBPIE remained predictive of all-cause mortality (hazard ratio [HR] 0.73, 95% confidence interval [CI] 0.59-0.91, p = 0.004), cardiac death (HR 0.67, 95% CI 0.46-0.98, p = 0.04), MI (HR 0.67, 95% CI 0.52-0.86, p = 0.002), and MACE (HR 0.69, 95% CI 0.56-0.86, p = 0.001). An EBPIE was associated with a significantly lower risk of mortality and MACE in patients with known or suspected CAD referred for stress testing.
Resumo:
Infective endocarditis (IE) is associated with high inhospital mortality. New microbiological diagnostic techniques have reduced the proportion of patients without etiological diagnosis, but in a significant number of patients the cause is still unknown. Our aim was to study the association of the absence of microbiological diagnosis with in-hospital prognosis. Prospective cohort of 2000 consecutive patients with IE. Data were collected in 26 Spanish hospitals. Modified Duke criteria were used to diagnose patients with suspected IE. A total of 290 patients (14.8%) had negative blood cultures. Etiological diagnosis was achieved with other methods (polymerase chain reaction, serology and other cultures) in 121 (6.1%). Finally, there were 175 patients (8.8%) without microbiological diagnosis (Group A) and 1825 with diagnosis (Group B). In-hospital mortality occurred in 58 patients in Group A (33.1%) vs. 487 (26.7%) in Group B, p = 0.07. Patients in Group A had a lower risk profile than those in Group B, with less comorbidity (Charlson index 1.9 ± 2.0 vs. 2.3 ± 2.1, p = 0.03) and lower surgical risk (EuroSCORE 23.6 ± 21.8 vs. 29.6 ± 25.2, p = 0.02). However they presented heart failure more frequently (53% vs. 40%, p = 0.005). Multivariate analysis showed that the absence of microbiological diagnosis was an independent predictor of inhospital mortality (odds ratio 1.8, 95% Confidence Interval 1.1–2.9, p = 0.016). Approximately 9% of patients with IE had no microbiological diagnosis. Absence of microbiological diagnosis was an independent predictor of inhospital mortality.
Resumo:
Background: In Brazil, heart failure leads to approximately 25,000 deaths per year. Abnormal calcium handling is a hallmark of heart failure and changes in genes encoding for proteins involved in the re-uptake of calcium might harbor mutations leading to inherited cardiomyopathies. Phospholamban (PLN) plays a prime role in cardiac contractility and relaxation and mutations in the gene encoding PLN have been associated with dilated cardiomyopathy. In this study, our objective was to determine the presence of the -36A>C alteration in PLN gene in a Brazilian population of individuals with HF and to test whether this alteration is associated with heart failure or with a worse prognosis of patients with HF. Methods: We genotyped a cohort of 881 patients with HF and 1259 individuals from a cohort of individuals from the general population for the alteration -36A>C in the PLN gene. Allele and genotype frequencies were compared between groups (patients and control). In addition, frequencies or mean values of different phenotypes associated with cardiovascular disease were compared between genotypic groups. Finally, patients were prospectively followed-up for death incidence and genotypes for the -36A>C were compared regarding mortality incidence in HF patients. Results: No significant association was found between the study polymorphism and HF in our population. In addition, no association between PLN -36A>C polymorphism and demographic, clinical and functional characteristics and mortality incidence in this sample of HF patients was observed. Conclusion: Our data do not support a role for the PLN -36A>C alteration in modulating the heart failure phenotype, including its clinical course, in humans.
Resumo:
Background: Cardiac remodeling is generally an adverse sign and is associated with heart failure (HF) progression. NFkB, an important transcription factor involved in many cell survival pathways, has been implicated in the remodeling process, but its role in the heart is still controversial. Recently, a promoter polymorphism associated with a lesser activation of the NFKB1 gene was also associated with Dilated Cardiomyopathy. The purpose of this study was to evaluate the association of this polymorphism with clinical and functional characteristics of heart failure patients of different etiologies. Methods: A total of 493 patients with HF and 916 individuals from a cohort of individuals from the general population were investigated. The NFKB1-94 insertion/deletion ATTG polymorphism was genotyped by High Resolution Melt discrimination. Allele and genotype frequencies were compared between groups. In addition, frequencies or mean values of different phenotypes associated with cardiovascular disease were compared between genotype groups. Finally, patients were prospectively followed-up for death incidence and genotypes for the polymorphism were compared regarding disease onset and mortality incidence in HF patients. Results: We did not find differences in genotype and allelic frequencies between cases and controls. Interestingly, we found an association between the ATTG(1)/ATTG(1) genotype with right ventricle diameter (P = 0.001), left ventricle diastolic diameter (P = 0.04), and ejection fraction (EF) (P = 0.016), being the genotype ATTG(1)/ATTG(1) more frequent in patients with EF lower than 50% (P = 0.01). Finally, we observed a significantly earlier disease onset in ATTG(1)/ATTG(1) carriers. Conclusion: There is no genotype or allelic association between the studied polymorphism and the occurrence of HF in the tested population. However, our data suggest that a diminished activation of NFKB1, previously associated with the ATTG(1)/ATTG(1) genotype, may act modulating on the onset of disease and, once the individual has HF, the genotype may modulate disease severity by increasing cardiac remodeling and function deterioration.
Resumo:
To describe the incidence of cancer in coal miners in New South Wales (NSW) between 1973 and 1992, an inception cohort of all male coal industry employees who entered the industry between 1 January 1973 and 31 December 1992 was constructed from the medical examination records of the Joint Coal Board. This cohort was matched with the NSW State Cancer Registry to determine the occurrence and type of cancer. In the cohort of 23 630 men, 297 developed 301 primary cancers in the 20-year period of observation. The standardised incidence ratio (SLR) for all cancers was 0.82. Stomach cancer has been reported to be common in coal miners but the SIR for stomach cancer was not higher than average in this cohort. A cluster of non-Hodgkin's lymphoma has been reported in a NSW coal mine but an increased risk of this cancer was not evident in the industry as a whole. Similarly a cluster of cases of brain tumour has been reported. In this cohort, the SIR for brain tumour was 1.05 (95 per cent confidence interval (CI) 0.57 to 1.76) and a risk for brain tumour remains unconfirmed. The SIR for malignant melanoma was 1.13 (CI 0.90 to 1.39) altogether and 2.02 (CI 1.31 to 2.98) for those workers who started in an open-cut mine. Overall, there does not appear to be a general risk of cancer in the NSW coal industry. Open-cut miners have an increased risk of malignant melanoma, which may be related to their exposure to the sun at work.
Resumo:
Purpose: The purpose of this study was to assess risk factors associated with the development of acute respiratory failure (ARF) and death in a general intensive care unit (ICU). Materials and Methods: Adults who were hospitalized at 12 surgical and nonsurgical ICUs were prospectively followed up. Multivariable analyses were realized to determine the risk factors for ARF and point out the prognostic factors for mortality in these patients. Results: A total of 1732 patients were evaluated, with an ARF prevalence of 57%. Of the 889 patients who were admitted without ARF, 141 (16%) developed this syndrome in the ICU. The independent risk factors for developing ARF were 64 years of age or older, longer time between hospital and ICU admission, unscheduled surgical or clinical reason for ICU admission, and severity of illness. Of the 984 patients with ARF, 475 (48%) died during the ICU stay. Independent prognostic factors for death were age older than 64 years, time between hospital and ICU admission of more than 4 days, history of hematologic malignancy or AIDS, the development of ARF in ICU, acute lung injury, and severity of illness. Conclusions: Acute respiratory failure represents a large percentage of all ICU patients, and the high mortality is related to some preventable factors such as the time to ICU admission. (C) 2011 Elsevier Inc. All rights reserved.
Resumo:
OBJECTIVE: Evaluate early and late evolution of patients submitted to primary coronary angioplasty for acute myocardial infarction. METHODS: A prospective study of 135 patients with acute myocardial infarction submitted to primary transcutaneous coronary angioplasty (PTCA). Success was defined as TIMI 3 flow and residual lesion <50%. We performed statistical analyses by univariated, multivariated methods and survival analyze by Kaplan-Meier. RESULTS: PTCA success rate was 78% and early mortality 18,5%. Killip classes III and IV was associated to higher mortality, odds ratio 22.9 (95% CI: 5,7 to 91,8) and inversely related to age <75 years (OR = 0,93; 95% CI: 0.88 to 0.98). If we had chosen success flow as TIMI 2 and had excluded patients in Killip III/IV classes, success rate would be 86% and mortality 8%. The survival probability at the end or study, follow-up time 142 ± 114 days, was 80% and event free survival 35%. Greater survival was associated to stenting (OR = 0.09; 0.01 to 0.75) and univessel disease (OR = 0.21; 0.07 to 0.61). CONCLUSION: The success rate was lower and mortality was higher than randomized trials, however similar to that of non randomized studies. This demonstrated the efficacy of primary PTCA in our local conditions.
Resumo:
OBJECTIVES: This study sought to assess outcomes in patients with ST-segment elevation myocardial infarction undergoing primary percutaneous coronary intervention (PCI) for unprotected left main (LM) disease. BACKGROUND: Limited data are available on outcomes in patients with ST-segment elevation myocardial infarction undergoing LM PCI. METHODS: Of 9,075 patients with ST-segment elevation myocardial infarction enrolled in the AMIS (Acute Myocardial Infarction in Switzerland) Plus registry between 2005 and June 30, 2010, 6,666 underwent primary PCI. Of them, 348 (5.2%; mean age: 63.5 ± 12.6 years) underwent LM PCI, either isolated (n = 208) or concomitant to PCI for other vessel segments (n = 140). They were compared with 6,318 patients (94.8%; mean age: 61.9 ± 12.5 years) undergoing PCI of non-LM vessel segments only. RESULTS: The LM patients had higher rates of cardiogenic shock (12.2% vs. 3.5%; p < 0.001), cardiac arrest (10.6% vs. 6.3%; p < 0.01), in-hospital mortality (10.9% vs. 3.8%; p < 0.001), and major adverse cardiac and cerebrovascular events (12.4% vs. 5.0%; p < 0.001) than non-LM PCI. Rates of mortality and major adverse cardiac and cerebrovascular events were highest for concurrent LM and non-LM PCI (17.9% and 18.6%, respectively), intermediate for isolated LM PCI (6.3% and 8.3%, respectively), and lowest for non-LM PCI (3.8% and 5.0%, respectively). Rates of mortality and major adverse cardiac and cerebrovascular events for LM PCI were higher than for non-LM multivessel PCI (10.9% vs. 4.9%, p < 0.001, and 12.4% vs. 6.4%, p < 0.001, respectively). LM disease independently predicted in-hospital death (odds ratio: 2.36; 95% confidence interval: 1.34 to 4.17; p = 0.003). CONCLUSIONS: Emergent LM PCI in the context of acute myocardial infarction, even including 12% cardiogenic shock, appears to have a remarkably high (89%) in-hospital survival. Concurrent LM and non-LM PCI has worse outcomes than isolated LM PCI.
Resumo:
Cancer is a major health problem in our Autonomous Community and is the second cause of death in both males and females. The incidence, mortality, potential years of life lost and resource consumption alongside the suffering endured by patients and families call for a commitment to be made by the Health Administration, healthcare professionals, patients and caregivers. This Plan is based on updated analyses of the mortality, incidence and survival of Cancer in Andalusia, of the situation of Cancer care and the resources available and of the expectations of patients and main caregivers, and on the conclusions of different Work Groups on the Management of Processes related to Cancer. The Andalusian Comprehensive Cancer Plan establishes an action programme that involves organisational and functional changes, new proposals for the training of professionals and a specific funding base.