743 resultados para In-hospital Mortality
Resumo:
INTRODUCTION Spinal disc herniation, lumbar spinal stenosis and spondylolisthesis are known to be leading causes of lumbar back pain. The cost of low back pain management and related operations are continuously increasing in the healthcare sector. There are many studies regarding complications after spine surgery but little is known about the factors predicting the length of stay in hospital. The purpose of this study was to identify these factors in lumbar spine surgery in order to adapt the postoperative treatment. MATERIAL AND METHODS The current study was carried out as a post hoc analysis on the basis of the German spine registry. Patients who underwent lumbar spine surgery by posterior surgical access and with posterior fusion and/or rigid stabilization, whereby procedures with dynamic stabilization were excluded. Patient characteristics were tested for association with length of stay (LOS) using bivariate and multivariate analyses. RESULTS A total of 356 patients met the inclusion criteria. The average age of all patients was 64.6 years and the mean LOS was 11.9 ± 6.0 days with a range of 2-44 days. Independent factors that were influencing LOS were increased age at the time of surgery, higher body mass index, male gender, blood transfusion of 1-2 erythrocyte concentrates and the presence of surgical complications. CONCLUSION Identification of predictive factors for prolonged LOS may allow for estimation of patient hospitalization time and for optimization of postoperative care. In individual cases this may result of a reduction in the LOS.
Resumo:
PURPOSES Geriatric problems frequently go undetected in older patients in emergency departments (EDs), thus increasing their risk of adverse outcomes. We evaluated a novel emergency geriatric screening (EGS) tool designed to detect geriatric problems. BASIC PROCEDURES The EGS tool consisted of short validated instruments used to screen 4 domains (cognition, falls, mobility, and activities of daily living). Emergency geriatric screening was introduced for ED patients 75 years or older throughout a 4-month period. We analyzed the prevalence of abnormal EGS and whether EGS increased the number of EGS-related diagnoses in the ED during the screening, as compared with a preceding control period. MAIN FINDINGS Emergency geriatric screening was performed on 338 (42.5%) of 795 patients presenting during screening. Emergency geriatric screening was unfeasible in 175 patients (22.0%) because of life-threatening conditions and was not performed in 282 (35.5%) for logistical reasons. Emergency geriatric screening took less than 5 minutes to perform in most (85.8%) cases. Among screened patients, 285 (84.3%) had at least 1 abnormal EGS finding. In 270 of these patients, at least 1 abnormal EGS finding did not result in a diagnosis in the ED and was reported for further workup to subsequent care. During screening, 142 patients (42.0%) had at least 1 diagnosis listed within the 4 EGS domains, significantly more than the 29.3% in the control period (odds ratio 1.75; 95% confidence interval, 1.34-2.29; P<.001). Emergency geriatric screening predicted nursing home admission after the in-hospital stay (odds ratio for ≥3 vs <3 abnormal domains 12.13; 95% confidence interval, 2.79-52.72; P=.001). PRINCIPAL CONCLUSIONS The novel EGS is feasible, identifies previously undetected geriatric problems, and predicts determinants of subsequent care.
Resumo:
OBJECTIVE Blood-borne biomarkers reflecting atherosclerotic plaque burden have great potential to improve clinical management of atherosclerotic coronary artery disease and acute coronary syndrome (ACS). APPROACH AND RESULTS Using data integration from gene expression profiling of coronary thrombi versus peripheral blood mononuclear cells and proteomic analysis of atherosclerotic plaque-derived secretomes versus healthy tissue secretomes, we identified fatty acid-binding protein 4 (FABP4) as a biomarker candidate for coronary artery disease. Its diagnostic and prognostic performance was validated in 3 different clinical settings: (1) in a cross-sectional cohort of patients with stable coronary artery disease, ACS, and healthy individuals (n=820), (2) in a nested case-control cohort of patients with ACS with 30-day follow-up (n=200), and (3) in a population-based nested case-control cohort of asymptomatic individuals with 5-year follow-up (n=414). Circulating FABP4 was marginally higher in patients with ST-segment-elevation myocardial infarction (24.9 ng/mL) compared with controls (23.4 ng/mL; P=0.01). However, elevated FABP4 was associated with adverse secondary cerebrovascular or cardiovascular events during 30-day follow-up after index ACS, independent of age, sex, renal function, and body mass index (odds ratio, 1.7; 95% confidence interval, 1.1-2.5; P=0.02). Circulating FABP4 predicted adverse events with similar prognostic performance as the GRACE in-hospital risk score or N-terminal pro-brain natriuretic peptide. Finally, no significant difference between baseline FABP4 was found in asymptomatic individuals with or without coronary events during 5-year follow-up. CONCLUSIONS Circulating FABP4 may prove useful as a prognostic biomarker in risk stratification of patients with ACS.
Resumo:
BACKGROUND Bacterial meningitis (BM) is an infectious disease that results in high mortality and morbidity. Despite efficacious antibiotic therapy, neurological sequelae are often observed in patients after disease. Currently, the main challenge in BM treatment is to develop adjuvant therapies that reduce the occurrence of sequelae. In recent papers published by our group, we described the associations between the single nucleotide polymorphisms (SNPs) AADAT +401C > T, APEX1 Asn148Glu, OGG1 Ser326Cys and PARP1 Val762Ala and BM. In this study, we analyzed the associations between the SNPs TNF -308G > A, TNF -857C > T, IL-8 -251A > T and BM and investigated gene-gene interactions, including the SNPs that we published previously. METHODS The study was conducted with 54 BM patients and 110 healthy volunteers (as the control group). The genotypes were investigated via primer-introduced restriction analysis-polymerase chain reaction (PIRA-PCR) or polymerase chain reaction-based restriction fragment length polymorphism (PCR-RFLP) analysis. Allelic and genotypic frequencies were also associated with cytokine and chemokine levels, as measured with the x-MAP method, and cell counts. We analyzed gene-gene interactions among SNPs using the generalized multifactor dimensionality reduction (GMDR) method. RESULTS We did not find significant association between the SNPs TNF -857C > T and IL-8 -251A > T and the disease. However, a higher frequency of the variant allele TNF -308A was observed in the control group, associated with changes in cytokine levels compared to individuals with wild type genotypes, suggesting a possible protective role. In addition, combined inter-gene interaction analysis indicated a significant association between certain genotypes and BM, mainly involving the alleles APEX1 148Glu, IL8 -251 T and AADAT +401 T. These genotypic combinations were shown to affect cyto/chemokine levels and cell counts in CSF samples from BM patients. CONCLUSIONS In conclusion, this study revealed a significant association between genetic variability and altered inflammatory responses, involving important pathways that are activated during BM. This knowledge may be useful for a better understanding of BM pathogenesis and the development of new therapeutic approaches.
Resumo:
OBJECTIVES The aim of this study was to assess the safety of the concurrent administration of a clopidogrel and prasugrel loading dose in patients undergoing primary percutaneous coronary intervention. BACKGROUND Prasugrel is one of the preferred P2Y12 platelet receptor antagonists for ST-segment elevation myocardial infarction patients. The use of prasugrel was evaluated clinically in clopidogrel-naive patients. METHODS Between September 2009 and October 2012, a total of 2,023 STEMI patients were enrolled in the COMFORTABLE (Comparison of Biomatrix Versus Gazelle in ST-Elevation Myocardial Infarction [STEMI]) and the SPUM-ACS (Inflammation and Acute Coronary Syndromes) studies. Patients receiving a prasugrel loading dose were divided into 2 groups: 1) clopidogrel and a subsequent prasugrel loading dose; and 2) a prasugrel loading dose. The primary safety endpoint was Bleeding Academic Research Consortium types 3 to 5 bleeding in hospital at 30 days. RESULTS Of 2,023 patients undergoing primary percutaneous coronary intervention, 427 (21.1%) received clopidogrel and a subsequent prasugrel loading dose, 447 (22.1%) received a prasugrel loading dose alone, and the remaining received clopidogrel only. At 30 days, the primary safety endpoint was observed in 1.9% of those receiving clopidogrel and a subsequent prasugrel loading dose and 3.4% of those receiving a prasugrel loading dose alone (adjusted hazard ratio [HR]: 0.57; 95% confidence interval [CI]: 0.25 to 1.30, p = 0.18). The HAS-BLED (hypertension, abnormal renal/liver function, stroke, bleeding history or predisposition, labile international normalized ratio, elderly, drugs/alcohol concomitantly) bleeding score tended to be higher in prasugrel-treated patients (p = 0.076). The primary safety endpoint results, however, remained unchanged after adjustment for these differences (clopidogrel and a subsequent prasugrel loading dose vs. prasugrel only; HR: 0.54 [95% CI: 0.23 to 1.27], p = 0.16). No differences in the composite of cardiac death, myocardial infarction, or stroke were observed at 30 days (adjusted HR: 0.66, 95% CI: 0.27 to 1.62, p = 0.36). CONCLUSIONS This observational, nonrandomized study of ST-segment elevation myocardial infarction patients suggests that the administration of a loading dose of prasugrel in patients pre-treated with a loading dose of clopidogrel is not associated with an excess of major bleeding events. (Comparison of Biomatrix Versus Gazelle in ST-Elevation Myocardial Infarction [STEMI] [COMFORTABLE]; NCT00962416; and Inflammation and Acute Coronary Syndromes [SPUM-ACS]; NCT01000701).
Resumo:
BACKGROUND Living at higher altitude was dose-dependently associated with lower risk of ischaemic heart disease (IHD). Higher altitudes have different climatic, topographic and built environment properties than lowland regions. It is unclear whether these environmental factors mediate/confound the association between altitude and IHD. We examined how much of the altitude-IHD association is explained by variations in exposure at place of residence to sunshine, temperature, precipitation, aspect, slope and distance to main road. METHODS We included 4.2 million individuals aged 40-84 at baseline living in Switzerland at altitudes 195-2971 m above sea level (ie, full range of residence), providing 77 127 IHD deaths. Mortality data 2000-2008, sociodemographic/economic information and coordinates of residence were obtained from the Swiss National Cohort, a longitudinal, census-based record linkage study. Environment information was modelled to residence level using Weibull regression models. RESULTS In the model not adjusted for other environmental factors, IHD mortality linearly decreased with increasing altitude resulting in a lower risk (HR, 95% CI 0.67, 0.60 to 0.74) for those living >1500 m (vs<600 m). This association remained after adjustment for all other environmental factors 0.74 (0.66 to 0.82). CONCLUSIONS The benefit of living at higher altitude was only partially confounded by variations in climate, topography and built environment. Rather, physical environment factors appear to have an independent effect and may impact on cardiovascular health in a cumulative way. Inclusion of additional modifiable factors as well as individual information on traditional IHD risk factors in our combined environmental model could help to identify strategies for the reduction of inequalities in IHD mortality.
Resumo:
The long-term risk associated with different coronary artery disease (CAD) presentations in women undergoing percutaneous coronary intervention (PCI) with drug-eluting stents (DES) is poorly characterized. We pooled patient-level data for women enrolled in 26 randomized clinical trials. Of 11,577 women included in the pooled database, 10,133 with known clinical presentation received a DES. Of them, 5,760 (57%) had stable angina pectoris (SAP), 3,594 (35%) had unstable angina pectoris (UAP) or non-ST-segment-elevation myocardial infarction (NSTEMI), and 779 (8%) had ST-segment-elevation myocardial infarction (STEMI) as clinical presentation. A stepwise increase in 3-year crude cumulative mortality was observed in the transition from SAP to STEMI (4.9% vs 6.1% vs 9.4%; p <0.01). Conversely, no differences in crude mortality rates were observed between 1 and 3 years across clinical presentations. After multivariable adjustment, STEMI was independently associated with greater risk of 3-year mortality (hazard ratio [HR] 3.45; 95% confidence interval [CI] 1.99 to 5.98; p <0.01), whereas no differences were observed between UAP or NSTEMI and SAP (HR 0.99; 95% CI 0.73 to 1.34; p = 0.94). In women with ACS, use of new-generation DES was associated with reduced risk of major adverse cardiac events (HR 0.58; 95% CI 0.34 to 0.98). The magnitude and direction of the effect with new-generation DES was uniform between women with or without ACS (pinteraction = 0.66). In conclusion, in women across the clinical spectrum of CAD, STEMI was associated with a greater risk of long-term mortality. Conversely, the adjusted risk of mortality between UAP or NSTEMI and SAP was similar. New-generation DESs provide improved long-term clinical outcomes irrespective of the clinical presentation in women.
Resumo:
Pleural infection is a frequent clinical condition. Prompt treatment has been shown to reduce hospital costs, morbidity and mortality. Recent advances in treatment have been variably implemented in clinical practice. This statement reviews the latest developments and concepts to improve clinical management and stimulate further research. The European Association for Cardio-Thoracic Surgery (EACTS) Thoracic Domain and the EACTS Pleural Diseases Working Group established a team of thoracic surgeons to produce a comprehensive review of available scientific evidence with the aim to cover all aspects of surgical practice related to its treatment, in particular focusing on: surgical treatment of empyema in adults; surgical treatment of empyema in children; and surgical treatment of post-pneumonectomy empyema (PPE). In the management of Stage 1 empyema, prompt pleural space chest tube drainage is required. In patients with Stage 2 or 3 empyema who are fit enough to undergo an operative procedure, there is a demonstrated benefit of surgical debridement or decortication [possibly by video-assisted thoracoscopic surgery (VATS)] over tube thoracostomy alone in terms of treatment success and reduction in hospital stay. In children, a primary operative approach is an effective management strategy, associated with a lower mortality rate and a reduction of tube thoracostomy duration, length of antibiotic therapy, reintervention rate and hospital stay. Intrapleural fibrinolytic therapy is a reasonable alternative to primary operative management. Uncomplicated PPE [without bronchopleural fistula (BPF)] can be effectively managed with minimally invasive techniques, including fenestration, pleural space irrigation and VATS debridement. PPE associated with BPF can be effectively managed with individualized open surgical techniques, including direct repair, myoplastic and thoracoplastic techniques. Intrathoracic vacuum-assisted closure may be considered as an adjunct to the standard treatment. The current literature cements the role of VATS in the management of pleural empyema, even if the choice of surgical approach relies on the individual surgeon's preference.
Resumo:
BACKGROUND There has been little research on bathroom accidents. It is unknown whether the shower or bathtub are connected with special dangers in different age groups or whether there are specific risk factors for adverse outcomes. METHODS This cross-sectional analysis included all direct admissions to the Emergency Department at the Inselspital Bern, Switzerland from 1 January 2000 to 28 February 2014 after accidents associated with the bathtub or shower. Time, age, location, mechanism and diagnosis were assessed and special risk factors were examined. Patient groups with and without intracranial bleeding were compared with the Mann-Whitney U test.The association of risk factors with intracranial bleeding was investigated using univariate analysis with Fisher's exact test or logistic regression. The effects of different variables on cerebral bleeding were analysed by multivariate logistic regression. RESULTS Two hundred and eighty (280) patients with accidents associated with the bathtub or shower were included in our study. Two hundred and thirty-five (235) patients suffered direct trauma by hitting an object (83.9%) and traumatic brain injury (TBI) was detected in 28 patients (10%). Eight (8) of the 27 patients with mild traumatic brain injuries (GCS 13-15), (29.6%) exhibited intracranial haemorrhage. All patients with intracranial haemorrhage were older than 48 years and needed in-hospital treatment. Patients with intracranial haemorrhage were significantly older and had higher haemoglobin levels than the control group with TBI but without intracranial bleeding (p<0.05 for both).In univariate analysis, we found that intracranial haemorrhage in patients with TBI was associated with direct trauma in general and with age (both p<0.05), but not with the mechanism of the fall, its location (shower or bathtub) or the gender of the patient. Multivariate logistic regression analysis identified only age as a risk factor for cerebral bleeding (p<0.05; OR 1.09 (CI 1.01;1.171)). CONCLUSION In patients with ED admissions associated with the bathtub or shower direct trauma and age are risk factors for intracranial haemorrhage. Additional effort in prevention should be considered, especially in the elderly.
Resumo:
Number of days spent in acute hospitals (DAH) at the end of life is regarded as an important care quality indicator for cancer patients. We analysed DAH during 90 days prior to death in patients from four Swiss cantons. Claims data from an insurance provider with about 20% market share and patient record review identified 2086 patients as dying of cancer. We calculated total DAH per patient. Multivariable generalised linear modelling served to evaluate potential explanatory variables. Mean DAH was 26 days. In the multivariable model, using complementary and alternative medicine (DAH = 33.9; +8.8 days compared to non-users) and canton of residence (for patient receiving anti-cancer therapy, Zürich DAH = 22.8 versus Basel DAH = 31.4; for other patients, Valais DAH = 22.7 versus Ticino DAH = 33.7) had the strongest influence. Age at death and days spent in other institutions were additional significant predictors. DAH during the last 90 days of life of cancer patients from four Swiss cantons is high compared to most other countries. Several factors influence DAH. Resulting differences are likely to have financial impact, as DAH is a major cost driver for end-of-life care. Whether they are supply- or demand-driven and whether patients would prefer fewer days in hospital remains to be established.
Resumo:
Changes in species composition in two 4–ha plots of lowland dipterocarp rainforest at Danum, Sabah, were measured over ten years (1986 to 1996) for trees greater than or equal to 10 cm girth at breast height (gbh). Each included a lower–slope to ridge gradient. The period lay between two drought events of moderate intensity but the forest showed no large lasting responses, suggesting that its species were well adapted to this regime. Mortality and recruitment rates were not unusual in global or regional comparisons. The forest continued to aggrade from its relatively (for Sabah) low basal area in 1986 and, together with the very open upper canopy structure and an abundance of lianas, this suggests a forest in a late stage of recovery from a major disturbance, yet one continually affected by smaller recent setbacks. Mortality and recruitment rates were not related to population size in 1986, but across subplots recruitment was positively correlated with the density and basal area of small trees (10 to <50 cm gbh) forming the dense understorey. Neither rate was related to topography. While species with larger mean gbh had greater relative growth rates (rgr) than smaller ones, subplot mean recruitment rates were correlated with rgr among small trees. Separating understorey species (typically the Euphorbiaceae) from the overstorey (Dipterocarpaceae) showed marked differences in change in mortality with increasing gbh: in the former it increased, in the latter it decreased. Forest processes are centred on this understorey quasi–stratum. The two replicate plots showed a high correspondence in the mortality, recruitment, population changes and growth rates of small trees for the 49 most abundant species in common to both. Overstorey species had higher rgrs than understorey ones, but both showed considerable ranges in mortality and recruitment rates. The supposed trade–off in traits, viz slower rgr, shade tolerance and lower population turnover in the understorey group versus faster potential growth rate, high light responsiveness and high turnover in the overstorey group, was only partly met, as some understorey species were also very dynamic. The forest at Danum, under such a disturbance–recovery regime, can be viewed as having a dynamic equilibrium in functional and structural terms. A second trade–off in shade–tolerance versus drought–tolerance is suggested for among the understorey species. A two–storey (or vertical component) model is proposed where the understorey–overstorey species’ ratio of small stems (currently 2:1) is maintained by a major feedback process. The understorey appears to be an important part of this forest, giving resilience against drought and protecting the overstorey saplings in the long term. This view could be valuable for understanding forest responses to climate change where drought frequency in Borneo is predicted to intensify in the coming decades.
Resumo:
Malaria poses a significant public health problem worldwide. The World Health Organization indicates that approximately 40% of the world's population and almost 85% of the population from the South–East Asian region is at risk of contracting malaria. India being the most populous country in the region, contributes the highest number of malaria cases and deaths attributed to malaria. Orissa is the state that has the highest number of malaria cases and deaths attributable to malaria. A secondary data analysis was carried out to evaluate the effectiveness of the World bank-assisted Malaria Action Program in the state of Orissa under the health sector reforms of 1995-96. The secondary analysis utilized the government of India's National Anti Malaria Management Information System's (NAMMIS) surveillance data and the National Family Health Survey (NFHS–I and NFHS–II) datasets to compare the malaria mortality and morbidity in the state between 1992-93 and 1998-99. Results revealed no effect of the intervention and indicated an increase of 2.18 times in malaria mortality between 1992-1999 and an increase of 1.53 times in malaria morbidity between 1992-93 and 1998-99 in the state. The difference in the age-adjusted malaria morbidity in the state between the time periods of 1992-93 and 1998-99 proved to be highly significant (t = 4.29 df=16, p<. 0005) whereas the difference between the increase of age-adjusted malaria morbidity during 1992-93 and 1998-99 between Orissa (with intervention) and Bihar (no intervention) proved to be non significant (t=.0471 df=16, p<.50). Factors such as underutilization of World Bank funds for the malaria control program, inadequate health care infrastructure, structural adjustment problems, poor management, poor financial management, parasite resistance to anti-malarial drugs, inadequate supply of drugs and staff shortages may have contributed to the failure of the program in the state.^
Resumo:
Introduction. Injury mortality was classically described with a tri-modal distribution, with immediate deaths at the scene, early deaths due to hemorrhage, and late deaths from organ failure. We hypothesized that trauma systems development have improved pre-hospital care, early resuscitation, and critical care, and altered this pattern. ^ Methods. This is a population-based study of all trauma deaths in an urban county with a mature trauma system (n=678, median age 33 years, 81% male, 43% gunshot, 20% motor vehicle crashes). Deaths were classified as immediate (scene), early (in hospital, ≤ 4 hours from injury), or late (>4 hours post injury). Multinomial regression was used to identify independent predictors of immediate and early vs. late deaths, adjusted for age, gender, race, intention, mechanism, toxicology and cause of death. ^ Results. There were 416 (61%) immediate, 199 (29%) early, and 63 (10%) late deaths. Immediate deaths remained unchanged and early deaths occurred much earlier (median 52 minutes vs. 120). However, unlike the classic trimodal distribution, there was no late peak. Intentional injuries, alcohol intoxication, asphyxia, and injuries to the head and chest were independent predictors of immediate deaths. Alcohol intoxication and injuries to the chest were predictors of early deaths, while pelvic fractures and blunt assaults were associated with late deaths. ^ Conclusion. Trauma deaths now have a bimodal distribution. Elimination of the late peak likely represents advancements in resuscitation and critical care that have reduced organ failure. Further reductions in mortality will likely come from prevention of intentional injuries, and injuries associated with alcohol intoxication. ^
Resumo:
The relationship between change in myocardial infarction (MI) mortality rate (ICD codes 410, 411) and change in use of percutaneous transluminal coronary angioplasty (PTCA), adjusted for change in hospitalization rates for MI, and for change in use of aortocoronary bypass surgery (ACBS) from 1985 through 1990 at private hospitals was examined in the biethnic community of Nueces County, Texas, site of the Corpus Christi Heart Project, a major coronary heart disease (CHD) surveillance program. Age-adjusted rates (per 100,000 persons) were calculated for each of these CHD events for the population aged 25 through 74 years and for each of the four major sex-ethnic groups: Mexican-American and Non-Hispanic White women and men. Over this six year period, there were 541 MI deaths, 2358 MI hospitalizations, 816 PTCA hospitalizations, and 920 ACBS hospitalizations among Mexican-American and Non-Hispanic White Nueces County residents. Acute MI mortality decreased from 24.7 in the first quarter of 1985 to 12.1 in the fourth quarter of 1990, a 51.2% decrease. All three hospitalization rates increased: The MI hospitalization rates increased from 44.1 to 61.3, a 38.9% increase, PTCA use increased from 7.1 to 23.2, a 228.0% increase, and ACBS use increased from 18.8 to 29.5, a 56.6% increase. In linear regression analyses, the change in MI mortality rate was negatively associated with the change in PTCA use (beta = $-$.266 $\pm$.103, p = 0.017) but was not associated with the changes in MI hospitalization rate and in ACBS use. The results of this ecologic research support the idea that the increasing use of PTCA, but not ACBS, has been associated with decreases in MI mortality. The contrast in associations between these two revascularization procedures and MI mortality highlights the need for research aimed at clarifying the proper roles of these procedures in the treatment of patients with CHD. The association between change in PTCA use and change in MI mortality supports the idea that some changes in medical treatment may be partially responsible for trends in CHD mortality. Differences in the use of therapies such as PTCA may be related to differences between geographical sites in CHD rates and trends. ^
Resumo:
Coronary perfusion with thrombolytic therapy and selective reperfusion by percutaneous transluminal coronary angioplasty (PTCA) were examined in the Corpus Christi Heart Project, a population-based surveillance program for hospitalized acute myocardial infarction (MI) patients in a biethnic community of Mexican-Americans (MAs) and non-Hispanic whites (NHWs). Results were based on 250 (12.4%) patients who received thromobolytic therapy in a cohort of 2011 acute MI cases. Out of these 107 (42.8%) underwent PTCA with a mean follow-up of 25 months. There were 186 (74.4%) men and 64 (25.6%) women; 148 (59.2%) were NHWs, 86 (34.4%) were MAs. Thrombolysis and PTCA were performed less frequently in women than in men, and less frequently in MAs than in NHWs.^ According to the coronary reperfusion interventions used, patients were divided in two groups, those that received no-PTCA (57.2%) and the other that underwent PTCA (42.8%) after thrombolysis. The case-fatality rate was higher in no-PTCA patients than in the PTCA (7.7% versus 5.6%), as was mortality at one year (16.2% versus 10.5%). Reperfusion was successful in 48.0% in the entire cohort and (51.4% versus 45.6%) in the PTCA and no-PTCA groups. Mortality in the successful reperfusion patients was 5.0% compared to 22.3% in the unsuccessful reperfusion group (p = 0.00016, 95% CI: 1.98-11.6).^ Cardiac catheterization was performed in 86.4% thrombolytic patients. Severe stenosis ($>$75%) obstruction was present most commonly in the left descending artery (52.8%) and in the right coronary artery (52.8%). The occurrence of adverse in-hospital clinical events was higher in the no-PTCA as compared to the PTCA and catheterized patients with the exception of reperfusion arrythmias (p = 0.140; Fisher's exact test p = 0.129).^ Cox regression analysis was used to study the relationship between selected variables and mortality. Apart from successful reperfusion, age group (p = 0.028, 95% CI: 2.1-12.42), site of acute MI index (p = 0.050) and ejection-fraction (p = 0.052) were predictors of long-term survival. The ejection-fraction in the PTCA group was higher than (median 78% versus 53%) in the no-PTCA group. Assessed by logistic regression analysis history of high cholesterol ($>$200mg/dl) and diabetes mellites did have significant prognostic value (p = 0.0233; p = 0.0318) in long-term survival irrespective of treatment status.^ In conclusion, the results of this study support the idea that the use of PTCA as a selective intervention following thrombolysis improves survival of patients with acute MI. The use of PTCA in this setting appears to be safe. However, we can not exclude the possibility that some of these results may have occurred due to the exclusion from PTCA of high risk patients (selection bias). ^