17 resultados para Acute risk
em University of Queensland eSpace - Australia
Resumo:
In patients hospitalised with acute coronary syndromes (ACS) and congestive heart failure (CHF), evidence suggests opportunities for improving in-hospital and after hospital care, patient self-care, and hospital-community integration. A multidisciplinary quality improvement program was designed and instigated in Brisbane in October 2000 involving 250 clinicians at three teaching hospitals, 1080 general practitioners (GPs) from five Divisions of General Practice, 1594 patients with ACS and 904 patients with CHF. Quality improvement interventions were implemented over 17 months after a 6-month baseline period and included: clinical decision support (clinical practice guidelines, reminders, checklists, clinical pathways); educational interventions (seminars, academic detailing); regular performance feedback; patient self-management strategies; and hospital-community integration (discharge referral summaries; community pharmacist liaison; patient prompts to attend GPs). Using a before-after study design to assess program impact, significantly more program patients compared with historical controls received: ACS: Angiotensin-converting enzyme (ACE) inhibitors and lipid-lowering agents at discharge, aspirin and beta-blockers at 3 months after discharge, inpatient cardiac counselling, and referral to outpatient cardiac rehabilitation. CHF. Assessment for reversible precipitants, use of prophylaxis for deep-venous thrombosis, beta-blockers at discharge, ACE inhibitors at 6 months after discharge, imaging of left ventricular function, and optimal management of blood pressure levels. Risk-adjusted mortality rates at 6 and 12 months decreased, respectively, from 9.8% to 7.4% (P=0.06) and from 13.4% to 10.1% (P= 0.06) for patients with ACS and from 22.8% to 15.2% (P < 0.001) and from 32.8% to 22.4% (P= 0.005) for patients with CHF. Quality improvement programs that feature multifaceted interventions across the continuum of care can change clinical culture, optimise care and improve clinical outcomes.
Resumo:
Background: Metformin is commonly prescribed to treat type 2 diabetes mellitus, however it is associated with the potentially lethal condition of lactic acidosis. Prescribing guidelines have been developed to minimize the risk of lactic acidosis development, although some suggest they are inappropriate and have created confusion amongst prescribers. The aim of this study was to investigate whether metformin dose was influenced by the presence of risk factors for lactic acidosis. Methods: The study was prospective, and retrieved information from patients admitted to hospital who were prescribed metformin at their time of admission. Results: Eighty-three patients were included in the study, 60 of whom had a least one risk factor for lactic acidosis. Of those 60 patients, 78.3% had a dose adjustment, with renal impairment, hepatic impairment, surgery and use of radiological contrast media - the risk factors most likely to result in a dose adjustment. When dose adjustments did occur, metformin was withheld on 88.7% of occasions. Conclusion: Metformin dose was influenced by the presence of risk factors for lactic acidosis, although it was dependent upon the number and particular risk factor/s present.
Resumo:
Aims: The aim of this study was to quantify the relationship between acute alcohol consumption and injury type (nature of injury, body region injured), while adjusting for the effect of known confounders (i.e. demographic and situational variables, usual drinking patterns, substance use and risk-taking behaviour). Methods: A cross-sectional study was conducted between October, 2000 and October, 2001 of patients aged >= 15 years presenting to a Queensland Emergency Department for treatment of an injury sustained in the preceding 24 h. There were three measures of acute alcohol consumption: drinking setting, quantity, and beverage type consumed in the 6 h prior to injury. Two variables were used to quantify injury type: nature of injury (fracture/dislocation, superficial, internal, and CNS injury) and body part injured (head/neck, facial, chest, abdominal, external, and extremities). Both were derived from patient medical records. Results: Five hundred and ninety three patients were interviewed. Logistic regression analyses indicated that, after controlling for relevant confounding variables, there was no significant association between any of the three measures of acute alcohol consumption and injury type. Conclusions: The effects of acute alcohol consumption are not specific to injury type. Interventions aimed at reducing the incidence of alcohol-related injury should not be targeted at specific injury types.
Resumo:
There is interest in the postulate that cyclosporine a (CsA) contributes to the elevated homocysteine levels seen in organ transplant recipients, as hyperhomocysteinemia is now considered an independent risk factor for cardiovascular disease (CVD) and may partially explain the increased prevalence of CVD in this population. The main purpose of this investigation was to determine the effect of CsA administration on plasma homocysteine. Eighteen female Sprague Dawley rats (4 months old) were randomly assigned to either a treatment or a control group. For 18 days the treatment group received of CsA (25 mg/kg/d) while the control group received the same volume of the vehicle. Blood samples obtained following sacrifice to measure CsA, total homocysteine, and plasma creatinine. There were no significant differences in plasma homocysteine (mean values SD: treatment = 4.79 +/- 0.63 mu mol/L, control = 4.46 +/- 0.75 mu mol/L; P = .37). Homocysteine was not significantly correlated with final CsA concentrations (r = .17; P = .69). There was a significant difference in plasma creatinine values between the two groups (treatment = 60.44 +/- 7.68 mu mol/L, control = 46.33 +/- 1.66 mu mol/L; P < .001). Furthermore, plasma homocysteine and creatinine were positively correlated with the treatment group (r = .73; P < .05) but not the controls (r = -.10; P = .81). In conclusion, CsA does not influence plasma homocysteine concentrations in rats.
Resumo:
Objectives: To re-examine interhospital variation in 30 day survival after acute myocardial infarction ( AMI) 10 years on to see whether the appointment of new cardiologists and their involvement in emergency care has improved outcome after AMI. Design: Retrospective cohort study. Setting: Acute hospitals in Scotland. Participants: 61 484 patients with a first AMI over two time periods: 1988 - 1991; and 1998 - 2001. Main outcome measures: 30 day survival. Results: Between 1988 and 1991, median 30 day survival was 79.2% ( interhospital range 72.1 - 85.1%). The difference between highest and lowest was 13.0 percentage points ( age and sex adjusted, 12.1 percentage points). Between 1998 and 2001, median survival rose to 81.6% ( and range decreased to 78.0 - 85.6%) with a difference of 7.6 ( adjusted 8.8) percentage points. Admission hospital was an independent predictor of outcome at 30 days during the two time periods ( p< 0.001). Over the period 1988 - 1991, the odds ratio for death ranged, between hospitals, from 0.71 ( 95% confidence interval ( CI) 0.58 to 0.88) to 1.50 ( 95% CI 1.19 to 1.89) and for the period 1998 - 2001 from 0.82 ( 95% CI 0.60 to 1.13) to 1.46 ( 95% CI 1.07 to 1.99). The adjusted risk of death was significantly higher than average in nine of 26 hospitals between 1988 and 1991 but in only two hospitals between 1998 and 2001. Conclusions: The average 30 day case fatality rate after admission with an AMI has fallen substantially over the past 10 years in Scotland. Between-hospital variation is also considerably less notable because of better survival in the previously poorly performing hospitals. This suggests that the greater involvement of cardiologists in the management of AMI has paid dividends.
Resumo:
Objectives - Nitric oxide (NO) is critically important in the regulation of vascular tone and the inhibition of platelet aggregation. We have shown previously that patients with acute coronary syndromes (ACS) or stable angina pectoris have impaired platelet responses to NO donors when compared with normal subjects. We tested the hypotheses that platelet hyporesponsiveness to NO is a predictor of (1) cardiovascular readmission and/or death and (2) all-cause mortality in patients with ACS (unstable angina pectoris or non-Q-wave myocardial infarction). Methods and Results - Patients (n = 51) with ACS had evaluation of platelet aggregation within 24 hours of coronary care unit admission using impedance aggregometry. Patients were categorized as having normal (>= 32% inhibition of ADP-induced aggregation with the NO donor sodium nitroprusside; 10 mu mol/L; n = 18) or impaired (>= 32% inhibition of ADP-induced aggregation; n = 33) NO responses. We then compared the incidence of cardiovascular readmission and death during a median of 7 years of follow-up in these 2 groups. Using a Cox proportional hazards model adjusting for age, sex, index event, postdischarge medical treatment, revascularization status, left ventricular systolic dysfunction, concurrent disease states, and cardiac risk factors, impaired NO responsiveness was associated with an increased risk of the combination of cardiovascular readmission and/or death (relative risk, 2.7; 95% CI, 1.03 to 7.10; P = 0.041) and all-cause mortality (relative risk, 6.3; 95% CI, 1.09 to 36.7; P = 0.033). Conclusions - Impaired platelet NO responsiveness is a novel, independent predictor of increased mortality and cardiovascular morbidity in patients with high-risk ACS.
Resumo:
Objective: The purpose of this study was to determine whether injury mechanism among injured patients is differentially distributed as a function of acute alcohol consumption (quantity, type, and drinking setting). Method: A cross-sectional study was conducted between October 2000 and October 2001 in the Gold Coast Hospital Emergency Department, Queensland, Australia. Data were collected quarterly over a 12-month period. Every injured patient who presented to the emergency department during the study period for treatment of an injury sustained less than 24 hours prior to presentation was approached for interview. The final sample comprised 593 injured patients (males = 377). Three measures of alcohol consumption in the 6 hours prior to injury were obtained from self-report: quantity, beverage type, and drinking setting. The main outcome measure was mechanism of injury which was categorized into six groups: road traffic crash (RTC), being hit by or against something, fall, cut/piercing, overdose/poisoning, and miscellaneous. Injury intent was also measured (intentional vs unintentional). Results: After controlling for relevant confounding variables, neither quantity nor type of alcohol was significantly associated with injury mechanism. However, drinking setting (i.e., licensed premise) was significantly associated with increased odds of sustaining an intentional versus unintentional injury (odds ratio [OR] = 2.79, 95% confidence interval [CI] = 1.4-5.6); injury through being hit by/against something versus other injury types (OR = 2.59, 95% CI = 1.4-4.9); and reduced odds of sustaining an injury through RTC versus non-RTC (OR = 0.02, 95% CI = 0.004-0.9), compared with not drinking alcohol prior to injury. Conclusions: No previous analytical studies have examined the relationship between injury mechanism and acute alcohol consumption (quantity, type, and setting) across all types of injury and all injury severities while controlling for potentially important confounders (demographic and situational confounders, risk-taking behavior, substance use, and usual drinking patterns). These data suggest that among injured patients, mechanism of injury is not differentially distributed as a function of quantity or type of acute alcohol consumption but may be differentially distributed as a function of drinking setting (i.e., RTC, intentional injury, being hit). Therefore, prevention strategies that focus primarily on the quantity and type of alcohol consumed should be directed generically across injury mechanisms and not limited to particular cause of injury campaigns.
Resumo:
Background We compared cost-effectiveness of pravastatin in a placebo-controlled trial in 5500 younger (31-64 years) and 3514 older patients (65-74 years) with previous acute coronary syndromes. Methods Hospitalizations and long-term medication within the 6 years of the trial were estimated in all patients. Drug dosage, nursing home, and ambulatory care costs were estimated from substudies. Incremental costs per life saved of pravastatin relative to placebo were estimated from treatment effects and resource use. Results Over 6 years, pravastatin reduced all-cause mortality by 4.3% in the older patients and by 2.3% in the younger patients. Older patients assigned pravastatin had marginally lower cost of pravastatin and other medication over 6 years (A$4442 vs A$4637), but greater cost offsets (A$2061 vs. A$897) from lower rates of hospitalizations. The incremental cost per life saved with pravastatin was A$55500 in the old and A$167200 in the young. Assuming no treatment effect beyond the study period, the life expectancy to age 82 years of additional survivors was 9.1 years in the older and. 17.3 years in the younger. Estimated additional life-years saved from pravastatin therapy were 0.39 years for older and 0.40 years for younger patients. Incremental costs per life-year saved were A$7581 in the older and A$1.4944 in the younger, if discounted at 5% per annum. Conclusions Pravastatin therapy was more cost-effective among older than younger patients, because of their higher baseline risk and greater cost offsets, despite their shorter life expectancy.
Resumo:
Background Our aim was to calculate the global burden of disease and risk factors for 2001, to examine regional trends from 1990 to 2001, and to provide a starting point for the analysis of the Disease Control Priorities Project (DCPP). Methods We calculated mortality, incidence, prevalence, and disability adjusted life years (DALYs) for 136 diseases and injuries, for seven income/geographic country groups. To assess trends, we re-estimated all-cause mortality for 1990 with the same methods as for 2001. We estimated mortality and disease burden attributable to 19 risk factors. Findings About 56 million people died in 2001. Of these, 10.6 million were children, 99% of whom lived in low-and-middle-income countries. More than half of child deaths in 2001 were attributable to acute respiratory infections, measles, diarrhoea, malaria, and HIV/AIDS. The ten leading diseases for global disease burden were perinatal conditions, lower respiratory infections, ischaemic heart disease, cerebrovascular disease, HIV/AIDS, diarrhoeal diseases, unipolar major depression, malaria, chronic obstructive pulmonary disease, and tuberculosis. There was a 20% reduction in global disease burden per head due to communicable, maternal, perinatal, and nutritional conditions between 1990 and 2001. Almost half the disease burden in low-and-middle-income countries is now from non-communicable diseases (disease burden per head in Sub-Saharan Africa and the low-and-middle-income countries of Europe and Central Asia increased between 1990 and 2001). Undernutrition remains the leading risk factor for health loss. An estimated 45% of global mortality and 36% of global disease burden are attributable to the joint hazardous effects of the 19 risk factors studied. Uncertainty in all-cause mortality estimates ranged from around 1% in high-income countries to 15-20% in Sub-Saharan Africa. Uncertainty was larger for mortality from specific diseases, and for incidence and prevalence of non-fatal outcomes. Interpretation Despite uncertainties about mortality and burden of disease estimates, our findings suggest that substantial gains in health have been achieved in most populations, countered by the HIV/AIDS epidemic in Sub-Saharan Africa and setbacks in adult mortality in countries of the former Soviet Union. our results on major disease, injury, and risk factor causes of loss of health, together with information on the cost-effectiveness of interventions, can assist in accelerating progress towards better health and reducing the persistent differentials in health between poor and rich countries.
Resumo:
Transient prenatal vitamin D deficiency produces hyperlocomotion in the adult rat. The aim of this study was to examine the effects of acute restraint on the behaviour of DVD and control rats in the open field. Rats were conceived and born to developmentally vitamin D (DVD) deficient or replete (control) dams and, at 8 weeks of age, were monitored for 30 min in an open field using automated video tracking software. Half of the rats were restrained within a towel for 5 min immediately before the open field test. The remainder received minimal handling prior to the open field test. Repeating previous findings, DVD deficient animals had enhanced locomotion during the first 10 min of the open field test compared to control rats. By contrast, there were no differences in locomotor activity after acute restraint stress. The time rats spent in the corners and side of the open field was affected by prenatal diet. DVD rats spent less time in the corners and more time in the side than control rats across the whole 30 min test. This difference was not seen in rats with acute restraint stress. The time spent in the centre was not altered by prenatal diet or acute restraint. Thus, transient prenatal vitamin D deficiency induces a transient spontaneous hyperlocomotion in adulthood that is modulated by acute restraint stress. (c) 2006 Elsevier B.V. All rights reserved.
Resumo:
Administration of human recombinant erythropoietin ( EPO) at time of acute ischemic renal injury ( IRI) inhibits apoptosis, enhances tubular epithelial regeneration, and promotes renal functional recovery. The present study aimed to determine whether darbepoetin-alfa ( DPO) exhibits comparable renoprotection to that afforded by EPO, whether pro or antiapoptotic Bcl-2 proteins are involved, and whether delayed administration of EPO or DPO 6 h following IRI ameliorates renal dysfunction. The model of IRI involved bilateral renal artery occlusion for 45 min in rats ( N = 4 per group), followed by reperfusion for 1-7 days. Controls were sham-operated. Rats were treated at time of ischemia or sham operation ( T0), or post-treated ( 6 h after the onset of reperfusion, T6) with EPO ( 5000 IU/kg), DPO ( 25 mu g/kg), or appropriate vehicle by intraperitoneal injection. Renal function, structure, and immunohistochemistry for Bcl-2, Bcl-XL, and Bax were analyzed. DPO or EPO at T0 significantly abrogated renal dysfunction in IRI animals ( serum creatinine for IRI 0.17 +/- 0.05mmol/l vs DPO-IRI 0.08 +/- 0.03mmol/l vs EPO-IRI 0.04 +/- 0.01mmol/l, P = 0.01). Delayed administration of DPO or EPO ( T6) also significantly abrogated subsequent renal dysfunction ( serum creatinine for IRI 0.17 +/- 0.05mmol/l vs DPO-IRI 0.06 +/- 0.01mmol/l vs EPO-IRI 0.03 +/- 0.03mmol/l, P = 0.01). There was also significantly decreased tissue injury ( apoptosis, P < 0.05), decreased proapoptotic Bax, and increased regenerative capacity, especially in the outer stripe of the outer medulla, with DPO or EPO at T0 or T6. These results reaffirm the potential clinical application of DPO and EPO as novel renoprotective agents for patients at risk of ischemic acute renal failure or after having sustained an ischemic renal insult.
Resumo:
Background Cardiac disease is the principal cause of death in patients with chronic kidney disease (CKD). Ischemia at dobutamine stress echocardiography (DSE) is associated with adverse events in these patients. We sought the efficacy of combining clinical risk evaluation with DSE. Methods We allocated 244 patients with CKD (mean age 54 years, 140 men, 169 dialysis-dependent at baseline) into low- and high-risk groups based on two disease-specific scores and the Framingham risk model. All underwent DSE and were further stratified according to DSE results. Patients were followed over 20 +/- 14 months for events (death, myocardial infarction, acute coronary syndrome). Results There were 49 deaths and 32 cardiac events. Using the different clinical scores, allocation of high risk varied from 34% to 79% of patients, and 39% to 50% of high-risk patients had an abnormal DSE. In the high-risk groups, depending on the clinical score chosen, 25% to 44% with an abnormal DSE had a cardiac event, compared with 8% to 22% with a.normal DSE. Cardiac events occurred in 2.0%, 3.1 %, and 9.7% of the low-risk patients, using the two disease-specific and Framingham scores, respectively, and DSE results did not add to risk evaluation in this subgroup. Independent DSE predictors of cardiac events were a lower resting diastolic blood pressure, angina during the test, and the combination of ischemia with resting left ventricular dysfunction. Conclusion In CKD patients, high-risk findings by DSE can predict outcome. A stepwise strategy of combining clinical risk scores with DSE for CAD screening in CKD reduces the number of tests required and identifies a high-risk subgroup among whom DSE results more effectively stratify high and low risk.