644 resultados para ADMISSIONS
Resumo:
Prescription medicine samples (or starter packs) are provided by pharmaceutical manufacturers to prescribing doctors as one component in the suite of marketing products used to convince them to prescribe a particular medicine [1,2]. Samples are generally newer, more expensive treatment options still covered by patent [3,4]. Safe, effective, judicious and appropriate medicine use (quality use of medicines) [5] could be enhanced by involving community pharmacists in the dispensing of starter packs. Doctors who use samples show a trend towards prescribing more expensive medicines overall [6] and also prescribe more medicines [7]. Cardiovascular health and mental health are Australian National Health Priority Areas [8] and account for approximately 30% and 17%, respectively, of annual government Pharmaceutical Benefits System (PBS) in 2006 [9]. The PBS is Australia's universal prescription subsidy scheme [9]. Antihypertensives were a major contributor to the estimated 80 000 medicine-related hospital admissions in Australia in 1999 [10] and also internationally [11,12]. The aim of this study was to pilot an alternative model for supply of free sample or starter packs of prescription medicines and ascertain if it is a viable model in daily practice.
Resumo:
Background Chronic kidney disease is a global public health problem of increasing prevalence. There are five stages of kidney disease, with Stage 5 indicating end stage kidney disease (ESKD) requiring dialysis or death will eventually occur. Over the last two decades there have been increasing numbers of people commencing dialysis. A majority of this increase has occurred in the population of people who are 65 years and over. With the older population it is difficult to determine at times whether dialysis will provide any benefit over non-dialysis management. The poor prognosis for the population over 65 years raises issues around management of ESKD in this population. It is therefore important to review any research that has been undertaken in this area which compares outcomes of the older ESKD population who have commenced dialysis with those who have received non-dialysis management. Objective The primary objective was to assess the effect of dialysis compared with non-dialysis management for the population of 65 years and over with ESKD. Inclusion criteria Types of participants This review considered studies that included participants who were 65 years and older. These participants needed to have been diagnosed with ESKD for greater than three months and also be either receiving renal replacement therapy (RRT) (hemodialysis [HD] or peritoneal dialysis [PD]) or non-dialysis management. The settings for the studies included the home, self-care centre, satellite centre, hospital, hospice or nursing home. Types of intervention(s)/phenomena of interest This review considered studies where the intervention was RRT (HD or PD) for the participants with ESKD. There was no restriction on frequency of RRT or length of time the participant received RRT. The comparator was participants who were not undergoing RRT. Types of studies This review considered both experimental and epidemiological study designs including randomized controlled trials, non-randomized controlled trials, quasi-experimental, before and after studies, prospective and retrospective cohort studies, case control studies and analytical cross sectional studies. This review also considered descriptive epidemiological study designs including case series, individual case reports and descriptive cross sectional studies for inclusion. This review included any of the following primary and secondary outcome measures: •Primary outcome – survival measures •Secondary outcomes – functional performance score (e.g. Karnofsky Performance score) •Symptoms and severity of end stage kidney disease •Hospital admissions •Health related quality of life (e.g. KDQOL, SF36 and HRQOL) •Comorbidities (e.g. Charlson Comorbidity index).
Resumo:
Background The purpose of this study was to estimate the incidence of fatal and non-fatal Low Speed Vehicle Run Over (LSVRO) events among children aged 0–15 years in Queensland, Australia, at a population level. Methods Fatal and non-fatal LSVRO events that occurred in children resident in Queensland over eleven calendar years (1999-2009) were identified using ICD codes, text description, word searches and medical notes clarification, obtained from five health related data bases across the continuum of care (pre-hospital to fatality). Data were manually linked. Population data provided by the Australian Bureau of Statistics were used to calculate crude incidence rates for fatal and non-fatal LSVRO events. Results There were 1611 LSVROs between 1999–2009 (IR = 16.87/100,000/annum). Incidence of non-fatal events (IR = 16.60/100,000/annum) was 61.5 times higher than fatal events (IR = 0.27/100,000/annum). LSVRO events were more common in boys (IR = 20.97/100,000/annum) than girls (IR = 12.55/100,000/annum), and among younger children aged 0–4 years (IR = 21.45/100000/annum; 39% or all events) than older children (5–9 years: IR = 16.47/100,000/annum; 10–15 years IR = 13.59/100,000/annum). A total of 896 (56.8%) children were admitted to hospital for 24 hours of more following an LSVRO event (IR = 9.38/100,000/annum). Total LSVROs increased from 1999 (IR = 14.79/100,000) to 2009 (IR = 18.56/100,000), but not significantly. Over the 11 year period, there was a slight (non –significant) increase in fatalities (IR = 0.37-0.42/100,000/annum); a significant decrease in admissions (IR = 12.39–5.36/100,000/annum), and significant increase in non-admissions (IR = 2.02-12.77/100,000/annum). Trends over time differed by age, gender and severity. Conclusion This is the most comprehensive, population-based epidemiological study on fatal and non-fatal LSVRO events to date. Results from this study indicate that LSVROs incur a substantial burden. Further research is required on the characteristics and risk factors associated with these events, in order to adequately inform injury prevention. Strategies are urgently required in order to prevent these events, especially among young children aged 0-4 years.
Resumo:
Purpose The purpose of this qualitative analysis was to examine the experiences of family caregivers supporting a dying person in the home setting. In particular, it explores caregivers’ perceptions of receiving palliative care at home when supplied with an emergency medication kit (EMK). Results Most family caregivers described preexisting medication management strategies that were unable to provide timely intervention in symptoms. The EMK was largely viewed as an effective strategy in providing timely symptom control and preventing readmission to inpatient care. Caregivers reported varying levels of confidence in the administration of medication. Conclusion The provision of an EMK is an effective strategy for improving symptom control and preventing inpatient admissions of home-dwelling palliative care patients.
Resumo:
Background Delirium is a common underdiagnosed condition in advanced cancer leading to increased distress, morbidity, and mortality. Screening improves detection but there is no consensus as to the best screening tool to use with patients with advanced cancer. Objective To determine the incidence of delirium in patients with advanced cancer within 72 hours of admission to an acute inpatient hospice using clinical judgement and validated screening tools. Method One hundred consecutive patients with advanced cancer were invited to be screened for delirium within 72 hours of admission to an acute inpatient hospice unit. Two validated tools were used, the Delirium Rating Scale-Revised 98 (DRS-R-98) and the Confusion Assessment Method (CAM) shortened diagnostic algorithm. These results were compared with clinical assessment by review of medical charts. Results Of 100 consecutive admissions 51 participated and of these 22 (43.1%) screened positive for delirium with CAM and/or DRS-R-98 compared to 15 (29.4%) by clinical assessment. Eleven (21.6%) were identified as hypoactive delirium and 5 (9.8%) as subsyndromal delirium. Conclusion This study confirms that delirium is a common condition in patients with advanced cancer.While there remains a lack of consensus regarding the choice of delirium screening tool this study supports theCAMas being appropriate. Further research may determine the optimal screening tool for delirium enabling the development of best practice clinical guidelines for routinemedical practice.
Resumo:
Background/Aim: Cardiotoxicity resulting in heart failure is a devastating complication of cancer therapy. It is possible that a patient may survive cancer only to develop heart failure (HF), which is more deadly than cancer. The aim of this project was to profile the characteristics of patients at risk of cancer treatment induced heart failure. Methods: Linked Health Data Analysis of Queensland Cancer Registry (QCR) from 1996-2009, Death Registry and Hospital Administration records for HF and chemotherapy admissions were reviewed. Index heart failure admission must have occurred after the date of cancer registry entry. Results: A total of 15,987 patients were included in this analysis; 1,062 (6.6%) had chemotherapy+HF admission (51.4% Female) and 14,925 (93.4%) chemotherapy_no HF admission. Median age of chemotherapy+HF patients was 67 years (IQR 58 to 75) vs. 54 years (IQR 44 to 64) for chemotherapy_no HF admission. Chemotherapy+HF patients had increased risk of all cause mortality (HR 2.79 [95% CI 2.58-3.02] and 1.67 [95% CI, 1.54 to 1.81] after adjusting for age, sex, marital status, country of birth, cancer site and chemotherapy dose). Index HF admission occurred within one year of cancer diagnosis in 47% of HF patients with 80% of patinets having there index admission with 3 years. The number of chemotherapy cycles was not associated with significant reduction in survival time in chemotherapy+HF patients. Mean survival for heart failure patients was 5.3 years (95% CI, 4.99 - 5.62) vs.9.57 years (95% CI, 9.47-9.68) for chemotherapy_no HF admission patients. Conclusion: All-cause mortality was 67% higher in patients diagnosed with HF following chemotherapy in adjusted analysis for covariates. Methods to improve and better coordinate of the interdisciplinary care for cancer patients with HF involving cardiologists and oncologists are required, including evidence-based guidelines for the comprehensive assessment, monitoring and management of this cohort.
Resumo:
Background and Objectives: Although depression is a commonly occurring mental illness, research concerning strategies for early detection and prophylaxis has not until now focused on the possible utility of measures of Emotional Intelligence (EI) as a potential predictive factor. The current study aimed to investigate the relationship between EI and a clinical diagnosis of depression in a cohort of adults. Methods: Sixty-two patients (59.70% female) with a DSM-IV-TR diagnosis of a major affective disorder and 39 aged matched controls (56.40% female) completed self-report instruments assessing EI and depression in a cross-sectional study. Results: Significant associations were observed between severity of depression and the EI dimensions of Emotional Management (r = -0.56) and Emotional Control (r = -0.62). The results show a reduced social involvement, an increased prior institutionalization and an increased incidence of "Schizophrenic Psychosis" and "Abnormal Personalities" in the sub-group of repeated admissions. Conclusions: Measures of EI may have predictive value in terms of early identification of those at risk for developing depression. The current study points to the potential value of conducting further studies of a prospective nature.
Resumo:
Objective. This study investigated cognitive functioning among older adults with physical debility not attributable to an acute injury or neurological condition who were receiving subacute inpatient physical rehabilitation. Design. A cohort investigation with assessments at admission and discharge. Setting. Three geriatric rehabilitation hospital wards. Participants. Consecutive rehabilitation admissions () following acute hospitalization (study criteria excluded orthopaedic, neurological, or amputation admissions). Intervention. Usual rehabilitation care. Measurements. The Functional Independence Measure (FIM) Cognitive and Motor items. Results. A total of 704 (86.5%) participants (mean age = 76.5 years) completed both assessments. Significant improvement in FIM Cognitive items (-score range 3.93–8.74, all ) and FIM Cognitive total score (-score = 9.12, ) occurred, in addition to improvement in FIM Motor performance. A moderate positive correlation existed between change in Motor and Cognitive scores (Spearman’s rho = 0.41). Generalized linear modelling indicated that better cognition at admission (coefficient = 0.398, ) and younger age (coefficient = −0.280, ) were predictive of improvement in Motor performance. Younger age (coefficient = −0.049, ) was predictive of improvement in FIM Cognitive score. Conclusions. Improvement in cognitive functioning was observed in addition to motor function improvement among this population. Causal links cannot be drawn without further research.
Resumo:
Introduction Risk factor analyses for nosocomial infections (NIs) are complex. First, due to competing events for NI, the association between risk factors of NI as measured using hazard rates may not coincide with the association using cumulative probability (risk). Second, patients from the same intensive care unit (ICU) who share the same environmental exposure are likely to be more similar with regard to risk factors predisposing to a NI than patients from different ICUs. We aimed to develop an analytical approach to account for both features and to use it to evaluate associations between patient- and ICU-level characteristics with both rates of NI and competing risks and with the cumulative probability of infection. Methods We considered a multicenter database of 159 intensive care units containing 109,216 admissions (813,739 admission-days) from the Spanish HELICS-ENVIN ICU network. We analyzed the data using two models: an etiologic model (rate based) and a predictive model (risk based). In both models, random effects (shared frailties) were introduced to assess heterogeneity. Death and discharge without NI are treated as competing events for NI. Results There was a large heterogeneity across ICUs in NI hazard rates, which remained after accounting for multilevel risk factors, meaning that there are remaining unobserved ICU-specific factors that influence NI occurrence. Heterogeneity across ICUs in terms of cumulative probability of NI was even more pronounced. Several risk factors had markedly different associations in the rate-based and risk-based models. For some, the associations differed in magnitude. For example, high Acute Physiology and Chronic Health Evaluation II (APACHE II) scores were associated with modest increases in the rate of nosocomial bacteremia, but large increases in the risk. Others differed in sign, for example respiratory vs cardiovascular diagnostic categories were associated with a reduced rate of nosocomial bacteremia, but an increased risk. Conclusions A combination of competing risks and multilevel models is required to understand direct and indirect risk factors for NI and distinguish patient-level from ICU-level factors.
Resumo:
Background Clostridium difficile infection (CDI) possibly extends hospital length of stay (LOS); however, the current evidence does not account for the time-dependent bias, ie, when infection is incorrectly analyzed as a baseline covariate. The aim of this study was to determine whether CDI increases LOS after managing this bias. Methods We examined the estimated extra LOS because of CDI using a multistate model. Data from all persons hospitalized >48 hours over 4 years in a tertiary hospital in Australia were analyzed. Persons with health care-associated CDIs were identified. Cox proportional hazards models were applied together with multistate modeling. Results One hundred fifty-eight of 58,942 admissions examined had CDI. The mean extra LOS because of infection was 0.9 days (95% confidence interval: −1.8 to 3.6 days, P = .51) when a multistate model was applied. The hazard of discharge was lower in persons who had CDI (adjusted hazard ratio, 0.42; P < .001) when a Cox proportional hazard model was applied. Conclusion This study is the first to use multistate models to determine the extra LOS because of CDI. Results suggest CDI does not significantly contribute to hospital LOS, contradicting findings published elsewhere. Conversely, when methods prone to result in time-dependent bias were applied to the data, the hazard of discharge significantly increased. These findings contribute to discussion on methods used to evaluate LOS and health care-associated infections.
Resumo:
Background Ascites, the most frequent complication of cirrhosis, is associated with poor prognosis and reduced quality of life. Recurrent hospital admissions are common and often unplanned, resulting in increased use of hospital services. Aims To examine use of hospital services by patients with cirrhosis and ascites requiring paracentesis, and to investigate factors associated with early unplanned readmission. Methods A retrospective review of the medical chart and clinical databases was performed for patients who underwent paracentesis between October 2011 and October 2012. Clinical parameters at index admission were compared between patients with and without early unplanned hospital readmissions. Results The 41 patients requiring paracentesis had 127 hospital admissions, 1164 occupied bed days and 733 medical imaging services. Most admissions (80.3%) were for management of ascites, of which 41.2% were unplanned. Of those eligible, 69.7% were readmitted and 42.4% had an early unplanned readmission. Twelve patients died and nine developed spontaneous bacterial peritonitis. Of those eligible for readmission, more patients died (P = 0.008) and/or developed spontaneous bacterial peritonitis (P = 0.027) if they had an early unplanned readmission during the study period. Markers of liver disease, as well as haemoglobin (P = 0.029), haematocrit (P = 0.024) and previous heavy alcohol use (P = 0.021) at index admission, were associated with early unplanned readmission. Conclusion Patients with cirrhosis and ascites comprise a small population who account for substantial use of hospital services. Markers of disease severity may identify patients at increased risk of early readmission. Alternative models of care should be considered to reduce unplanned hospital admissions, healthcare costs and pressure on emergency services.
Resumo:
Objective To examine personal and social demographics, and rehabilitation discharge outcomes of dysvascular and non-vascular lower limb amputees. Methods In total, 425 lower limb amputation inpatient rehabilitation admissions (335 individuals) from 2005 to 2011 were examined. Admission and discharge descriptive statistics (frequency, percentages) were calculated and compared by aetiology. Results Participants were male (74%), aged 65 years (s.d. 14), born in Australia (72%), had predominantly dysvascular aetiology (80%) and a median length of stay 48 days (interquartile range (IQR): 25–76). Following amputation, 56% received prostheses for mobility, 21% (n = 89) changed residence and 28% (n = 116) required community services. Dysvascular amputees were older (mean 67 years, s.d. 12 vs 54 years, s.d. 16; P < 0.001) and recorded lower functional independence measure – motor scores at admission (z = 3.61, P < 0.001) and discharge (z = 4.52, P < 0.001). More nonvascular amputees worked before amputation (43% vs 11%; P < 0.001), were prescribed a prosthesis by discharge (73% vs 52%; P < 0.001) and had a shorter length of stay (7 days, 95% confidence interval: –3 to 17), although this was not statistically significant. Conclusions Differences exist in social and demographic outcomes between dysvascular and non-vascular lower limb amputees.
Resumo:
There is a growing awareness of the high levels of psychological distress being experienced by law students and the practising profession in Australia. In this context, a Threshold Learning Outcome (TLO) on self-management has been included in the six TLOs recently articulated as minimum learning outcomes for all Australian graduates of the Bachelor of Laws degree (LLB). The TLOs were developed during 2010 as part of the Australian Learning and Teaching Council’s (ALTC’s) project funded by the Australian Government to articulate ‘Learning and Teaching Academic Standards’. The TLOs are the result of a comprehensive national consultation process led by the ALTC’s Discipline Scholars: Law, Professors Sally Kift and Mark Israel.1 The TLOs have been endorsed by the Council of Australian Law Deans (CALD) and have received broad support from members of the judiciary and practising profession, representative bodies of the legal profession, law students and recent graduates, Legal Services Commissioners and the Law Admissions Consultative Committee. At the time of writing, TLOs for the Juris Doctor (JD) are also being developed, utilising the TLOs articulated for the LLB as their starting point but restating the JD requirements as the higher order outcomes expected of graduates of a ‘Masters Degree (Extended)’, this being the award level designation for the JD now set out in the new Australian Qualifications Framework.2 As Australian law schools begin embedding the learning, teaching and assessment of the TLOs in their curricula, and seek to assure graduates’ achievement of them, guidance on the implementation of the self-management TLO is salient and timely.
Resumo:
Background: This study attempted to develop health risk-based metrics for defining a heatwave in Brisbane, Australia. Methods: Poisson generalised additive model was performed to assess the impact of heatwaves on mortality and emergency hospital admissions (EHAs) in Brisbane. Results: In general, the higher the intensity and the longer the duration of a heatwave, the greater the health impacts. There was no apparent difference in EHAs risk during different periods of a warm season. However, there was a greater risk of mortality in the second half of a warm season than that in the first half. While elderly (>75 years)were particularly vulnerable to both the EHA and mortality effects of a heatwave, the risk for EHAs also significantly increased for two other age groups (0-64 years and 65-74 years) during severe heatwaves. Different patterns between cardiorespiratory mortality and EHAs were observed. Based on these findings, we propose the use of a teiered heat warning system based on the health risk of heatwave. Conclusions: Health risk-based metrics are a useful tool for the development of local heatwave definitions. thsi tool may have significant implications for the assessment of heatwave-related health consequences and development of heatwave response plans and implementation strategies.
Resumo:
In late 2014, the fifth biennial Educate Plus benchmarking study was conducted to track educational development in Australia and New Zealand. The 2014 survey built upon the four previous studies, which began in 2005. All participants were asked questions regarding institutional information, personal information, salary information and advancement office information. Following this, they could choose to complete at least one of the following sections according to their role/s: fundraising, marketing & communications, alumni & community relations, and admissions.