842 resultados para Central Supply, Hospital
Resumo:
Species of fleshy-fruited Myrtaceae are generally associated with humid environments and their vegetative anatomy is mainly mesophytic. Myrceugenia rufa is an endemic and rare species from arid zones of the coast of central Chile and there are no anatomical studies regarding its leaf anatomy and environmental adaptations. Here we describe the leaf micromorphology and anatomy of the species using standard protocols for light and scanning electron microscopy. The leaf anatomy of M. rufa matches that of other Myrtaceae, such as presence of druses, schizogenous secretory ducts and internal phloem. Leaves of M. rufa exhibit a double epidermis, thick cuticle, abundant unicellular hairs, large substomatal chambers covered by trichomes and a dense palisade parenchyma. Leaf characters of M. rufa confirm an anatomical adaptation to xerophytic environments.
Resumo:
Chronic disease accounts for about 80 per cent of the total disease burden in Australia, and its management accounts for 70 per cent of all current health expenditure.1 Effective prevention and management of chronic disease requires a coordinated approach between primary health care, acute care services, and the patients.2 However, what is not clear is whether improvements in primary healthcare management can have a clear benefit in the cost of care of patients with chronic disease. We recently completed a pilot study in rural Western Australia to ascertain the feasibility of a coordinated general practice-based approach to managing chronic respiratory and cardiovascular conditions, and to determine the direct cost savings to the public insurer through reduction in avoidable hospital admission. The aim of this correspondence is to share our preliminary findings and encourage debate on how such a project may be scaled up or adapted to other primary healthcare settings.
Resumo:
Background and Purpose Randomized trials have demonstrated reduced morbidity and mortality with stroke unit care; however, the effect on length of stay, and hence the economic benefit, is less well-defined. In 2001, a multidisciplinary stroke unit was opened at our institution. We observed whether a stroke unit reduces length of stay and in-hospital case fatality when compared to admission to a general neurology/medical ward. Methods A retrospective study of 2 cohorts in the Foothills Medical Center in Calgary was conducted using administrative databases. We compared a cohort of stroke patients managed on general neurology/medical wards before 2001, with a similar cohort of stroke patients managed on a stroke unit after 2003. The length of stay was dichotomized after being centered to 7 days and the Charlson Index was dichotomized for analysis. Multivariable logistic regression was used to compare the length of stay and case fatality in 2 cohorts, adjusted for age, gender, and patient comorbid conditions defined by the Charlson Index. Results Average length of stay for patients on a stroke unit (n=2461) was 15 days vs 19 days for patients managed on general neurology/medical wards (n=1567). The proportion of patients with length of stay >7 days on general neurology/medical wards was 53.8% vs 44.4% on the stroke unit (difference 9.4%; P<0.0001). The adjusted odds of a length of stay >7 days was reduced by 30% (P<0.0001) on a stroke unit compared to general neurology/medical wards. Overall in-hospital case fatality was reduced by 4.5% with stroke unit care. Conclusions We observed a reduced length of stay and reduced in-hospital case-fatality in a stroke unit compared to general neurology/medical wards.
Resumo:
Background Stroke incidence has fallen since 1950. Recent trends suggest that stroke incidence may be stabilizing or increasing. We investigated time trends in stroke occurrence and in-hospital morbidity and mortality in the Calgary Health Region. Methods All patients admitted to hospitals in the Calgary Health Region between 1994 and 2002 with a primary discharge diagnosis code (ICD-9 or ICD-10) of stroke were included. In-hospital strokes were also included. Stroke type, date of admission, age, gender,discharge disposition (died, discharged) and in-hospital complications (pneumonia, pulmonary embolism, deep venous thrombosis) were recorded. Poisson and simple linear regression was used to model time trends of occurrence by stroke type and age-group and to extrapolate future time trends. Results From 1994 to 2002, 11642 stroke events were observed. Of these, 9879 patients (84.8%) were discharged from hospital, 1763 (15.1%) died in hospital, and 591 (5.1%) developed in-hospital complications from pneumonia, pulmonary embolism or deep venous thrombosis. Both in-hospital mortality and complication rates were highest for hemorrhages. Over the period of study, the rate of stroke admission has remained stable. However, total numbers of stroke admission to hospital have faced a significant increase (p=0.012) due to the combination of increases in intracerebral hemorrhage (p=0.021) and ischemic stroke admissions (p=0.011). Sub-arachnoid hemorrhage rates have declined. In-hospital stroke mortality has experienced an overall decline due to a decrease in deaths from ischemic stroke, intracerebral hemorrhage and sub-arachnoid hemorrhage. Conclusions Although age-adjusted stroke occurrence rates were stable from 1994 to 2002, this is associated with both a sharp increase in the absolute number of stroke admissions and decline in proportional in-hospital mortality. Further research is needed into changes in stroke severity over time to understand the causes of declining in-hospital stroke mortality rates.
Resumo:
Background An important potential clinical benefit of using capnography monitoring during procedural sedation and analgesia (PSA) is that this technology could improve patient safety by reducing serious sedation-related adverse events, such as death or permanent neurological disability, which are caused by inadequate oxygenation. The hypothesis is that earlier identification of respiratory depression using capnography leads to a change in clinical management that prevents hypoxaemia. As inadequate oxygenation/ventilation is the most common reason for injury associated with PSA, reducing episodes of hypoxaemia would indicate that using capnography would be safer than relying on standard monitoring alone. Methods/design The primary objective of this review is to determine whether using capnography during PSA in the hospital setting improves patient safety by reducing the risk of hypoxaemia (defined as an arterial partial pressure of oxygen below 60 mmHg or percentage of haemoglobin that is saturated with oxygen [SpO2] less than 90 %). A secondary objective of this review is to determine whether changes in the clinical management of sedated patients are the mediating factor for any observed impact of capnography monitoring on the rate of hypoxaemia. The potential adverse effect of capnography monitoring that will be examined in this review is the rate of inadequate sedation. Electronic databases will be searched for parallel, crossover and cluster randomised controlled trials comparing the use of capnography with standard monitoring alone during PSA that is administered in the hospital setting. Studies that included patients who received general or regional anaesthesia will be excluded from the review. Non-randomised studies will be excluded. Screening, study selection and data extraction will be performed by two reviewers. The Cochrane risk of bias tool will be used to assign a judgment about the degree of risk. Meta-analyses will be performed if suitable. Discussion This review will synthesise the evidence on an important potential clinical benefit of capnography monitoring during PSA within hospital settings. Systematic review registration: PROSPERO CRD42015023740
Resumo:
The phase relations have been investigated experimentally at 200 and 500 MPa as a function of water activity for one of the least evolved (Indian Batt Rhyolite) and of a more evolved rhyolite composition (Cougar Point Tuff XV) from the 12·8-8·1 Ma Bruneau-Jarbidge eruptive center of the Yellowstone hotspot. Particular priority was given to accurate determination of the water content of the quenched glasses using infrared spectroscopic techniques. Comparison of the composition of natural and experimentally synthesized phases confirms that high temperatures (>900°C) and extremely low melt water contents (<1·5 wt % H₂O) are required to reproduce the natural mineral assemblages. In melts containing 0·5-1·5 wt % H₂O, the liquidus phase is clinopyroxene (excluding Fe-Ti oxides, which are strongly dependent on fO₂), and the liquidus temperature of the more evolved Cougar Point Tuff sample (BJR; 940-1000°C) is at least 30°C lower than that of the Indian Batt Rhyolite lava sample (IBR2; 970-1030°C). For the composition BJR, the comparison of the compositions of the natural and experimental glasses indicates a pre-eruptive temperature of at least 900°C. The composition of clinopyroxene and pigeonite pairs can be reproduced only for water contents below 1·5 wt % H₂O at 900°C, or lower water contents if the temperature is higher. For the composition IBR2, a minimum temperature of 920°C is necessary to reproduce the main phases at 200 and 500 MPa. At 200 MPa, the pre-eruptive water content of the melt is constrained in the range 0·7-1·3 wt % at 950°C and 0·3-1·0 wt % at 1000°C. At 500 MPa, the pre-eruptive temperatures are slightly higher (by 30-50°C) for the same ranges of water concentration. The experimental results are used to explore possible proxies to constrain the depth of magma storage. The crystallization sequence of tectosilicates is strongly dependent on pressure between 200 and 500 MPa. In addition, the normative Qtz-Ab-Or contents of glasses quenched from melts coexisting with quartz, sanidine and plagioclase depend on pressure and melt water content, assuming that the normative Qtz and Ab/Or content of such melts is mainly dependent on pressure and water activity, respectively. The combination of results from the phase equilibria and from the composition of glasses indicates that the depth of magma storage for the IBR2 and BJR compositions may be in the range 300-400 MPa (13 km) and 200-300 MPa (10 km), respectively.
Resumo:
Introduction Vascular access devices (VADs), such as peripheral or central venous catheters, are vital across all medical and surgical specialties. To allow therapy or haemodynamic monitoring, VADs frequently require administration sets (AS) composed of infusion tubing, fluid containers, pressure-monitoring transducers and/or burettes. While VADs are replaced only when necessary, AS are routinely replaced every 3–4 days in the belief that this reduces infectious complications. Strong evidence supports AS use up to 4 days, but there is less evidence for AS use beyond 4 days. AS replacement twice weekly increases hospital costs and workload. Methods and analysis This is a pragmatic, multicentre, randomised controlled trial (RCT) of equivalence design comparing AS replacement at 4 (control) versus 7 (experimental) days. Randomisation is stratified by site and device, centrally allocated and concealed until enrolment. 6554 adult/paediatric patients with a central venous catheter, peripherally inserted central catheter or peripheral arterial catheter will be enrolled over 4 years. The primary outcome is VAD-related bloodstream infection (BSI) and secondary outcomes are VAD colonisation, AS colonisation, all-cause BSI, all-cause mortality, number of AS per patient, VAD time in situ and costs. Relative incidence rates of VAD-BSI per 100 devices and hazard rates per 1000 device days (95% CIs) will summarise the impact of 7-day relative to 4-day AS use and test equivalence. Kaplan-Meier survival curves (with log rank Mantel-Cox test) will compare VAD-BSI over time. Appropriate parametric or non-parametric techniques will be used to compare secondary end points. p Values of <0.05 will be considered significant.
Resumo:
While the indirect and direct cost of occupational musculoskeletal disorders (MSD) causes a significant burden on the health system, lower back pain (LBP) is associated with a significant portion of MSD. In Australia, the highest prevalence of MSD exists for health care workers, such as nurses. The digital human model (DHM) Siemens JACK was used to investigate if hospital bed pushing, a simple task and hazard that is commonly associated with LBP, can be simulated and ergonomically assessed in a virtual environment. It was found that while JACK has implemented a range of common physical work assessment methods, the simulation of dynamic bed pushing remains a challenge due to the complex interface between the floor and wheels, which can only be insufficiently modelle
Resumo:
The aim of this project was to evaluate the cost-effectiveness of hand hygiene interventions in resource-limited hospital settings. Using data from north-east Thailand, the research found that such interventions are likely to be very cost-effective in intensive care unit settings as a result of reduced incidence of methicillin-resistant Staphylococcus aureus bloodstream infection alone. This study also found evidence showing that the World Health Organization's (WHO) multimodal intervention is effective and when adding either goal-setting, reward incentives, or accountability strategies to the WHO intervention, compliance could be further improved.
Resumo:
Supply chain outsourcing has posed problems for conventional labour regulation, which focuses on employers contracting directly with workers, particularly employees. These difficulties have been exacerbated by the traditional trifurcated approach to regulation of pay and conditions, work health and safety and workers’ compensation. This paper analyses the parallel interaction of two legal developments within the Australian textile, clothing and footwear industry. The first is mandatory contractual tracking mechanisms within state and federal labour laws and the second is the duties imposed by the harmonised Work Health and Safety Acts. Their combined effect has created an innovative, fully enforceable and integrated regulatory framework for the textile, clothing and footwear industry and, it is argued, other supply chains in different industry contexts. This paper highlights how regulatory solutions can address adverse issues for workers at the bottom of contractual networks, such as fissured workplaces and capital fragmentation, by enabling regulators to harness the commercial power of business controllers at the apex to ensure compliance throughout the entire chain.
Resumo:
Background Miscommunication in the healthcare sector can be life-threatening. The rising number of migrant patients and foreign-trained staff means that communication errors between a healthcare practitioner and patient when one or both are speaking a second language are increasingly likely. However, there is limited research that addresses this issue systematically. This protocol outlines a hospital-based study examining interactions between healthcare practitioners and their patients who either share or do not share a first language. Of particular interest are the nature and efficacy of communication in language-discordant conversations, and the degree to which risk is communicated. Our aim is to understand language barriers and miscommunication that may occur in healthcare settings between patients and healthcare practitioners, especially where at least one of the speakers is using a second (weaker) language. Methods/Design Eighty individual interactions between patients and practitioners who speak either English or Chinese (Mandarin or Cantonese) as their first language will be video recorded in a range of in- and out-patient departments at three hospitals in the Metro South area of Brisbane, Australia. All participants will complete a language background questionnaire. Patients will also complete a short survey rating the effectiveness of the interaction. Recordings will be transcribed and submitted to both quantitative and qualitative analyses to determine elements of the language used that might be particularly problematic and the extent to which language concordance and discordance impacts on the quality of the patient-practitioner consultation. Discussion Understanding the role that language plays in creating barriers to healthcare is critical for healthcare systems that are experiencing an increasing range of culturally and linguistically diverse populations both amongst patients and practitioners. The data resulting from this study will inform policy and practical solutions for communication training, provide an agenda for future research, and extend theory in health communication.
Resumo:
Introduction. The venous drainage system within vertebral bodies (VBs) has been well documented previously in cadaveric specimens. Advances in 3D imaging and image processing now allow for in vivo quantification of larger venous vessels, such as the basivertebral vein. Differences between healthy and scoliotic VB veins can therefore be investigated. Methods. 20 healthy adolescent controls and 21 AIS patients were recruited (with ethics approval) to undergo 3D MRI, using a 3 Tesla, T1-weighted 3D gradient echo sequence, resulting in 512 slices across the thoraco-lumbar spine, with a voxel size of 0.5x0.5x0.5mm. Using Amira Filament Editor, five transverse slices through the VB were examined simultaneously and the resulting observable vascular network traced. Each VB was assessed, and a vascular network recorded when observable. A local coordinate system was created in the centre of each VB and the vascular networks aligned to this. The length of the vascular network on the left and right sides (with a small central region) of the VB was calculated, and the spatial patterning of the networks assessed level-by-level within each subject. Results. An average of 6 (range 4-10) vascular networks, consistent with descriptions of the basivertebral vein, were identifiable within each subject, most commonly between T10-L1. Differences were seen in the left/right distribution of vessels in the control and AIS subjects. Healthy controls saw a percentage distribution of 29:18:53 across the left:centre:right regions respectively, whereas the AIS subjects had a slightly shifted distribution of 33:25:42. The control group showed consistent spatial patterning of the vascular networks across most levels, but this was not seen in the AIS group. Conclusion. Observation and quantification of the basivertebral vein in vivo is possible using 3D MRI. The AIS group lacked the spatial pattern repetition seen in the control group and minor differences were seen in the left/right distribution of vessels.
Resumo:
Introduction: Patients with rheumatoid arthritis (RA) have increased risk of cardiovascular (CV) events. We sought to test the hypothesis that due to increased inflammation, CV disease and risk factors are associated with increased risk of future RA development. Methods: The population-based Nord-Trøndelag health surveys (HUNT) were conducted among the entire adult population of Nord-Trøndelag, Norway. All inhabitants 20 years or older were invited, and information was collected through comprehensive questionnaires, a clinical examination, and blood samples. In a cohort design, data from HUNT2 (1995-1997, baseline) and HUNT3 (2006-2008, follow-up) were obtained to study participants with RA (n = 786) or osteoarthritis (n = 3,586) at HUNT3 alone, in comparison with individuals without RA or osteoarthritis at both times (n = 33,567). Results: Female gender, age, smoking, body mass index, and history of previous CV disease were associated with self-reported incident RA (previous CV disease: odds ratio 1.52 (95% confidence interval 1.11-2.07). The findings regarding previous CV disease were confirmed in sensitivity analyses excluding participants with psoriasis (odds ratio (OR) 1.70 (1.23-2.36)) or restricting the analysis to cases with a hospital diagnosis of RA (OR 1.90 (1.10-3.27)) or carriers of the shared epitope (OR 1.76 (1.13-2.74)). History of previous CV disease was not associated with increased risk of osteoarthritis (OR 1.04 (0.86-1.27)). Conclusion: A history of previous CV disease was associated with increased risk of incident RA but not osteoarthritis.
Resumo:
Asthma prevalence in children has remained relatively constant in many Western countries, but hospital admissions for younger age groups have increased over time.1 Although the role of outdoor aeroallergens as triggers for asthma exacerbations requiring hospitalization in children and adolescents is complex, there is evidence that increasing concentrations of grass pollen are associated with an increased risk of asthma exacerbations in children.2 Human rhinovirus (HRV) infections are implicated in most of the serious asthma exacerbations in school-age children.3 In previous research, HRV infections and aeroallergen exposure have usually been studied independently. To our knowledge, only 1 study has examined interactions between these 2 factors,4 but lack of power prevented any meaningful interpretation...
Resumo:
- Background Falls are the most frequent adverse events that are reported in hospitals. We examined the effectiveness of individualised falls-prevention education for patients, supported by training and feedback for staff, delivered as a ward-level programme. - Methods Eight rehabilitation units in general hospitals in Australia participated in this stepped-wedge, cluster-randomised study, undertaken during a 50 week period. Units were randomly assigned to intervention or control groups by use of computer-generated, random allocation sequences. We included patients admitted to the unit during the study with a Mini-Mental State Examination (MMSE) score of more than 23/30 to receive individualised education that was based on principles of changes in health behaviour from a trained health professional, in addition to usual care. We provided information about patients' goals, feedback about the ward environment, and perceived barriers to engagement in falls-prevention strategies to staff who were trained to support the uptake of strategies by patients. The coprimary outcome measures were patient rate of falls per 1000 patient-days and the proportion of patients who were fallers. All analyses were by intention to treat. This trial is registered with the Australian New Zealand Clinical Trials registry, number ACTRN12612000877886). - Findings Between Jan 13, and Dec 27, 2013, 3606 patients were admitted to the eight units (n=1983 control period; n=1623 intervention period). There were fewer falls (n=196, 7·80/1000 patient-days vs n=380, 13·78/1000 patient-days, adjusted rate ratio 0·60 [robust 95% CI 0·42–0·94], p=0·003), injurious falls (n=66, 2·63/1000 patient-days vs 131, 4·75/1000 patient-days, 0·65 [robust 95% CI 0·42–0·88], p=0·006), and fallers (n=136 [8·38%] vs n=248 [12·51%] adjusted odds ratio 0·55 [robust 95% CI 0·38 to 0·81], p=0·003) in the intervention compared with the control group. There was no significant difference in length of stay (intervention median 11 days [IQR 7–19], control 10 days [6–18]). - Interpretation Individualised patient education programmes combined with training and feedback to staff added to usual care reduces the rates of falls and injurious falls in older patients in rehabilitation hospital-units.