940 resultados para Rehabilitation Unit
Resumo:
An estimation of costs for maintenance and rehabilitation is subject to variation due to the uncertainties of input parameters. This paper presents the results of an analysis to identify input parameters that affect the prediction of variation in road deterioration. Road data obtained from 1688 km of a national highway located in the tropical northeast of Queensland in Australia were used in the analysis. Data were analysed using a probability-based method, the Monte Carlo simulation technique and HDM-4’s roughness prediction model. The results of the analysis indicated that among the input parameters the variability of pavement strength, rut depth, annual equivalent axle load and initial roughness affected the variability of the predicted roughness. The second part of the paper presents an analysis to assess the variation in cost estimates due to the variability of the overall identified critical input parameters.
Resumo:
This paper compares and reviews the recommendations and contents of the guide for the design and construction of externally bonded FRP systems for strengthening concrete structures reported by ACI committee 440 and technical report of Externally bonded FRP reinforcement for RC structures (FIB 14) in application of carbon fiber reinforced polymer (CFRP) composites in strengthening of an aging reinforced concrete headstock. The paper also discusses the background, limitations, strengthening for flexure and shear, and other related issues in use of FRP for strengthening of a typical reinforced concrete headstock structure such as durability, de-bonding, strengthening limits, fire and environmental conditions. A case study of strengthening of a bridge headstock using FRP composites is presented as a worked example in order to illustrate and compare the differences between these two design guidelines when used in conjunction with the philosophy of the Austroads (1992) bridge design code.
Resumo:
Objective To evaluate staff perceptions about working environment, efficiency and the clinical safety of a cardiovascular intervention short stay unit (SSU) during the first year of operation. Design Postal questionnaire. Setting Cardiac catheterisation laboratory (CCL), coronary care unit (CCU), general cardiology ward (GCW) and the short stay unit (SSU) of a tertiary referral hospital situated in the mid coastal region of NSW. Subjects Cardiologists (including visiting medical officers [VMO]), cardiology fellows, cardiology advanced trainees and nurses. Results Responses on the working environment of the SSU and the discharge process were statistically significant. A substantial proportion of both nurses and doctors had concerns about patient safety, even though no adverse events were formally recorded in the database. Conclusions Though the participants of the survey agree on the efficiency of the SSU in providing beds to the hospital, they disagree on aspects that are important in the functioning of the SSU, including the working environment, patient selection and clinical safety. The results highlight potential issues that could be improved or addressed and are relevant to the rollout of SSUs across NSW.
Resumo:
Healthcare-associated methicillin-resistant Staphylococcus aureus(MRSA) infection may cause increased hospital stay or, sometimes, death. Quantifying this effect is complicated because it is a time-dependent exposure: infection may prolong hospital stay, while longer stays increase the risk of infection. We overcome these problems by using a multinomial longitudinal model for estimating the daily probability of death and discharge. We then extend the basic model to estimate how the effect of MRSA infection varies over time, and to quantify the number of excess ICU days due to infection. We find that infection decreases the relative risk of discharge (relative risk ratio = 0.68, 95% credible interval: 0.54, 0.82), but is only indirectly associated with increased mortality. An infection on the first day of admission resulted in a mean extra stay of 0.3 days (95% CI: 0.1, 0.5) for a patient with an APACHE II score of 10, and 1.2 days (95% CI: 0.5, 2.0) for a patient with an APACHE II score of 30. The decrease in the relative risk of discharge remained fairly constant with day of MRSA infection, but was slightly stronger closer to the start of infection. These results confirm the importance of MRSA infection in increasing ICU stay, but suggest that previous work may have systematically overestimated the effect size.
Resumo:
In Semester 1 2007, a Monitoring Student Engagement study, conducted as part of the Enhancing Transition at Queensland University of Technology (ET@QUT) Project and extending earlier work in the Project by Arora (2006), aimed at mapping the processes and resources used at that time to identify, monitor and manage students in their first year who were at risk of leaving QUT (Shaw, 2007). This identified a lack of documentation of the processes and resources used and revealed an ad-hoc rather than holistic and systematic approach to monitoring student engagement. One of Shaw’s recommendations was to: “To introduce a centralised case management approach to student engagement” (p. 14). That provided the genesis for the Student Success Project that is being reported on here. The aim of the Student Success Project is to trial, evaluate and ultimately establish holistic and systematic ways of helping students who appear to be at-risk of failing or withdrawing from a unit to persist and succeed. Students are profiled as being at-risk if they are absent from more than 2 tutorials in a row without contacting their tutor or if they fail to submit their first assignment. A Project Officer makes personal contact with these students to suggest ways they can get further assistance depending on their situation.
Resumo:
Professional prac− tice guidelines for endoscope reprocessing re− commend reprocessing endoscopes between each case and proper storage following repro− cessing after the last case of the list. There is lim− ited empirical evidence to support the efficacy of endoscope reprocessing prior to use in the first case of the day; however, internationally, many guidelines continue to recommend this practice. The aim of this study is to estimate a safe shelf life for flexible endoscopes in a high−turnover gastroenterology unit. Materials and methods: In a prospective obser− vational study, all flexible endoscopes in active service during the 3−week study period were mi− crobiologically sampled prior to reprocessing be− fore the first case of the day (n = 200). The main outcome variables were culture status, organism cultured, and shelf life. Results: Among the total number of useable samples (n = 194), the overall contamination rate was 15.5 %, with a pathogenic contamination rate of 0.5 %. Mean time between last case one day and reprocessing before the first case on the next day (that is, shelf life) was 37.62 h (SD 36.47). Median shelf life was 18.8 h (range 5.27± 165.35 h). The most frequently identified organ− ism was coagulase−negative Staphylococcus, an environmental nonpathogenic organism. Conclusions: When processed according to es− tablished guidelines, flexible endoscopes remain free from pathogenic organisms between last case and next day first case use. Significant re− ductions in the expenditure of time and resources on reprocessing endoscopes have the potential to reduce the restraints experienced by high−turnover endoscopy units and improve ser− vice delivery.
Resumo:
Introduction: Some types of antimicrobial-coated central venous catheters (A-CVC) have been shown to be cost-effective in preventing catheter-related bloodstream infection (CR-BSI). However, not all types have been evaluated, and there are concerns over the quality and usefulness of these earlier studies. There is uncertainty amongst clinicians over which, if any, antimicrobial-coated central venous catheters to use. We re-evaluated the cost-effectiveness of all commercially available antimicrobialcoated central venous catheters for prevention of catheter-related bloodstream infection in adult intensive care unit (ICU) patients. Methods: We used a Markov decision model to compare the cost-effectiveness of antimicrobial-coated central venous catheters relative to uncoated catheters. Four catheter types were evaluated; minocycline and rifampicin (MR)-coated catheters; silver, platinum and carbon (SPC)-impregnated catheters; and two chlorhexidine and silver sulfadiazine-coated catheters, one coated on the external surface (CH/SSD (ext)) and the other coated on both surfaces (CH/SSD (int/ext)). The incremental cost per qualityadjusted life-year gained and the expected net monetary benefits were estimated for each. Uncertainty arising from data estimates, data quality and heterogeneity was explored in sensitivity analyses. Results: The baseline analysis, with no consideration of uncertainty, indicated all four types of antimicrobial-coated central venous catheters were cost-saving relative to uncoated catheters. Minocycline and rifampicin-coated catheters prevented 15 infections per 1,000 catheters and generated the greatest health benefits, 1.6 quality-adjusted life-years, and cost-savings, AUD $130,289. After considering uncertainty in the current evidence, the minocycline and rifampicin-coated catheters returned the highest incremental monetary net benefits of $948 per catheter; but there was a 62% probability of error in this conclusion. Although the minocycline and rifampicin-coated catheters had the highest monetary net benefits across multiple scenarios, the decision was always associated with high uncertainty. Conclusions: Current evidence suggests that the cost-effectiveness of using antimicrobial-coated central venous catheters within the ICU is highly uncertain. Policies to prevent catheter-related bloodstream infection amongst ICU patients should consider the cost-effectiveness of competing interventions in the light of this uncertainty. Decision makers would do well to consider the current gaps in knowledge and the complexity of producing good quality evidence in this area.
Resumo:
The purpose of this proof-of-concept study was to determine the relevance of direct measurements to monitor the load applied on the osseointegrated fixation of transfemoral amputees during static load bearing exercises. The objectives were (A) to introduce an apparatus using a three-dimensional load transducer, (B) to present a range of derived information relevant to clinicians, (C) to report on the outcomes of a pilot study and (D) to compare the measurements from the transducer with those from the current method using a weighing scale. One transfemoral amputee fitted with an osseointegrated implant was asked to apply 10 kg, 20 kg, 40 kg and 80 kg on the fixation, using self-monitoring with the weighing scale. The loading was directly measured with a portable kinetic system including a six-channel transducer, external interface circuitry and a laptop. As the load prescribed increased from 10 kg to 80 kg, the forces and moments applied on and around the antero-posterior axis increased by 4 fold anteriorly and 14 fold medially, respectively. The forces and moments applied on and around the medio-lateral axis increased by 9 fold laterally and 16 fold from anterior to posterior, respectively. The long axis of the fixation was overloaded and underloaded in 17 % and 83 % of the trials, respectively, by up to ±10 %. This proof-of-concept study presents an apparatus that can be used by clinicians facing the challenge of improving basic knowledge on osseointegration, for the design of equipment for load bearing exercises and for rehabilitation programs.
Resumo:
Aims: To describe a local data linkage project to match hospital data with the Australian Institute of Health and Welfare (AIHW) National Death Index (NDI) to assess longterm outcomes of intensive care unit patients. Methods: Data were obtained from hospital intensive care and cardiac surgery databases on all patients aged 18 years and over admitted to either of two intensive care units at a tertiary-referral hospital between 1 January 1994 and 31 December 2005. Date of death was obtained from the AIHW NDI by probabilistic software matching, in addition to manual checking through hospital databases and other sources. Survival was calculated from time of ICU admission, with a censoring date of 14 February 2007. Data for patients with multiple hospital admissions requiring intensive care were analysed only from the first admission. Summary and descriptive statistics were used for preliminary data analysis. Kaplan-Meier survival analysis was used to analyse factors determining long-term survival. Results: During the study period, 21 415 unique patients had 22 552 hospital admissions that included an ICU admission; 19 058 surgical procedures were performed with a total of 20 092 ICU admissions. There were 4936 deaths. Median follow-up was 6.2 years, totalling 134 203 patient years. The casemix was predominantly cardiac surgery (80%), followed by cardiac medical (6%), and other medical (4%). The unadjusted survival at 1, 5 and 10 years was 97%, 84% and 70%, respectively. The 1-year survival ranged from 97% for cardiac surgery to 36% for cardiac arrest. An APACHE II score was available for 16 877 patients. In those discharged alive from hospital, the 1, 5 and 10-year survival varied with discharge location. Conclusions: ICU-based linkage projects are feasible to determine long-term outcomes of ICU patients
Resumo:
Principal Topic: There is increasing recognition that the organizational configurations of corporate venture units should depend on the types of ventures the unit seeks to develop (Burgelman, 1984; Hill and Birkinshaw, 2008). Distinction have been made between internal and external as well as exploitative versus explorative ventures (Hill and Birkinshaw, 2008; Narayan et al., 2009; Schildt et al., 2005). Assuming that firms do not want to limit themselves to a single type of venture, but rather employ a portfolio of ventures, the logical consequence is that firms should employ multiple corporate venture units. Each venture unit tailor-made for the type of venture it seeks to develop. Surprisingly, there is limited attention in the literature for the challenges of managing multiple corporate venture units in a single firm. Maintaining multiple venture units within one firm provides easier access to funding for new ideas (Hamel, 1999). It allows for freedom and flexibility to tie the organizational systems (Rice et al., 2000), autonomy (Hill and Rothaermel, 2003), and involvement of management (Day, 1994; Wadwha and Kotha, 2006) to the requirements of the individual ventures. Yet, the strategic objectives of a venture may change when uncertainty around the venture is resolved (Burgelman, 1984). For example, firms may decide to spin-in external ventures (Chesbrough, 2002) or spun-out ventures that prove strategically unimportant (Burgelman, 1984). This suggests that ventures might need to be transferred between venture units, e.g. from a more internally-driven corporate venture division to a corporate venture capital unit. Several studies suggested that ventures require different managerial skills across their phase of development (Desouza et al., 2007; O'Connor and Ayers, 2005; Kazanjian and Drazin, 1990; Westerman et al., 2006). To facilitate effective transfer between venture units and manage the overall venturing process, it is important that firms set up and manage integrative linkages. Integrative linkages provide synergies and coordination between differentiated units (Lawrence and Lorsch, 1967). Prior findings pointed to the important role of senior management (Westerman et al., 2006; Gilbert, 2006) and a shared organizational vision (Burgers et al., 2009) to coordinate venture units with mainstream businesses. We will draw on these literatures to investigate the key question of how to integratively manage multiple venture units. ---------- Methodology/Key Propositions: In order to seek an answer to the research question, we employ a case study approach that provides unique insights into how firms can break up their venturing process. We selected three Fortune 500 companies that employ multiple venturing units, IBM, Royal Dutch/ Shell and Nokia, and investigated and compared their approaches. It was important that the case companies somewhat differed in the type of venture units they employed as well as the way they integrate and coordinate their venture units. The data are based on extensive interviews and a variety of internal and external company documents to triangulate our findings (Eisenhardt, 1989). The key proposition of the article is that firms can best manage their multiple venture units through an ambidextrous design of loosely coupled units. This provides venture units with sufficient flexibility to employ organizational configurations that best support the type of venture they seek to develop, as well as provides sufficient integration to facilitate smooth transfer of ventures between venture units. Based on the case findings, we develop a generic framework for a new way of managing the venturing process through multiple corporate venture units. ---------- Results and Implications: One of our main findings is that these firms tend to organize their venture units according to phases in the venture development process. That is, they tend to have venture units aimed at incubation of venture ideas as well as units aimed more at the commercialization of ventures into a new business unit for the firm or a start-up. The companies in our case studies tended to coordinate venture units through integrative management skills or a coordinative venture unit that spanned multiple phases. We believe this paper makes two significant contributions. First, we extend prior venturing literature by addressing how firms manage a portfolio of venture units, each achieving different strategic objectives. Second, our framework provides recommendations on how firms should manage such an approach towards venturing. This helps to increase the likelihood of success of their venturing programs.
Resumo:
Over the past decade, there has been growth in the delivery of vocational rehabilitation services globally, as countries seek to control disability-related expenditure, yet there has been minimal research outside the United States on competencies required to work in this area. This study reports on research conducted in Australia to determine current job function and knowledge areas in terms of their importance and frequency of use in the provision of vocational rehabilitation. A survey comprising items from the Rehabilitation Skills Inventory-Amended and International Survey of Disability Management was completed by 149 rehabilitation counselors and items submitted to factor analysis. T-tests and analyses of variance were used to determine differences between scores of importance and frequency and differences in scores based on work setting and professional training. Six factors were identified as important and frequently used: (i) vocational counseling, (ii) professional practice, (iii) personal counseling, (iv) rehabilitation case management, (v) workplace disability case management, and (vi) workplace intervention and program management. Vocational counseling, professional practice and personal counseling were significantly more important and performed more frequently by respondents in vocational rehabilitation settings than those in compensation settings. These same three factors were rated significantly higher in importance and frequency by those with rehabilitation counselor training when compared with those with other training. In conclusion, although ‘traditional’ knowledge and skill areas such as vocational counseling, professional practice, and personal counseling were identified as central to vocational rehabilitation practice in Australian rehabilitation agencies, mean ratings suggest a growing emphasis on knowledge and skills associated with disability management practice.