871 resultados para Webster
Resumo:
Background How accurately do people perceive extreme water speeds and how does their perception affect perceived risk? Prior research has focused on the characteristics of moving water that can reduce human stability or balance. The current research presents the first experiment on people's perceptions of risk and moving water at different speeds and depths. Methods Using a randomized within-person 2 (water depth: 0.45, 0.90 m) ×3 (water speed: 0.4, 0.8, 1.2 m/s) experiment, we immersed 76 people in moving water and asked them to estimate water speed and the risk they felt. Results Multilevel modeling showed that people increasingly overestimated water speeds as actual water speeds increased or as water depth increased. Water speed perceptions mediated the direct positive relationship between actual water speeds and perceptions of risk; the faster the moving water, the greater the perceived risk. Participants' prior experience with rip currents and tropical cyclones moderated the strength of the actual–perceived water speed relationship; consequently, mediation was stronger for people who had experienced no rip currents or fewer storms. Conclusions These findings provide a clearer understanding of water speed and risk perception, which may help communicate the risks associated with anticipated floods and tropical cyclones.
Resumo:
Background How accurately do people perceive extreme wind speeds and how does that perception affect the perceived risk? Prior research on human–wind interaction has focused on comfort levels in urban settings or knock-down thresholds. No systematic experimental research has attempted to assess people's ability to estimate extreme wind speeds and perceptions of their associated risks. Method We exposed 76 people to 10, 20, 30, 40, 50, and 60 mph (4.5, 8.9, 13.4, 17.9, 22.3, and 26.8 m/s) winds in randomized orders and asked them to estimate wind speed and the corresponding risk they felt. Results Multilevel modeling showed that people were accurate at lower wind speeds but overestimated wind speeds at higher levels. Wind speed perceptions mediated the direct relationship between actual wind speeds and perceptions of risk (i.e., the greater the perceived wind speed, the greater the perceived risk). The number of tropical cyclones people had experienced moderated the strength of the actual–perceived wind speed relationship; consequently, mediation was stronger for people who had experienced fewer storms. Conclusion These findings provide a clearer understanding of wind and risk perception, which can aid development of public policy solutions toward communicating the severity and risks associated with natural disasters.
Resumo:
This Perspective reflects on the withdrawal of the Liverpool Care Pathway in the UK, and its implications for Australia. Integrated care pathways are documents which outline the essential steps of multidisciplinary care in addressing a specific clinical problem. They can be used to introduce best clinical practice, to ensure that the most appropriate management occurs at the most appropriate time and that it is provided by the most appropriate health professional. By providing clear instructions, decision support and a framework for clinician-patient interactions, care pathways guide the systematic provision of best evidence-based care. The Liverpool Care Pathway (LCP) is an example of an integrated care pathway, designed in the 1990s to guide care for people with cancer who are in their last days of life and are expected to die in hospital. This pathway evolved out of a recognised local need to better support non-specialist palliative care providers’ care for patients dying of cancer within their inpatient units. Historically, despite the large number of people in acute care settings whose treatment intent is palliative, dying patients receiving general hospital acute care tended to lack sufficient attention from senior medical staff and nursing staff. The quality of end-of-life care was considered inadequate, therefore much could be learned from the way patients were cared for by palliative care services. The LCP was a strategy developed to improve end-of-life care in cancer patients and was based on the care received by those dying in the palliative care setting.
Resumo:
An in situ X-ray diffraction investigation of goethite-seeded Al(OH)3 precipitation from synthetic Bayer liquor at 343 K has been performed. The presence of iron oxides and oxyhydroxides in the Bayer process has implications for alumina reversion, which causes significant process losses through unwanted gibbsite precipitation, and is also relevant for the nucleation and growth of scale on mild steel process equipment. The gibbsite, bayerite and nordstrandite polymorphs of Al(OH)3 precipitated from the liquor; gibbsite appeared to precipitate first, with subsequent formation of bayerite and nordstrandite. A Rietveld-based approach to quantitative phase analysis was implemented for the determination of absolute phase abundances as a function of time, from which kinetic information for the formation of the Al(OH)3 phases was determined.
Resumo:
The Office of Urban Management recognises that the values which characterise the SEQ region as 'subtropical' are important determinants of form in urban and regional planning. Subtropical values are those qualities on which our regional identity depends. A built environment which responds positively to these values is a critical ingredient for achieving a desirable future for the region. The Centre for Subtropical Design has undertaken this study to identify the particular set of values which characterises SEQ, and to translate theses values into design principals that will maintain and reinforce the value set. The principles not only apply to the overall balance between the natural environment and the built environment, but can be applied by local government authorities to guide local planning schemes and help shape specific built for outcomes.
Resumo:
Our world is literally and figuratively turning to ‘dust’. This work acknowledges decay and renewal and the transitional, cyclical natures of interrelated ecologies. It also suggests advanced levels of degradation potentially beyond reparation. Dust exists both on and beneath the border of our unaided vision. Dust particles are predominantly forms of disintegrating solids that often become the substance or catalyst of future forms. Like many tiny forms, dust is an often unnoticed residue with ‘planet-size consequences’. (Hanna Holmes 2001) The image depicts an ethereal, backlit body, continually circling and morphing, apparently floating, suggesting endless cycles of birth, life and death and inviting differing states of meditation, exploration, stillness and play. This never ending video work is taken from a large-scale interactive/media artwork created during a six-month research residency in England at the Institute of Contemporary Art London and at Vincent Dance Theatre Sheffield in 2006. It was originally presented on a raised floor screen made of pure white sand at the ICA in London (see). The project involved developing new interaction, engagement and image making strategies for media arts practice, drawing on the application of both kinetic and proprioceptive dance/performance knowledges. The work was further informed by ecological network theory that assesses the systemic implications of private and public actions within bounded systems. The creative methodology was primarily practice-led which fomented the particular qualities of imagery, generated through cross-fertilising embodied knowledge of Dance and Media Arts. This was achieved through extensive workshopping undertaken in theatres, working ‘on the floor’ live, with dancers, props, sound and projection. And eventually of course, all this dust must settle. (Holmes 2001, from Dust Jacket) Holmes, H. 2001, The Secret Life of Dust: From the Cosmos to the Kitchen Counter, the Big Consequences of Little Things, p.3
Resumo:
Human half-lives of PentaBDE congeners have been estimated from the decline in serum concentrations measured over a 6-12 month period for a population of exchange students moving from North America to Australia. Australian serum PBDE concentrations are typically between 5 -10 times lower than in North America and we can therefore hypothesize that if the biological half-life is sufficiently short we would observe a decline in serum concentration with length of residence in Australia. Thirty students were recruited over a period of 3 years from whom serum were archived every 2 months during their stay in Australia. Australian residents (n=22) were also sampled longitudinally to estimate general population background levels. All serum samples were analyzed by gas chromatography high resolution mass spectrometry. Key findings confirmed that BDE-47 concentrations in the Australians (median 2.3;
Resumo:
A three-year research program funded by the Australian Research Council and conducted by the four Learned Academies through the Australian Council of Learned Academies for PMSEIC, through the Office of the Chief Scientist. Securing Australia’s Future delivers research-based evidence and findings to support policy development in areas of importance to Australia’s future.
Resumo:
Background Surgical site infections (SSIs) are wound infections that occur after invasive (surgical) procedures. Preoperative bathing or showering with an antiseptic skin wash product is a well-accepted procedure for reducing skin bacteria (microflora). It is less clear whether reducing skin microflora leads to a lower incidence of surgical site infection. Objectives To review the evidence for preoperative bathing or showering with antiseptics for preventing hospital-acquired (nosocomial) surgical site infections. Search methods For this fifth update we searched the Cochrane Wounds Group Specialised Register (searched 18 December 2014); the Cochrane Central Register of Controlled Trials (The Cochrane Library 2014 Issue 11); Ovid MEDLINE (2012 to December Week 4 2014), Ovid MEDLINE (In-Process & Other Non-Indexed Citations December 18, 2014); Ovid EMBASE (2012 to 2014 Week 51), EBSCO CINAHL (2012 to December 18 2014) and reference lists of articles. Selection criteria Randomised controlled trials comparing any antiseptic preparation used for preoperative full-body bathing or showering with non-antiseptic preparations in people undergoing surgery. Data collection and analysis Two review authors independently assessed studies for selection, risk of bias and extracted data. Study authors were contacted for additional information. Main results We did not identify any new trials for inclusion in this fifth update. Seven trials involving a total of 10,157 participants were included. Four of the included trials had three comparison groups. The antiseptic used in all trials was 4% chlorhexidine gluconate (Hibiscrub/Riohex). Three trials involving 7791 participants compared chlorhexidine with a placebo. Bathing with chlorhexidine compared with placebo did not result in a statistically significant reduction in SSIs; the relative risk of SSI (RR) was 0.91 (95% confidence interval (CI) 0.80 to 1.04). When only trials of high quality were included in this comparison, the RR of SSI was 0.95 (95%CI 0.82 to 1.10). Three trials of 1443 participants compared bar soap with chlorhexidine; when combined there was no difference in the risk of SSIs (RR 1.02, 95% CI 0.57 to 1.84). Three trials of 1192 patients compared bathing with chlorhexidine with no washing, one large study found a statistically significant difference in favour of bathing with chlorhexidine (RR 0.36, 95%CI 0.17 to 0.79). The smaller studies found no difference between patients who washed with chlorhexidine and those who did not wash preoperatively. Authors' conclusions This review provides no clear evidence of benefit for preoperative showering or bathing with chlorhexidine over other wash products, to reduce surgical site infection. Efforts to reduce the incidence of nosocomial surgical site infection should focus on interventions where effect has been demonstrated.
Resumo:
Introduction Vascular access devices (VADs), such as peripheral or central venous catheters, are vital across all medical and surgical specialties. To allow therapy or haemodynamic monitoring, VADs frequently require administration sets (AS) composed of infusion tubing, fluid containers, pressure-monitoring transducers and/or burettes. While VADs are replaced only when necessary, AS are routinely replaced every 3–4 days in the belief that this reduces infectious complications. Strong evidence supports AS use up to 4 days, but there is less evidence for AS use beyond 4 days. AS replacement twice weekly increases hospital costs and workload. Methods and analysis This is a pragmatic, multicentre, randomised controlled trial (RCT) of equivalence design comparing AS replacement at 4 (control) versus 7 (experimental) days. Randomisation is stratified by site and device, centrally allocated and concealed until enrolment. 6554 adult/paediatric patients with a central venous catheter, peripherally inserted central catheter or peripheral arterial catheter will be enrolled over 4 years. The primary outcome is VAD-related bloodstream infection (BSI) and secondary outcomes are VAD colonisation, AS colonisation, all-cause BSI, all-cause mortality, number of AS per patient, VAD time in situ and costs. Relative incidence rates of VAD-BSI per 100 devices and hazard rates per 1000 device days (95% CIs) will summarise the impact of 7-day relative to 4-day AS use and test equivalence. Kaplan-Meier survival curves (with log rank Mantel-Cox test) will compare VAD-BSI over time. Appropriate parametric or non-parametric techniques will be used to compare secondary end points. p Values of <0.05 will be considered significant.
Resumo:
Background International standard practice for the correct confirmation of the central venous access device is the chest X-ray. The intracavitary electrocardiogram-based insertion method is radiation-free, and allows real-time placement verification, providing immediate treatment and reduced requirement for post-procedural repositioning. Methods Relevant databases were searched for prospective randomised controlled trials (RCTs) or quasi RCTs that compared the effectiveness of electrocardiogram-guided catheter tip positioning with placement using surface-anatomy-guided insertion plus chest X-ray confirmation. The primary outcome was accurate catheter tip placement. Secondary outcomes included complications, patient satisfaction and costs. Results Five studies involving 729 participants were included. Electrocardiogram-guided insertion was more accurate than surface anatomy guided insertion (odds ratio: 8.3; 95% confidence interval (CI) 1.38; 50.07; p=0.02). There was a lack of reporting on complications, patient satisfaction and costs. Conclusion The evidence suggests that intracavitary electrocardiogram-based positioning is superior to surface-anatomy-guided positioning of central venous access devices, leading to significantly more successful placements. This technique could potentially remove the requirement for post-procedural chest X-ray, especially during peripherally inserted central catheter (PICC) line insertion.
Resumo:
BACKGROUND After general surgery, the lower limb experiences some of the highest complication rates. However, little is known about contributing factors to surgical site failure in the lower limb dermatological surgery population. OBJECTIVE To determine the incidence of lower limb surgical site failure and to explore the predictors that contribute to surgical site failure. METHODS A prospective observational study design was used to collect data from 73 participants, from July 2010, to March 2012. Incidence was determined as a percentage of surgical site failure from the total population. Predictors were determined by the use of a binary logistic regression model. RESULTS The surgical site failure rate was 53.4%. Split-skin grafting had a higher failure rate than primary closures, 66% versus 26.1%. Predictors of lower limb surgical site failure were identified as increasing age (p = .04) and the presence of postoperative hematoma (p = .01), with all patients who developed surgical site infection experiencing surgical site failure (p = .01). CONCLUSION Findings from this study confirmed that the lower limb is at high risk of surgical site failure. Two predictors of surgical site failure from this cohort were determined. However, to understand this phenomenon and make recommendations to assist and reduce surgical site complications, further research in this field is required.