132 resultados para Complementary risks


Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: Standardising handover processes and content, and using context-specific checklists are proposed as solutions to mitigate risks for preventable errors and patient harm associated with clinical handovers. OBJECTIVES: Adapt existing tools to standardise nursing handover from the intensive care unit (ICU) to the cardiac ward and assess patient safety risks before and after pilot implementation. METHODS: A three-stage, pre-post interrupted time-series design was used. Data were collected using naturalistic observations and audio-recording of 40 handovers and focus groups with 11 nurses. In Stage 1, examination of existing practice using observation of 20 handovers and a focus group interview provided baseline data. In Stage 2, existing tools for high-risk handovers were adapted to create tools specific to ICU-to-ward handovers. The adapted tools were introduced to staff using principles from evidence-based frameworks for practice change. In Stage 3, observation of 20 handovers and a focus group with five nurses were used to verify the design of tools to standardise handover by ICU nurses transferring care of cardiac surgical patients to ward nurses. RESULTS: Stage 1 data revealed variable and unsafe ICU-to-ward handover practices: incomplete ward preparation; failure to check patient identity; handover located away from patients; and information gaps. Analyses informed adaptation of process, content and checklist tools to standardise handover in Stage 2. Compared with baseline data, Stage 3 observations revealed nurses used the tools consistently, ward readiness to receive patients (10% vs 95%), checking patient identity (0% vs 100%), delivery of handover at the bedside (25% vs 100%) and communication of complete information (40% vs 100%) improved. CONCLUSION: Clinician adoption of tools to standardise ICU-to-ward handover of cardiac surgical patients reduced handover variability and patient safety risks. The study outcomes provide context-specific tools to guide handover processes and delivery of verbal content, a safety checklist, and a risk recognition matrix.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Responding to an emergency alarm poses a significant risk to firefighters' health and safety, particularly to cardiovascular health, physical and psychological stress, and fatigue. These risks have been largely categorised for salaried firefighters working 'on station'. Less is known about the factors that contribute to these risks for the vast number of non-salaried personnel who serve in retained roles, often deploying from home. The present study investigated the alarm response procedure for Australian metropolitan fire fighters, identifying common and divergent sources of risk for salaried and retained staff. There were significant differences in procedure between the two workgroups and this resulted in differences in risk profile between groups. Sleep and fatigue, actual response to the alarm stimulus, work-life balance and trauma emerged as sources of risk experienced differently by salaried and retained firefighters. Key findings included reports of fatigue in both groups, but particularly in the case of retained firefighters who manage primary employment as well as their retained position. This also translated into a poor sense of work-life balance. Both groups reported light sleep, insufficient sleep or fragmented sleep as a result of alarm response. In the case of salaried firefighters, this was associated with being woken on station when other appliances are called. There were risks from physical and psychological responses to the alarm stimulus, and reports of sleep inertia when driving soon after waking. The findings of this study highlight the common and divergent risks for these workgroups, and could be used in the ongoing management of firefighters' health and safety.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Schizophrenia risk has often been conceptualized using a model which requires two hits in order to generate the clinical phenotype-the first as an early priming in a genetically predisposed individual and the second a likely environmental insult. The aim of this paper was to review the literature and reformulate this binary risk-vulnerability model. We sourced the data for this narrative review from the electronic database PUBMED. Our search terms were not limited by language or date of publication. The development of schizophrenia may be driven by genetic vulnerability interacting with multiple vulnerability factors including lowered prenatal vitamin D exposure, viral infections, smoking intelligence quotient, social cognition cannabis use, social defeat, nutrition and childhood trauma. It is likely that these genetic risks, environmental risks and vulnerability factors are cumulative and interactive with each other and with critical periods of neurodevelopmental vulnerability. The development of schizophrenia is likely to be more complex and nuanced than the binary two hit model originally proposed nearly thirty years ago. Risk appears influenced by a more complex process involving genetic risk interfacing with multiple potentially interacting hits and vulnerability factors occurring at key periods of neurodevelopmental activity, which culminate in the expression of disease state. These risks are common across a number of neuropsychiatric and medical disorders, which might inform common preventive and intervention strategies across non-communicable disorders.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: The behaviour of hospitalized older adults can contribute to falls, a common adverse event during and after hospitalization. Objective: To understand why older adults take risks that may lead to falls in the hospital setting and in the transition period following discharge home. Design: Qualitative research. Setting and participants: Hospital patients from inpatient medical and rehabilitation wards (n = 16), their informal caregivers (n = 8), and health professionals (n = 33) recruited from Southern Health hospital facilities, Victoria, Australia. Main variables studied: Perceived motivations for, and factors contributing to risk taking that may lead to falls. Main outcome measures: Semi-structured, in depth interviews and focus groups were used to generate qualitative data. Interviews were conducted both 2 weeks post-hospitalization and 3 months post-hospitalization. Results: Risk taking was classified as; (i) enforced (ii) voluntary and informed and (iii) voluntary and mal informed. Five key factors that influence risk taking behaviour were (i) risk compensation ability of the older adult, (ii) willingness to ask for help, (iii) older adult desire to test their physical boundaries, (iv) communication failure between and within older adults, informal care givers and health professionals and (v) delayed provision of help. Discussion and Conclusion: Tension exists between taking risks as a part of rehabilitation and the effect it has on likelihood of falling. Health professionals and caregivers played a central role in mitigating unnecessary risk taking, though some older adults appear more likely to take risks than others by virtue of their attitudes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: The Global Burden of Disease, Injuries, and Risk Factor study 2013 (GBD 2013) is the first of a series of annual updates of the GBD. Risk factor quantification, particularly of modifiable risk factors, can help to identify emerging threats to population health and opportunities for prevention. The GBD 2013 provides a timely opportunity to update the comparative risk assessment with new data for exposure, relative risks, and evidence on the appropriate counterfactual risk distribution. Methods: Attributable deaths, years of life lost, years lived with disability, and disability-adjusted life-years (DALYs) have been estimated for 79 risks or clusters of risks using the GBD 2010 methods. Risk-outcome pairs meeting explicit evidence criteria were assessed for 188 countries for the period 1990-2013 by age and sex using three inputs: risk exposure, relative risks, and the theoretical minimum risk exposure level (TMREL). Risks are organised into a hierarchy with blocks of behavioural, environmental and occupational, and metabolic risks at the first level of the hierarchy. The next level in the hierarchy includes nine clusters of related risks and two individual risks, with more detail provided at levels 3 and 4 of the hierarchy. Compared with GBD 2010, six new risk factors have been added: handwashing practices, occupational exposure to trichloroethylene, childhood wasting, childhood stunting, unsafe sex, and low glomerular filtration rate. For most risks, data for exposure were synthesised with a Bayesian metaregression method, DisMod-MR 2.0, or spatial-temporal Gaussian process regression. Relative risks were based on meta-regressions of published cohort and intervention studies. Attributable burden for clusters of risks and all risks combined took into account evidence on the mediation of some risks such as high body-mass index (BMI) through other risks such as high systolic blood pressure and high cholesterol. Findings: All risks combined account for 57·2% (95% uncertainty interval [UI] 55·8-58·5) of deaths and 41·6% (40·1-43·0) of DALYs. Risks quantified account for 87·9% (86·5-89·3) of cardiovascular disease DALYs, ranging to a low of 0% for neonatal disorders and neglected tropical diseases and malaria. In terms of global DALYs in 2013, six risks or clusters of risks each caused more than 5% of DALYs: dietary risks accounting for 11·3 million deaths and 241·4 million DALYs, high systolic blood pressure for 10·4 million deaths and 208·1 million DALYs, child and maternal malnutrition for 1·7 million deaths and 176·9 million DALYs, tobacco smoke for 6·1 million deaths and 143·5 million DALYs, air pollution for 5·5 million deaths and 141·5 million DALYs, and high BMI for 4·4 million deaths and 134·0 million DALYs. Risk factor patterns vary across regions and countries and with time. In sub-Saharan Africa, the leading risk factors are child and maternal malnutrition, unsafe sex, and unsafe water, sanitation, and handwashing. In women, in nearly all countries in the Americas, north Africa, and the Middle East, and in many other high-income countries, high BMI is the leading risk factor, with high systolic blood pressure as the leading risk in most of Central and Eastern Europe and south and east Asia. For men, high systolic blood pressure or tobacco use are the leading risks in nearly all high-income countries, in north Africa and the Middle East, Europe, and Asia. For men and women, unsafe sex is the leading risk in a corridor from Kenya to South Africa. Interpretation: Behavioural, environmental and occupational, and metabolic risks can explain half of global mortality and more than one-third of global DALYs providing many opportunities for prevention. Of the larger risks, the attributable burden of high BMI has increased in the past 23 years. In view of the prominence of behavioural risk factors, behavioural and social science research on interventions for these risks should be strengthened. Many prevention and primary care policy options are available now to act on key risks.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Clinical trial adaptation refers to any adjustment of the trial protocol after the onset of the trial. Such adjustment may take on various forms, including the change in the dose of administered medicines, the frequency of administering an intervention, the number of trial participants, or the duration of the trial, to name just some possibilities. The main goal is to make the process of introducing new medical interventions to patients more efficient, either by reducing the cost or the time associated with evaluating their safety and efficacy. The principal challenge, which is an outstanding research problem, is to be found in the question of how adaptation should be performed so as to minimize the chance of distorting the outcome of the trial. In this paper we propose a novel method for achieving this. Unlike most of the previously published work, our approach focuses on trial adaptation by sample size adjustment i.e. by reducing the number of trial participants in a statistically informed manner. We adopt a stratification framework recently proposed for the analysis of trial outcomes in the presence of imperfect blinding and based on the administration of a generic auxiliary questionnaire that allows the participants to express their belief concerning the assigned intervention (treatment or control). We show that this data, together with the primary measured variables, can be used to make the probabilistically optimal choice of the particular sub-group a participant should be removed from if trial size reduction is desired. Extensive experiments on a series of simulated trials are used to illustrate the effectiveness of our method.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

As a response to calls for making construction activities environmentally conscious, alternatives to mechanical demolition such as deconstruction, recycling and reuse for re-entering building materials and components back in to the supply chain have emerged. However, deconstruction has remained unexploited within the construction industry due to the adverse effects of barriers and challenges that make demolishing contractors shy away from implementing deconstruction in projects. On assessment of the barriers/challenges facing deconstruction it was revealed that deconstruction, like all construction activities, is fraught with various health and safety hazards. This study attempts to identify the role of health and safety risks in impeding the widespread implementation of deconstruction practices in construction projects. Afterwards, major health and safety risks associated with deconstruction activities are identified. Findings of the present study are based on the results acquired through conducting unstructured interviews with 6 demolition contractors in South Australia. The study contributes to the body of knowledge by further establishing the deconstruction field and providing a basis for future investigations into barriers of deconstruction. Further, presented discussions would provide professional implications by offering guidelines for managing deconstruction projects in a safer and more efficient environment.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: Critically ill patients require regular body position changes to minimize the adverse effects of bed rest, inactivity and immobilization. However, uncertainty surrounds the effectiveness of lateral positioning for improving pulmonary gas exchange, aiding drainage of tracheobronchial secretions and preventing morbidity. In addition, it is unclear whether the perceived risk levied by respiratory and haemodynamic instability upon turning critically ill patients outweighs the respiratory benefits of side-to-side rotation. Thus, lack of certainty may contribute to variation in positioning practice and equivocal patient outcomes. OBJECTIVES: To evaluate effects of the lateral position compared with other body positions on patient outcomes (mortality, morbidity and clinical adverse events) in critically ill adult patients. (Clinical adverse events include hypoxaemia, hypotension, low oxygen delivery and global indicators of impaired tissue oxygenation.) We examined single use of the lateral position (i.e. on the right or left side) and repeat use of the lateral position (i.e. lateral positioning) within a positioning schedule. SEARCH METHODS: We searched the Cochrane Central Register of Controlled Trials (CENTRAL; 2015, Issue 5), MEDLINE (1950 to 23 May 2015), the Cumulative Index to Nursing and Allied Health Literature (CINAHL) (1937 to 23 May 2015), the Allied and Complementary Medicine Database (AMED) (1984 to 23 May 2015), Latin American Caribbean Health Sciences Literature (LILACS) (1901 to 23 May 2015), Web of Science (1945 to 23 May 2015), Index to Theses in Great Britain and Ireland (1950 to 23 May 2015), Trove (2009 to 23 May 2015; previously Australasian Digital Theses Program (1997 to December 2008)) and Proquest Dissertations and Theses (2009 to 23 May 2015; previously Proquest Digital Dissertations (1980 to 23 May 2015)). We handsearched the reference lists of potentially relevant reports and two nursing journals. SELECTION CRITERIA: We included randomized and quasi-randomized trials examining effects of lateral positioning in critically ill adults. We included manual or automated turns but limited eligibility to studies that included duration of body position of 10 minutes or longer. We examined each lateral position versus at least one comparator (opposite lateral position and/or another body position) for single therapy effects, and the lateral positioning schedule (repeated lateral turning) versus other positioning schedules for repetitive therapy effects. DATA COLLECTION AND ANALYSIS: We pre-specified methods to be used for data collection, risk of bias assessment and analysis. Two independent review authors carried out each stage of selection and data extraction and settled differences in opinion by consensus, or by third party adjudication when disagreements remained unresolved. We planned analysis of pair-wise comparisons under composite time intervals with the aim of considering recommendations based on meta-analyses of studies with low risk of bias. MAIN RESULTS: We included 24 studies of critically ill adults. No study reported mortality as an outcome of interest. Two randomized controlled trials (RCTs) examined lateral positioning for pulmonary morbidity outcomes but provided insufficient information for meta-analysis. A total of 22 randomized trials examined effects of lateral positioning (four parallel-group and 18 cross-over designs) by measuring various continuous data outcomes commonly used to detect adverse cardiopulmonary events within critical care areas. However, parallel-group studies were not comparable, and cross-over studies provided limited data as the result of unit of analysis errors. Eight studies provided some data; most of these were single studies with small effects that were imprecise. We pooled partial pressure of arterial oxygen (PaO2) as a measure to detect hypoxaemia from two small studies of participants with unilateral lung disease (n = 19). The mean difference (MD) between lateral positions (bad lung down versus good lung down) was approximately 50 mmHg (MD -49.26 mmHg, 95% confidence interval (CI) -67.33 to -31.18; P value < 0.00001). Despite a lower mean PaO2 for bad lung down, hypoxaemia (mean PaO2 < 60 mmHg) was not consistently reported. Furthermore, pooled data had methodological shortcomings with unclear risk of bias. We had similar doubts regarding internal validity for other studies included in the review. AUTHORS' CONCLUSIONS: Review authors could provide no clinical practice recommendations based on the findings of included studies. Available research could not eliminate the uncertainty surrounding benefits and/or risks associated with lateral positioning of critically ill adult patients. Research gaps include the effectiveness of lateral positioning compared with semi recumbent positioning for mechanically ventilated patients, lateral positioning compared with prone positioning for acute respiratory distress syndrome (ARDS) and less frequent changes in body position. We recommend that future research be undertaken to address whether the routine practice of repositioning patients on their side benefits all, some or few critically ill patients.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Professor Karen Starr, Foundation Chair, School Development and Leadership at Deakin University, recently spoke with Principals about risk management in schools. She argues for more open dialogue in schools to expose the full range of risks they face.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Understanding the relationship between community composition and ecosystem function is essential for managing forests with complex disturbance regimes. Studies of animal responses to fire and timber harvesting in forest ecosystems typically focus on a single level of community diversity. Measures of species abundance and diversity at the community level, along with measures of functional diversity that incorporate information on species traits, provide opportunities for complementary insights into biodiversity responses to disturbances. We quantified community and functional responses of a temperate forest lizard community to fire and rotational logging using metrics including species-specific abundance, community abundance, species richness and evenness, as well as trait-based measures of functional diversity. We used non-linear regression models to examine the relationships between reptile data and time since fire and timber harvesting, using sites arrayed along a 30-years post-disturbance chronosequence. We modelled responses separately in two major vegetation types: coastal Banksia woodland and lowland eucalypt forests. Species and community measures offered different insights into the role of fire and logging. Species responses to disturbance differed between disturbance type and vegetation type. Four species exhibited significant population responses to either fire or timber harvesting, while the rest were unaffected by either disturbance. At the community level, species richness and community abundance increased significantly with time since fire in woodland vegetation. In forest vegetation, community abundance decreased with time since fire. Surprisingly, community evenness and functional diversity did not show marked responses to fire or timber harvesting. This is likely a result of trait homogeneity and the asynchrony in species responses to disturbance. We advocate using multiple measures of community composition - incorporating species-specific information, community metrics and functional traits - to ensure a more holistic understanding of disturbance ecology in forest landscapes.