83 resultados para Bullard, Joanna
Resumo:
Background Procedural sedation and analgesia (PSA) is used to attenuate the pain and distress that may otherwise be experienced during diagnostic and interventional medical or dental procedures. As the risk of adverse events increases with the depth of sedation induced, frequent monitoring of level of consciousness is recommended. Level of consciousness is usually monitored during PSA with clinical observation. Processed electroencephalogram-based depth of anaesthesia (DoA) monitoring devices provide an alternative method to monitor level of consciousness that can be used in addition to clinical observation. However, there is uncertainty as to whether their routine use in PSA would be justified. Rigorous evaluation of the clinical benefits of DoA monitors during PSA, including comprehensive syntheses of the available evidence, is therefore required. One potential clinical benefit of using DoA monitoring during PSA is that the technology could improve patient safety by reducing sedation-related adverse events, such as death or permanent neurological disability. We hypothesise that earlier identification of lapses into deeper than intended levels of sedation using DoA monitoring leads to more effective titration of sedative and analgesic medications, and results in a reduction in the risk of adverse events caused by the consequences of over-sedation, such as hypoxaemia. The primary objective of this review is to determine whether using DoA monitoring during PSA in the hospital setting improves patient safety by reducing the risk of hypoxaemia (defined as an arterial partial pressure of oxygen below 60 mmHg or percentage of haemoglobin that is saturated with oxygen [SpO2] less than 90 %). Other potential clinical benefits of using DoA monitoring devices during sedation will be assessed as secondary outcomes. Methods/design Electronic databases will be systematically searched for randomized controlled trials comparing the use of depth of anaesthesia monitoring devices with clinical observation of level of consciousness during PSA. Language restrictions will not be imposed. Screening, study selection and data extraction will be performed by two independent reviewers. Disagreements will be resolved by discussion. Meta-analyses will be performed if suitable. Discussion This review will synthesise the evidence on an important potential clinical benefit of DoA monitoring during PSA within hospital settings.
Resumo:
Objective To identify the prevalence of and risk factors for inadvertent hypothermia after procedures performed with procedural sedation and analgesia in a cardiac catheterisation laboratory. Design Single-centre, prospective observational study. Setting Tertiary care private hospital in Australia. Participants A convenience sample of 399 patients undergoing elective procedures with procedural sedation and analgesia were included. Propofol infusions were used when an anaesthetist was present. Otherwise, bolus doses of either midazolam or fentanyl or a combination of these medications was used. Interventions None Measurements and main results Hypothermia was defined as a temperature <36.0° Celsius. Multivariate logistic regression was used to identify risk factors. Hypothermia was present after 23.3% (n=93; 95% confidence interval [CI] 19.2%-27.4%) of 399 procedures. Sedative regimens with the highest prevalence of hypothermia were any regimen that included propofol (n=35; 40.2%; 95% CI 29.9%-50.5%) and the use of fentanyl combined with midazolam (n=23; 20.3%; 95% CI 12.9%-27.7%). Difference in mean temperature from pre to post-procedure was -0.27°C (Standard deviation [SD] 0.45). Receiving propofol (odds ratio [OR] OR 4.6 95% CI 2.5-8.6), percutaneous coronary intervention (OR 3.2 95% CI 1.7-5.9), body mass index <25 (OR 2.5 95% CI 1.4-4.4) and being hypothermic prior to the procedure (OR 4.9; 95% CI 2.3-10.8) were independent predictors of post-procedural hypothermia. Conclusions A moderate prevalence of hypothermia was observed. The small absolute change in temperature observed may not be a clinically important amount. More research is needed to increase confidence in our estimates of hypothermia in sedated patients and its impact on clinical outcomes.
Resumo:
Review questions/objective What is the effectiveness of meaningful occupation interventions for people living with dementia in residential aged care facilities? More specifically, the objectives are to identify: The effectiveness of interventions based on engaging residents of residential aged care facilities who have dementia in meaningful occupation (activities that have meaning for the individual) on: quality of life, behavioral and psychological symptoms of dementia (such as agitation, aggression, depression, wandering, apathy, etc.), mood, function, cognition, and sleep. Inclusion criteria Types of participants This review will consider studies that include participants with a confirmed diagnosis of any type of dementia living in residential aged care facilities / long term care/nursing homes/permanent care. Types of intervention(s)/phenomena of interest This review will consider studies that evaluate non-pharmacological interventions that are based on occupational or activity interventions considered to be meaningful to the person with dementia, and tailoring the intervention to meet their needs, abilities, interests and/or preferences will be required as part of the study’s methodology. Such interventions may include reminiscence therapy, exercise therapy, music therapy, individualized activity, behavioral interventions, recreational therapy, diversional therapy and psychosocial interventions. Trials of combinations of two or more such interventions will also be considered. Interventions may be in comparison to usual care, other meaningful occupation interventions, or any other non-pharmacological control or comparator.
Resumo:
A lack of access to primary care services, decreasing numbers of general practitioners (GPs) and free of charge visits have been cited as factors contributing to the rising demand on emergency departments. This study aims to investigate the sources of patients' referrals to emergency departments and track changes in the source of referral over a six-year period in Queensland. Data from Queensland Emergency Departments Information Systems were analyzed based on records from 21 hospitals for the periods 2003–04 to 2008–09. The emergency department data were compared with publicly available data on GPs services and patients attendance rates. In Queensland, the majority of patients are self-referred and a 6.6% growth between 2003–04 and 2008–09 (84.4% to 90% respectively) has been observed. The number of referrals made by GPs, hospitals and community services decreased by 29.4%, 40%, 42% respectively during the six-year period. The full-time workload equivalent GPs per 100,000 people increased by 4.5% and the number of GP attendances measured per capita rose by 4% (4.25 to 4.42). An examination of changes in the triage category of self-referred patients revealed an increase in triage category 1-3 by 60%, 36.2%, and 14.4% respectively. The number of self-referred patients in triage categories 4–5 decreased by 10.5% and 21.9% respectively. The results of this analysis reveal that although the number of services provided by GPs increased, the amount of referrals decreased, and the proportion of self-referred patients to emergency departments rose during the six-year period. In addition, a growth in urgent triage categories (1–3) has been observed, with a decline in the number of non-urgent categories (4–5) among patients who came directly to emergency departments. Understanding the reasons behind this situation is crucial for appropriate demand management. Possible explanations will be sought and presented based on patients' responses to an emergency department users' questionnaire.
Resumo:
Background An important potential clinical benefit of using capnography monitoring during procedural sedation and analgesia (PSA) is that this technology could improve patient safety by reducing serious sedation-related adverse events, such as death or permanent neurological disability, which are caused by inadequate oxygenation. The hypothesis is that earlier identification of respiratory depression using capnography leads to a change in clinical management that prevents hypoxaemia. As inadequate oxygenation/ventilation is the most common reason for injury associated with PSA, reducing episodes of hypoxaemia would indicate that using capnography would be safer than relying on standard monitoring alone. Methods/design The primary objective of this review is to determine whether using capnography during PSA in the hospital setting improves patient safety by reducing the risk of hypoxaemia (defined as an arterial partial pressure of oxygen below 60 mmHg or percentage of haemoglobin that is saturated with oxygen [SpO2] less than 90 %). A secondary objective of this review is to determine whether changes in the clinical management of sedated patients are the mediating factor for any observed impact of capnography monitoring on the rate of hypoxaemia. The potential adverse effect of capnography monitoring that will be examined in this review is the rate of inadequate sedation. Electronic databases will be searched for parallel, crossover and cluster randomised controlled trials comparing the use of capnography with standard monitoring alone during PSA that is administered in the hospital setting. Studies that included patients who received general or regional anaesthesia will be excluded from the review. Non-randomised studies will be excluded. Screening, study selection and data extraction will be performed by two reviewers. The Cochrane risk of bias tool will be used to assign a judgment about the degree of risk. Meta-analyses will be performed if suitable. Discussion This review will synthesise the evidence on an important potential clinical benefit of capnography monitoring during PSA within hospital settings. Systematic review registration: PROSPERO CRD42015023740
Resumo:
Australia is a leading user of collaborative procurement methods, which are used to deliver large and complex infrastructure projects. Project alliances, Early Contractor Involvement (ECI), and partnering are typical examples of collaborative procurement models. In order to increase procurement effectiveness and value for money (VfM), clients have adopted various learning strategies for new contract development. However client learning strategies and behaviours have not been systematically analysed before. Therefore, the current paper undertakes a literature review addressing the research question “How can client learning capabilities be effectively understood?”. From the resource-based and dynamic capability perspectives, this paper proposes that the collaborative learning capability (CLC) of clients drives procurement model evolution. Learning routines underpinning CLC carry out exploratory, transformative and exploitative learning phases associated with collaborative project delivery. This learning improves operating routines, and ultimately performance. The conceptualization of CLC and the three sequential learning phases is used to analyse the evidence in the construction management literature. The main contribution of this study is the presentation of a theoretical foundation for future empirical studies to unveil effective learning strategies, which help clients to improve the performance of collaborative projects in the dynamic infrastructure market.
Resumo:
With the current emerging development pattern in Malaysia, Malaysian government has enthusiastically promoted green procurement approach that will help the construction project being green. Previous studies highlighted that the concept of green procurement is still very new to the Malaysian construction industry, and this increases the needs for further research in this area. This paper addresses the needs of guidelines for stakeholders to procure environmentally-friendly construction. Currently, there is a limited practical guideline for stakeholders to procure green projects. This paper discusses the progress to date of a research project aimed at developing a green procurement framework for construction projects in the Malaysian construction industry. This framework will guide the stakeholders to plan the green procurement implementation to procure a construction projects. Through literature and expert opinion, this paper explores the list of green practices within procurement practices which becomes the basis to develop a survey instrument that will be used in the later part of this study. The paper will shed useful information for construction researchers and practitioners in exploring the green procurement concept for construction industry in Malaysia.
Resumo:
Objective This study aims to identify the main reasons for which first time and multiple users seek medical care through Queensland emergency departments (ED). Methods A cross-sectional survey was conducted at eight public EDs among presenting patients (n = 911). The questions measured the socio-demographic characteristics of patients, their beliefs and attitudes towards EDs services, and perceptions of health status. Bivariate and binary logistic regression analyses were performed to examine the differences between first time and multiple users of EDs. Results First time and multiple users accounted for 55.5% and 44.5%, respectively. Multiple users themselves believed to be sicker, have poorer health status, and additional and/or chronic health conditions. Multiple users more strongly believed that their condition required treatment at an ED and perceived their condition as being very serious. Multiple users reported weekly household incomes below $600, and half of the multiple users were not working as compared to 35% first time users. Multivariate analysis showed that multiple use was significantly associated with the existence of additional health problems, having chronic condition, lower self-efficacy, and need for ED treatment. Conclusions Patients who sought care for multiple times at EDs more often than first time users suffered from additional and chronic conditions. Their opinion of an ED as the most suitable place to address their current health problem was stronger than first time users. Any proposed demand management strategies need to address these beliefs together with the reasoning of patients to provide effective and appropriate care outside or within ED services.
Resumo:
Cost estimating has been acknowledged as a crucial component of construction projects. Depending on available information and project requirements, cost estimates evolve in tandem with project lifecycle stages; conceptualisation, design development, execution and facility management. The premium placed on the accuracy of cost estimates is crucial to producing project tenders and eventually in budget management. Notwithstanding the initial slow pace of its adoption, Building Information Modelling (BIM) has successfully addressed a number of challenges previously characteristic of traditional approaches in the AEC, including poor communication, the prevalence of islands of information and frequent reworks. Therefore, it is conceivable that BIM can be leveraged to address specific shortcomings of cost estimation. The impetus for leveraging BIM models for accurate cost estimation is to align budgeted and actual cost. This paper hypothesises that the accuracy of BIM-based estimation, as more efficient, process-mirrors of traditional cost estimation methods, can be enhanced by simulating traditional cost estimation factors variables. Through literature reviews and preliminary expert interviews, this paper explores the factors that could potentially lead to more accurate cost estimates for construction projects. The findings show numerous factors that affect the cost estimates ranging from project information and its characteristic, project team, clients, contractual matters, and other external influences. This paper will make a particular contribution to the early phase of BIM-based project estimation.
Resumo:
Background Internationally, a considerable body of research exists examining why nurses do not use evidence in practice. Consistently, the research finds that lack of knowledge about research or discomfort with understanding research terminology are among the chief reasons given. Research education is commonly included in undergraduate nursing degree programs, but this does not seem to translate into a strong understanding of research following graduation, or an ability to use it in practice. Aim The objective of this review was to identify the effectiveness of workplace, tertiary-level educational, or other interventions designed to improve or increase postregistration nurses’understanding of research literature and ability to critically interact with research literature with the aim of promoting the use of research evidence in practice in comparison to no intervention, other intervention, or usual practice. Methods A wide range of databases were searched for quantitative studies of registered nurses receiving educational interventions designed to increase or improve their understanding of research literature in tertiary or workplace settings. Two reviewers working independently critically appraised the relevant papers and extracted the data using Joanna Briggs Institute instruments. Data are presented as a narrative summary as no meta-analysis was possible. Results Searching identified 4,545 potentially relevant papers, and after the sifting of titles and abstracts, 96 papers were selected for retrieval. On examination of full-text versions, 10 of the 96 retrieved papers were found to meet the inclusion criteria. Included studies were low to moderate quality. Interactive or activity-based learning seems to be effective in terms of improving research knowledge, critical appraisal ability, and research self-efficacy. Utilizing a program with a strong base in an appropriate theory also seems to be associated with greater effectiveness, particularly for workplace interventions. Linking Evidence to Action The included studies strongly favored interactive interventions, and those utilizing theory in their construction. Therefore, these types of interventions should be implemented to improve the effectiveness of research education for nurses as well as their research literacy.
Resumo:
IMPORTANCE Patients with chest pain represent a high health care burden, but it may be possible to identify a patient group with a low short-term risk of adverse cardiac events who are suitable for early discharge. OBJECTIVE To compare the effectiveness of a rapid diagnostic pathway with a standard-care diagnostic pathway for the assessment of patients with possible cardiac chest pain in a usual clinical practice setting. DESIGN, SETTING, AND PARTICIPANTS A single-center, randomized parallel-group trial with blinded outcome assessments was conducted in an academic general and tertiary hospital. Participants included adults with acute chest pain consistent with acute coronary syndrome for whom the attending physician planned further observation and troponin testing. Patient recruitment occurred from October 11, 2010, to July 4, 2012, with a 30-day follow-up. INTERVENTIONS An experimental pathway using an accelerated diagnostic protocol (Thrombolysis in Myocardial Infarction score, 0; electrocardiography; and 0- and 2-hour troponin tests) or a standard-care pathway (troponin test on arrival at hospital, prolonged observation, and a second troponin test 6-12 hours after onset of pain) serving as the control. MAIN OUTCOMES AND MEASURES Discharge from the hospital within 6 hours without a major adverse cardiac event occurring within 30 days. RESULTS Fifty-two of 270 patients in the experimental group were successfully discharged within 6 hours compared with 30 of 272 patients in the control group (19.3% vs 11.0%; odds ratio, 1.92; 95% CI, 1.18-3.13; P = .008). It required 20 hours to discharge the same proportion of patients from the control group as achieved in the experimental group within 6 hours. In the experimental group, 35 additional patients (12.9%) were classified as low risk but admitted to an inpatient ward for cardiac investigation. None of the 35 patients received a diagnosis of acute coronary syndrome after inpatient evaluation. CONCLUSIONS AND RELEVANCE Using the accelerated diagnostic protocol in the experimental pathway almost doubled the proportion of patients with chest pain discharged early. Clinicians could discharge approximately 1 of 5 patients with chest pain to outpatient follow-up monitoring in less than 6 hours. This diagnostic strategy could be easily replicated in other centers because no extra resources are required.
Resumo:
The major histocompatibility complex (MHC) on chromosome 6 is associated with susceptibility to more common diseases than any other region of the human genome, including almost all disorders classified as autoimmune. In type 1 diabetes the major genetic susceptibility determinants have been mapped to the MHC class II genes HLA-DQB1 and HLA-DRB1 (refs 1–3), but these genes cannot completely explain the association between type 1 diabetes and the MHC region4, 5, 6, 7, 8, 9, 10, 11. Owing to the region's extreme gene density, the multiplicity of disease-associated alleles, strong associations between alleles, limited genotyping capability, and inadequate statistical approaches and sample sizes, which, and how many, loci within the MHC determine susceptibility remains unclear. Here, in several large type 1 diabetes data sets, we analyse a combined total of 1,729 polymorphisms, and apply statistical methods—recursive partitioning and regression...
Resumo:
Background The Global Burden of Disease Study 2013 (GBD 2013) aims to bring together all available epidemiological data using a coherent measurement framework, standardised estimation methods, and transparent data sources to enable comparisons of health loss over time and across causes, age–sex groups, and countries. The GBD can be used to generate summary measures such as disability-adjusted life-years (DALYs) and healthy life expectancy (HALE) that make possible comparative assessments of broad epidemiological patterns across countries and time. These summary measures can also be used to quantify the component of variation in epidemiology that is related to sociodemographic development. Methods We used the published GBD 2013 data for age-specific mortality, years of life lost due to premature mortality (YLLs), and years lived with disability (YLDs) to calculate DALYs and HALE for 1990, 1995, 2000, 2005, 2010, and 2013 for 188 countries. We calculated HALE using the Sullivan method; 95% uncertainty intervals (UIs) represent uncertainty in age-specific death rates and YLDs per person for each country, age, sex, and year. We estimated DALYs for 306 causes for each country as the sum of YLLs and YLDs; 95% UIs represent uncertainty in YLL and YLD rates. We quantified patterns of the epidemiological transition with a composite indicator of sociodemographic status, which we constructed from income per person, average years of schooling after age 15 years, and the total fertility rate and mean age of the population. We applied hierarchical regression to DALY rates by cause across countries to decompose variance related to the sociodemographic status variable, country, and time. Findings Worldwide, from 1990 to 2013, life expectancy at birth rose by 6·2 years (95% UI 5·6–6·6), from 65·3 years (65·0–65·6) in 1990 to 71·5 years (71·0–71·9) in 2013, HALE at birth rose by 5·4 years (4·9–5·8), from 56·9 years (54·5–59·1) to 62·3 years (59·7–64·8), total DALYs fell by 3·6% (0·3–7·4), and age-standardised DALY rates per 100 000 people fell by 26·7% (24·6–29·1). For communicable, maternal, neonatal, and nutritional disorders, global DALY numbers, crude rates, and age-standardised rates have all declined between 1990 and 2013, whereas for non–communicable diseases, global DALYs have been increasing, DALY rates have remained nearly constant, and age-standardised DALY rates declined during the same period. From 2005 to 2013, the number of DALYs increased for most specific non-communicable diseases, including cardiovascular diseases and neoplasms, in addition to dengue, food-borne trematodes, and leishmaniasis; DALYs decreased for nearly all other causes. By 2013, the five leading causes of DALYs were ischaemic heart disease, lower respiratory infections, cerebrovascular disease, low back and neck pain, and road injuries. Sociodemographic status explained more than 50% of the variance between countries and over time for diarrhoea, lower respiratory infections, and other common infectious diseases; maternal disorders; neonatal disorders; nutritional deficiencies; other communicable, maternal, neonatal, and nutritional diseases; musculoskeletal disorders; and other non-communicable diseases. However, sociodemographic status explained less than 10% of the variance in DALY rates for cardiovascular diseases; chronic respiratory diseases; cirrhosis; diabetes, urogenital, blood, and endocrine diseases; unintentional injuries; and self-harm and interpersonal violence. Predictably, increased sociodemographic status was associated with a shift in burden from YLLs to YLDs, driven by declines in YLLs and increases in YLDs from musculoskeletal disorders, neurological disorders, and mental and substance use disorders. In most country-specific estimates, the increase in life expectancy was greater than that in HALE. Leading causes of DALYs are highly variable across countries. Interpretation Global health is improving. Population growth and ageing have driven up numbers of DALYs, but crude rates have remained relatively constant, showing that progress in health does not mean fewer demands on health systems. The notion of an epidemiological transition—in which increasing sociodemographic status brings structured change in disease burden—is useful, but there is tremendous variation in burden of disease that is not associated with sociodemographic status. This further underscores the need for country-specific assessments of DALYs and HALE to appropriately inform health policy decisions and attendant actions.
Resumo:
Background The Global Burden of Disease, Injuries, and Risk Factor study 2013 (GBD 2013) is the first of a series of annual updates of the GBD. Risk factor quantification, particularly of modifiable risk factors, can help to identify emerging threats to population health and opportunities for prevention. The GBD 2013 provides a timely opportunity to update the comparative risk assessment with new data for exposure, relative risks, and evidence on the appropriate counterfactual risk distribution. Methods Attributable deaths, years of life lost, years lived with disability, and disability-adjusted life-years (DALYs) have been estimated for 79 risks or clusters of risks using the GBD 2010 methods. Risk–outcome pairs meeting explicit evidence criteria were assessed for 188 countries for the period 1990–2013 by age and sex using three inputs: risk exposure, relative risks, and the theoretical minimum risk exposure level (TMREL). Risks are organised into a hierarchy with blocks of behavioural, environmental and occupational, and metabolic risks at the first level of the hierarchy. The next level in the hierarchy includes nine clusters of related risks and two individual risks, with more detail provided at levels 3 and 4 of the hierarchy. Compared with GBD 2010, six new risk factors have been added: handwashing practices, occupational exposure to trichloroethylene, childhood wasting, childhood stunting, unsafe sex, and low glomerular filtration rate. For most risks, data for exposure were synthesised with a Bayesian meta-regression method, DisMod-MR 2.0, or spatial-temporal Gaussian process regression. Relative risks were based on meta-regressions of published cohort and intervention studies. Attributable burden for clusters of risks and all risks combined took into account evidence on the mediation of some risks such as high body-mass index (BMI) through other risks such as high systolic blood pressure and high cholesterol. Findings All risks combined account for 57·2% (95% uncertainty interval [UI] 55·8–58·5) of deaths and 41·6% (40·1–43·0) of DALYs. Risks quantified account for 87·9% (86·5–89·3) of cardiovascular disease DALYs, ranging to a low of 0% for neonatal disorders and neglected tropical diseases and malaria. In terms of global DALYs in 2013, six risks or clusters of risks each caused more than 5% of DALYs: dietary risks accounting for 11·3 million deaths and 241·4 million DALYs, high systolic blood pressure for 10·4 million deaths and 208·1 million DALYs, child and maternal malnutrition for 1·7 million deaths and 176·9 million DALYs, tobacco smoke for 6·1 million deaths and 143·5 million DALYs, air pollution for 5·5 million deaths and 141·5 million DALYs, and high BMI for 4·4 million deaths and 134·0 million DALYs. Risk factor patterns vary across regions and countries and with time. In sub-Saharan Africa, the leading risk factors are child and maternal malnutrition, unsafe sex, and unsafe water, sanitation, and handwashing. In women, in nearly all countries in the Americas, north Africa, and the Middle East, and in many other high-income countries, high BMI is the leading risk factor, with high systolic blood pressure as the leading risk in most of Central and Eastern Europe and south and east Asia. For men, high systolic blood pressure or tobacco use are the leading risks in nearly all high-income countries, in north Africa and the Middle East, Europe, and Asia. For men and women, unsafe sex is the leading risk in a corridor from Kenya to South Africa. Interpretation Behavioural, environmental and occupational, and metabolic risks can explain half of global mortality and more than one-third of global DALYs providing many opportunities for prevention. Of the larger risks, the attributable burden of high BMI has increased in the past 23 years. In view of the prominence of behavioural risk factors, behavioural and social science research on interventions for these risks should be strengthened. Many prevention and primary care policy options are available now to act on key risks.
Resumo:
Peer-based interventions have the potential to enhance quality of life and functioning; however their role specifically within the older population has not been fully investigated. The objective of this review therefore is to locate, appraise and synthesise evidence on the effectiveness of peer-based interventions on changes in health behaviors, specifically for the older population. The specific question to be answered is: “what is the effectiveness of peer-based interventions on health promoting behaviors in older adults, when compared to non peer-based interventions?”