716 resultados para Practice guidelines
Resumo:
Over the past decade, an exciting area of research has emerged that demonstrates strong links between specific nursing care activities and patient outcomes. This body of research has resulted in the identification of a set of "nursing-sensitive outcomes"(NSOs). These NSOs may be interpreted with more meaning when they are linked to evidence-based best practice guidelines, which provide a structured means of ensuring care is consistent among all health care team members, across geographic locations, and across care settings. Uptake of evidence-based best practices at the point of care has been shown to have a measurable positive impact on processes of care and patient outcomes. The purpose of this paper is to present a systematic, narrative review of the literature regarding the clinical effectiveness of nursing management strategies on stroke patient outcomes sensitive to nursing interventions. Subsequent investigation will explore current applications of nursing-sensitive outcomes to patients with stroke, and identify and validate measurable NSOs within stroke care delivery.
Resumo:
Nursing is fundamental to the care of stroke patients. From the acute setting all the way to rehabilitation and community reintegration, nursing is there. Having well-educated and highly skilled nurses to monitor and care for stroke patients is crucial. Equally important is the collaboration of colleagues at a national level to facilitate and disseminate research and best practice guidelines across Canada. The National Stroke Nursing Council aims to fill this role. Stroke nurses from across Canada were invited to a national forum in 2005, hosted by the Canadian Stroke Network. The focus of this forum was to elucidate issues of concern to nurses across the stroke care continuum in relation to a Canadian Stroke Strategy. Subsequent to this forum, a cadre of nurses, after undergoing a rigorous screening process, were selected to form the inaugural National Stroke Nursing Council (NSNC). With ongoing support from the Canadian Stroke Network, the mandate of the NSNC is to promote leadership, communication, advocacy, education and nursing research in the field of stroke.
Resumo:
Background Guidelines and clinical practice for the prevention of complications associated with central venous catheters (CVC) around the world vary greatly. Most institutions recommend the use of heparin to prevent occlusion, however there is debate regarding the need for heparin and evidence to suggest 0.9% sodium chloride (normal saline) may be as effective. The use of heparin is not without risk, may be unnecessary and is also associated with increased cost. Objectives To assess the clinical effects (benefits and harms) of intermittent flushing of heparin versus normal saline to prevent occlusion in long term central venous catheters in infants and children. Search Methods The Cochrane Vascular Trials Search Co-ordinator searched the Specialised Register (last searched April 2015) and the Cochrane Register of Studies (Issue 3, 2015). We also searched the reference lists of retrieved trials. Selection criteria Randomised controlled trials that compared the efficacy of normal saline with heparin to prevent occlusion of long term CVCs in infants and children aged up to 18 years of age were included. We excluded temporary CVCs and peripherally inserted central catheters (PICC). Data Collection and Analysis Two review authors independently assessed trial inclusion criteria, trial quality and extracted data. Rate ratios were calculated for two outcome measures - occlusion of the CVC and central line-associated blood stream infection. Other outcome measures included duration of catheter placement, inability to withdraw blood from the catheter, use of urokinase or recombinant tissue plasminogen, incidence of removal or re-insertion of the catheter, or both, and other CVC-related complications such as dislocation of CVCs, other CVC site infections and thrombosis. Main Results Three trials with a total of 245 participants were included in this review. The three trials directly compared the use of normal saline and heparin, however, between studies, all used different protocols for the standard and experimental arms with different concentrations of heparin and different frequency of flushes reported. In addition, not all studies reported on all outcomes. The quality of the evidence ranged from low to very low because there was no blinding, heterogeneity and inconsistency between studies was high and the confidence intervals were wide. CVC occlusion was assessed in all three trials (243 participants). We were able to pool the results of two trials for the outcomes of CVC occlusion and CVC-associated blood stream infection. The estimated rate ratio for CVC occlusion per 1000 catheter days between the normal saline and heparin group was 0.75 (95% CI 0.10 to 5.51, two studies, 229 participants, very low quality evidence). The estimated rate ratio for CVC-associated blood stream infection was 1.48 (95% CI 0.24 to 9.37, two studies, 231 participants; low quality evidence). The duration of catheter placement was reported to be similar between the two study arms, in one study (203 participants). Authors' Conclusions The review found that there was not enough evidence to determine the effects of intermittent flushing of heparin versus normal saline to prevent occlusion in long term central venous catheters in infants and children. Ultimately, if this evidence were available, the development of evidenced-based clinical practice guidelines and consistency of practice would be facilitated.
Resumo:
In dentistry, basic imaging techniques such as intraoral and panoramic radiography are in most cases the only imaging techniques required for the detection of pathology. Conventional intraoral radiographs provide images with sufficient information for most dental radiographic needs. Panoramic radiography produces a single image of both jaws, giving an excellent overview of oral hard tissues. Regardless of the technique, plain radiography has only a limited capability in the evaluation of three-dimensional (3D) relationships. Technological advances in radiological imaging have moved from two-dimensional (2D) projection radiography towards digital, 3D and interactive imaging applications. This has been achieved first by the use of conventional computed tomography (CT) and more recently by cone beam CT (CBCT). CBCT is a radiographic imaging method that allows accurate 3D imaging of hard tissues. CBCT has been used for dental and maxillofacial imaging for more than ten years and its availability and use are increasing continuously. However, at present, only best practice guidelines are available for its use, and the need for evidence-based guidelines on the use of CBCT in dentistry is widely recognized. We evaluated (i) retrospectively the use of CBCT in a dental practice, (ii) the accuracy and reproducibility of pre-implant linear measurements in CBCT and multislice CT (MSCT) in a cadaver study, (iii) prospectively the clinical reliability of CBCT as a preoperative imaging method for complicated impacted lower third molars, and (iv) the tissue and effective radiation doses and image quality of dental CBCT scanners in comparison with MSCT scanners in a phantom study. Using CBCT, subjective identification of anatomy and pathology relevant in dental practice can be readily achieved, but dental restorations may cause disturbing artefacts. CBCT examination offered additional radiographic information when compared with intraoral and panoramic radiographs. In terms of the accuracy and reliability of linear measurements in the posterior mandible, CBCT is comparable to MSCT. CBCT is a reliable means of determining the location of the inferior alveolar canal and its relationship to the roots of the lower third molar. CBCT scanners provided adequate image quality for dental and maxillofacial imaging while delivering considerably smaller effective doses to the patient than MSCT. The observed variations in patient dose and image quality emphasize the importance of optimizing the imaging parameters in both CBCT and MSCT.
Resumo:
Summer in the Persian Gulf region presents physiological challenges for Australian sheep that are part of the live export supply chain coming from the Australian winter. Many feedlots throughout the Gulf have very high numbers of animals during June to August in order to cater for the increased demand for religious festivals. From an animal welfare perspective it is important to understand the necessary requirements of feed and water trough allowances, and the amount of pen space required, to cope with exposure to these types of climatic conditions. This study addresses parameters that are pertinent to the wellbeing of animals arriving in the Persian Gulf all year round. Three experiments were conducted in a feedlot in the Persian Gulf between March 2010 and February 2012, totalling 44 replicate pens each with 60 or 100 sheep. The applied treatments covered animal densities, feed-bunk lengths and water trough lengths. Weights, carcass attributes and health status were the key recorded variables. Weight change results showed superior performance for animal densities of ≥1.2 m2/head during hot conditions (24-h average temperatures greater than 33 °C, or a diurnal range of around 29–37 °C). However the space allowance for animals can be decreased, with no demonstrated detrimental effect, to 0.6 m2/head under milder conditions. A feed-bunk length of ≥5 cm/head is needed, as 2 cm/head showed significantly poorer animal performance. When feeding at 90 ad libitum 10 cm/head was optimal, however under a maintenance feeding regime (1 kg/head/day) 5 cm/head was adequate. A minimum water trough allowance of 1 cm/head is required. However, this experiment was conducted during milder conditions, and it may well be expected that larger water trough lengths would be needed in hotter conditions. Carcass weights were determined mainly by weights at feedlot entry and subsequent weight gains, while dressing percentage was not significantly affected by any of the applied treatments. There was no demonstrated effect of any of the treatments on the number of animals that died, or were classified as unwell. However, across all the treatments, these animals lost significantly more weight than the healthy animals, so the above recommendations, which are aimed at maintaining weight, should also be applicable for good animal health and welfare. Therefore, best practice guidelines for managing Australian sheep in Persian Gulf feedlots in the hottest months (June–August) which present the greatest environmental and physical challenge is to allow feed-bunk length 5 cm/head on a maintenance-feeding program and 10 cm/head for 90 ad libitum feeding, and the space allowance per animal should be ≥1.2 m2/head. Water trough allocation should be at least 1 cm/head with provision for more in the summer when water intake potentially doubles.
Resumo:
Summer in the Persian Gulf region presents physiological challenges for Australian sheep that are part of the live export supply chain coming from the Australian winter. Many feedlots throughout the Gulf have very high numbers of animals during June to August in order to cater for the increased demand for religious festivals. From an animal welfare perspective it is important to understand the necessary requirements of feed and water trough allowances, and the amount of pen space required, to cope with exposure to these types of climatic conditions. This study addresses parameters that are pertinent to the wellbeing of animals arriving in the Persian Gulf all year round. Three experiments were conducted in a feedlot in the Persian Gulf between March 2010 and February 2012, totalling 44 replicate pens each with 60 or 100 sheep. The applied treatments covered animal densities, feed-bunk lengths and water trough lengths. Weights, carcass attributes and health status were the key recorded variables. Weight change results showed superior performance for animal densities of ≥1.2 m2/head during hot conditions (24-h average temperatures greater than 33 °C, or a diurnal range of around 29–37 °C). However the space allowance for animals can be decreased, with no demonstrated detrimental effect, to 0.6 m2/head under milder conditions. A feed-bunk length of ≥5 cm/head is needed, as 2 cm/head showed significantly poorer animal performance. When feeding at 90% ad libitum 10 cm/head was optimal, however under a maintenance feeding regime (1 kg/head/day) 5 cm/head was adequate. A minimum water trough allowance of 1 cm/head is required. However, this experiment was conducted during milder conditions, and it may well be expected that larger water trough lengths would be needed in hotter conditions. Carcass weights were determined mainly by weights at feedlot entry and subsequent weight gains, while dressing percentage was not significantly affected by any of the applied treatments. There was no demonstrated effect of any of the treatments on the number of animals that died, or were classified as unwell. However, across all the treatments, these animals lost significantly more weight than the healthy animals, so the above recommendations, which are aimed at maintaining weight, should also be applicable for good animal health and welfare. Therefore, best practice guidelines for managing Australian sheep in Persian Gulf feedlots in the hottest months (June–August) which present the greatest environmental and physical challenge is to allow feed-bunk length 5 cm/head on a maintenance-feeding program and 10 cm/head for 90% ad libitum feeding, and the space allowance per animal should be ≥1.2 m2/head. Water trough allocation should be at least 1 cm/head with provision for more in the summer when water intake potentially doubles.
Resumo:
Background Around the world, guidelines and clinical practice for the prevention of complications associated with central venous catheters (CVC) vary greatly. To prevent occlusion, most institutions recommend the use of heparin when the CVC is not in use. However, there is debate regarding the need for heparin and evidence to suggest normal saline may be as effective. The use of heparin is not without risk, may be unnecessary and is also associated with increased costs. Objectives To assess the clinical effects (benefits and harms) of heparin versus normal saline to prevent occlusion in long-term central venous catheters in infants, children and adolescents. Design A Cochrane systematic review of randomised controlled trials was undertaken. - Data sources: The Cochrane Vascular Group Specialised Register (including MEDLINE, CINAHL, EMBASE and AMED) and the Cochrane Register of Studies were searched. Hand searching of relevant journals and reference lists of retrieved articles was also undertaken. - Review Methods: Data were extracted and appraisal undertaken. We included studies that compared the efficacy of normal saline with heparin to prevent occlusion. We excluded temporary CVCs and peripherally inserted central catheters. Rate ratios per 1000 catheter days were calculated for two outcomes, occlusion of the CVC, and CVC-associated blood stream infection. Results Three trials with a total of 245 participants were included in this review. The three trials directly compared the use of normal saline and heparin. However, between studies, all used different protocols with various concentrations of heparin and frequency of flushes. The quality of the evidence ranged from low to very low. The estimated rate ratio for CVC occlusion per 1000 catheter days between the normal saline and heparin group was 0.75 (95% CI 0.10 to 5.51, two studies, 229 participants, very low quality evidence). The estimated rate ratio for CVC-associated blood stream infection was 1.48 (95% CI 0.24 to 9.37, two studies, 231 participants; low quality evidence). Conclusions It remains unclear whether heparin is necessary for CVC maintenance. More well-designed studies are required to understand this relatively simple, but clinically important question. Ultimately, if this evidence were available, the development of evidenced-based clinical practice guidelines and consistency of practice would be facilitated.
Resumo:
Background The Researching Effective Approaches to Cleaning in Hospitals (REACH) study will generate evidence about the effectiveness and cost-effectiveness of a novel cleaning initiative that aims to improve the environmental cleanliness of hospitals. The initiative is an environmental cleaning bundle, with five interdependent, evidence-based components (training, technique, product, audit and communication) implemented with environmental services staff to enhance hospital cleaning practices. Methods/design The REACH study will use a stepped-wedge randomised controlled design to test the study intervention, an environmental cleaning bundle, in 11 Australian hospitals. All trial hospitals will receive the intervention and act as their own control, with analysis undertaken of the change within each hospital based on data collected in the control and intervention periods. Each site will be randomised to one of the 11 intervention timings with staggered commencement dates in 2016 and an intervention period between 20 and 50 weeks. All sites complete the trial at the same time in 2017. The inclusion criteria allow for a purposive sample of both public and private hospitals that have higher-risk patient populations for healthcare-associated infections (HAIs). The primary outcome (objective one) is the monthly number of Staphylococcus aureus bacteraemias (SABs), Clostridium difficile infections (CDIs) and vancomycin resistant enterococci (VRE) infections, per 10,000 bed days. Secondary outcomes for objective one include the thoroughness of hospital cleaning assessed using fluorescent marker technology, the bio-burden of frequent touch surfaces post cleaning and changes in staff knowledge and attitudes about environmental cleaning. A cost-effectiveness analysis will determine the second key outcome (objective two): the incremental cost-effectiveness ratio from implementation of the cleaning bundle. The study uses the integrated Promoting Action on Research Implementation in Health Services (iPARIHS) framework to support the tailored implementation of the environmental cleaning bundle in each hospital. Discussion Evidence from the REACH trial will contribute to future policy and practice guidelines about hospital environmental cleaning. It will be used by healthcare leaders and clinicians to inform decision-making and implementation of best-practice infection prevention strategies to reduce HAIs in hospitals. Trial registration Australia New Zealand Clinical Trial Registry ACTRN12615000325505
Resumo:
Background Diabetic foot complications are the leading cause of lower extremity amputation and diabetes-related hospitalisation in Australia. Studies demonstrate significant reductions in amputations and hospitalisation when health professionals implement best practice management. Whilst other nations have surveyed health professionals on specific diabetic foot management, to the best of the authors’ knowledge this appears not to have occurred in Australia. The primary aim of this study was to examine Australian podiatrists’ diabetic foot management compared with best practice recommendations by the Australian National Health Medical Research Council. Methods A 36-item Australian Diabetic Foot Management survey, employing seven-point Likert scales (0 = Never; 7 = Always) to measure multiple aspects of best practice diabetic foot management was developed. The survey was briefly tested for face and content validity. The survey was electronically distributed to Australian podiatrists via professional associations. Demographics including sex, years treating patients with diabetes, employment-sector and patient numbers were also collected. Chi-squared and Mann Whitney U tests were used to test differences between sub-groups. Results Three hundred and eleven podiatrists responded; 222 (71%) were female, 158 (51%) from the public sector and 11–15 years median experience. Participants reported treating a median of 21–30 diabetes patients each week, including 1–5 with foot ulcers. Overall, participants registered median scores of at least “very often” (>6) in their use of most items covering best practice diabetic foot management. Notable exceptions were: “never” (1 (1 – 3)) using total contact casting, “sometimes” (4 (2 – 5)) performing an ankle brachial index, “sometimes” (4 (1 – 6)) using University of Texas Wound Classification System, and “sometimes” (4 (3 – 6) referring to specialist multi-disciplinary foot teams. Public sector podiatrists reported higher use or access on all those items compared to private sector podiatrists (p < 0.01). Conclusions This study provides the first baseline information on Australian podiatrists’ adherence to best practice diabetic foot guidelines. It appears podiatrists manage large caseloads of people with diabetes and are generally implementing best practice guidelines recommendations with some notable exceptions. Further studies are required to identify barriers to implementing these recommendations to ensure all Australians with diabetes have access to best practice care to prevent amputations.
Resumo:
Document contains 4 pages.
Resumo:
Document contains 4 pages.
Resumo:
(4pp.)
Resumo:
(4pp.)