8 resultados para improved outcomes
em Duke University
Resumo:
© 2015 Chinese Nursing Association.Background Although self-management approaches have shown strong evidence of positive outcomes for urinary incontinence prevention and management, few programs have been developed for Korean rural communities. Objectives This pilot study aimed to develop, implement, and evaluate a urinary incontinence self-management program for community-dwelling women aged 55 and older with urinary incontinence in rural South Korea. Methods This study used a one-group pre- post-test design to measure the effects of the intervention using standardized urinary incontinence symptom, knowledge, and attitude measures. Seventeen community-dwelling older women completed weekly 90-min group sessions for 5 weeks. Descriptive statistics and paired t-tests and were used to analyze data. Results The mean of the overall interference on daily life from urine leakage (pre-test: M = 5.76 ± 2.68, post-test: M = 2.29 ± 1.93, t = -4.609, p < 0.001) and the sum of International Consultation on Incontinence Questionnaire scores (pre-test: M = 11.59 ± 3.00, post-test: M = 5.29 ± 3.02, t = -5.881, p < 0.001) indicated significant improvement after the intervention. Improvement was also noted on the mean knowledge (pre-test: M = 19.07 ± 3.34, post-test: M = 23.15 ± 2.60, t = 7.550, p < 0.001) and attitude scores (pre-test: M = 2.64 ± 0.19, post-test: M = 3.08 ± 0.41, t = 5.150, p < 0.001). Weekly assignments were completed 82.4% of the time. Participants showed a high satisfaction level (M = 26.82 ± 1.74, range 22-28) with the group program. Conclusions Implementation of a urinary incontinence self-management program was accompanied by improved outcomes for Korean older women living in rural communities who have scarce resources for urinary incontinence management and treatment. Urinary incontinence self-management education approaches have potential for widespread implementation in nursing practice.
Resumo:
BACKGROUND: The American College of Cardiology guidelines recommend 3 months of anticoagulation after replacement of the aortic valve with a bioprosthesis. However, there remains great variability in the current clinical practice and conflicting results from clinical studies. To assist clinical decision making, we pooled the existing evidence to assess whether anticoagulation in the setting of a new bioprosthesis was associated with improved outcomes or greater risk of bleeding. METHODS AND RESULTS: We searched the PubMed database from the inception of these databases until April 2015 to identify original studies (observational studies or clinical trials) that assessed anticoagulation with warfarin in comparison with either aspirin or no antiplatelet or anticoagulant therapy. We included the studies if their outcomes included thromboembolism or stroke/transient ischemic attacks and bleeding events. Quality assessment was performed in accordance with the Newland Ottawa Scale, and random effects analysis was used to pool the data from the available studies. I(2) testing was done to assess the heterogeneity of the included studies. After screening through 170 articles, a total of 13 studies (cases=6431; controls=18210) were included in the final analyses. The use of warfarin was associated with a significantly increased risk of overall bleeding (odds ratio, 1.96; 95% confidence interval, 1.25-3.08; P<0.0001) or bleeding risk at 3 months (odds ratio, 1.92; 95% confidence interval, 1.10-3.34; P<0.0001) compared with aspirin or placebo. With regard to composite primary outcome variables (risk of venous thromboembolism, stroke, or transient ischemic attack) at 3 months, no significant difference was seen with warfarin (odds ratio, 1.13; 95% confidence interval, 0.82-1.56; P=0.67). Moreover, anticoagulation was also not shown to improve outcomes at time interval >3 months (odds ratio, 1.12; 95% confidence interval, 0.80-1.58; P=0.79). CONCLUSIONS: Contrary to the current guidelines, a meta-analysis of previous studies suggests that anticoagulation in the setting of an aortic bioprosthesis significantly increases bleeding risk without a favorable effect on thromboembolic events. Larger, randomized controlled studies should be performed to further guide this clinical practice.
Resumo:
Copyright © 2014 International Anesthesia Research Society.BACKGROUND: Goal-directed fluid therapy (GDFT) is associated with improved outcomes after surgery. The esophageal Doppler monitor (EDM) is widely used, but has several limitations. The NICOM, a completely noninvasive cardiac output monitor (Cheetah Medical), may be appropriate for guiding GDFT. No prospective studies have compared the NICOM and the EDM. We hypothesized that the NICOM is not significantly different from the EDM for monitoring during GDFT. METHODS: One hundred adult patients undergoing elective colorectal surgery participated in this study. Patients in phase I (n = 50) had intraoperative GDFT guided by the EDM while the NICOM was connected, and patients in phase II (n = 50) had intraoperative GDFT guided by the NICOM while the EDM was connected. Each patient's stroke volume was optimized using 250- mL colloid boluses. Agreement between the monitors was assessed, and patient outcomes (postoperative pain, nausea, and return of bowel function), complications (renal, pulmonary, infectious, and wound complications), and length of hospital stay (LOS) were compared. RESULTS: Using a 10% increase in stroke volume after fluid challenge, agreement between monitors was 60% at 5 minutes, 61% at 10 minutes, and 66% at 15 minutes, with no significant systematic disagreement (McNemar P > 0.05) at any time point. The EDM had significantly more missing data than the NICOM. No clinically significant differences were found in total LOS or other outcomes. The mean LOS was 6.56 ± 4.32 days in phase I and 6.07 ± 2.85 days in phase II, and 95% confidence limits for the difference were -0.96 to +1.95 days (P = 0.5016). CONCLUSIONS: The NICOM performs similarly to the EDM in guiding GDFT, with no clinically significant differences in outcomes, and offers increased ease of use as well as fewer missing data points. The NICOM may be a viable alternative monitor to guide GDFT.
Resumo:
Minimally-invasive microsurgery has resulted in improved outcomes for patients. However, operating through a microscope limits depth perception and fixes the visual perspective, which result in a steep learning curve to achieve microsurgical proficiency. We introduce a surgical imaging system employing four-dimensional (live volumetric imaging through time) microscope-integrated optical coherence tomography (4D MIOCT) capable of imaging at up to 10 volumes per second to visualize human microsurgery. A custom stereoscopic heads-up display provides real-time interactive volumetric feedback to the surgeon. We report that 4D MIOCT enhanced suturing accuracy and control of instrument positioning in mock surgical trials involving 17 ophthalmic surgeons. Additionally, 4D MIOCT imaging was performed in 48 human eye surgeries and was demonstrated to successfully visualize the pathology of interest in concordance with preoperative diagnosis in 93% of retinal surgeries and the surgical site of interest in 100% of anterior segment surgeries. In vivo 4D MIOCT imaging revealed sub-surface pathologic structures and instrument-induced lesions that were invisible through the operating microscope during standard surgical maneuvers. In select cases, 4D MIOCT guidance was necessary to resolve such lesions and prevent post-operative complications. Our novel surgical visualization platform achieves surgeon-interactive 4D visualization of live surgery which could expand the surgeon's capabilities.
Resumo:
BACKGROUND: Coronary artery bypass grafting (CABG) is often used to treat patients with significant coronary heart disease (CHD). To date, multiple longitudinal and cross-sectional studies have examined the association between depression and CABG outcomes. Although this relationship is well established, the mechanism underlying this relationship remains unclear. The purpose of this study was twofold. First, we compared three markers of autonomic nervous system (ANS) function in four groups of patients: 1) Patients with coronary heart disease and depression (CHD/Dep), 2) Patients without CHD but with depression (NonCHD/Dep), 3) Patients with CHD but without depression (CHD/NonDep), and 4) Patients without CHD and depression (NonCHD/NonDep). Second, we investigated the impact of depression and autonomic nervous system activity on CABG outcomes. METHODS: Patients were screened to determine whether they met some of the study's inclusion or exclusion criteria. ANS function (i.e., heart rate, heart rate variability, and plasma norepinephrine levels) were measured. Chi-square and one-way analysis of variance were performed to evaluate group differences across demographic, medical variables, and indicators of ANS function. Logistic regression and multiple regression analyses were used to assess impact of depression and autonomic nervous system activity on CABG outcomes. RESULTS: The results of the study provide some support to suggest that depressed patients with CHD have greater ANS dysregulation compared to those with only CHD or depression. Furthermore, independent predictors of in-hospital length of stay and non-routine discharge included having a diagnosis of depression and CHD, elevated heart rate, and low heart rate variability. CONCLUSIONS: The current study presents evidence to support the hypothesis that ANS dysregulation might be one of the underlying mechanisms that links depression to cardiovascular CABG surgery outcomes. Thus, future studies should focus on developing and testing interventions that targets modifying ANS dysregulation, which may lead to improved patient outcomes.
Resumo:
BACKGROUND: Ipsilateral hindfoot arthrodesis in combination with total ankle replacement (TAR) may diminish functional outcome and prosthesis survivorship compared to isolated TAR. We compared the outcome of isolated TAR to outcomes of TAR with ipsilateral hindfoot arthrodesis. METHODS: In a consecutive series of 404 primary TARs in 396 patients, 70 patients (17.3%) had a hindfoot fusion before, after, or at the time of TAR; the majority had either an isolated subtalar arthrodesis (n = 43, 62%) or triple arthrodesis (n = 15, 21%). The remaining 334 isolated TARs served as the control group. Mean patient follow-up was 3.2 years (range, 24-72 months). RESULTS: The SF-36 total, AOFAS Hindfoot-Ankle pain subscale, Foot and Ankle Disability Index, and Short Musculoskeletal Function Assessment scores were significantly improved from preoperative measures, with no significant differences between the hindfoot arthrodesis and control groups. The AOFAS Hindfoot-Ankle total, function, and alignment scores were significantly improved for both groups, albeit the control group demonstrated significantly higher scores in all 3 scales. Furthermore, the control group demonstrated a significantly greater improvement in VAS pain score compared to the hindfoot arthrodesis group. Walking speed, sit-to-stand time, and 4-square step test time were significantly improved for both groups at each postoperative time point; however, the hindfoot arthrodesis group completed these tests significantly slower than the control group. There was no significant difference in terms of talar component subsidence between the fusion (2.6 mm) and control groups (2.0 mm). The failure rate in the hindfoot fusion group (10.0%) was significantly higher than that in the control group (2.4%; p < 0.05). CONCLUSION: To our knowledge, this study represents the first series evaluating the clinical outcome of TARs performed with and without hindfoot fusion using implants available in the United States. At follow-up of 3.2 years, TAR performed with ipsilateral hindfoot arthrodesis resulted in significant improvements in pain and functional outcome; in contrast to prior studies, however, overall outcome was inferior to that of isolated TAR. LEVEL OF EVIDENCE: Level II, prospective comparative series.
Resumo:
The HIV epidemic in the United States continues to be a significant public health problem, with approximately 50,000 new infections occurring each year. National public health priorities have shifted in recent years towards targeted HIV prevention efforts among people living with HIV/AIDS (PLWHA) that include: increasing engagement in and retention in care, improving HIV treatment adherence, and increasing screening for and treatment of substance use and psychological difficulties. This study evaluated the efficacy of Positive Choices (PC), a brief, care-based, theory-driven, 3-session counseling intervention for newly HIV-diagnosed men who have sex with men (MSM), in the context of current national HIV prevention priorities. The study involved secondary analysis of data from a preliminary efficacy trial of the PC intervention (n=102). Descriptive statistics examined baseline substance use, psychological characteristics and strategies, and care engagement and HIV-related biological outcomes. Generalized Estimating Equations (GEE) examined longitudinal changes in these variables by study condition. Results indicated that PC improved adherence to HIV treatment, but increased use of illicit drugs, specifically amyl nitrates and other stimulant drugs; additionally, moderation analyses indicated differences in patterns of change over time in viral load by baseline depression status. Implications of the findings and suggestions for future research are discussed.
Resumo:
In developing countries, access to modern energy for cooking and heating still remains a challenge to raising households out of poverty. About 2.5 billion people depend on solid fuels such as biomass, wood, charcoal and animal dung. The use of solid fuels has negative outcomes for health, the environment and economic development (Universal Energy Access, UNDP). In low income countries, 1.3 million deaths occur due to indoor smoke or air pollution from burning solid fuels in small, confined and unventilated kitchens or homes. In addition, pollutants such as black carbon, methane and ozone, emitted when burning inefficient fuels, are responsible for a fraction of the climate change and air pollution. There are international efforts to promote the use of clean cookstoves in developing countries but limited evidence on the economic benefits of such distribution programs. This study undertook a systematic economic evaluation of a program that distributed subsidized improved cookstoves to rural households in India. The evaluation examined the effect of different levels of subsidies on the net benefits to the household and to society. This paper answers the question, “Ex post, what are the economic benefits to various stakeholders of a program that distributed subsidized improved cookstoves?” In addressing this question, the evaluation used empirical data from India applied to a cost-benefit model to examine how subsidies affect the costs and the benefits of the biomass improved cookstove and the electric improved cookstove to different stakeholders.