14 resultados para Team Effectiveness
em Duke University
Resumo:
PURPOSE: The readiness assurance process (RAP) of team-based learning (TBL) is an important element that ensures that students come prepared to learn. However, the RAP can use a significant amount of class time which could otherwise be used for application exercises. The authors administered the TBL-associated RAP in class or individual readiness assurance tests (iRATs) at home to compare medical student performance and learning preference for physiology content. METHODS: Using cross-over study design, the first year medical student TBL teams were divided into two groups. One group was administered iRATs and group readiness assurance tests (gRATs) consisting of physiology questions during scheduled class time. The other group was administered the same iRAT questions at home, and did not complete a gRAT. To compare effectiveness of the two administration methods, both groups completed the same 12-question physiology assessment during dedicated class time. Four weeks later, the entire process was repeated, with each group administered the RAP using the opposite method. RESULTS: The performance on the physiology assessment after at-home administration of the iRAT was equivalent to performance after traditional in-class administration of the RAP. In addition, a majority of students preferred the at-home method of administration and reported that the at-home method was more effective in helping them learn course content. CONCLUSION: The at-home administration of the iRAT proved effective. The at-home administration method is a promising alternative to conventional iRATs and gRATs with the goal of preserving valuable in-class time for TBL application exercises.
Resumo:
BACKGROUND: Many patients with diabetes have poor blood pressure (BP) control. Pharmacological therapy is the cornerstone of effective BP treatment, yet there are high rates both of poor medication adherence and failure to intensify medications. Successful medication management requires an effective partnership between providers who initiate and increase doses of effective medications and patients who adhere to the regimen. METHODS: In this cluster-randomized controlled effectiveness study, primary care teams within sites were randomized to a program led by a clinical pharmacist trained in motivational interviewing-based behavioral counseling approaches and authorized to make BP medication changes or to usual care. This study involved the collection of data during a 14-month intervention period in three Department of Veterans Affairs facilities and two Kaiser Permanente Northern California facilities. The clinical pharmacist was supported by clinical information systems that enabled proactive identification of, and outreach to, eligible patients identified on the basis of poor BP control and either medication refill gaps or lack of recent medication intensification. The primary outcome is the relative change in systolic blood pressure (SBP) measurements over time. Secondary outcomes are changes in Hemoglobin A1c, low-density lipoprotein cholesterol (LDL), medication adherence determined from pharmacy refill data, and medication intensification rates. DISCUSSION: Integration of the three intervention elements--proactive identification, adherence counseling and medication intensification--is essential to achieve optimal levels of control for high-risk patients. Testing the effectiveness of this intervention at the team level allows us to study the program as it would typically be implemented within a clinic setting, including how it integrates with other elements of care. TRIAL REGISTRATION: The ClinicalTrials.gov registration number is NCT00495794.
Resumo:
Co-occurrence of HIV and substance abuse is associated with poor outcomes for HIV-related health and substance use. Integration of substance use and medical care holds promise for HIV patients, yet few integrated treatment models have been reported. Most of the reported models lack data on treatment outcomes in diverse settings. This study examined the substance use outcomes of an integrated treatment model for patients with both HIV and substance use at three different clinics. Sites differed by type and degree of integration, with one integrated academic medical center, one co-located academic medical center, and one co-located community health center. Participants (n=286) received integrated substance use and HIV treatment for 12 months and were interviewed at 6-month intervals. We used linear generalized estimating equation regression analysis to examine changes in Addiction Severity Index (ASI) alcohol and drug severity scores. To test whether our treatment was differentially effective across sites, we compared a full model including site by time point interaction terms to a reduced model including only site fixed effects. Alcohol severity scores decreased significantly at 6 and 12 months. Drug severity scores decreased significantly at 12 months. Once baseline severity variation was incorporated into the model, there was no evidence of variation in alcohol or drug score changes by site. Substance use outcomes did not differ by age, gender, income, or race. This integrated treatment model offers an option for treating diverse patients with HIV and substance use in a variety of clinic settings. Studies with control groups are needed to confirm these findings.
Resumo:
We analyze the cost-effectiveness of electric utility ratepayer-funded programs to promote demand-side management (DSM) and energy efficiency (EE) investments. We specify a model that relates electricity demand to previous EE DSM spending, energy prices, income, weather, and other demand factors. In contrast to previous studies, we allow EE DSM spending to have a potential longterm demand effect and explicitly address possible endogeneity in spending. We find that current period EE DSM expenditures reduce electricity demand and that this effect persists for a number of years. Our findings suggest that ratepayer funded DSM expenditures between 1992 and 2006 produced a central estimate of 0.9 percent savings in electricity consumption over that time period and a 1.8 percent savings over all years. These energy savings came at an expected average cost to utilities of roughly 5 cents per kWh saved when future savings are discounted at a 5 percent rate. Copyright © 2012 by the IAEE. All rights reserved.
Resumo:
Systematic reviews comparing the effectiveness of strategies to prevent, detect, and treat chronic kidney disease are needed to inform patient care. We engaged stakeholders in the chronic kidney disease community to prioritize topics for future comparative effectiveness research systematic reviews. We developed a preliminary list of suggested topics and stakeholders refined and ranked topics based on their importance. Among 46 topics identified, stakeholders nominated 18 as 'high' priority. Most pertained to strategies to slow disease progression, including: (a) treat proteinuria, (b) improve access to care, (c) treat hypertension, (d) use health information technology, and (e) implement dietary strategies. Most (15 of 18) topics had been previously studied with two or more randomized controlled trials, indicating feasibility of rigorous systematic reviews. Chronic kidney disease topics rated by stakeholders as 'high priority' are varied in scope and may lead to quality systematic reviews impacting practice and policy.
Resumo:
BACKGROUND: Evidence is lacking to inform providers' and patients' decisions about many common treatment strategies for patients with end stage renal disease (ESRD). METHODS/DESIGN: The DEcIDE Patient Outcomes in ESRD Study is funded by the United States (US) Agency for Health Care Research and Quality to study the comparative effectiveness of: 1) antihypertensive therapies, 2) early versus later initiation of dialysis, and 3) intravenous iron therapies on clinical outcomes in patients with ESRD. Ongoing studies utilize four existing, nationally representative cohorts of patients with ESRD, including (1) the Choices for Healthy Outcomes in Caring for ESRD study (1041 incident dialysis patients recruited from October 1995 to June 1999 with complete outcome ascertainment through 2009), (2) the Dialysis Clinic Inc (45,124 incident dialysis patients initiating and receiving their care from 2003-2010 with complete outcome ascertainment through 2010), (3) the United States Renal Data System (333,308 incident dialysis patients from 2006-2009 with complete outcome ascertainment through 2010), and (4) the Cleveland Clinic Foundation Chronic Kidney Disease Registry (53,399 patients with chronic kidney disease with outcome ascertainment from 2005 through 2009). We ascertain patient reported outcomes (i.e., health-related quality of life), morbidity, and mortality using clinical and administrative data, and data obtained from national death indices. We use advanced statistical methods (e.g., propensity scoring and marginal structural modeling) to account for potential biases of our study designs. All data are de-identified for analyses. The conduct of studies and dissemination of findings are guided by input from Stakeholders in the ESRD community. DISCUSSION: The DEcIDE Patient Outcomes in ESRD Study will provide needed evidence regarding the effectiveness of common treatments employed for dialysis patients. Carefully planned dissemination strategies to the ESRD community will enhance studies' impact on clinical care and patients' outcomes.
Resumo:
© 2013 American Psychological Association.This meta-analysis synthesizes research on the effectiveness of intelligent tutoring systems (ITS) for college students. Thirty-five reports were found containing 39 studies assessing the effectiveness of 22 types of ITS in higher education settings. Most frequently studied were AutoTutor, Assessment and Learning in Knowledge Spaces, eXtended Tutor-Expert System, and Web Interface for Statistics Education. Major findings include (a) Overall, ITS had a moderate positive effect on college students' academic learning (g = .32 to g = .37); (b) ITS were less effective than human tutoring, but they outperformed all other instruction methods and learning activities, including traditional classroom instruction, reading printed text or computerized materials, computer-assisted instruction, laboratory or homework assignments, and no-treatment control; (c) ITS's effectiveness did not significantly differ by different ITS, subject domain, or the manner or degree of their involvement in instruction and learning; and (d) effectiveness in earlier studies appeared to be significantly greater than that in more recent studies. In addition, there is some evidence suggesting the importance of teachers and pedagogy in ITS-assisted learning.
Resumo:
In preventing invasive fungal disease (IFD) in patients with acute myelogenous leukemia (AML) or myelodysplastic syndrome (MDS), clinical trials demonstrated efficacy of posaconazole over fluconazole and itraconazole. However, effectiveness of posaconazole has not been investigated in the United States in real-world setting outside the environment of controlled clinical trial. We performed a single-center, retrospective cohort study of 130 evaluable patients ≥18 years of age admitted to Duke University Hospital between 2004 and 2010 who received either posaconazole or fluconazole as prophylaxis during first induction or first reinduction chemotherapy for AML or MDS. The primary endpoint was possible, probable, or definite breakthrough IFD. Baseline characteristics were well balanced between groups, except that posaconazole recipients received reinduction chemotherapy and cytarabine more frequently. IFD occurred in 17/65 (27.0%) in the fluconazole group and in 6/65 (9.2%) in the posaconazole group (P = 0.012). Definite/probable IFDs occurred in 7 (10.8%) and 0 patients (0%), respectively (P = 0.0013). In multivariate analysis, fluconazole prophylaxis and duration of neutropenia were predictors of IFD. Mortality was similar between groups. This study demonstrates superior effectiveness of posaconazole over fluconazole as prophylaxis of IFD in AML and MDS patients. Such superiority did not translate to reductions in 100-day all-cause mortality.
Resumo:
BACKGROUND: Diagnostic imaging represents the fastest growing segment of costs in the US health system. This study investigated the cost-effectiveness of alternative diagnostic approaches to meniscus tears of the knee, a highly prevalent disease that traditionally relies on MRI as part of the diagnostic strategy. PURPOSE: To identify the most efficient strategy for the diagnosis of meniscus tears. STUDY DESIGN: Economic and decision analysis; Level of evidence, 1. METHODS: A simple-decision model run as a cost-utility analysis was constructed to assess the value added by MRI in various combinations with patient history and physical examination (H&P). The model examined traumatic and degenerative tears in 2 distinct settings: primary care and orthopaedic sports medicine clinic. Strategies were compared using the incremental cost-effectiveness ratio (ICER). RESULTS: In both practice settings, H&P alone was widely preferred for degenerative meniscus tears. Performing MRI to confirm a positive H&P was preferred for traumatic tears in both practice settings, with a willingness to pay of less than US$50,000 per quality-adjusted life-year. Performing an MRI for all patients was not preferred in any reasonable clinical scenario. The prevalence of a meniscus tear in a clinician's patient population was influential. For traumatic tears, MRI to confirm a positive H&P was preferred when prevalence was less than 46.7%, with H&P preferred above that. For degenerative tears, H&P was preferred until the prevalence reaches 74.2%, and then MRI to confirm a negative was the preferred strategy. In both settings, MRI to confirm positive physical examination led to more than a 10-fold lower rate of unnecessary surgeries than did any other strategy, while MRI to confirm negative physical examination led to a 2.08 and 2.26 higher rate than H&P alone in primary care and orthopaedic clinics, respectively. CONCLUSION: For all practitioners, H&P is the preferred strategy for the suspected degenerative meniscus tear. An MRI to confirm a positive H&P is preferred for traumatic tears for all practitioners. Consideration should be given to implementing alternative diagnostic strategies as well as enhancing provider education in physical examination skills to improve the reliability of H&P as a diagnostic test. CLINICAL RELEVANCE: Alternative diagnostic strategies that do not include the use of MRI may result in decreased health care costs without harm to the patient and could possibly reduce unnecessary procedures.
Resumo:
BACKGROUND: Despite the high prevalence and global impact of knee osteoarthritis (KOA), current treatments are palliative. No disease modifying anti-osteoarthritic drug (DMOAD) has been approved. We recently demonstrated significant involvement of uric acid and activation of the innate immune response in osteoarthritis (OA) pathology and progression, suggesting that traditional gout therapy may be beneficial for OA. We therefore assess colchicine, an existing commercially available agent for gout, for a new therapeutic application in KOA. METHODS/DESIGN: COLKOA is a double-blind, placebo-controlled, randomized trial comparing a 16-week treatment with standard daily dose oral colchicine to placebo for KOA. A total of 120 participants with symptomatic KOA will be recruited from a single center in Singapore. The primary end point is 30% improvement in total Western Ontario and McMaster Universities Osteoarthritis Index (WOMAC) score at week 16. Secondary end points include improvement in pain, physical function, and quality of life and change in serum, urine and synovial fluid biomarkers of cartilage metabolism and inflammation. A magnetic resonance imaging (MRI) substudy will be conducted in 20 participants to evaluate change in synovitis. Logistic regression will be used to compare changes between groups in an intention-to-treat analysis. DISCUSSION: The COLKOA trial is designed to evaluate whether commercially available colchicine is effective for improving signs and symptoms of KOA, and reducing synovial fluid, serum and urine inflammatory and biochemical joint degradation biomarkers. These biomarkers should provide insights into the underlying mechanism of therapeutic response. This trial will potentially provide data to support a new treatment option for KOA. TRIAL REGISTRATION: The trial has been registered at clinicaltrials.gov as NCT02176460 . Date of registration: 26 June 2014.
Resumo:
BACKGROUND: In recent decades, low-level laser therapy (LLLT) has been widely used to relieve pain caused by different musculoskeletal disorders. Though widely used, its reported therapeutic outcomes are varied and conflicting. Results similarly conflict regarding its usage in patients with nonspecific chronic low back pain (NSCLBP). This study investigated the efficacy of low-level laser therapy (LLLT) for the treatment of NSCLBP by a systematic literature search with meta-analyses on selected studies. METHOD: MEDLINE, EMBASE, ISI Web of Science and Cochrane Library were systematically searched from January 2000 to November 2014. Included studies were randomized controlled trials (RCTs) written in English that compared LLLT with placebo treatment in NSCLBP patients. The efficacy effect size was estimated by the weighted mean difference (WMD). Standard random-effects meta-analysis was used, and inconsistency was evaluated by the I-squared index (I(2)). RESULTS: Of 221 studies, seven RCTs (one triple-blind, four double-blind, one single-blind, one not mentioning blinding, totaling 394 patients) met the criteria for inclusion. Based on five studies, the WMD in visual analog scale (VAS) pain outcome score after treatment was significantly lower in the LLLT group compared with placebo (WMD = -13.57 [95 % CI = -17.42, -9.72], I(2) = 0 %). No significant treatment effect was identified for disability scores or spinal range of motion outcomes. CONCLUSIONS: Our findings indicate that LLLT is an effective method for relieving pain in NSCLBP patients. However, there is still a lack of evidence supporting its effect on function.
Resumo:
BACKGROUND: Road traffic injuries (RTIs) are a growing but neglected global health crisis, requiring effective prevention to promote sustainable safety. Low- and middle-income countries (LMICs) share a disproportionately high burden with 90% of the world's road traffic deaths, and where RTIs are escalating due to rapid urbanization and motorization. Although several studies have assessed the effectiveness of a specific intervention, no systematic reviews have been conducted summarizing the effectiveness of RTI prevention initiatives specifically performed in LMIC settings; this study will help fill this gap. METHODS: In accordance with PRISMA guidelines we searched the electronic databases MEDLINE, EMBASE, Scopus, Web of Science, TRID, Lilacs, Scielo and Global Health. Articles were eligible if they considered RTI prevention in LMICs by evaluating a prevention-related intervention with outcome measures of crash, RTI, or death. In addition, a reference and citation analysis was conducted as well as a data quality assessment. A qualitative metasummary approach was used for data analysis and effect sizes were calculated to quantify the magnitude of emerging themes. RESULTS: Of the 8560 articles from the literature search, 18 articles from 11 LMICs fit the eligibility and inclusion criteria. Of these studies, four were from Sub-Saharan Africa, ten from Latin America and the Caribbean, one from the Middle East, and three from Asia. Half of the studies focused specifically on legislation, while the others focused on speed control measures, educational interventions, enforcement, road improvement, community programs, or a multifaceted intervention. CONCLUSION: Legislation was the most common intervention evaluated with the best outcomes when combined with strong enforcement initiatives or as part of a multifaceted approach. Because speed control is crucial to crash and injury prevention, road improvement interventions in LMIC settings should carefully consider how the impact of improvements will affect speed and traffic flow. Further road traffic injury prevention interventions should be performed in LMICs with patient-centered outcomes in order to guide injury prevention in these complex settings.
Resumo:
CONCLUSION Radiation dose reduction, while saving image quality could be easily implemented with this approach. Furthermore, the availability of a dosimetric data archive provides immediate feedbacks, related to the implemented optimization strategies. Background JCI Standards and European Legislation (EURATOM 59/2013) require the implementation of patient radiation protection programs in diagnostic radiology. Aim of this study is to demonstrate the possibility to reduce patients radiation exposure without decreasing image quality, through a multidisciplinary team (MT), which analyzes dosimetric data of diagnostic examinations. Evaluation Data from CT examinations performed with two different scanners (Siemens DefinitionTM and GE LightSpeed UltraTM) between November and December 2013 are considered. CT scanners are configured to automatically send images to DoseWatch© software, which is able to store output parameters (e.g. kVp, mAs, pitch ) and exposure data (e.g. CTDIvol, DLP, SSDE). Data are analyzed and discussed by a MT composed by Medical Physicists and Radiologists, to identify protocols which show critical dosimetric values, then suggest possible improvement actions to be implemented. Furthermore, the large amount of data available allows to monitor diagnostic protocols currently in use and to identify different statistic populations for each of them. Discussion We identified critical values of average CTDIvol for head and facial bones examinations (respectively 61.8 mGy, 151 scans; 61.6 mGy, 72 scans), performed with the GE LightSpeed CTTM. Statistic analysis allowed us to identify the presence of two different populations for head scan, one of which was only 10% of the total number of scans and corresponded to lower exposure values. The MT adopted this protocol as standard. Moreover, the constant output parameters monitoring allowed us to identify unusual values in facial bones exams, due to changes during maintenance service, which the team promptly suggested to correct. This resulted in a substantial dose saving in CTDIvol average values of approximately 15% and 50% for head and facial bones exams, respectively. Diagnostic image quality was deemed suitable for clinical use by radiologists.