388 resultados para Including therapeutic trials


Relevância:

20.00% 20.00%

Publicador:

Resumo:

There is now a widespread recognition of the importance of mental imagery in a range of clinical disorders (1). This provides the potential for a transdiagnostic route to integrate some aspects of these disorders and their treatment within a common framework. This opinion piece argues that we need to understand why imagery is such a central and recurring feature, if we are to progress theories of the origin and maintenance of disorders. This will aid us in identifying therapeutic techniques that are not simply targeting imagery as a symptom, but as a manifestation of an underlying problem. As papers in this issue highlight, imagery is a central feature across many clinical disorders, but has been ascribed varying roles. For example, the involuntary occurrence of traumatic memories is a diagnostic criterion for PTSD (2), and it has been suggested that multisensory imagery of traumatic events normally serves a functional role in allowing the individual to reappraise the situation (3), but that this re-appraisal is disabled by extreme affective responses. In contrast to the disabling flashbacks associated with PTSD, depressed adults who experience suicidal ideation often report “flash forward” imagery related to suicidal acts (4), motivating them to self-harm. Socially anxious individuals who engage in visual imagery about giving a talk in public become more anxious and make more negative predictions about future performance than others who engage in more abstract, semantic processing of the past event (5). People with Obsessive Compulsive Disorder (OCD) frequently report imagery of past adverse events, and imagery seems to be associated with severity (6). The content of intrusive imagery has been related to psychotic symptoms (7), including visual images of the catastrophic fears associated with paranoia and persecution. Imagery has been argued (8) to play a role in the maintenance of psychosis through negative appraisals of imagined voices, misattribution of sensations to external sources, by the induction of negative mood states that trigger voices, and through maintenance of negative schemas. In addiction and substance dependence, Elaborated Intrusion (EI) Theory (9, 10) emphasizes the causal role that imagery plays in substance use, through its role in motivating an individual to pursue goals directed toward achieving the pleasurable outcomes associated with substance use...

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Emotional intelligence (EI) is defined as “the ability to recognise, understand and manage emotions in ourselves and others” [1]. Initially identified as a concept applied to leadership and management, EI is now recognised as an important skill in a number of areas, including healthcare [2]. Empathy (the ability to see the world through someone else’s eyes) is known to play an important role in the therapeutic relationship with patients [3]. As EI has been shown to improve empathy [4], it is clear that developing the EI of student health professionals should benefit patients in the long term. It is not surprising, then, that a number of studies have investigated the role of EI in medical, dental and nursing students, however there is little reported evidence relating to EI development in pre-registration radiation therapy (RT) students.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND OR CONTEXT The concept of 'Aboriginal engineering' has had little exposure in conventional engineering education programs, despite more than 40,000 years of active human engagement with the diverse Australian environment. The work reported in this paper began with the premise that Indigenous Student Support Through Indigenous Perspectives Embedded in Engineering Curricula (Goldfinch, et al 2013) would provide a clear and replicable means of encouraging Aboriginal teenagers to consider a career in engineering. Although that remains a key outcome of this OLT project, the direction taken by the research had led to additional insights and perspectives that have wide implications for engineering education more generally. There has only been passing reference to the achievements of Aboriginal engineering in current texts, and the very absence of such references was a prompt to explore further as our work developed. PURPOSE OR GOAL Project goals focused on curriculum-based change, including development of a model for inclusive teaching spaces, and study units employing key features of the model. As work progressed we found we needed to understand more about the principles and practices informing the development of pre-contact Aboriginal engineering strategies for sustaining life and society within the landscape of this often harsh continent. We also found ourselves being asked 'what engineering did Aboriginal cultures have?' Finding that there are no easy-to- access answers, we began researching the question, while continuing to engage with specific curriculum trials. APPROACH Stakeholders in the project had been identified as engineering educators, potential Aboriginal students and Aboriginal communities local to Universities involved in the project. We realised, early on, that at least one more group was involved - all the non-Aboriginal students in engineering classes. This realisation, coupled with recognition of the need to understand Aboriginal engineering as a set of viable, long term practices, altered the focus of our efforts. Rather than focusing primarily on finding ways to attract Aboriginal engineering students, the shift has been towards evolving ways of including knowledge about Aboriginal practices and principles in relevant engineering content. DISCUSSION This paper introduces the model resulting from the work of this project, explores its potential influence on engineering curriculum development and reports on implementation strategies. The model is a static representation of a dynamic and cyclic approach to engaging with Aboriginal engineering through contact with local communities in regard to building knowledge about the social beliefs underlying Aboriginal engineering principles and practices. Ways to engage engineering educators, students and the wider community are evolving through the continuing work of the project team and will be reported in more detail in the paper. RECOMMENDATIONS/IMPLICATIONS/CONCLUSION While engineering may be considered by some to be agnostic in regard to culture and social issues, the work of this project is drawing attention to the importance of including such issues into curriculum materials at a number of levels of complexity. The paper will introduce and explore the central concepts of the research completed to date, as well as suggesting ways in which engineering educators can extend their knowledge and understanding of Aboriginal engineering principles in the context of their own specialisations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Emotional intelligence (EI) is defined as “the ability to recognise, understand and manage emotions in ourselves and others”. Initially identified as a concept applied to leadership and management, EI is now recognised as an important skill in a number of areas, including healthcare [2]. Empathy (the ability to see the world through someone else’s eyes) is known to play an important role in the therapeutic relationship with patients [3]. As EI has been shown to improve empathy [4], it is clear that developing the EI of student health professionals should benefit patients in the long term. It is not surprising, then, that a number of studies have investigated the role of EI in medical, dental and nursing students, however there is little reported evidence relating to EI development in pre-registration radiation therapy (RT) students.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Periodontal inflammation can inhibit cell differentiation of periodontal ligament cells (PDLCs), resulting in decreased bone/cementum regeneration ability. The Wnt signaling pathway, including canonical Wnt/β-catenin signaling and noncanonical Wnt/Ca2+ signaling, plays essential roles in cell proliferation and differentiation during tooth development. However, little is still known whether noncanonical Wnt/Ca2+ signaling cascade could regulate cementogenic/osteogenic differentiation capability of PDLCs within an inflammatory environment. Therefore, in this study, human PDLCs (hPDLCs) and their cementogenic differentiation potential were investigated in the presence of cytokines. The data demonstrated that both cytokines interleukin-6 (IL-6) and tumor necrosis factor alpha (TNF-α) inhibited cell proliferation, relative alkaline phosphatase activity, bone/cementum-related gene/protein expression, and canonical Wnt pathway-related gene/protein expression in hPDLCs. Interestingly, both cytokines upregulated the noncanonical Wnt/Ca2+ signaling-related gene and protein expression in hPDLCs. When the Wnt/Ca2+ pathway was blocked by Ca2+/calmodulin-dependent protein kinase II inhibitor KN93, even in the presence of IL-6 and TNF-α, cementogenesis could be stimulated in hPDLCs. Our data indicate that the Wnt/Ca2+ pathway plays an inhibitory role on PDLC cementogenic differentiation in inflammatory microenvironments. Therefore, targeting the Wnt/Ca2+ pathway may provide a novel therapeutic approach to improve periodontal regeneration for periodontal diseases.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We included six trials with 2524 participants. Capnography reduced hypoxaemic episodes, relative risk (95% CI) 0.71 (0.56-0.91), but the quality of evidence was poor due to high risks of performance bias and detection bias and substantial statistical heterogeneity. The reduction in hypoxaemic episodes was statistically homogeneous in the subgroup of three trials of 1823 adults sedated for colonoscopy, relative risk (95% CI) 0.59 (0.48-0.73), although the risks of performance and detection biases were high. There was no evidence that capnography affected other outcomes, including assisted ventilation, relative risk (95% CI) 0.58 (0.26-1.27), p = 0.17.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Alcohol addiction is a debilitating disorder producing maladaptive changes in the brain, leading drinkers to become more sensitive to stress and anxiety. These changes are key factors contributing to alcohol craving and maintaining a persistent vulnerability to relapse. Serotonin (5-Hydroxytryptamine, 5-HT) is a monoamine neurotransmitter widely expressed in the central nervous system where it plays an important role in the regulation of mood. The serotonin system has been extensively implicated in the regulation of stress and anxiety, as well as the reinforcing properties of all of the major classes of drugs of abuse, including alcohol. Dysregulation within the 5-HT system has been postulated to underlie the negative mood states associated with alcohol use disorders. This review will describe the serotonergic (5-HTergic) neuroplastic changes observed in animal models throughout the alcohol addiction cycle, from prenatal to adulthood exposure. The first section will focus on alcohol-induced 5-HTergic neuroadaptations in offspring prenatally exposed to alcohol and the consequences on the regulation of stress/anxiety. The second section will compare alterations in 5-HT signalling induced by acute or chronic alcohol exposure during adulthood and following alcohol withdrawal, highlighting the impact on the regulation of stress/anxiety signalling pathways. The third section will outline 5-HTergic neuroadaptations observed in various genetically-selected ethanol preferring rat lines. Finally, we will discuss the pharmacological manipulation of the 5-HTergic system on ethanol- and anxiety/stress-related behaviours demonstrated by clinical trials, with an emphasis on current and potential treatments.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Since its initial description as a Th2-cytokine antagonistic to interferon-alpha and granulocyte-macrophage colony-stimulating factor, many studies have shown various anti-inflammatory actions of interleukin-10 (IL-10), and its role in infection as a key regulator of innate immunity. Studies have shown that IL-10 induced in response to microorganisms and their products plays a central role in shaping pathogenesis. IL-10 appears to function as both sword and shield in the response to varied groups of microorganisms in its capacity to mediate protective immunity against some organisms but increase susceptibility to other infections. The nature of IL-10 as a pleiotropic modulator of host responses to microorganisms is explained, in part, by its potent and varied effects on different immune effector cells which influence antimicrobial activity. A new understanding of how microorganisms trigger IL-10 responses is emerging, along with recent discoveries of how IL-10 produced during disease might be harnessed for better protective or therapeutic strategies. In this review, we summarize studies from the past 5 years that have reported the induction of IL-10 by different classes of pathogenic microorganisms, including protozoa, nematodes, fungi, viruses and bacteria and discuss the impact of this induction on the persistence and/or clearance of microorganisms in the host.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study compared the effects of a low-frequency electrical stimulation (LFES; Veinoplus® Sport, Ad Rem Technology, Paris, France), a low-frequency electrical stimulation combined with a cooling vest (LFESCR) and an active recovery combined with a cooling vest (ACTCR) as recovery strategies on performance (racing time and pacing strategies), physiologic and perceptual responses between two sprint kayak simulated races, in a hot environment (∼32 wet-bulb-globe temperature). Eight elite male kayakers performed two successive 1000-m kayak time trials (TT1 and TT2), separated by a short-term recovery period, including a 30-min of the respective recovery intervention protocol, in a randomized crossover design. Racing time, power output, and stroke rate were recorded for each time trial. Blood lactate concentration, pH, core, skin and body temperatures were measured before and after both TT1 and TT2 and at mid- and post-recovery intervention. Perceptual ratings of thermal sensation were also collected. LFESCR was associated with a very likely effect in performance restoration compared with ACTCR (99/0/1%) and LFES conditions (98/0/2%). LFESCR induced a significant decrease in body temperature and thermal sensation at post-recovery intervention, which is not observed in ACTCR condition. In conclusion, the combination of LFES and wearing a cooling vest (LFESCR) improves performance restoration between two 1000-m kayak time trials achieved by elite athletes, in the heat.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objective To quantify and compare the treatment effect and risk of bias of trials reporting biomarkers or intermediate outcomes (surrogate outcomes) versus trials using final patient relevant primary outcomes. Design Meta-epidemiological study. Data sources All randomised clinical trials published in 2005 and 2006 in six high impact medical journals: Annals of Internal Medicine, BMJ, Journal of the American Medical Association, Lancet, New England Journal of Medicine, and PLoS Medicine. Study selection Two independent reviewers selected trials. Data extraction Trial characteristics, risk of bias, and outcomes were recorded according to a predefined form. Two reviewers independently checked data extraction. The ratio of odds ratios was used to quantify the degree of difference in treatment effects between the trials using surrogate outcomes and those using patient relevant outcomes, also adjusted for trial characteristics. A ratio of odds ratios >1.0 implies that trials with surrogate outcomes report larger intervention effects than trials with patient relevant outcomes. Results 84 trials using surrogate outcomes and 101 using patient relevant outcomes were considered for analyses. Study characteristics of trials using surrogate outcomes and those using patient relevant outcomes were well balanced, except for median sample size (371 v 741) and single centre status (23% v 9%). Their risk of bias did not differ. Primary analysis showed trials reporting surrogate endpoints to have larger treatment effects (odds ratio 0.51, 95% confidence interval 0.42 to 0.60) than trials reporting patient relevant outcomes (0.76, 0.70 to 0.82), with an unadjusted ratio of odds ratios of 1.47 (1.07 to 2.01) and adjusted ratio of odds ratios of 1.46 (1.05 to 2.04). This result was consistent across sensitivity and secondary analyses. Conclusions Trials reporting surrogate primary outcomes are more likely to report larger treatment effects than trials reporting final patient relevant primary outcomes. This finding was not explained by differences in the risk of bias or characteristics of the two groups of trials.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

- Background Nilotinib and dasatinib are now being considered as alternative treatments to imatinib as a first-line treatment of chronic myeloid leukaemia (CML). - Objective This technology assessment reviews the available evidence for the clinical effectiveness and cost-effectiveness of dasatinib, nilotinib and standard-dose imatinib for the first-line treatment of Philadelphia chromosome-positive CML. - Data sources Databases [including MEDLINE (Ovid), EMBASE, Current Controlled Trials, ClinicalTrials.gov, the US Food and Drug Administration website and the European Medicines Agency website] were searched from search end date of the last technology appraisal report on this topic in October 2002 to September 2011. - Review methods A systematic review of clinical effectiveness and cost-effectiveness studies; a review of surrogate relationships with survival; a review and critique of manufacturer submissions; and a model-based economic analysis. - Results Two clinical trials (dasatinib vs imatinib and nilotinib vs imatinib) were included in the effectiveness review. Survival was not significantly different for dasatinib or nilotinib compared with imatinib with the 24-month follow-up data available. The rates of complete cytogenetic response (CCyR) and major molecular response (MMR) were higher for patients receiving dasatinib than for those with imatinib for 12 months' follow-up (CCyR 83% vs 72%, p < 0.001; MMR 46% vs 28%, p < 0.0001). The rates of CCyR and MMR were higher for patients receiving nilotinib than for those receiving imatinib for 12 months' follow-up (CCyR 80% vs 65%, p < 0.001; MMR 44% vs 22%, p < 0.0001). An indirect comparison analysis showed no difference between dasatinib and nilotinib for CCyR or MMR rates for 12 months' follow-up (CCyR, odds ratio 1.09, 95% CI 0.61 to 1.92; MMR, odds ratio 1.28, 95% CI 0.77 to 2.16). There is observational association evidence from imatinib studies supporting the use of CCyR and MMR at 12 months as surrogates for overall all-cause survival and progression-free survival in patients with CML in chronic phase. In the cost-effectiveness modelling scenario, analyses were provided to reflect the extensive structural uncertainty and different approaches to estimating OS. First-line dasatinib is predicted to provide very poor value for money compared with first-line imatinib, with deterministic incremental cost-effectiveness ratios (ICERs) of between £256,000 and £450,000 per quality-adjusted life-year (QALY). Conversely, first-line nilotinib provided favourable ICERs at the willingness-to-pay threshold of £20,000-30,000 per QALY. - Limitations Immaturity of empirical trial data relative to life expectancy, forcing either reliance on surrogate relationships or cumulative survival/treatment duration assumptions. - Conclusions From the two trials available, dasatinib and nilotinib have a statistically significant advantage compared with imatinib as measured by MMR or CCyR. Taking into account the treatment pathways for patients with CML, i.e. assuming the use of second-line nilotinib, first-line nilotinib appears to be more cost-effective than first-line imatinib. Dasatinib was not cost-effective if decision thresholds of £20,000 per QALY or £30,000 per QALY were used, compared with imatinib and nilotinib. Uncertainty in the cost-effectiveness analysis would be substantially reduced with better and more UK-specific data on the incidence and cost of stem cell transplantation in patients with chronic CML. - Funding The Health Technology Assessment Programme of the National Institute for Health Research.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objective To identify the efficacy of short message service (SMS) reminders in health care appointment attendance. Materials and Methods A systematic review was undertaken to identify studies published between 2005 and 2015 that compared the attendance rates of patients receiving SMS reminders compared to patients not receiving a reminder. Each article was examined for information regarding the study design, sample size, population demographics and intervention methods. A meta-analysis was used to calculate a pooled estimate odds ratio. Results Twenty-eight (28) studies were included in the review, including 13 (46%) randomized controlled trials. The pooled odds ratio of the randomized control trials was 1.62 (1.35 – 1.94). Half of the studies reviewed sent the reminder within 48 hour prior to the appointment time, yet no significant subgroups differences with respect to participant age, SMS timing, rate or type, setting or specialty was detectable. Discussion All studies, except one with a small sample size, demonstrated a positive OR, indicating SMS reminders were an effective means of improving appointment attendance. There was no significant difference in OR when controlling for when the SMS was sent, the frequency of the reminders or the content of the reminder. Conclusion SMS appointment reminders are an effective and operative method in improving appointment attendance in a health care setting and this effectiveness has improved over the past five years. Further research is required to identify the optimal SMS reminder timing and frequency, specifically in relation to the length of time since the appointment.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background People admitted to intensive care units and those with chronic health care problems often require long-term vascular access. Central venous access devices (CVADs) are used for administering intravenous medications and blood sampling. CVADs are covered with a dressing and secured with an adhesive or adhesive tape to protect them from infection and reduce movement. Dressings are changed when they become soiled with blood or start to come away from the skin. Repeated removal and application of dressings can cause damage to the skin. The skin is an important barrier that protects the body against infection. Less frequent dressing changes may reduce skin damage, but it is unclear whether this practice affects the frequency of catheter-related infections. Objectives To assess the effect of the frequency of CVAD dressing changes on the incidence of catheter-related infections and other outcomes including pain and skin damage. Search methods In June 2015 we searched: The Cochrane Wounds Specialised Register; The Cochrane Central Register of Controlled Trials (CENTRAL) (The Cochrane Library); Ovid MEDLINE; Ovid MEDLINE (In-Process & Other Non-Indexed Citations); Ovid EMBASE and EBSCO CINAHL. We also searched clinical trials registries for registered trials. There were no restrictions with respect to language, date of publication or study setting. Selection criteria All randomised controlled trials (RCTs) evaluating the effect of the frequency of CVAD dressing changes on the incidence of catheter-related infections on all patients in any healthcare setting. Data collection and analysis We used standard Cochrane review methodology. Two review authors independently assessed studies for inclusion, performed risk of bias assessment and data extraction. We undertook meta-analysis where appropriate or otherwise synthesised data descriptively when heterogeneous. Main results We included five RCTs (2277 participants) that compared different frequencies of CVAD dressing changes. The studies were all conducted in Europe and published between 1995 and 2009. Participants were recruited from the intensive care and cancer care departments of one children's and four adult hospitals. The studies used a variety of transparent dressings and compared a longer interval between dressing changes (5 to15 days; intervention) with a shorter interval between changes (2 to 5 days; control). In each study participants were followed up until the CVAD was removed or until discharge from ICU or hospital. - Confirmed catheter-related bloodstream infection (CRBSI) One trial randomised 995 people receiving central venous catheters to a longer or shorter interval between dressing changes and measured CRBSI. It is unclear whether there is a difference in the risk of CRBSI between people having long or short intervals between dressing changes (RR 1.42, 95% confidence interval (CI) 0.40 to 4.98) (low quality evidence). - Suspected catheter-related bloodstream infection Two trials randomised a total of 151 participants to longer or shorter dressing intervals and measured suspected CRBSI. It is unclear whether there is a difference in the risk of suspected CRBSI between people having long or short intervals between dressing changes (RR 0.70, 95% CI 0.23 to 2.10) (low quality evidence). - All cause mortality Three trials randomised a total of 896 participants to longer or shorter dressing intervals and measured all cause mortality. It is unclear whether there is a difference in the risk of death from any cause between people having long or short intervals between dressing changes (RR 1.06, 95% CI 0.90 to 1.25) (low quality evidence). - Catheter-site infection Two trials randomised a total of 371 participants to longer or shorter dressing intervals and measured catheter-site infection. It is unclear whether there is a difference in risk of catheter-site infection between people having long or short intervals between dressing changes (RR 1.07, 95% CI 0.71 to 1.63) (low quality evidence). - Skin damage One small trial (112 children) and three trials (1475 adults) measured skin damage. There was very low quality evidence for the effect of long intervals between dressing changes on skin damage compared with short intervals (children: RR of scoring ≥ 2 on the skin damage scale 0.33, 95% CI 0.16 to 0.68; data for adults not pooled). - Pain Two studies involving 193 participants measured pain. It is unclear if there is a difference between long and short interval dressing changes on pain during dressing removal (RR 0.80, 95% CI 0.46 to 1.38) (low quality evidence). Authors' conclusions The best available evidence is currently inconclusive regarding whether longer intervals between CVAD dressing changes are associated with more or less catheter-related infection, mortality or pain than shorter intervals.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Aim To assess the effectiveness of a decision support intervention using a pragmatic single blind Randomized Controlled Trial. Background Worldwide the proportion of older people (aged 65 years and over) is rising. This population is known to have a higher prevalence of chronic diseases including chronic kidney disease. The resultant effect of the changing health landscape is seen in the increase in older patients (aged ≥65 years) commencing on dialysis. Emerging evidence suggests that for some older patients dialysis may provide minimal benefit. In a majority of renal units non-dialysis management is offered as an alternative to undertaking dialysis. Research regarding decision-making support that is required to assist this population in choosing between dialysis or non-dialysis management is limited. Design. A multisite single blinded pragmatic randomized controlled trial is proposed. Methods Patients will be recruited from four Queensland public hospitals and randomizd into either the control or intervention group. The decision support intervention is multimodal and includes counselling provided by a trained nurse. The comparator is standard decision-making support. The primary outcomes are decisional regret and decisional conflict. Secondary outcomes are improved knowledge and quality of life. Ethics approval obtained November 2014. Conclusion This is one of the first randomized controlled trials assessing a decision support intervention in older people with advance chronic kidney disease. The results may provide guidance for clinicians in future approaches to assist this population in decision-making to ensure reduced decisional regret and decisional conflict.