806 resultados para delayed multiple baseline across participants


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Individuals differ widely in how steeply they discount future rewards. The sources of these stable individual differences in delay discounting (DD) are largely unknown. One candidate is the COMT Val158Met polymorphism, known to modulate prefrontal dopamine levels and affect DD. To identify possible neural mechanisms by which this polymorphism may contribute to stable individual DD differences, we measured 73 participants' neural baseline activation using resting electroencephalogram (EEG). Such neural baseline activation measures are highly heritable and stable over time, thus an ideal endophenotype candidate to explain how genes may influence behavior via individual differences in neural function. After EEG-recording, participants made a series of incentive-compatible intertemporal choices to determine the steepness of their DD. We found that COMT significantly affected DD and that this effect was mediated by baseline activation level in the left dorsal prefrontal cortex (DPFC): (i) COMT had a significant effect on DD such that the number of Val alleles was positively correlated with steeper DD (higher numbers of Val alleles means greater COMT activity and thus lower dopamine levels). (ii) A whole-brain search identified a cluster in left DPFC where baseline activation was correlated with DD; lower activation was associated with steeper DD. (iii) COMT had a significant effect on the baseline activation level in this left DPFC cluster such that a higher number of Val alleles was associated with lower baseline activation. (iv) The effect of COMT on DD was explained by the mediating effect of neural baseline activation in the left DPFC cluster. Our study thus establishes baseline activation level in left DPFC as salient neural signature in the form of an endophenotype that mediates the link between COMT and DD.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND The main goal of this study was to assess frequency, clinical correlates, and independent predictors of fatigue in a homogeneous cohort of well-defined glioblastoma patients at baseline prior to combined radio-chemotherapy. METHODS We prospectively included 65 glioblastoma patients at postsurgical baseline and assessed fatigue, sleepiness, mean bedtimes, mood disturbances, and clinical characteristics such as clinical performance status, presenting symptomatology, details on neurosurgical procedure, and tumor location and diameter as well as pharmacological treatment including antiepileptic drugs, antidepressants, and use of corticosteroids. Data on fatigue and sleepiness were measured with the Fatigue Severity Scale and the Epworth Sleepiness Scale, respectively, and compared with 130 age- and sex-matched healthy controls. RESULTS We observed a significant correlation between fatigue and sleepiness scores in both patients (r = 0.26; P = .04) and controls (r = 0.36; P < .001). Only fatigue appeared to be more common in glioblastoma patients than in healthy controls (48% vs 11%; P < .001) but not the frequency of sleepiness (22% vs 19%; P = .43). Female sex was associated with increased fatigue frequency among glioblastoma patients but not among control participants. Multiple linear regression analyses identified depression, left-sided tumor location, and female sex as strongest associates of baseline fatigue severity. CONCLUSIONS Our findings indicate that glioblastoma patients are frequently affected by fatigue at baseline, suggesting that factors other than those related to radio- or chemotherapy have significant impact, particularly depression and tumor localization.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose The purpose of this study was to investigate multiple indirect Big Five personality influences on professionals’ annual salary while considering relevant mediators. These are the motivational variables of occupational self-efficacy and career-advancement goals, and the work status variable of contractual work hours. The motivational and work status variables were conceptualized as serial mediators (Big Five → occupational self-efficacy/career-advancement goals → contractual work hours → annual salary). Design/Methodology/Approach We realized a 4 year longitudinal survey study with 432 participants and three points of measurement. We assessed personality prior to the mediators and the mediators prior to annual salary. Findings Results showed that except for openness the other Big Five personality traits exerted indirect influences on annual salary. Career-advancement goals mediated influences of conscientiousness (+), extraversion (+), and agreeableness (−). Occupational self-efficacy mediated influences of neuroticism (–) and conscientiousness (+). Because the influence of occupational self-efficacy on annual salary was fully mediated by contractual work hours, indirect personality influences via occupational self-efficacy always included contractual work hours in a serial mediation. Implications These findings underline the importance of distal personality traits for career success. They give further insights into direct and indirect relationships between personality, goal content, self-efficacy beliefs, and an individual’s career progress. Originality/Value Previous research predominantly investigated direct Big Five influences on salary, and it analyzed cross-sectional data. This study is one of the first to investigate multiple indirect Big Five influences on salary in a longitudinal design. The findings support process-oriented theories of personality influences on career outcomes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The extravasation of CD4(+) effector/memory T cells (TEM cells) across the blood-brain barrier (BBB) is a crucial step in the pathogenesis of experimental autoimmune encephalomyelitis (EAE) or multiple sclerosis (MS). Endothelial ICAM-1 and ICAM-2 are essential for CD4(+) TEM cell crawling on the BBB prior to diapedesis. Here, we investigated the influence of cell surface levels of endothelial ICAM-1 in determining the cellular route of CD4(+) TEM -cell diapedesis across cytokine treated primary mouse BBB endothelial cells under physiological flow. Inflammatory conditions, inducing high levels of endothelial ICAM-1, promoted rapid initiation of transcellular diapedesis of CD4(+) T cells across the BBB, while intermediate levels of endothelial ICAM-1 favored paracellular CD4(+) T-cell diapedesis. Importantly, the route of T-cell diapedesis across the BBB was independent of loss of BBB barrier properties. Unexpectedly, a low number of CD4(+) TEM cells was found to cross the inflamed BBB in the absence of endothelial ICAM-1 and ICAM-2 via an obviously alternatively regulated transcellular pathway. In vivo, this translated to the development of ameliorated EAE in ICAM-1(null) //ICAM-2(-/-) C57BL/6J mice. Taken together, our study demonstrates that cell surface levels of endothelial ICAM-1 rather than the inflammatory stimulus or BBB integrity influence the pathway of T-cell diapedesis across the BBB.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Duffy antigen/receptor for chemokines, DARC, belongs to the family of atypical heptahelical chemokine receptors that do not couple to G proteins and therefore fail to transmit conventional intracellular signals. Here we show that during experimental autoimmune encephalomyelitis, an animal model of multiple sclerosis, the expression of DARC is upregulated at the blood-brain barrier. These findings are corroborated by the presence of a significantly increased number of subcortical white matter microvessels staining positive for DARC in human multiple sclerosis brains as compared to control tissue. Using an in vitro blood-brain barrier model we demonstrated that endothelial DARC mediates the abluminal to luminal transport of inflammatory chemokines across the blood-brain barrier. An involvement of DARC in experimental autoimmune encephalomyelitis pathogenesis was confirmed by the observed ameliorated experimental autoimmune encephalomyelitis in Darc(-/-) C57BL/6 and SJL mice, as compared to wild-type control littermates. Experimental autoimmune encephalomyelitis studies in bone marrow chimeric Darc(-/-) and wild-type mice revealed that increased plasma levels of inflammatory chemokines in experimental autoimmune encephalomyelitis depended on the presence of erythrocyte DARC. However, fully developed experimental autoimmune encephalomyelitis required the expression of endothelial DARC. Taken together, our data show a role for erythrocyte DARC as a chemokine reservoir and that endothelial DARC contributes to the pathogenesis of experimental autoimmune encephalomyelitis by shuttling chemokines across the blood-brain barrier.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND Impaired manual dexterity is frequent and disabling in patients with multiple sclerosis (MS), affecting activities of daily living (ADL) and quality of life. OBJECTIVE We aimed to evaluate the effectiveness of a standardized, home-based training program to improve manual dexterity and dexterity-related ADL in MS patients. METHODS This was a randomized, rater-blinded controlled trial. Thirty-nine MS patients acknowledging impaired manual dexterity and having a pathological Coin Rotation Task (CRT), Nine Hole Peg Test (9HPT) or both were randomized 1:1 into two standardized training programs, the dexterity training program and the theraband training program. Patients trained five days per week in both programs over a period of 4 weeks. Primary outcome measures performed at baseline and after 4 weeks were the CRT, 9HPT and a dexterous-related ADL questionnaire. Secondary outcome measures were the Chedoke Arm and Hand Activity Inventory (CAHAI-8) and the JAMAR test. RESULTS The dexterity training program resulted in significant improvements in almost all outcome measures at study end compared with baseline. The theraband training program resulted in mostly non-significant improvements. CONCLUSION The home-based dexterity training program significantly improved manual dexterity and dexterity-related ADL in moderately disabled MS patients. Trial Registration NCT01507636.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Land-use intensification is a key driver of biodiversity change. However, little is known about how it alters relationships between the diversities of different taxonomic groups, which are often correlated due to shared environmental drivers and trophic interactions. Using data from 150 grassland sites, we examined how land-use intensification (increased fertilization, higher livestock densities, and increased mowing frequency) altered correlations between the species richness of 15 plant, invertebrate, and vertebrate taxa. We found that 54% of pairwise correlations between taxonomic groups were significant and positive among all grasslands, while only one was negative. Higher land-use intensity substantially weakened these correlations (35% decrease in r and 43% fewer significant pairwise correlations at high intensity), a pattern which may emerge as a result of biodiversity declines and the breakdown of specialized relationships in these conditions. Nevertheless, some groups (Coleoptera, Heteroptera, Hymenoptera and Orthoptera) were consistently correlated with multidiversity, an aggregate measure of total biodiversity comprised of the standardized diversities of multiple taxa, at both high and low land-use intensity. The form of intensification was also important; increased fertilization and mowing frequency typically weakened plant–plant and plant–primary consumer correlations, whereas grazing intensification did not. This may reflect decreased habitat heterogeneity under mowing and fertilization and increased habitat heterogeneity under grazing. While these results urge caution in using certain taxonomic groups to monitor impacts of agricultural management on biodiversity, they also suggest that the diversities of some groups are reasonably robust indicators of total biodiversity across a range of conditions. Read More: http://www.esajournals.org/doi/10.1890/14-1307.1

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND Hepatitis B virus (HBV) genotypes can influence treatment outcome in HBV-monoinfected and human immunodeficiency virus (HIV)/HBV-coinfected patients. Tenofovir disoproxil fumarate (TDF) plays a pivotal role in antiretroviral therapy (ART) of HIV/HBV-coinfected patients. The influence of HBV genotypes on the response to antiviral drugs, particularly TDF, is poorly understood. METHODS HIV/HBV-co-infected participants with detectable HBV DNA prior to TDF therapy were selected from the Swiss HIV Cohort Study. HBV genotypes were identified and resistance testing was performed prior to antiviral therapy, and in patients with delayed treatment response (>6 months). The efficacy of TDF to suppress HBV (HBV DNA <20 IU/mL) and the influence of HBV genotypes were determined. RESULTS 143 HIV/HBV-coinfected participants with detectable HBV DNA were identified. The predominant HBV genotypes were A (82 patients, 57 %); and D (35 patients, 24 %); 20 patients (14 %) were infected with multiple genotypes (3 % A + D and 11 % A + G); and genotypes B, C and E were each present in two patients (1 %). TDF completely suppressed HBV DNA in 131 patients (92 %) within 6 months; and in 12 patients (8 %), HBV DNA suppression was delayed. No HBV resistance mutations to TDF were found in patients with delayed response, but all were infected with HBV genotype A (among these, 5 patients with genotype A + G), and all had previously been exposed to lamivudine. CONCLUSION In HIV/HBV-coinfected patients, infection with multiple HBV genotypes was more frequent than previously reported. The large majority of patients had an undetectable HBV viral load at six months of TDF-containing ART. In patients without viral suppression, no TDF-related resistance mutations were found. The role of specific genotypes and prior lamivudine treatment in the delayed response to TDF warrant further investigation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

HIV-infection is an important risk factor for developing Kaposi sarcoma (KS), but it is unclear whether HIV-positive persons are also at increased risk of co-infection with human herpesvirus 8 (HHV-8), the infectious cause of KS. We systematically searched literature up to December 2012 and included studies reporting HHV-8 seroprevalence for HIV-positive and HIV-negative persons. We used random-effects meta-analysis to combine odds ratios (ORs) of the association between HIV and HHV-8 seropositivity and conducted random-effects meta-regression to identify sources of heterogeneity. We included 93 studies with 58,357 participants from 32 countries in sub-Saharan Africa, North and South America, Europe, Asia, and Australia. Overall, HIV-positive persons were more likely to be HHV-8 seropositive than HIV-negative persons (OR 1.99, 95% confidence interval [CI] 1.70-2.34) with considerable heterogeneity among studies (I(2) 84%). The association was strongest in men who have sex with men (MSM, OR 3.95, 95% CI 2.92-5.35), patients with hemophilia (OR 3.11, 95% CI 1.19-8.11), and children (OR 2.45, 95% CI 1.58-3.81), but weaker in heterosexuals who engage in low-risk (OR 1.42, 95% CI 1.16-1.74) or high-risk sexual behavior (OR 1.66, 95% CI 1.27-2.17), persons who inject drugs (OR 1.66, 95% CI 1.28-2.14), and pregnant women (OR 1.68, 95% CI 1.15-2.47), p value for interaction <0.001. In conclusion, HIV-infection was associated with an increased HHV-8 seroprevalence in all population groups examined. A better understanding of HHV-8 transmission in different age and behavioral groups is needed to develop strategies to prevent HHV-8 transmission.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In sports games, it is often necessary to perceive a large number of moving objects (e.g., the ball and players). In this context, the role of peripheral vision for processing motion information in the periphery is often discussed especially when motor responses are required. In an attempt to test the basal functionality of peripheral vision in those sports-games situations, a Multiple Object Tracking (MOT) task that requires to track a certain number of targets amidst distractors, was chosen. Participants’ primary task was to recall four targets (out of 10 rectangular stimuli) after six seconds of quasi-random motion. As a second task, a button had to be pressed if a target change occurred (Exp 1: stop vs. form change to a diamond for 0.5 s; Exp 2: stop vs. slowdown for 0.5 s). While eccentricities of changes (5-10° vs. 15-20°) were manipulated, decision accuracy (recall and button press correct), motor response time as well as saccadic reaction time were calculated as dependent variables. Results show that participants indeed used peripheral vision to detect changes, because either no or very late saccades to the changed target were executed in correct trials. Moreover, a saccade was more often executed when eccentricities were small. Response accuracies were higher and response times were lower in the stop conditions of both experiments while larger eccentricities led to higher response times in all conditions. Summing up, it could be shown that monitoring targets and detecting changes can be processed by peripheral vision only and that a monitoring strategy on the basis of peripheral vision may be the optimal one as saccades may be afflicted with certain costs. Further research is planned to address the question whether this functionality is also evident in sports tasks.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In sports games, it is often necessary to perceive a large number of moving objects (e.g., the ball and players). In this context, the role of peripheral vision for processing motion information in the periphery is often discussed especially when motor responses are required. In an attempt to test the capability of using peripheral vision in those sports-games situations, a Multiple-Object-Tracking task that requires to track a certain number of targets amidst distractors, was chosen to determine the sensitivity of detecting target changes with peripheral vision only. Participants’ primary task was to recall four targets (out of 10 rectangular stimuli) after six seconds of quasi-random motion. As a second task, a button had to be pressed if a target change occurred (Exp 1: stop vs. form change to a diamond for 0.5 s; Exp 2: stop vs. slowdown for 0.5 s). Eccentricities of changes (5-10° vs. 15-20°) were manipulated, decision accuracy (recall and button press correct), motor response time and saccadic reaction time (change onset to saccade onset) were calculated and eye-movements were recorded. Results show that participants indeed used peripheral vision to detect changes, because either no or very late saccades to the changed target were executed in correct trials. Moreover, a saccade was more often executed when eccentricities were small. Response accuracies were higher and response times were lower in the stop conditions of both experiments while larger eccentricities led to higher response times in all conditions. Summing up, it could be shown that monitoring targets and detecting changes can be processed by peripheral vision only and that a monitoring strategy on the basis of peripheral vision may be the optimal one as saccades may be afflicted with certain costs. Further research is planned to address the question whether this functionality is also evident in sports tasks.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Evidence based public health requires knowledge about successful dissemination of public health measures. This study analyses (a) the changes in worksite tobacco prevention (TP) in the Canton of Zurich, Switzerland, between 2007 and 2009; (b1) the results of a multistep versus a “brochure only” dissemination strategy; (b2) the results of a monothematic versus a comprehensive dissemination strategy that aim to get companies to adopt TP measures; and (c) whether worksite TP is associated with health- related outcomes. A longitudinal design with randomized control groups was applied. Data on worksite TP and health-related outcomes were gathered by a written questionnaire (baseline

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Aims: The aim of this study is to explore the migration (colonization of new areas) and subsequent population expansion (within an area) since 15 ka cal BP of Abies, Fagus, Picea, and Quercus into and through the Alps solely on the basis of high-quality pollen data. Methods: Chronologies of 101 pollen sequences are improved or created. Data from the area delimited by 45.5–48.1°N and 6–14°E are summarized in three ways: (1) in a selection of pollen-percentage threshold maps (thresholds 0.5%, 1%, 2%, 4%, 8%, 16%, and 32% of land pollen); (2) in graphic summaries of 250-year time slices and geographic segments (lengthwise and transverse in relation to the main axis of the Alps) as pollen-percentage curves, pollen-percentage difference curves, and pollen-percentage threshold ages cal BP graphed against both the length and the transverse Alpine axes; and (3) in tables showing statistical relationships of either pollen-percentage threshold ages cal BP or pollen expansion durations (=time lapse between different pollen-percentage threshold ages cal BP) with latitude, longitude, and elevation; to establish these relationships we used both simple linear regression and multiple linear regression after stepwise-forward selection. Results: The statistical results indicate that (a) the use of pollen-percentage thresholds between 0.5% and 8% yield mostly similar directions of tree migration, so the method is fairly robust, (b) Abies migrated northward, Fagus southward, Picea westward, and Quercus northward; more detail does not emerge due to an extreme scarcity of high-quality data especially along the southern foothills of the Alps and in the eastern Alps. This scarcity allows the reconstruction of one immigration route only of Abies into the southern Alps. The speed of population expansion (following arrival) of Abies increased and of Picea decreased during the Holocene, of Fagus it decreased especially during the later Holocene, and of Quercus it increased especially at the start of the Holocene.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND Delayed-onset muscle soreness (DOMS) is a common symptom in people participating in exercise, sport, or recreational physical activities. Several remedies have been proposed to prevent and alleviate DOMS. DESIGN AND METHODS A five-arm randomized controlled study was conducted to examine the effects of acupuncture on eccentric exercise-induced DOMS of the biceps brachii muscle. Participants were recruited through convenience sampling of students and general public. Participants were randomly allocated to needle, laser, sham needle, sham laser acupuncture, and no intervention. Outcome measures included pressure pain threshold (PPT), pain intensity (visual analog scale), and maximum isometric voluntary force. RESULTS Delayed-onset muscle soreness was induced in 60 participants (22 females, age 23.6 ± 2.8 years, weight 66.1 ± 9.6 kg, and height 171.6 ± 7.9 cm). Neither verum nor sham interventions significantly improved outcomes within 72 hours when compared with no treatment control (P > 0.05). CONCLUSIONS Acupuncture was not effective in the treatment of DOMS. From a mechanistic point of view, these results have implications for further studies: (1) considering the high-threshold mechanosensitive nociceptors of the muscle, the cutoff for PPT (5 kg/cm) chosen to avoid bruising might have led to ceiling effects; (2) the traditional acupuncture regimen, targeting muscle pain, might have been inappropriate as the DOMS mechanisms seem limited to the muscular unit and its innervation. Therefore, a regionally based regimen including an intensified intramuscular needling (dry needling) should be tested in future studies, using a higher cutoff for PPT to avoid ceiling effects.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND Knee osteoarthritis is a leading cause of chronic pain, disability, and decreased quality of life. Despite the long-standing use of intra-articular corticosteroids, there is an ongoing debate about their benefits and safety. This is an update of a Cochrane review first published in 2005. OBJECTIVES To determine the benefits and harms of intra-articular corticosteroids compared with sham or no intervention in people with knee osteoarthritis in terms of pain, physical function, quality of life, and safety. SEARCH METHODS We searched the Cochrane Central Register of Controlled Trials (CENTRAL), MEDLINE, and EMBASE (from inception to 3 February 2015), checked trial registers, conference proceedings, reference lists, and contacted authors. SELECTION CRITERIA We included randomised or quasi-randomised controlled trials that compared intra-articular corticosteroids with sham injection or no treatment in people with knee osteoarthritis. We applied no language restrictions. DATA COLLECTION AND ANALYSIS We calculated standardised mean differences (SMDs) and 95% confidence intervals (CI) for pain, function, quality of life, joint space narrowing, and risk ratios (RRs) for safety outcomes. We combined trials using an inverse-variance random-effects meta-analysis. MAIN RESULTS We identified 27 trials (13 new studies) with 1767 participants in this update. We graded the quality of the evidence as 'low' for all outcomes because treatment effect estimates were inconsistent with great variation across trials, pooled estimates were imprecise and did not rule out relevant or irrelevant clinical effects, and because most trials had a high or unclear risk of bias. Intra-articular corticosteroids appeared to be more beneficial in pain reduction than control interventions (SMD -0.40, 95% CI -0.58 to -0.22), which corresponds to a difference in pain scores of 1.0 cm on a 10-cm visual analogue scale between corticosteroids and sham injection and translates into a number needed to treat for an additional beneficial outcome (NNTB) of 8 (95% CI 6 to 13). An I(2) statistic of 68% indicated considerable between-trial heterogeneity. A visual inspection of the funnel plot suggested some asymmetry (asymmetry coefficient -1.21, 95%CI -3.58 to 1.17). When stratifying results according to length of follow-up, benefits were moderate at 1 to 2 weeks after end of treatment (SMD -0.48, 95% CI -0.70 to -0.27), small to moderate at 4 to 6 weeks (SMD -0.41, 95% CI -0.61 to -0.21), small at 13 weeks (SMD -0.22, 95% CI -0.44 to 0.00), and no evidence of an effect at 26 weeks (SMD -0.07, 95% CI -0.25 to 0.11). An I(2) statistic of ≥ 63% indicated a moderate to large degree of between-trial heterogeneity up to 13 weeks after end of treatment (P for heterogeneity≤0.001), and an I(2) of 0% indicated low heterogeneity at 26 weeks (P=0.43). There was evidence of lower treatment effects in trials that randomised on average at least 50 participants per group (P=0.05) or at least 100 participants per group (P=0.013), in trials that used concomittant viscosupplementation (P=0.08), and in trials that used concomitant joint lavage (P≤0.001).Corticosteroids appeared to be more effective in function improvement than control interventions (SMD -0.33, 95% CI -0.56 to -0.09), which corresponds to a difference in functions scores of -0.7 units on standardised Western Ontario and McMaster Universities Arthritis Index (WOMAC) disability scale ranging from 0 to 10 and translates into a NNTB of 10 (95% CI 7 to 33). An I(2) statistic of 69% indicated a moderate to large degree of between-trial heterogeneity. A visual inspection of the funnel plot suggested asymmetry (asymmetry coefficient -4.07, 95% CI -8.08 to -0.05). When stratifying results according to length of follow-up, benefits were small to moderate at 1 to 2 weeks after end of treatment (SMD -0.43, 95% CI -0.72 to -0.14), small to moderate at 4 to 6 weeks (SMD -0.36, 95% CI -0.63 to -0.09), and no evidence of an effect at 13 weeks (SMD -0.13, 95% CI -0.37 to 0.10) or at 26 weeks (SMD 0.06, 95% CI -0.16 to 0.28). An I(2) statistic of ≥ 62% indicated a moderate to large degree of between-trial heterogeneity up to 13 weeks after end of treatment (P for heterogeneity≤0.004), and an I(2) of 0% indicated low heterogeneity at 26 weeks (P=0.52). We found evidence of lower treatment effects in trials that randomised on average at least 50 participants per group (P=0.023), in unpublished trials (P=0.023), in trials that used non-intervention controls (P=0.031), and in trials that used concomitant viscosupplementation (P=0.06).Participants on corticosteroids were 11% less likely to experience adverse events, but confidence intervals included the null effect (RR 0.89, 95% CI 0.64 to 1.23, I(2)=0%). Participants on corticosteroids were 67% less likely to withdraw because of adverse events, but confidence intervals were wide and included the null effect (RR 0.33, 95% CI 0.05 to 2.07, I(2)=0%). Participants on corticosteroids were 27% less likely to experience any serious adverse event, but confidence intervals were wide and included the null effect (RR 0.63, 95% CI 0.15 to 2.67, I(2)=0%).We found no evidence of an effect of corticosteroids on quality of life compared to control (SMD -0.01, 95% CI -0.30 to 0.28, I(2)=0%). There was also no evidence of an effect of corticosteroids on joint space narrowing compared to control interventions (SMD -0.02, 95% CI -0.49 to 0.46). AUTHORS' CONCLUSIONS Whether there are clinically important benefits of intra-articular corticosteroids after one to six weeks remains unclear in view of the overall quality of the evidence, considerable heterogeneity between trials, and evidence of small-study effects. A single trial included in this review described adequate measures to minimise biases and did not find any benefit of intra-articular corticosteroids.In this update of the systematic review and meta-analysis, we found most of the identified trials that compared intra-articular corticosteroids with sham or non-intervention control small and hampered by low methodological quality. An analysis of multiple time points suggested that effects decrease over time, and our analysis provided no evidence that an effect remains six months after a corticosteroid injection.