998 resultados para Controlled fusion
Resumo:
Adult neurogenesis is regulated by the neurogenic niche, through mechanisms that remain poorly defined. Here, we investigated whether niche-constituting astrocytes influence the maturation of adult-born hippocampal neurons using two independent transgenic approaches to block vesicular release from astrocytes. In these models, adult-born neurons but not mature neurons showed reduced glutamatergic synaptic input and dendritic spine density that was accompanied with lower functional integration and cell survival. By taking advantage of the mosaic expression of transgenes in astrocytes, we found that spine density was reduced exclusively in segments intersecting blocked astrocytes, revealing an extrinsic, local control of spine formation. Defects in NMDA receptor (NMDAR)-mediated synaptic transmission and dendrite maturation were partially restored by exogenous D-serine, whose extracellular level was decreased in transgenic models. Together, these results reveal a critical role for adult astrocytes in local dendritic spine maturation, which is necessary for the NMDAR-dependent functional integration of newborn neurons.
Resumo:
BACKGROUND: Antiretroviral regimens containing tenofovir disoproxil fumarate have been associated with renal toxicity and reduced bone mineral density. Tenofovir alafenamide is a novel tenofovir prodrug that reduces tenofovir plasma concentrations by 90%, thereby decreasing off-target side-effects. We aimed to assess whether efficacy, safety, and tolerability were non-inferior in patients switched to a regimen containing tenofovir alafenamide versus in those remaining on one containing tenofovir disoproxil fumarate. METHODS: In this randomised, actively controlled, multicentre, open-label, non-inferiority trial, we recruited HIV-1-infected adults from Gilead clinical studies at 168 sites in 19 countries. Patients were virologically suppressed (HIV-1 RNA <50 copies per mL) with an estimated glomerular filtration rate of 50 mL per min or greater, and were taking one of four tenofovir disoproxil fumarate-containing regimens for at least 96 weeks before enrolment. With use of a third-party computer-generated sequence, patients were randomly assigned (2:1) to receive a once-a-day single-tablet containing elvitegravir 150 mg, cobicistat 150 mg, emtricitabine 200 mg, and tenofovir alafenamide 10 mg (tenofovir alafenamide group) or to carry on taking one of four previous tenofovir disoproxil fumarate-containing regimens (tenofovir disoproxil fumarate group) for 96 weeks. Randomisation was stratified by previous treatment regimen in blocks of six. Patients and treating physicians were not masked to the assigned study regimen; outcome assessors were masked until database lock. The primary endpoint was the proportion of patients who received at least one dose of study drug who had undetectable viral load (HIV-1 RNA <50 copies per mL) at week 48. The non-inferiority margin was 12%. This study was registered with ClinicalTrials.gov, number NCT01815736. FINDINGS: Between April 12, 2013 and April 3, 2014, we enrolled 1443 patients. 959 patients were randomly assigned to the tenofovir alafenamide group and 477 to the tenofovir disoproxil fumarate group. Viral suppression at week 48 was noted in 932 (97%) patients assigned to the tenofovir alafenamide group and in 444 (93%) assigned to the tenofovir disoproxil fumarate group (adjusted difference 4·1%, 95% CI 1·6-6·7), with virological failure noted in ten and six patients, respectively. The number of adverse events was similar between the two groups, but study drug-related adverse events were more common in the tenofovir alafenamide group (204 patients [21%] vs 76 [16%]). Hip and spine bone mineral density and glomerular filtration were each significantly improved in patients in the tenofovir alafenamide group compared with those in the tenofovir disoproxil fumarate group. INTERPRETATION: Switching to a tenofovir alafenamide-containing regimen from one containing tenofovir disoproxil fumarate was non-inferior for maintenance of viral suppression and led to improved bone mineral density and renal function. Longer term follow-up is needed to better understand the clinical impact of these changes. FUNDING: Gilead Sciences.
Resumo:
BACKGROUND: The diagnosis of Pulmonary Embolism (PE) in the emergency department (ED) is crucial. As emergency physicians fear missing this potential life-threatening condition, PE tends to be over-investigated, exposing patients to unnecessary risks and uncertain benefit in terms of outcome. The Pulmonary Embolism Rule-out Criteria (PERC) is an eight-item block of clinical criteria that can identify patients who can safely be discharged from the ED without further investigation for PE. The endorsement of this rule could markedly reduce the number of irradiative imaging studies, ED length of stay, and rate of adverse events resulting from both diagnostic and therapeutic interventions. Several retrospective and prospective studies have shown the safety and benefits of the PERC rule for PE diagnosis in low-risk patients, but the validity of this rule is still controversial. We hypothesize that in European patients with a low gestalt clinical probability and who are PERC-negative, PE can be safely ruled out and the patient discharged without further testing. METHODS/DESIGN: This is a controlled, cluster randomized trial, in 15 centers in France. Each center will be randomized for the sequence of intervention periods: a 6-month intervention period (PERC-based strategy) followed by a 6-month control period (usual care), or in reverse order, with 2 months of "wash-out" between the 2 periods. Adult patients presenting to the ED with a suspicion of PE and a low pre test probability estimated by clinical gestalt will be eligible. The primary outcome is the percentage of failure resulting from the diagnostic strategy, defined as diagnosed venous thromboembolic events at 3-month follow-up, among patients for whom PE has been initially ruled out. DISCUSSION: The PERC rule has the potential to decrease the number of irradiative imaging studies in the ED, and is reported to be safe. However, no randomized study has ever validated the safety of PERC. Furthermore, some studies have challenged the safety of a PERC-based strategy to rule-out PE, especially in Europe where the prevalence of PE diagnosed in the ED is high. The PROPER study should provide high-quality evidence to settle this issue. If it confirms the safety of the PERC rule, physicians will be able to reduce the number of investigations, associated subsequent adverse events, costs, and ED length of stay for patients with a low clinical probability of PE. TRIAL REGISTRATION: NCT02375919 .
Resumo:
The fusion of bone marrow (BM) hematopoietic cells with hepatocytes to generate BM derived hepatocytes (BMDH) is a natural process, which is enhanced in damaged tissues. However, the reprogramming needed to generate BMDH and the identity of the resultant cells is essentially unknown. In a mouse model of chronic liver damage, here we identify a modification in the chromatin structure of the hematopoietic nucleus during BMDH formation, accompanied by the loss of the key hematopoietic transcription factor PU.1/Sfpi1 (SFFV proviral integration 1) and gain of the key hepatic transcriptional regulator HNF-1A homeobox A (HNF-1A/Hnf1a). Through genome-wide expression analysis of laser captured BMDH, a differential gene expression pattern was detected and the chromatin changes observed were confirmed at the level of chromatin regulator genes. Similarly, Tranforming Growth Factor-β1 (TGF-β1) and neurotransmitter (e.g. Prostaglandin E Receptor 4 [Ptger4]) pathway genes were over-expressed. In summary, in vivo BMDH generation is a process in which the hematopoietic cell nucleus changes its identity and acquires hepatic features. These BMDHs have their own cell identity characterized by an expression pattern different from hematopoietic cells or hepatocytes. The role of these BMDHs in the liver requires further investigation.
Resumo:
BACKGROUND: Patients undergoing emergency gastrointestinal surgery for intra-abdominal infection are at risk of invasive candidiasis (IC) and candidates for preemptive antifungal therapy. METHODS: This exploratory, randomized, double-blind, placebo-controlled trial assessed a preemptive antifungal approach with micafungin (100 mg/d) in intensive care unit patients requiring surgery for intra-abdominal infection. Coprimary efficacy variables were the incidence of IC and the time from baseline to first IC in the full analysis set; an independent data review board confirmed IC. An exploratory biomarker analysis was performed using logistic regression. RESULTS: The full analysis set comprised 124 placebo- and 117 micafungin-treated patients. The incidence of IC was 8.9% for placebo and 11.1% for micafungin (difference, 2.24%; [95% confidence interval, -5.52 to 10.20]). There was no difference between the arms in median time to IC. The estimated odds ratio showed that patients with a positive (1,3)-β-d-glucan (ßDG) result were 3.66 (95% confidence interval, 1.01-13.29) times more likely to have confirmed IC than those with a negative result. CONCLUSIONS: This study was unable to provide evidence that preemptive administration of an echinocandin was effective in preventing IC in high-risk surgical intensive care unit patients with intra-abdominal infections. This may have been because the drug was administered too late to prevent IC coupled with an overall low number of IC events. It does provide some support for using ßDG to identify patients at high risk of IC. CLINICAL TRIALS REGISTRATION: NCT01122368.
Resumo:
PURPOSE: To evaluate the effect of spironolactone, a mineralocorticoid receptor antagonist, for nonresolving central serous chorioretinopathy. METHODS: This is a prospective, randomized, double-blinded, placebo-controlled crossover study. Sixteen eyes of 16 patients with central serous chorioretinopathy and persistent subretinal fluid (SRF) for at least 3 months were enrolled. Patients were randomized to receive either spironolactone 50 mg or placebo once a day for 30 days, followed by a washout period of 1 week and then crossed over to either placebo or spironolactone for another 30 days. The primary outcome measure was the changes from baseline in SRF thickness at the apex of the serous retinal detachment. Secondary outcomes included subfoveal choroidal thickness and the ETDRS best-corrected visual acuity. RESULTS: The mean duration of central serous chorioretinopathy before enrollment in study eyes was 10 ± 16.9 months. Crossover data analysis showed a statistically significant reduction in SRF in spironolactone treated eyes as compared with the same eyes under placebo (P = 0.04). Secondary analysis on the first period (Day 0-Day 30) showed a significant reduction in subfoveal choroidal thickness in treated eyes as compared with placebo (P = 0.02). No significant changes were observed in the best-corrected visual acuity. There were no complications related to treatment observed. CONCLUSION: In eyes with persistent SRF due to central serous chorioretinopathy, spironolactone significantly reduced both the SRF and the subfoveal choroidal thickness as compared with placebo.
Resumo:
OBJECTIVE: To test the hypothesis that substituting artificially sweetened beverages (ASB) for sugar-sweetened beverages (SSB) decreases intrahepatocellular lipid concentrations (IHCL) in overweight subjects with high SSB consumption. METHODS: About 31 healthy subjects with BMI greater than 25 kg/m(2) and a daily consumption of at least 660 ml SSB were randomized to a 12-week intervention in which they replaced SSBs with ASBs. Their IHCL (magnetic resonance spectroscopy), visceral adipose tissue volume (VAT; magnetic resonance imaging), food intake (2-day food records), and fasting blood concentrations of metabolic markers were measured after a 4-week run-in period and after a 12-week period with ASB or control (CTRL). RESULTS: About 27 subjects completed the study. IHCL was reduced to 74% of the initial values with ASB (N = 14; P < 0.05) but did not change with CTRL. The decrease in IHCL attained with ASB was more important in subjects with IHCL greater than 60 mmol/l than in subjects with low IHCL. ALT decreased significantly with SSB only in subjects with IHCL greater than 60 mmol/l. There was otherwise no significant effect of ASB on body weight, VAT, or metabolic markers. CONCLUSIONS: In subjects with overweight or obesity and a high SSB intake, replacing SSB with ASB decreased intrahepatic fat over a 12-week period.
Resumo:
BACKGROUND: Cognitive deficits have been reported during the early stages of bipolar disorder; however, the role of medication on such deficits remains unclear. The aim of this study was to compare the effects of lithium and quetiapine monotherapy on cognitive performance in people following first episode mania. METHODS: The design was a single-blind, randomised controlled trial on a cohort of 61 participants following first episode mania. Participants received either lithium or quetiapine monotherapy as maintenance treatment over a 12-month follow-up period. The groups were compared on performance outcomes using an extensive cognitive assessment battery conducted at baseline, month 3 and month 12 follow-up time-points. RESULTS: There was a significant interaction between group and time in phonemic fluency at the 3-month and 12-month endpoints, reflecting greater improvements in performance in lithium-treated participants relative to quetiapine-treated participants. After controlling for multiple comparisons, there were no other significant interactions between group and time for other measures of cognition. CONCLUSION: Although the effects of lithium and quetiapine treatment were similar for most cognitive domains, the findings imply that early initiation of lithium treatment may benefit the trajectory of cognition, specifically verbal fluency in young people with bipolar disorder. Given that cognition is a major symptomatic domain of bipolar disorder and has substantive effects on general functioning, the ability to influence the trajectory of cognitive change is of considerable clinical importance.
Resumo:
BACKGROUND AND OBJECTIVES: Hepcidin is the main hormone that regulates iron balance. Its lowering favours digestive iron absorption in cases of iron deficiency or enhanced erythropoiesis. The careful dosage of this small peptide promises new diagnostic and therapeutic strategies. Its measurement is progressively being validated and now its clinical value must be explored in different physiological situations. Here, we evaluate hepcidin levels among premenopausal female donors with iron deficiency without anaemia. MATERIALS AND METHODS: In a preceding study, a 4-week oral iron treatment (80 mg/day) was administered in a randomized controlled trial (n = 145), in cases of iron deficiency without anaemia after a blood donation. We subsequently measured hepcidin at baseline and after 4 weeks of treatment, using mass spectrometry. RESULTS: Iron supplementation had a significant effect on plasma hepcidin compared to the placebo arm at 4 weeks [+0·29 nm [95% CI: 0·18 to 0·40]). There was a significant correlation between hepcidin and ferritin at baseline (R(2) = 0·121, P < 0·001) and after treatment (R(2) = 0·436, P < 0·001). Hepcidin levels at baseline were not predictive of concentration changes for ferritin or haemoglobin. However, hepcidin levels at 4 weeks were significantly higher (0·79 nm [95% CI: 0·53 to 1·05]) among ferritin responders. CONCLUSIONS: This study shows that a 4-week oral treatment of iron increased hepcidin blood concentrations in female blood donors with an initial ferritin concentration of less than 30 ng/ml. Apparently, hepcidin cannot serve as a predictor of response to iron treatment but might serve as a marker of the iron repletion needed for erythropoiesis.
Resumo:
BACKGROUND: Tinnitus is an often disabling condition for which there is no effective therapy. Current research suggests that tinnitus may develop due to maladaptive plastic changes and altered activity in the auditory and prefrontal cortex. Transcranial direct current stimulation (tDCS) modulates brain activity and has been shown to transiently suppress tinnitus in trials. OBJECTIVE: To investigate the efficacy and safety of tDCS in the treatment of chronic subjective tinnitus. METHODS: In a randomized, parallel, double-blind, sham-controlled study, the efficacy and safety of cathodal tDCS to the auditory cortex with anode over the prefrontal cortex was investigated in five sessions over five consecutive days. Tinnitus was assessed after the last session on day 5, and at follow-up visits 1 and 3 months post stimulation using the Tinnitus Handicap Inventory (THI, primary outcome measure), Subjective Tinnitus Severity Scale, Hospital Anxiety and Depression scale, Visual Analogue Scale, and Clinical Global Impression scale. RESULTS: 42 patients were investigated, 21 received tDCS and 21 sham stimulation. There were no beneficial effects of tDCS on tinnitus as assessed by primary and secondary outcome measures. Effect size assessed with Cohen's d amounted to 0.08 (95% CI: -0.52 to 0.69) at 1 month and 0.18 (95% CI: -0.43 to 0.78) at 3 months for the THI. CONCLUSION: tDCS of the auditory and prefrontal cortices is safe, but does not improve tinnitus. Different tDCS protocols might be beneficial.
Resumo:
INTRODUCTION: Alcohol use is one of the leading modifiable morbidity and mortality risk factors among young adults. STUDY DESIGN: 2 parallel-group randomized controlled trial with follow-up at 1 and 6 months. SETTING/PARTICIPANTS: Internet based study in a general population sample of young men with low-risk drinking, recruited between June 2012 and February 2013. INTERVENTION: Internet-based brief alcohol primary prevention intervention (IBI). The IBI aims at preventing an increase in alcohol use: it consists of normative feedback, feedback on consequences, calorific value alcohol, computed blood alcohol concentration, indication that the reported alcohol use is associated with no or limited risks for health. INTERVENTION group participants received the IBI. Control group (CG) participants completed only an assessment. MAIN OUTCOME MEASURES: Alcohol use (number of drinks per week), binge drinking prevalence. Analyses were conducted in 2014-2015. RESULTS: Of 4365 men invited to participate, 1633 did so; 896 reported low-risk drinking and were randomized (IBI: n = 451; CG: n = 445). At baseline, 1 and 6 months, the mean (SD) number of drinks/week was 2.4(2.2), 2.3(2.6), 2.5(3.0) for IBI, and 2.4(2.3), 2.8(3.7), 2.7(3.9) for CG. Binge drinking, absent at baseline, was reported by 14.4% (IBI) and 19.0% (CG) at 1 month and by 13.3% (IBI) and 13.0% (CG) at 6 months. At 1 month, beneficial intervention effects were observed on the number of drinks/week (p = 0.05). No significant differences were observed at 6 months. CONCLUSION: We found protective short term effects of a primary prevention IBI. TRIAL REGISTRATION: Controlled-Trials.com ISRCTN55991918.
Resumo:
The degree of fusion at the anterior aspect of the sacral vertebrae has been scored in 242 male and female skeletons from the Lisbon documented collection, ranging in age from 16 to 59 years old. Statistical tests indicate a sex difference towards earlier fusion in young females compared with young males, as well as a clear association between degree of fusion and age. Similar results have been found in documented skeletal samples from Coimbra and Sassari, and the recommendations stated by these authors regarding age estimation have been positively tested in the Lisbon collection. Although more research from geographically diverse samples is required, a general picture of the pattern of sacral fusion and its associations with age and sex is emerging. We also provide a practical example of the usefulness of the sacrum in age estimation in a forensic setting, a mass grave from the Spanish Civil War. It is concluded that the scoring of the degree of fusion of the sacral vertebrae, specially of S1-2, can be a simple tool for assigning skeletons to broad age groups, and it should be implemented as another resource for age estimation in the study of human skeletal remains.
Resumo:
Expression of the SS18/SYT-SSX fusion protein is believed to underlie the pathogenesis of synovial sarcoma (SS). Recent evidence suggests that deregulation of the Wnt pathway may play an important role in SS but the mechanisms whereby SS18-SSX might affect Wnt signaling remain to be elucidated. Here, we show that SS18/SSX tightly regulates the elevated expression of the key Wnt target AXIN2 in primary SS. SS18-SSX is shown to interact with TCF/LEF, TLE and HDAC but not β-catenin in vivo and to induce Wnt target gene expression by forming a complex containing promoter-bound TCF/LEF and HDAC but lacking β-catenin. Our observations provide a tumor-specific mechanistic basis for Wnt target gene induction in SS that can occur in the absence of Wnt ligand stimulation.
Resumo:
BACKGROUND: Delirium is an acute cognitive impairment among older hospitalized patients. It can persist until discharge and for months after that. Despite proof that evidence-based nursing interventions are effective in preventing delirium in acute hospitals, interventions among home-dwelling older patients is lacking. The aim was to assess feasibility and acceptability of a nursing intervention designed to detect and reduce delirium in older adults after discharge from hospital. METHODS: Randomized clinical pilot trial with a before/after design was used. One hundred and three older adults were recruited in a home healthcare service in French-speaking Switzerland and randomized into an experimental group (EG, n = 51) and a control group (CG, n = 52). The CG received usual homecare. The EG received usual homecare plus five additional nursing interventions at 48 and 72 h and at 7, 14 and 21 days after discharge. These interventions were tailored for detecting and reducing delirium and were conducted by a geriatric clinical nurse (GCN). All patients were monitored at the start of the study (M1) and throughout the month for symptoms of delirium (M2). This was documented in patients' records after usual homecare using the Confusion Assessment Method (CAM). At one month (M2), symptoms of delirium were measured using the CAM, cognitive status was measured using the Mini-Mental State Examination (MMSE), and functional status was measured using Katz and Lawton Index of activities of daily living (ADL/IADL). At the end of the study, participants in the EG and homecare nurses were interviewed about the acceptability of the nursing interventions and the study itself. RESULTS: Feasibility and acceptability indicators reported excellent results. Recruitment, retention, randomization, and other procedures were efficient, although some potentially issues were identified. Participants and nurses considered organizational procedures, data collection, intervention content, the dose-effect of the interventions, and methodology all to be feasible. Duration, patient adherence and fidelity were judged acceptable. Nurses, participants and informal caregivers were satisfied with the relevance and safety of the interventions. CONCLUSIONS: Nursing interventions to detect/improve delirium at home are feasible and acceptable. These results confirm that developing a large-scale randomized controlled trial would be appropriate. TRIAL REGESTRATION: ISRCTN registry no: 16103589 - 19 February 2016.