106 resultados para Markovian switching
Resumo:
The respective production of specific immunoglobulin (Ig)G2a or IgG1 within 5 d of primary immunization with Swiss type mouse mammary tumor virus [MMTV(SW)] or haptenated protein provides a model for the development of T helper 1 (Th1) and Th2 responses. The antibody-producing cells arise from cognate T cell B cell interaction, revealed by the respective induction of Cgamma2a and Cgamma1 switch transcript production, on the third day after immunization. T cell proliferation and upregulation of mRNA for interferon gamma in response to MMTV(SW) and interleukin 4 in response to haptenated protein also starts during this day. It follows that there is minimal delay in these responses between T cell priming and the onset of cognate interaction between T and B cells leading to class switching and exponential growth. The Th1 or Th2 profile is at least partially established at the time of the first cognate T cell interaction with B cells in the T zone. The addition of killed Bordetella pertussis to the hapten-protein induces nonhapten-specific IgG2a and IgG1 plasma cells, whereas the anti-hapten response continues to be IgG1 dominated. This indicates that a Th2 response to hapten-protein can proceed in a node where there is substantial Th1 activity.
Resumo:
Reproductive division of labour is a defining characteristic of eusociality in insect societies. The task of reproduction is performed by the fertile males and queens of the colony, while the non-fertile female worker caste performs all other tasks related to colony upkeep, foraging and nest defence. Division of labour, or polyethism, within the worker caste is organized such that specific tasks are performed by discrete groups of individuals. Ordinarily, workers of one group will not participate in the tasks of other groups making the groups of workers behaviourally distinct. In some eusocial species, this has led to the evolution of a remarkable diversity of subcaste morphologies within the worker caste, and a division of labour amongst the subcastes. This caste polyethism is best represented in many species of ants where a smaller-bodied minor subcaste typically performs foraging duties while larger individuals of the major subcaste are tasked with nest defence. Recent work suggests that polyethism in the worker caste is influenced by an evolutionarily conserved, yet diversely regulated, gene called foraging (for), which encodes a cGMP-dependent protein kinase (PKG). Additionally, flexibility in the activity of this enzyme allows for workers from one task group to assist the workers of other task groups in times of need during the colony's life.
Stability-dependent behavioural and electro-cortical reorganizations during bimanual switching tasks
Resumo:
This study investigated behavioural and electro-cortical reorganizations accompanying intentional switching between two distinct bimanual coordination tapping modes (In-phase and Anti-phase) that differ in stability when produced at the same movement rate. We expected that switching to a less stable tapping mode (In-to-Anti switching) would lead to larger behavioural perturbations and require supplementary neural resources than switching to a more stable tapping mode (Anti-to-In switching). Behavioural results confirmed that the In-to-Anti switching lasted longer than the Anti-to-In switching. A general increase in attention-related neural activity was found at the moment of switching for both conditions. Additionally, two condition-dependent EEG reorganizations were observed. First, a specific increase in cortico-cortical coherence appeared exclusively during the In-to-Anti switching. This result may reflect a strengthening in inter-regional communication in order to engage in the subsequent, less stable, tapping mode. Second, a decrease in motor-related neural activity (increased beta spectral power) was found for the Anti-to-In switching only. The latter effect may reflect the interruption of the previous, less stable, tapping mode. Given that previous results on spontaneous Anti-to-In switching revealing an inverse pattern of EEG reorganization (decreased beta spectral power), present findings give new insight on the stability-dependent neural correlates of intentional motor switching. © 2010 Elsevier Ireland Ltd. All rights reserved
Resumo:
It is known that post-movement beta synchronization (PMBS) is involved both in active inhibition and in sensory reafferences processes. The aim of this study was examine the temporal and spatial dynamics of the PMBS involved during multi-limb coordination task. We investigated post-switching beta synchronization (assigned PMBS) using time-frequency and source estimations analyzes. Participants (n = 17) initiated an auditory-paced bimanual tapping. After a 1500 ms preparatory period, an imperative stimulus required to either selectively stop the left while maintaining the right unimanual tapping (Switch condition: SWIT) or to continue the bimanual tapping (Continue condition: CONT). PMBS significantly increased in SWIT compared to CONT with maximal difference within right central region in broad-band 14âeuro"30 Hz and within left central region in restricted-band 22âeuro"26 Hz. Source estimations localized these effects within right pre-frontal cortex and left parietal cortex, respectively. A negative correlation showed that participants with a low percentage of errors in SWIT had a large PMBS amplitude within right parietal and frontal cortices. This study shows for the first time simultaneous PMBS with distinct functions in different brain regions and frequency ranges. The left parietal PMBS restricted to 22âeuro"26 Hz could reflect the sensory reafferences of the right hand tapping disrupted by the switching. In contrast, the right pre-frontal PMBS in a broad-band 14âeuro"30 Hz is likely reflecting the active inhibition of the left hand stopped. Finally, correlations between behavioral performance and the magnitude of the PMBS suggest that beta oscillations can be viewed as a marker of successful active inhibition.
Resumo:
Background: Macular edema resulting from central retinal vein occlusion is effectively treated with anti-vascular endothelial growth factor injections. However, some patients need monthly retreatment and still show frequent recurrences. The purpose of this study was to evaluate the visual and anatomic outcomes of refractory macular edema resulting from ischemic central retinal vein occlusion in patients switched from ranibizumab to aflibercept intravitreal injections. Patients and Methods: We describe a retrospective series of patients followed in the Medical Retina Unit of the Jules Gonin Eye Hospital for macular edema due to ischemic central retinal vein occlusion, refractory to monthly retreatment with ranibizumab, and changed to aflibercept. Refractory macular edema was defined as persistence of any fluid at each visit one month after last injection during at least 6 months. All patients had to have undergone pan-retinal laser scan. Results: Six patients were identified, one of whom had a very short-term follow-up (excluded from statistics). Mean age was 57 ± 12 years. The mean changes in visual acuity and central macular thickness from baseline to switch were + 20.6 ± 20.3 ETDRS letters and - 316.4 ± 276.6 µm, respectively. The additional changes from before to after the switch were + 9.2 ± 9.5 ETDRS letters and - 248.0 ± 248.7 µm, respectively. The injection intervals could often be lengthened after the switch. Conclusions: Intravitreal aflibercept seems to be a promising alternative treatment for macular edema refractory to ranibizumab in ischemic central retinal vein occlusion.
Resumo:
BACKGROUND: Antiretroviral regimens containing tenofovir disoproxil fumarate have been associated with renal toxicity and reduced bone mineral density. Tenofovir alafenamide is a novel tenofovir prodrug that reduces tenofovir plasma concentrations by 90%, thereby decreasing off-target side-effects. We aimed to assess whether efficacy, safety, and tolerability were non-inferior in patients switched to a regimen containing tenofovir alafenamide versus in those remaining on one containing tenofovir disoproxil fumarate. METHODS: In this randomised, actively controlled, multicentre, open-label, non-inferiority trial, we recruited HIV-1-infected adults from Gilead clinical studies at 168 sites in 19 countries. Patients were virologically suppressed (HIV-1 RNA <50 copies per mL) with an estimated glomerular filtration rate of 50 mL per min or greater, and were taking one of four tenofovir disoproxil fumarate-containing regimens for at least 96 weeks before enrolment. With use of a third-party computer-generated sequence, patients were randomly assigned (2:1) to receive a once-a-day single-tablet containing elvitegravir 150 mg, cobicistat 150 mg, emtricitabine 200 mg, and tenofovir alafenamide 10 mg (tenofovir alafenamide group) or to carry on taking one of four previous tenofovir disoproxil fumarate-containing regimens (tenofovir disoproxil fumarate group) for 96 weeks. Randomisation was stratified by previous treatment regimen in blocks of six. Patients and treating physicians were not masked to the assigned study regimen; outcome assessors were masked until database lock. The primary endpoint was the proportion of patients who received at least one dose of study drug who had undetectable viral load (HIV-1 RNA <50 copies per mL) at week 48. The non-inferiority margin was 12%. This study was registered with ClinicalTrials.gov, number NCT01815736. FINDINGS: Between April 12, 2013 and April 3, 2014, we enrolled 1443 patients. 959 patients were randomly assigned to the tenofovir alafenamide group and 477 to the tenofovir disoproxil fumarate group. Viral suppression at week 48 was noted in 932 (97%) patients assigned to the tenofovir alafenamide group and in 444 (93%) assigned to the tenofovir disoproxil fumarate group (adjusted difference 4·1%, 95% CI 1·6-6·7), with virological failure noted in ten and six patients, respectively. The number of adverse events was similar between the two groups, but study drug-related adverse events were more common in the tenofovir alafenamide group (204 patients [21%] vs 76 [16%]). Hip and spine bone mineral density and glomerular filtration were each significantly improved in patients in the tenofovir alafenamide group compared with those in the tenofovir disoproxil fumarate group. INTERPRETATION: Switching to a tenofovir alafenamide-containing regimen from one containing tenofovir disoproxil fumarate was non-inferior for maintenance of viral suppression and led to improved bone mineral density and renal function. Longer term follow-up is needed to better understand the clinical impact of these changes. FUNDING: Gilead Sciences.
Resumo:
Purpose: Sirolimus (SRL) has been used to replace calcineurin inhibitors (CNI) for various indications including CNI-induced toxicity. The aim of this study was to evaluate the efficacy and safety of switching from CNI to SRL in stable renal transplant recipients (RTR) with low grade proteinuria (<1 g/24 h). Methods and materials: Between 2001 and 2007, 41 patients (20 females, 21 males; mean age 47 ± 13) were switched after a median time post-transplantation of 73.5 months (range 0.2-273.2 months). Indications for switch were CNI nephrotoxicity (39%), thrombotic micro-angiopathy (14.6%), post-transplantation cancer (24.4%), CNI neurotoxicity (7.4%), or others (14.6%). Mean follow-up after SRL switch was 23.8±16.3 months. Mean SRL dosage and through levels were 2.4 ± 1.1 mg/day and 8 ± 2.2 ug/l respectively. Immunosuppressive regiments were SRL + mycophenolate mofetil (MMF) (31.7%), SRL + MMF + prednisone (36.58%), SRL + prednisone (19.51%), SRL + Azathioprine (9.75%), or SRL alone (2.43%). Results: Mean creatinine decreased from 164 to 143 μmol/l (p <0.03), mean estimated glomerular filtration rate (eGFR) increased significantly from 50.13 to 55.01 ml/minute (p <0.00001), mean systolic and diastolic blood pressure decreased from 138 to 132 mm Hg (p <0.03) and from 83 to78 mm Hg (p <0.01), but mean proteinuria increased from 0.21 to 0.63 g/24 h (p <0.001). While mean total cholesterolemia didn't increased significantly from 5.09 to 5.56 mmol/l (p = 0.06). The main complications after SRL switch were dermatitis (19.5%), urinary tract infections (24.4%), ankle edema (13.3%), and transient oral ulcers (20%). Acute rejection after the switch occurred in 7.3% of patients (n = 3), and 2 acute rejections were successfully treated with corticosteroids and 1 did not respond to treatment (not related to switch). SRL had to be discontinued in 17% of patients (2 nephrotic syndromes, 2 severe edema, 1 acute rejection, 1 thrombotic micro-angiopathy, and 1 fever). Conclusion: In conclusion, we found that switching from CNI to SRL in stable RTR was safe and associated with a significant improvement of renal function and blood pressure. Known side-effects of SRL led to drug discontinuation in less than 20% of patients and the acute rejection rate was 7.3%. This experience underlines the importance of patient selection before switching to SRL, in particular regarding preswitch proteinuria.
Resumo:
The mechanisms regulating systemic and mucosal IgA responses in the respiratory tract are incompletely understood. Using virus-like particles loaded with single-stranded RNA as a ligand for TLR7, we found that systemic vs mucosal IgA responses in mice were differently regulated. Systemic IgA responses following s.c. immunization were T cell independent and did not require TACI or TGFbeta, whereas mucosal IgA production was dependent on Th cells, TACI, and TGFbeta. Strikingly, both responses required TLR7 signaling, but systemic IgA depended upon TLR7 signaling directly to B cells whereas mucosal IgA required TLR7 signaling to lung dendritic cells and alveolar macrophages. Our data show that IgA switching is controlled differently according to the cell type receiving TLR signals. This knowledge should facilitate the development of IgA-inducing vaccines.
Resumo:
Using event-related potentials (ERPs), we investigated the neural response associated with preparing to switch from one task to another. We used a cued task-switching paradigm in which the interval between the cue and the imperative stimulus was varied. The difference between response time (RT) to trials on which the task switched and trials on which the task repeated (switch cost) decreased as the interval between cue and target (CTI) was increased, demonstrating that subjects used the CTI to prepare for the forthcoming task. However, the RT on repeated-task trials in blocks during which the task could switch (mixed-task blocks) were never as short as RTs during single-task blocks (mixing cost). This replicates previous research. The ERPs in response to the cue were compared across three conditions: single-task trials, switch trials, and repeat trials. ERP topographic differences were found between single-task trials and mixed-task (switch and repeat) trials at approximately 160 and approximately 310 msec after the cue, indicative of changes in the underlying neural generator configuration as a basis for the mixing cost. In contrast, there were no topographic differences evident between switch and repeat trials during the CTI. Rather, the response of statistically indistinguishable generator configurations was stronger at approximately 310 msec on switch than on repeat trials. By separating differences in ERP topography from differences in response strength, these results suggest that a reappraisal of previous research is appropriate.
Sociogenomics of Cooperation and Conflict during Colony Founding in the Fire Ant Solenopsis invicta.
Resumo:
One of the fundamental questions in biology is how cooperative and altruistic behaviors evolved. The majority of studies seeking to identify the genes regulating these behaviors have been performed in systems where behavioral and physiological differences are relatively fixed, such as in the honey bee. During colony founding in the monogyne (one queen per colony) social form of the fire ant Solenopsis invicta, newly-mated queens may start new colonies either individually (haplometrosis) or in groups (pleometrosis). However, only one queen (the "winner") in pleometrotic associations survives and takes the lead of the young colony while the others (the "losers") are executed. Thus, colony founding in fire ants provides an excellent system in which to examine the genes underpinning cooperative behavior and how the social environment shapes the expression of these genes. We developed a new whole genome microarray platform for S. invicta to characterize the gene expression patterns associated with colony founding behavior. First, we compared haplometrotic queens, pleometrotic winners and pleometrotic losers. Second, we manipulated pleometrotic couples in order to switch or maintain the social ranks of the two cofoundresses. Haplometrotic and pleometrotic queens differed in the expression of genes involved in stress response, aging, immunity, reproduction and lipid biosynthesis. Smaller sets of genes were differentially expressed between winners and losers. In the second experiment, switching social rank had a much greater impact on gene expression patterns than the initial/final rank. Expression differences for several candidate genes involved in key biological processes were confirmed using qRT-PCR. Our findings indicate that, in S. invicta, social environment plays a major role in the determination of the patterns of gene expression, while the queen's physiological state is secondary. These results highlight the powerful influence of social environment on regulation of the genomic state, physiology and ultimately, social behavior of animals.
Resumo:
BACKGROUND: Superinfection with drug resistant HIV strains could potentially contribute to compromised therapy in patients initially infected with drug-sensitive virus and receiving antiretroviral therapy. To investigate the importance of this potential route to drug resistance, we developed a bioinformatics pipeline to detect superinfection from routinely collected genotyping data, and assessed whether superinfection contributed to increased drug resistance in a large European cohort of viremic, drug treated patients. METHODS: We used sequence data from routine genotypic tests spanning the protease and partial reverse transcriptase regions in the Virolab and EuResist databases that collated data from five European countries. Superinfection was indicated when sequences of a patient failed to cluster together in phylogenetic trees constructed with selected sets of control sequences. A subset of the indicated cases was validated by re-sequencing pol and env regions from the original samples. RESULTS: 4425 patients had at least two sequences in the database, with a total of 13816 distinct sequence entries (of which 86% belonged to subtype B). We identified 107 patients with phylogenetic evidence for superinfection. In 14 of these cases, we analyzed newly amplified sequences from the original samples for validation purposes: only 2 cases were verified as superinfections in the repeated analyses, the other 12 cases turned out to involve sample or sequence misidentification. Resistance to drugs used at the time of strain replacement did not change in these two patients. A third case could not be validated by re-sequencing, but was supported as superinfection by an intermediate sequence with high degenerate base pair count within the time frame of strain switching. Drug resistance increased in this single patient. CONCLUSIONS: Routine genotyping data are informative for the detection of HIV superinfection; however, most cases of non-monophyletic clustering in patient phylogenies arise from sample or sequence mix-up rather than from superinfection, which emphasizes the importance of validation. Non-transient superinfection was rare in our mainly treatment experienced cohort, and we found a single case of possible transmitted drug resistance by this route. We therefore conclude that in our large cohort, superinfection with drug resistant HIV did not compromise the efficiency of antiretroviral treatment.
Resumo:
CONTEXT: New trial data and drug regimens that have become available in the last 2 years warrant an update to guidelines for antiretroviral therapy (ART) in human immunodeficiency virus (HIV)-infected adults in resource-rich settings. OBJECTIVE: To provide current recommendations for the treatment of adult HIV infection with ART and use of laboratory-monitoring tools. Guidelines include when to start therapy and with what drugs, monitoring for response and toxic effects, special considerations in therapy, and managing antiretroviral failure. DATA SOURCES, STUDY SELECTION, AND DATA EXTRACTION: Data that had been published or presented in abstract form at scientific conferences in the past 2 years were systematically searched and reviewed by an International Antiviral Society-USA panel. The panel reviewed available evidence and formed recommendations by full panel consensus. DATA SYNTHESIS: Treatment is recommended for all adults with HIV infection; the strength of the recommendation and the quality of the evidence increase with decreasing CD4 cell count and the presence of certain concurrent conditions. Recommended initial regimens include 2 nucleoside reverse transcriptase inhibitors (tenofovir/emtricitabine or abacavir/lamivudine) plus a nonnucleoside reverse transcriptase inhibitor (efavirenz), a ritonavir-boosted protease inhibitor (atazanavir or darunavir), or an integrase strand transfer inhibitor (raltegravir). Alternatives in each class are recommended for patients with or at risk of certain concurrent conditions. CD4 cell count and HIV-1 RNA level should be monitored, as should engagement in care, ART adherence, HIV drug resistance, and quality-of-care indicators. Reasons for regimen switching include virologic, immunologic, or clinical failure and drug toxicity or intolerance. Confirmed treatment failure should be addressed promptly and multiple factors considered. CONCLUSION: New recommendations for HIV patient care include offering ART to all patients regardless of CD4 cell count, changes in therapeutic options, and modifications in the timing and choice of ART in the setting of opportunistic illnesses such as cryptococcal disease and tuberculosis.
Resumo:
BACKGROUND: The efficacy of vedolizumab, an α4β7 integrin antibody, in Crohn's disease is unknown. METHODS: In an integrated study with separate induction and maintenance trials, we assessed intravenous vedolizumab therapy (300 mg) in adults with active Crohn's disease. In the induction trial, 368 patients were randomly assigned to receive vedolizumab or placebo at weeks 0 and 2 (cohort 1), and 747 patients received open-label vedolizumab at weeks 0 and 2 (cohort 2); disease status was assessed at week 6. In the maintenance trial, 461 patients who had had a response to vedolizumab were randomly assigned to receive placebo or vedolizumab every 8 or 4 weeks until week 52. RESULTS: At week 6, a total of 14.5% of the patients in cohort 1 who received vedolizumab and 6.8% who received placebo were in clinical remission (i.e., had a score on the Crohn's Disease Activity Index [CDAI] of ≤150, with scores ranging from 0 to approximately 600 and higher scores indicating greater disease activity) (P=0.02); a total of 31.4% and 25.7% of the patients, respectively, had a CDAI-100 response (≥100-point decrease in the CDAI score) (P=0.23). Among patients in cohorts 1 and 2 who had a response to induction therapy, 39.0% and 36.4% of those assigned to vedolizumab every 8 weeks and every 4 weeks, respectively, were in clinical remission at week 52, as compared with 21.6% assigned to placebo (P<0.001 and P=0.004 for the two vedolizumab groups, respectively, vs. placebo). Antibodies against vedolizumab developed in 4.0% of the patients. Nasopharyngitis occurred more frequently, and headache and abdominal pain less frequently, in patients receiving vedolizumab than in patients receiving placebo. Vedolizumab, as compared with placebo, was associated with a higher rate of serious adverse events (24.4% vs. 15.3%), infections (44.1% vs. 40.2%), and serious infections (5.5% vs. 3.0%). CONCLUSIONS: Vedolizumab-treated patients with active Crohn's disease were more likely than patients receiving placebo to have a remission, but not a CDAI-100 response, at week 6; patients with a response to induction therapy who continued to receive vedolizumab (rather than switching to placebo) were more likely to be in remission at week 52. Adverse events were more common with vedolizumab. (Funded by Millennium Pharmaceuticals; GEMINI 2 ClinicalTrials.gov number, NCT00783692.).