20 resultados para brainstem glioma
Resumo:
BACKGROUND: Mechanical and in particular tactile allodynia is a hallmark of chronic pain in which innocuous touch becomes painful. Previous cholera toxin B (CTB)-based neural tracing experiments and electrophysiology studies had suggested that aberrant axon sprouting from touch sensory afferents into pain-processing laminae after injury is a possible anatomical substrate underlying mechanical allodynia. This hypothesis was later challenged by experiments using intra-axonal labeling of A-fiber neurons, as well as single-neuron labeling of electrophysiologically identified sensory neurons. However, no studies have used genetically labeled neurons to examine this issue, and most studies were performed on spinal but not trigeminal sensory neurons which are the relevant neurons for orofacial pain, where allodynia oftentimes plays a dominant clinical role. FINDINGS: We recently discovered that parvalbumin::Cre (Pv::Cre) labels two types of Aβ touch neurons in trigeminal ganglion. Using a Pv::CreER driver and a Cre-dependent reporter mouse, we specifically labeled these Aβ trigeminal touch afferents by timed taxomifen injection prior to inflammation or infraorbital nerve injury (ION transection). We then examined the peripheral and central projections of labeled axons into the brainstem caudalis nucleus after injuries vs controls. We found no evidence for ectopic sprouting of Pv::CreER labeled trigeminal Aβ axons into the superficial trigeminal noci-receptive laminae. Furthermore, there was also no evidence for peripheral sprouting. CONCLUSIONS: CreER-based labeling prior to injury precluded the issue of phenotypic changes of neurons after injury. Our results suggest that touch allodynia in chronic orofacial pain is unlikely caused by ectopic sprouting of Aβ trigeminal afferents.
Resumo:
It is essential to keep track of the movements we make, and one way to do that is to monitor correlates, or corollary discharges, of neuronal movement commands. We hypothesized that a previously identified pathway from brainstem to frontal cortex might carry corollary discharge signals. We found that neuronal activity in this pathway encodes upcoming eye movements and that inactivating the pathway impairs sequential eye movements consistent with loss of corollary discharge without affecting single eye movements. These results identify a pathway in the brain of the primate Macaca mulatta that conveys corollary discharge signals.
Resumo:
INTRODUCTION: Malignant gliomas frequently harbor mutations in the isocitrate dehydrogenase 1 (IDH1) gene. Studies suggest that IDH mutation contributes to tumor pathogenesis through mechanisms that are mediated by the neomorphic metabolite of the mutant IDH1 enzyme, 2-hydroxyglutarate (2-HG). The aim of this work was to synthesize and evaluate radiolabeled compounds that bind to the mutant IDH1 enzyme with the goal of enabling noninvasive imaging of mutant IDH1 expression in gliomas by positron emission tomography (PET). METHODS: A small library of nonradioactive analogs were designed and synthesized based on the chemical structure of reported butyl-phenyl sulfonamide inhibitors of mutant IDH1. Enzyme inhibition assays were conducted using purified mutant IDH1 enzyme, IDH1-R132H, to determine the IC50 and the maximal inhibitory efficiency of the synthesized compounds. Selected compounds, 1 and 4, were labeled with radioiodine ((125)I) and/or (18)F using bromo- and phenol precursors, respectively. In vivo behavior of the labeled inhibitors was studied by conducting tissue distribution studies with [(125)I]1 in normal mice. Cell uptake studies were conducted using an isogenic astrocytoma cell line that carried a native IDH1-R132H mutation to evaluate the potential uptake of the labeled inhibitors in IDH1-mutated tumor cells. RESULTS: Enzyme inhibition assays showed good inhibitory potency for compounds that have iodine or a fluoroethoxy substituent at the ortho position of the phenyl ring in compounds 1 and 4 with IC50 values of 1.7 μM and 2.3 μM, respectively. Compounds 1 and 4 inhibited mutant IDH1 activity and decreased the production of 2-HG in an IDH1-mutated astrocytoma cell line. Radiolabeling of 1 and 4 was achieved with an average radiochemical yield of 56.6 ± 20.1% for [(125)I]1 (n = 4) and 67.5 ± 6.6% for [(18)F]4 (n = 3). [(125)I]1 exhibited favorable biodistribution characteristics in normal mice, with rapid clearance from the blood and elimination via the hepatobiliary system by 4 h after injection. The uptake of [(125)I]1 in tumor cells positive for IDH1-R132H was significantly higher compared to isogenic WT-IDH1 controls, with a maximal uptake ratio of 1.67 at 3 h post injection. Co-incubation of the labeled inhibitors with the corresponding nonradioactive analogs, and decreasing the normal concentrations of FBS (10%) in the incubation media substantially increased the uptake of the labeled inhibitors in both the IDH1-mutant and WT-IDH1 tumor cell lines, suggesting significant non-specific binding of the synthesized labeled butyl-phenyl sulfonamide inhibitors. CONCLUSIONS: These data demonstrate the feasibility of developing radiolabeled probes for the mutant IDH1 enzyme based on enzyme inhibitors. Further optimization of the labeled inhibitors by modifying the chemical structure to decrease the lipophilicity and to increase potency may yield compounds with improved characteristics as probes for imaging mutant IDH1 expression in tumors.
Resumo:
Integrating information from multiple sources is a crucial function of the brain. Examples of such integration include multiple stimuli of different modalties, such as visual and auditory, multiple stimuli of the same modality, such as auditory and auditory, and integrating stimuli from the sensory organs (i.e. ears) with stimuli delivered from brain-machine interfaces.
The overall aim of this body of work is to empirically examine stimulus integration in these three domains to inform our broader understanding of how and when the brain combines information from multiple sources.
First, I examine visually-guided auditory, a problem with implications for the general problem in learning of how the brain determines what lesson to learn (and what lessons not to learn). For example, sound localization is a behavior that is partially learned with the aid of vision. This process requires correctly matching a visual location to that of a sound. This is an intrinsically circular problem when sound location is itself uncertain and the visual scene is rife with possible visual matches. Here, we develop a simple paradigm using visual guidance of sound localization to gain insight into how the brain confronts this type of circularity. We tested two competing hypotheses. 1: The brain guides sound location learning based on the synchrony or simultaneity of auditory-visual stimuli, potentially involving a Hebbian associative mechanism. 2: The brain uses a ‘guess and check’ heuristic in which visual feedback that is obtained after an eye movement to a sound alters future performance, perhaps by recruiting the brain’s reward-related circuitry. We assessed the effects of exposure to visual stimuli spatially mismatched from sounds on performance of an interleaved auditory-only saccade task. We found that when humans and monkeys were provided the visual stimulus asynchronously with the sound but as feedback to an auditory-guided saccade, they shifted their subsequent auditory-only performance toward the direction of the visual cue by 1.3-1.7 degrees, or 22-28% of the original 6 degree visual-auditory mismatch. In contrast when the visual stimulus was presented synchronously with the sound but extinguished too quickly to provide this feedback, there was little change in subsequent auditory-only performance. Our results suggest that the outcome of our own actions is vital to localizing sounds correctly. Contrary to previous expectations, visual calibration of auditory space does not appear to require visual-auditory associations based on synchrony/simultaneity.
My next line of research examines how electrical stimulation of the inferior colliculus influences perception of sounds in a nonhuman primate. The central nucleus of the inferior colliculus is the major ascending relay of auditory information before it reaches the forebrain, and thus an ideal target for understanding low-level information processing prior to the forebrain, as almost all auditory signals pass through the central nucleus of the inferior colliculus before reaching the forebrain. Thus, the inferior colliculus is the ideal structure to examine to understand the format of the inputs into the forebrain and, by extension, the processing of auditory scenes that occurs in the brainstem. Therefore, the inferior colliculus was an attractive target for understanding stimulus integration in the ascending auditory pathway.
Moreover, understanding the relationship between the auditory selectivity of neurons and their contribution to perception is critical to the design of effective auditory brain prosthetics. These prosthetics seek to mimic natural activity patterns to achieve desired perceptual outcomes. We measured the contribution of inferior colliculus (IC) sites to perception using combined recording and electrical stimulation. Monkeys performed a frequency-based discrimination task, reporting whether a probe sound was higher or lower in frequency than a reference sound. Stimulation pulses were paired with the probe sound on 50% of trials (0.5-80 µA, 100-300 Hz, n=172 IC locations in 3 rhesus monkeys). Electrical stimulation tended to bias the animals’ judgments in a fashion that was coarsely but significantly correlated with the best frequency of the stimulation site in comparison to the reference frequency employed in the task. Although there was considerable variability in the effects of stimulation (including impairments in performance and shifts in performance away from the direction predicted based on the site’s response properties), the results indicate that stimulation of the IC can evoke percepts correlated with the frequency tuning properties of the IC. Consistent with the implications of recent human studies, the main avenue for improvement for the auditory midbrain implant suggested by our findings is to increase the number and spatial extent of electrodes, to increase the size of the region that can be electrically activated and provide a greater range of evoked percepts.
My next line of research employs a frequency-tagging approach to examine the extent to which multiple sound sources are combined (or segregated) in the nonhuman primate inferior colliculus. In the single-sound case, most inferior colliculus neurons respond and entrain to sounds in a very broad region of space, and many are entirely spatially insensitive, so it is unknown how the neurons will respond to a situation with more than one sound. I use multiple AM stimuli of different frequencies, which the inferior colliculus represents using a spike timing code. This allows me to measure spike timing in the inferior colliculus to determine which sound source is responsible for neural activity in an auditory scene containing multiple sounds. Using this approach, I find that the same neurons that are tuned to broad regions of space in the single sound condition become dramatically more selective in the dual sound condition, preferentially entraining spikes to stimuli from a smaller region of space. I will examine the possibility that there may be a conceptual linkage between this finding and the finding of receptive field shifts in the visual system.
In chapter 5, I will comment on these findings more generally, compare them to existing theoretical models, and discuss what these results tell us about processing in the central nervous system in a multi-stimulus situation. My results suggest that the brain is flexible in its processing and can adapt its integration schema to fit the available cues and the demands of the task.
Resumo:
The mouth, throat, and face contain numerous muscles that participate in a large variety of orofacial behaviors. The jaw and tongue can move independently, and thus require a high degree of coordination among the muscles that move them to prevent self-injury. However, different orofacial behaviors require distinct patterns of coordination between these muscles. The method through which motor control circuitry might coordinate this activity has yet to be determined. Electrophysiological, immunohistochemical, and retrograde tracing studies have attempted to identify populations of premotor neurons which directly send information to orofacial motoneurons in an effort to identify sources of coordination. Yet these studies have not provided a complete picture of the population of neurons which monosynaptically connect to jaw and tongue motoneurons. Additionally, while many of these studies have suggested that premotor neurons projecting to multiple motor pools may play a role in coordination of orofacial muscles, no clear functional roles for these neurons in the coordination of natural orofacial movements has been identified.
In this dissertation, I took advantage of the recently developed monosynaptic rabies virus to trace the premotor circuits for the jaw-closing masseter muscle and tongue-protruding genioglossus muscle in the neonatal mouse, uncovering novel premotor inputs in the brainstem. Furthermore, these studies identified a set of neurons which form boutons onto motor neurons in multiple motor pools, providing a premotor substrate for orofacial coordination. I then combined a retrogradely traveling lentivirus with a split-intein mediated split-Cre recombinase system to isolate and manipulate a population of neurons which project to both left and right jaw-closing motor nuclei. I found that these bilaterally projecting neurons also innervate multiple other orofacial motor nuclei, premotor regions, and midbrain regions implicated in motor control. I anatomically and physiologically characterized these neurons and used optogenetic and chemicogenetic approaches to assess their role in natural jaw-closing behavior, specifically with reference to bilateral masseter muscle electromyogram (EMG) activity. These studies identified a population of bilaterally projecting neurons in the supratrigeminal nucleus as essential for maintenance of an appropriate level of masseter activation during natural chewing behavior in the freely moving mouse. Moreover, these studies uncovered two distinct roles of supratrigeminal bilaterally projecting neurons in bilaterally synchronized activation of masseter muscles, and active balancing of bilateral masseter muscle tone against an excitatory input. Together, these studies identify neurons which project to multiple motor nuclei as a mechanism by which the brain coordinates orofacial muscles during natural behavior.