44 resultados para Primate
Resumo:
Transcranial magnetic stimulation (TMS) is a widely used, noninvasive method for stimulating nervous tissue, yet its mechanisms of effect are poorly understood. Here we report new methods for studying the influence of TMS on single neurons in the brain of alert non-human primates. We designed a TMS coil that focuses its effect near the tip of a recording electrode and recording electronics that enable direct acquisition of neuronal signals at the site of peak stimulus strength minimally perturbed by stimulation artifact in awake monkeys (Macaca mulatta). We recorded action potentials within ∼1 ms after 0.4-ms TMS pulses and observed changes in activity that differed significantly for active stimulation as compared with sham stimulation. This methodology is compatible with standard equipment in primate laboratories, allowing easy implementation. Application of these tools will facilitate the refinement of next generation TMS devices, experiments and treatment protocols.
Resumo:
Small bistratified cells (SBCs) in the primate retina carry a major blue-yellow opponent signal to the brain. We found that SBCs also carry signals from rod photoreceptors, with the same sign as S cone input. SBCs exhibited robust responses under low scotopic conditions. Physiological and anatomical experiments indicated that this rod input arose from the AII amacrine cell-mediated rod pathway. Rod and cone signals were both present in SBCs at mesopic light levels. These findings have three implications. First, more retinal circuits may multiplex rod and cone signals than were previously thought to, efficiently exploiting the limited number of optic nerve fibers. Second, signals from AII amacrine cells may diverge to most or all of the approximately 20 retinal ganglion cell types in the peripheral primate retina. Third, rod input to SBCs may be the substrate for behavioral biases toward perception of blue at mesopic light levels.
Resumo:
Chronic allograft rejection is a major impediment to long-term transplant success. Humoral immune responses to alloantigens are a growing clinical problem in transplantation, with mounting evidence associating alloantibodies with the development of chronic rejection. Nearly a third of transplant recipients develop de novo antibodies, for which no established therapies are effective at preventing or eliminating, highlighting the need for a nonhuman primate model of antibody-mediated rejection. In this report, we demonstrate that depletion using anti-CD3 immunotoxin (IT) combined with maintenance immunosuppression that included tacrolimus with or without alefacept reliably prolonged renal allograft survival in rhesus monkeys. In these animals, a preferential skewing toward CD4 repopulation and proliferation was observed, particularly with the addition of alefacept. Furthermore, alefacept-treated animals demonstrated increased alloantibody production (100%) and morphologic features of antibody-mediated injury. In vitro, alefacept was found to enhance CD4 effector memory T cell proliferation. In conclusion, alefacept administration after depletion and with tacrolimus promotes a CD4+memory T cell and alloantibody response, with morphologic changes reflecting antibody-mediated allograft injury. Early and consistent de novo alloantibody production with associated histological changes makes this nonhuman primate model an attractive candidate for evaluating targeted therapeutics.
Resumo:
De novo donor-specific antibody (DSA) after organ transplantation promotes antibody-mediated rejection (AMR) and causes late graft loss. Previously, we demonstrated that depletion using anti-CD3 immunotoxin combined with tacrolimus and alefacept (AMR regimen) reliably induced early DSA production with AMR in a nonhuman primate kidney transplant model. Five animals were assigned as positive AMR controls, four received additional belatacept and four received additional anti-CD40 mAb (2C10R4). Notably, production of early de novo DSA was completely attenuated with additional belatacept or 2C10R4 treatment. In accordance with this, while positive controls experienced a decrease in peripheral IgM(+) B cells, bela- and 2C10R4-added groups maintained a predominant population of IgM(+) B cells, potentially indicating decreased isotype switching. Central memory T cells (CD4(+) CD28(+) CD95(+)) as well as PD-1(hi) CD4(+) T cells were decreased in both bela-added and 2C10R4-added groups. In analyzing germinal center (GC) reactions in situ, lymph nodes further revealed a reduction of B cell clonal expansion, GC-follicular helper T (Tfh) cells, and IL-21 production inside GCs with additional belatacept or 2C10R4 treatment. Here we provide evidence that belatacept and 2C10R4 selectively suppresses the humoral response via regulating Tfh cells and prevents AMR in this nonhuman primate model.
Resumo:
Successful interaction with the world depends on accurate perception of the timing of external events. Neurons at early stages of the primate visual system represent time-varying stimuli with high precision. However, it is unknown whether this temporal fidelity is maintained in the prefrontal cortex, where changes in neuronal activity generally correlate with changes in perception. One reason to suspect that it is not maintained is that humans experience surprisingly large fluctuations in the perception of time. To investigate the neuronal correlates of time perception, we recorded from neurons in the prefrontal cortex and midbrain of monkeys performing a temporal-discrimination task. Visual time intervals were presented at a timescale relevant to natural behavior (<500 ms). At this brief timescale, neuronal adaptation--time-dependent changes in the size of successive responses--occurs. We found that visual activity fluctuated with timing judgments in the prefrontal cortex but not in comparable midbrain areas. Surprisingly, only response strength, not timing, predicted task performance. Intervals perceived as longer were associated with larger visual responses and shorter intervals with smaller responses, matching the dynamics of adaptation. These results suggest that the magnitude of prefrontal activity may be read out to provide temporal information that contributes to judging the passage of time.
Resumo:
Cellular stresses activate the tumor suppressor p53 protein leading to selective binding to DNA response elements (REs) and gene transactivation from a large pool of potential p53 REs (p53REs). To elucidate how p53RE sequences and local chromatin context interact to affect p53 binding and gene transactivation, we mapped genome-wide binding localizations of p53 and H3K4me3 in untreated and doxorubicin (DXR)-treated human lymphoblastoid cells. We examined the relationships among p53 occupancy, gene expression, H3K4me3, chromatin accessibility (DNase 1 hypersensitivity, DHS), ENCODE chromatin states, p53RE sequence, and evolutionary conservation. We observed that the inducible expression of p53-regulated genes was associated with the steady-state chromatin status of the cell. Most highly inducible p53-regulated genes were suppressed at baseline and marked by repressive histone modifications or displayed CTCF binding. Comparison of p53RE sequences residing in different chromatin contexts demonstrated that weaker p53REs resided in open promoters, while stronger p53REs were located within enhancers and repressed chromatin. p53 occupancy was strongly correlated with similarity of the target DNA sequences to the p53RE consensus, but surprisingly, inversely correlated with pre-existing nucleosome accessibility (DHS) and evolutionary conservation at the p53RE. Occupancy by p53 of REs that overlapped transposable element (TE) repeats was significantly higher (p<10-7) and correlated with stronger p53RE sequences (p<10-110) relative to nonTE-associated p53REs, particularly for MLT1H, LTR10B, and Mer61 TEs. However, binding at these elements was generally not associated with transactivation of adjacent genes. Occupied p53REs located in L2-like TEs were unique in displaying highly negative PhyloP scores (predicted fast-evolving) and being associated with altered H3K4me3 and DHS levels. These results underscore the systematic interaction between chromatin status and p53RE context in the induced transactivation response. This p53 regulated response appears to have been tuned via evolutionary processes that may have led to repression and/or utilization of p53REs originating from primate-specific transposon elements.
Resumo:
A juvenile cranium of Homunculus patagonicus Ameghino, 1891a from the late Early Miocene of Santa Cruz Province (Argentina) provides the first evidence of developing cranial anatomy for any fossil platyrrhine. The specimen preserves the rostral part of the cranium with deciduous and permanent alveoli and teeth. The dental eruption sequence in the new specimen and a reassessment of eruption patterns in living and fossil platyrrhines suggest that the ancestral platyrrhine pattern of tooth replacement was for the permanent incisors to erupt before M(1), not an accelerated molar eruption (before the incisors) as recently proposed. Two genera and species of Santacrucian monkeys are now generally recognized: H. patagonicus Ameghino, 1891a and Killikaike blakei Tejedor et al., 2006. Taxonomic allocation of Santacrucian monkeys to these species encounters two obstacles: 1) the (now lost) holotype and a recently proposed neotype of H. patagonicus are mandibles from different localities and different geologic members of the Santa Cruz Formation, separated by approximately 0.7 million years, whereas the holotype of K. blakei is a rostral part of a cranium without a mandible; 2) no Santacrucian monkey with associated cranium and mandible has ever been found. Bearing in mind these uncertainties, our examination of the new specimen as well as other cranial specimens of Santacrucian monkeys establishes the overall dental and cranial similarity between the holotype of Killikaike blakei, adult cranial material previously referred to H. patagonicus, and the new juvenile specimen. This leads us to conclude that Killikaike blakei is a junior subjective synonym of H. patagonicus.
Resumo:
The high energetic costs of building and maintaining large brains are thought to constrain encephalization. The 'expensive-tissue hypothesis' (ETH) proposes that primates (especially humans) overcame this constraint through reduction of another metabolically expensive tissue, the gastrointestinal tract. Small guts characterize animals specializing on easily digestible diets. Thus, the hypothesis may be tested via the relationship between brain size and diet quality. Platyrrhine primates present an interesting test case, as they are more variably encephalized than other extant primate clades (excluding Hominoidea). We find a high degree of phylogenetic signal in the data for diet quality, endocranial volume and body size. Controlling for phylogenetic effects, we find no significant correlation between relative diet quality and relative endocranial volume. Thus, diet quality fails to account for differences in platyrrhine encephalization. One taxon, in particular, Brachyteles, violates predictions made by ETH in having a large brain and low-quality diet. Dietary reconstructions of stem platyrrhines further indicate that a relatively high-quality diet was probably in place prior to increases in encephalization. Therefore, it is unlikely that a shift in diet quality was a primary constraint release for encephalization in platyrrhines and, by extrapolation, humans.
Resumo:
Saccadic eye movements can be elicited by more than one type of sensory stimulus. This implies substantial transformations of signals originating in different sense organs as they reach a common motor output pathway. In this study, we compared the prevalence and magnitude of auditory- and visually evoked activity in a structure implicated in oculomotor processing, the primate frontal eye fields (FEF). We recorded from 324 single neurons while 2 monkeys performed delayed saccades to visual or auditory targets. We found that 64% of FEF neurons were active on presentation of auditory targets and 87% were active during auditory-guided saccades, compared with 75 and 84% for visual targets and saccades. As saccade onset approached, the average level of population activity in the FEF became indistinguishable on visual and auditory trials. FEF activity was better correlated with the movement vector than with the target location for both modalities. In summary, the large proportion of auditory-responsive neurons in the FEF, the similarity between visual and auditory activity levels at the time of the saccade, and the strong correlation between the activity and the saccade vector suggest that auditory signals undergo tailoring to match roughly the strength of visual signals present in the FEF, facilitating accessing of a common motor output pathway.
Resumo:
Animals communicating via scent often deposit composite signals that incorporate odorants from multiple sources; however, the function of mixing chemical signals remains understudied. We tested both a 'multiple-messages' and a 'fixative' hypothesis of composite olfactory signalling, which, respectively, posit that mixing scents functions to increase information content or prolong signal longevity. Our subjects-adult, male ring-tailed lemurs (Lemur catta)-have a complex scent-marking repertoire, involving volatile antebrachial (A) secretions, deposited pure or after being mixed with a squalene-rich paste exuded from brachial (B) glands. Using behavioural bioassays, we examined recipient responses to odorants collected from conspecific strangers. We concurrently presented pure A, pure B and mixed A + B secretions, in fresh or decayed conditions. Lemurs preferentially responded to mixed over pure secretions, their interest increasing and shifting over time, from sniffing and countermarking fresh mixtures, to licking and countermarking decayed mixtures. Substituting synthetic squalene (S)-a well-known fixative-for B secretions did not replicate prior results: B secretions, which contain additional chemicals that probably encode salient information, were preferred over pure S. Whereas support for the 'multiple-messages' hypothesis underscores the unique contribution from each of an animal's various secretions, support for the 'fixative' hypothesis highlights the synergistic benefits of composite signals.
Resumo:
As we look around a scene, we perceive it as continuous and stable even though each saccadic eye movement changes the visual input to the retinas. How the brain achieves this perceptual stabilization is unknown, but a major hypothesis is that it relies on presaccadic remapping, a process in which neurons shift their visual sensitivity to a new location in the scene just before each saccade. This hypothesis is difficult to test in vivo because complete, selective inactivation of remapping is currently intractable. We tested it in silico with a hierarchical, sheet-based neural network model of the visual and oculomotor system. The model generated saccadic commands to move a video camera abruptly. Visual input from the camera and internal copies of the saccadic movement commands, or corollary discharge, converged at a map-level simulation of the frontal eye field (FEF), a primate brain area known to receive such inputs. FEF output was combined with eye position signals to yield a suitable coordinate frame for guiding arm movements of a robot. Our operational definition of perceptual stability was "useful stability,” quantified as continuously accurate pointing to a visual object despite camera saccades. During training, the emergence of useful stability was correlated tightly with the emergence of presaccadic remapping in the FEF. Remapping depended on corollary discharge but its timing was synchronized to the updating of eye position. When coupled to predictive eye position signals, remapping served to stabilize the target representation for continuously accurate pointing. Graded inactivations of pathways in the model replicated, and helped to interpret, previous in vivo experiments. The results support the hypothesis that visual stability requires presaccadic remapping, provide explanations for the function and timing of remapping, and offer testable hypotheses for in vivo studies. We conclude that remapping allows for seamless coordinate frame transformations and quick actions despite visual afferent lags. With visual remapping in place for behavior, it may be exploited for perceptual continuity.
Resumo:
As we look around a scene, we perceive it as continuous and stable even though each saccadic eye movement changes the visual input to the retinas. How the brain achieves this perceptual stabilization is unknown, but a major hypothesis is that it relies on presaccadic remapping, a process in which neurons shift their visual sensitivity to a new location in the scene just before each saccade. This hypothesis is difficult to test in vivo because complete, selective inactivation of remapping is currently intractable. We tested it in silico with a hierarchical, sheet-based neural network model of the visual and oculomotor system. The model generated saccadic commands to move a video camera abruptly. Visual input from the camera and internal copies of the saccadic movement commands, or corollary discharge, converged at a map-level simulation of the frontal eye field (FEF), a primate brain area known to receive such inputs. FEF output was combined with eye position signals to yield a suitable coordinate frame for guiding arm movements of a robot. Our operational definition of perceptual stability was "useful stability," quantified as continuously accurate pointing to a visual object despite camera saccades. During training, the emergence of useful stability was correlated tightly with the emergence of presaccadic remapping in the FEF. Remapping depended on corollary discharge but its timing was synchronized to the updating of eye position. When coupled to predictive eye position signals, remapping served to stabilize the target representation for continuously accurate pointing. Graded inactivations of pathways in the model replicated, and helped to interpret, previous in vivo experiments. The results support the hypothesis that visual stability requires presaccadic remapping, provide explanations for the function and timing of remapping, and offer testable hypotheses for in vivo studies. We conclude that remapping allows for seamless coordinate frame transformations and quick actions despite visual afferent lags. With visual remapping in place for behavior, it may be exploited for perceptual continuity.
Resumo:
For primates, and other arboreal mammals, adopting suspensory locomotion represents one of the strategies an animal can use to prevent toppling off a thin support during arboreal movement and foraging. While numerous studies have reported the incidence of suspensory locomotion in a broad phylogenetic sample of mammals, little research has explored what mechanical transitions must occur in order for an animal to successfully adopt suspensory locomotion. Additionally, many primate species are capable of adopting a highly specialized form of suspensory locomotion referred to as arm-swinging, but few scenarios have been posited to explain how arm-swinging initially evolved. This study takes a comparative experimental approach to explore the mechanics of below branch quadrupedal locomotion in primates and other mammals to determine whether above and below branch quadrupedal locomotion represent neuromuscular mirrors of each other, and whether the patterns below branch quadrupedal locomotion are similar across taxa. Also, this study explores whether the nature of the flexible coupling between the forelimb and hindlimb observed in primates is a uniquely primate feature, and investigates the possibility that this mechanism could be responsible for the evolution of arm-swinging.
To address these research goals, kinetic, kinematic, and spatiotemporal gait variables were collected from five species of primate (Cebus capucinus, Daubentonia madagascariensis, Lemur catta, Propithecus coquereli, and Varecia variegata) walking quadrupedally above and below branches. Data from these primate species were compared to data collected from three species of non-primate mammals (Choloepus didactylus, Pteropus vampyrus, and Desmodus rotundus) and to three species of arm-swinging primate (Hylobates moloch, Ateles fusciceps, and Pygathrix nemaeus) to determine how varying forms of suspensory locomotion relate to each other and across taxa.
From the data collected in this study it is evident the specialized gait characteristics present during above branch quadrupedal locomotion in primates are not observed when walking below branches. Instead, gait mechanics closely replicate the characteristic walking patterns of non-primate mammals, with the exception that primates demonstrate an altered limb loading pattern during below branch quadrupedal locomotion, in which the forelimb becomes the primary propulsive and weight-bearing limb; a pattern similar to what is observed during arm-swinging. It is likely that below branch quadrupedal locomotion represents a “mechanical release” from the challenges of moving on top of thin arboreal supports. Additionally, it is possible, that arm-swinging could have evolved from an anatomically-generalized arboreal primate that began to forage and locomote below branches. During these suspensory bouts, weight would have been shifted away from the hindlimbs towards forelimbs, and as the frequency of these boats increased the reliance of the forelimb as the sole form of weight support would have also increased. This form of functional decoupling may have released the hindlimbs from their weight-bearing role during suspensory locomotion, and eventually arm-swinging would have replaced below branch quadrupedal locomotion as the primary mode of suspensory locomotion observed in some primate species. This study provides the first experimental evidence supporting the hypothetical link between below branch quadrupedal locomotion and arm-swinging in primates.
Resumo:
Primate species typically differ from other mammals in having bony canals that enclose the branches of the internal carotid artery (ICA) as they pass through the middle ear. The presence and relative size of these canals varies among major primate clades. As a result, differences in the anatomy of the canals for the promontorial and stapedial branches of the ICA have been cited as evidence of either haplorhine or strepsirrhine affinities among otherwise enigmatic early fossil euprimates. Here we use micro X-ray computed tomography to compile the largest quantitative dataset on ICA canal sizes. The data suggest greater variation of the ICA canals within some groups than has been previously appreciated. For example, Lepilemur and Avahi differ from most other lemuriforms in having a larger promontorial canal than stapedial canal. Furthermore, various lemurids are intraspecifically variable in relative canal size, with the promontorial canal being larger than the stapedial canal in some individuals but not others. In species where the promontorial artery supplies the brain with blood, the size of the promontorial canal is significantly correlated with endocranial volume (ECV). Among species with alternate routes of encephalic blood supply, the promontorial canal is highly reduced relative to ECV, and correlated with both ECV and cranium size. Ancestral state reconstructions incorporating data from fossils suggest that the last common ancestor of living primates had promontorial and stapedial canals that were similar to each other in size and large relative to ECV. We conclude that the plesiomorphic condition for crown primates is to have a patent promontorial artery supplying the brain and a patent stapedial artery for various non-encephalic structures. This inferred ancestral condition is exhibited by treeshrews and most early fossil euprimates, while extant primates exhibit reduction in one canal or another. The only early fossils deviating from this plesiomorphic condition are Adapis parisiensis with a reduced promontorial canal, and Rooneyia and Mahgarita with reduced stapedial canals.