3 resultados para Syllable

em Duke University


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Perceiving or producing complex vocalizations such as speech and birdsongs require the coordinated activity of neuronal populations, and these activity patterns can vary over space and time. How learned communication signals are represented by populations of sensorimotor neurons essential to vocal perception and production remains poorly understood. Using a combination of two-photon calcium imaging, intracellular electrophysiological recording and retrograde tracing methods in anesthetized adult male zebra finches (Taeniopygia guttata), I addressed how the bird's own song and its component syllables are represented by the spatiotemporal patterns of activity of two spatially intermingled populations of projection neurons (PNs) in HVC, a sensorimotor area required for song perception and production. These experiments revealed that neighboring PNs can respond at markedly different times to song playback and that different syllables activate spatially intermingled HVC PNs within a small region. Moreover, noise correlation analysis reveals enhanced functional connectivity between PNs that respond most strongly to the same syllable and also provides evidence of a spatial gradient of functional connectivity specific to PNs that project to song motor nucleus (i.e. HVCRA cells). These findings support a model in which syllabic and temporal features of song are represented by spatially intermingled PNs functionally organized into cell- and syllable-type networks.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Undergraduates were asked to generate a name for a hypothetical new exemplar of a category. They produced names that had the same numbers of syllables, the same endings, and the same types of word stems as existing exemplars of that category. In addition, novel exemplars, each consisting of a nonsense syllable root and a prototypical ending, were accurately assigned to categories. The data demonstrate the abstraction and use of surface properties of words.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Once thought to be predominantly the domain of cortex, multisensory integration has now been found at numerous sub-cortical locations in the auditory pathway. Prominent ascending and descending connection within the pathway suggest that the system may utilize non-auditory activity to help filter incoming sounds as they first enter the ear. Active mechanisms in the periphery, particularly the outer hair cells (OHCs) of the cochlea and middle ear muscles (MEMs), are capable of modulating the sensitivity of other peripheral mechanisms involved in the transduction of sound into the system. Through indirect mechanical coupling of the OHCs and MEMs to the eardrum, motion of these mechanisms can be recorded as acoustic signals in the ear canal. Here, we utilize this recording technique to describe three different experiments that demonstrate novel multisensory interactions occurring at the level of the eardrum. 1) In the first experiment, measurements in humans and monkeys performing a saccadic eye movement task to visual targets indicate that the eardrum oscillates in conjunction with eye movements. The amplitude and phase of the eardrum movement, which we dub the Oscillatory Saccadic Eardrum Associated Response or OSEAR, depended on the direction and horizontal amplitude of the saccade and occurred in the absence of any externally delivered sounds. 2) For the second experiment, we use an audiovisual cueing task to demonstrate a dynamic change to pressure levels in the ear when a sound is expected versus when one is not. Specifically, we observe a drop in frequency power and variability from 0.1 to 4kHz around the time when the sound is expected to occur in contract to a slight increase in power at both lower and higher frequencies. 3) For the third experiment, we show that seeing a speaker say a syllable that is incongruent with the accompanying audio can alter the response patterns of the auditory periphery, particularly during the most relevant moments in the speech stream. These visually influenced changes may contribute to the altered percept of the speech sound. Collectively, we presume that these findings represent the combined effect of OHCs and MEMs acting in tandem in response to various non-auditory signals in order to manipulate the receptive properties of the auditory system. These influences may have a profound, and previously unrecognized, impact on how the auditory system processes sounds from initial sensory transduction all the way to perception and behavior. Moreover, we demonstrate that the entire auditory system is, fundamentally, a multisensory system.