6 resultados para Musical perception

em Bucknell University Digital Commons - Pensilvania - USA


Relevância:

40.00% 40.00%

Publicador:

Resumo:

Neuropsychological studies have suggested that imagery processes may be mediated by neuronal mechanisms similar to those used in perception. To test this hypothesis, and to explore the neural basis for song imagery, 12 normal subjects were scanned using the water bolus method to measure cerebral blood flow (CBF) during the performance of three tasks. In the control condition subjects saw pairs of words on each trial and judged which word was longer. In the perceptual condition subjects also viewed pairs of words, this time drawn from a familiar song; simultaneously they heard the corresponding song, and their task was to judge the change in pitch of the two cued words within the song. In the imagery condition, subjects performed precisely the same judgment as in the perceptual condition, but with no auditory input. Thus, to perform the imagery task correctly an internal auditory representation must be accessed. Paired-image subtraction of the resulting pattern of CBF, together with matched MRI for anatomical localization, revealed that both perceptual and imagery. tasks produced similar patterns of CBF changes, as compared to the control condition, in keeping with the hypothesis. More specifically, both perceiving and imagining songs are associated with bilateral neuronal activity in the secondary auditory cortices, suggesting that processes within these regions underlie the phenomenological impression of imagined sounds. Other CBF foci elicited in both tasks include areas in the left and right frontal lobes and in the left parietal lobe, as well as the supplementary motor area. This latter region implicates covert vocalization as one component of musical imagery. Direct comparison of imagery and perceptual tasks revealed CBF increases in the inferior frontal polar cortex and right thalamus. We speculate that this network of regions may be specifically associated with retrieval and/or generation of auditory information from memory.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We used fMRI to investigate the neuronal correlates of encoding and recognizing heard and imagined melodies. Ten participants were shown lyrics of familiar verbal tunes; they either heard the tune along with the lyrics, or they had to imagine it. In a subsequent surprise recognition test, they had to identify the titles of tunes that they had heard or imagined earlier. The functional data showed substantial overlap during melody perception and imagery, including secondary auditory areas. During imagery compared with perception, an extended network including pFC, SMA, intraparietal sulcus, and cerebellum showed increased activity, in line with the increased processing demands of imagery. Functional connectivity of anterior right temporal cortex with frontal areas was increased during imagery compared with perception, indicating that these areas form an imagery-related network. Activity in right superior temporal gyrus and pFC was correlated with the subjective rating of imagery vividness. Similar to the encoding phase, the recognition task recruited overlapping areas, including inferior frontal cortex associated with memory retrieval, as well as left middle temporal gyrus. The results present new evidence for the cortical network underlying goal-directed auditory imagery, with a prominent role of the right pFC both for the subjective impression of imagery vividness and for on-line mental monitoring of imagery-related activity in auditory areas.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

WE STUDIED THE EMOTIONAL RESPONSES BY MUSICIANS to familiar classical music excerpts both when the music was sounded, and when it was imagined.We used continuous response methodology to record response profiles for the dimensions of valence and arousal simultaneously and then on the single dimension of emotionality. The response profiles were compared using cross-correlation analysis, and an analysis of responses to musical feature turning points, which isolate instances of change in musical features thought to influence valence and arousal responses. We found strong similarity between the use of an emotionality arousal scale across the stimuli, regardless of condition (imagined or sounded). A majority of participants were able to create emotional response profiles while imagining the music, which were similar in timing to the response profiles created while listening to the sounded music.We conclude that similar mechanisms may be involved in the processing of emotion in music when the music is sounded and when imagined.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The generality of findings implicating secondary auditory areas in auditory imagery was tested by using a timbre imagery task with fMRI. Another aim was to test whether activity in supplementary motor area (SMA) seen in prior studies might have been related to subvocalization. Participants with moderate musical background were scanned while making similarity judgments about the timbre of heard or imagined musical instrument sounds. The critical control condition was a visual imagery task. The pattern of judgments in perceived and imagined conditions was similar, suggesting that perception and imagery access similar cognitive representations of timbre. As expected, judgments of heard timbres, relative to the visual imagery control, activated primary and secondary auditory areas with some right-sided asymmetry. Timbre imagery also activated secondary auditory areas relative to the visual imagery control, although less strongly, in accord with previous data. Significant overlap was observed in these regions between perceptual and imagery conditions. Because the visual control task resulted in deactivation of auditory areas relative to a silent baseline, we interpret the timbre imagery effect as a reversal of that deactivation. Despite the lack of an obvious subvocalization component to timbre imagery, some activity in SMA was observed, suggesting that SMA may have a more general role in imagery beyond any motor component.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The two modes most widely used in Western music today convey opposite moods—a distinction that nonmusicians and even young children are able to make. However, the current studies provide evidence that, despite a strong link between mode and affect, mode perception is problematic. Nonmusicians found mode discrimination to be harder than discrimination of other melodic features, and they were not able to accurately classify major and minor melodies with these labels. Although nonmusicians were able to classify major and minor melodies using affective labels, they performed at chance in mode discrimination. Training, in the form of short lessons given to nonmusicians and the natural musical experience of musicians, improved performance, but not to ceiling levels. Tunes with high note density were classified as major, and tunes with low note density as minor, even though these features were actually unrelated in the experimental material. Although these findings provide support for the importance of mode in the perception of emotion, they clearly indicate that these mode perceptions are inaccurate, even in trained individuals, without the assistance of affective labeling.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Four experiments investigated perception of major and minor thirds whose component tones were sounded simultaneously. Effects akin to categorical perception of speech sounds were found. In the first experiment, musicians demonstrated relatively sharp category boundaries in identification and peaks near the boundary in discrimination tasks of an interval continuum where the bottom note was always an F and the top note varied from A to A flat in seven equal logarithmic steps. Nonmusicians showed these effects only to a small extent. The musicians showed higher than predicted discrimination performance overall, and reaction time increases at category boundaries. In the second experiment, musicians failed to consistently identify or discriminate thirds which varied in absolute pitch, but retained the proper interval ratio. In the last two experiments, using selective adaptation, consistent shifts were found in both identification and discrimination, similar to those found in speech experiments. Manipulations of adapting and test showed that the mechanism underlying the effect appears to be centrally mediated and confined to a frequency-specific level. A multistage model of interval perception, where the first stages deal only with specific pitches may account for the results.