982 resultados para Musical perception
Resumo:
Pós-graduação em Música - IA
Resumo:
Pós-graduação em Música - IA
Resumo:
Pós-graduação em Música - IA
Resumo:
Pós-graduação em Música - IA
Resumo:
Este estudo verificou se emoções percebidas durante uma escuta musical influenciam a percepção temporal. Músicos e não músicos foram submetidos a tarefas de escuta de trechos musicais do repertório erudito ocidental com 20 segundos de duração cada um e tarefas de associação temporal de cada trecho ouvido a durações padrões, que variavam de 16 a 24 segundos. Os trechos musicais empregados eram representativos de uma dentre as categorias emocionais Alegria, Tristeza, Serenidade ou Medo / Raiva. Uma análise de variância mostrou que, enquanto os não músicos apresentaram subestimações temporais associadas a pelo menos um trecho musical de cada uma das categorias emocionais, os músicos subestimaram todos os trechos musicais tristes, relacionados às características de baixo arousal e valência afetiva negativa.
Resumo:
Este estudo verificou se emoções percebidas durante uma escuta musical influenciam a percepção temporal. Músicos e não músicos foram submetidos a tarefas de escuta de trechos musicais do repertório erudito ocidental com 20 segundos de duração cada um e tarefas de associação temporal de cada trecho ouvido a durações padrões, que variavam de 16 a 24 segundos. Os trechos musicais empregados eram representativos de uma dentre as categorias emocionais Alegria, Tristeza, Serenidade ou Medo / Raiva. Uma análise de variância mostrou que, enquanto os não músicos apresentaram subestimações temporais associadas a pelo menos um trecho musical de cada uma das categorias emocionais, os músicos subestimaram todos os trechos musicais tristes, relacionados às características de baixo arousal e valência afetiva negativa.
Resumo:
We used fMRI to investigate the neuronal correlates of encoding and recognizing heard and imagined melodies. Ten participants were shown lyrics of familiar verbal tunes; they either heard the tune along with the lyrics, or they had to imagine it. In a subsequent surprise recognition test, they had to identify the titles of tunes that they had heard or imagined earlier. The functional data showed substantial overlap during melody perception and imagery, including secondary auditory areas. During imagery compared with perception, an extended network including pFC, SMA, intraparietal sulcus, and cerebellum showed increased activity, in line with the increased processing demands of imagery. Functional connectivity of anterior right temporal cortex with frontal areas was increased during imagery compared with perception, indicating that these areas form an imagery-related network. Activity in right superior temporal gyrus and pFC was correlated with the subjective rating of imagery vividness. Similar to the encoding phase, the recognition task recruited overlapping areas, including inferior frontal cortex associated with memory retrieval, as well as left middle temporal gyrus. The results present new evidence for the cortical network underlying goal-directed auditory imagery, with a prominent role of the right pFC both for the subjective impression of imagery vividness and for on-line mental monitoring of imagery-related activity in auditory areas.
Resumo:
WE STUDIED THE EMOTIONAL RESPONSES BY MUSICIANS to familiar classical music excerpts both when the music was sounded, and when it was imagined.We used continuous response methodology to record response profiles for the dimensions of valence and arousal simultaneously and then on the single dimension of emotionality. The response profiles were compared using cross-correlation analysis, and an analysis of responses to musical feature turning points, which isolate instances of change in musical features thought to influence valence and arousal responses. We found strong similarity between the use of an emotionality arousal scale across the stimuli, regardless of condition (imagined or sounded). A majority of participants were able to create emotional response profiles while imagining the music, which were similar in timing to the response profiles created while listening to the sounded music.We conclude that similar mechanisms may be involved in the processing of emotion in music when the music is sounded and when imagined.
Resumo:
The generality of findings implicating secondary auditory areas in auditory imagery was tested by using a timbre imagery task with fMRI. Another aim was to test whether activity in supplementary motor area (SMA) seen in prior studies might have been related to subvocalization. Participants with moderate musical background were scanned while making similarity judgments about the timbre of heard or imagined musical instrument sounds. The critical control condition was a visual imagery task. The pattern of judgments in perceived and imagined conditions was similar, suggesting that perception and imagery access similar cognitive representations of timbre. As expected, judgments of heard timbres, relative to the visual imagery control, activated primary and secondary auditory areas with some right-sided asymmetry. Timbre imagery also activated secondary auditory areas relative to the visual imagery control, although less strongly, in accord with previous data. Significant overlap was observed in these regions between perceptual and imagery conditions. Because the visual control task resulted in deactivation of auditory areas relative to a silent baseline, we interpret the timbre imagery effect as a reversal of that deactivation. Despite the lack of an obvious subvocalization component to timbre imagery, some activity in SMA was observed, suggesting that SMA may have a more general role in imagery beyond any motor component.
Resumo:
The two modes most widely used in Western music today convey opposite moods—a distinction that nonmusicians and even young children are able to make. However, the current studies provide evidence that, despite a strong link between mode and affect, mode perception is problematic. Nonmusicians found mode discrimination to be harder than discrimination of other melodic features, and they were not able to accurately classify major and minor melodies with these labels. Although nonmusicians were able to classify major and minor melodies using affective labels, they performed at chance in mode discrimination. Training, in the form of short lessons given to nonmusicians and the natural musical experience of musicians, improved performance, but not to ceiling levels. Tunes with high note density were classified as major, and tunes with low note density as minor, even though these features were actually unrelated in the experimental material. Although these findings provide support for the importance of mode in the perception of emotion, they clearly indicate that these mode perceptions are inaccurate, even in trained individuals, without the assistance of affective labeling.
Resumo:
Four experiments investigated perception of major and minor thirds whose component tones were sounded simultaneously. Effects akin to categorical perception of speech sounds were found. In the first experiment, musicians demonstrated relatively sharp category boundaries in identification and peaks near the boundary in discrimination tasks of an interval continuum where the bottom note was always an F and the top note varied from A to A flat in seven equal logarithmic steps. Nonmusicians showed these effects only to a small extent. The musicians showed higher than predicted discrimination performance overall, and reaction time increases at category boundaries. In the second experiment, musicians failed to consistently identify or discriminate thirds which varied in absolute pitch, but retained the proper interval ratio. In the last two experiments, using selective adaptation, consistent shifts were found in both identification and discrimination, similar to those found in speech experiments. Manipulations of adapting and test showed that the mechanism underlying the effect appears to be centrally mediated and confined to a frequency-specific level. A multistage model of interval perception, where the first stages deal only with specific pitches may account for the results.
Resumo:
Speech coding might have an impact on music perception of cochlear implant users. This questionnaire study compares the musical activities and perception of postlingually deafened cochlear implant users with three different coding strategies (CIS, ACE, SPEAK) using the Munich Music Questionnaire. Overall, the self-reported perception of music of CIS, SPEAK, and ACE users did not differ by very much.
Resumo:
Most previous neurophysiological studies evoked emotions by presenting visual stimuli. Models of the emotion circuits in the brain have for the most part ignored emotions arising from musical stimuli. To our knowledge, this is the first emotion brain study which examined the influence of visual and musical stimuli on brain processing. Highly arousing pictures of the International Affective Picture System and classical musical excerpts were chosen to evoke the three basic emotions of happiness, sadness and fear. The emotional stimuli modalities were presented for 70 s either alone or combined (congruent) in a counterbalanced and random order. Electroencephalogram (EEG) Alpha-Power-Density, which is inversely related to neural electrical activity, in 30 scalp electrodes from 24 right-handed healthy female subjects, was recorded. In addition, heart rate (HR), skin conductance responses (SCR), respiration, temperature and psychometrical ratings were collected. Results showed that the experienced quality of the presented emotions was most accurate in the combined conditions, intermediate in the picture conditions and lowest in the sound conditions. Furthermore, both the psychometrical ratings and the physiological involvement measurements (SCR, HR, Respiration) were significantly increased in the combined and sound conditions compared to the picture conditions. Finally, repeated measures ANOVA revealed the largest Alpha-Power-Density for the sound conditions, intermediate for the picture conditions, and lowest for the combined conditions, indicating the strongest activation in the combined conditions in a distributed emotion and arousal network comprising frontal, temporal, parietal and occipital neural structures. Summing up, these findings demonstrate that music can markedly enhance the emotional experience evoked by affective pictures.
Resumo:
Relatório de Estágio apresentado à Escola Superior de Artes Aplicadas do Instituto Politécnico de Castelo Branco para cumprimento dos requisitos necessários à obtenção do grau de Mestre em Ensino de Música – Formação Musical e Música de Conjunto.
Resumo:
Drawing from ethnographic, empirical, and historical/cultural perspectives, we examine the extent to which visual aspects of music contribute to the communication that takes place between performers and their listeners. First, we introduce a framework for understanding how media and genres shape aural and visual experiences of music. Second, we present case studies of two performances, and describe the relation between visual and aural aspects of performance. Third, we report empirical evidence that visual aspects of performance reliably influence perceptions of musical structure (pitch related features) and affective interpretations of music. Finally, we trace new and old media trajectories of aural and visual dimensions of music, and highlight how our conceptions, perceptions and appreciation of music are intertwined with technological innovation and media deployment strategies.