12 resultados para Musical meter and rhythm
em Bucknell University Digital Commons - Pensilvania - USA
Resumo:
Two experiments plus a pilot investigated the role of melodic structure on short-term memory for musical notation by musicians and nonmusicians. In the pilot experiment, visually similar melodies that had been rated as either "good" or "bad" were presented briefly, followed by a 15-sec retention interval and then recall. Musicians remembered good melodies better than they remembered bad ones: nonmusicians did not distinguish between them. In the second experiment, good, bad, and random melodies were briefly presented, followed by immediate recall. The advantage of musicians over nonmusicians decreased as the melody type progressed from good to bad to random. In the third experiment, musicians and nonmusicians divided the stimulus melodies into groups. For each melody, the consistency of grouping was correlated with memory performance in the first two experiments. Evidence was found for use of musical groupings by musicians and for use of a simple visual strategy by nonmusicians. The nature of these musical groupings and how they may be learned are considered. The relation of this work to other studies of comprehension of symbolic diagrams is also discussed.
Resumo:
Neuropsychological studies have suggested that imagery processes may be mediated by neuronal mechanisms similar to those used in perception. To test this hypothesis, and to explore the neural basis for song imagery, 12 normal subjects were scanned using the water bolus method to measure cerebral blood flow (CBF) during the performance of three tasks. In the control condition subjects saw pairs of words on each trial and judged which word was longer. In the perceptual condition subjects also viewed pairs of words, this time drawn from a familiar song; simultaneously they heard the corresponding song, and their task was to judge the change in pitch of the two cued words within the song. In the imagery condition, subjects performed precisely the same judgment as in the perceptual condition, but with no auditory input. Thus, to perform the imagery task correctly an internal auditory representation must be accessed. Paired-image subtraction of the resulting pattern of CBF, together with matched MRI for anatomical localization, revealed that both perceptual and imagery. tasks produced similar patterns of CBF changes, as compared to the control condition, in keeping with the hypothesis. More specifically, both perceiving and imagining songs are associated with bilateral neuronal activity in the secondary auditory cortices, suggesting that processes within these regions underlie the phenomenological impression of imagined sounds. Other CBF foci elicited in both tasks include areas in the left and right frontal lobes and in the left parietal lobe, as well as the supplementary motor area. This latter region implicates covert vocalization as one component of musical imagery. Direct comparison of imagery and perceptual tasks revealed CBF increases in the inferior frontal polar cortex and right thalamus. We speculate that this network of regions may be specifically associated with retrieval and/or generation of auditory information from memory.
Resumo:
Most people intuitively understand what it means to “hear a tune in your head.” Converging evidence now indicates that auditory cortical areas can be recruited even in the absence of sound and that this corresponds to the phenomenological experience of imagining music. We discuss these findings as well as some methodological challenges. We also consider the role of core versus belt areas in musical imagery, the relation between auditory and motor systems during imagery of music performance, and practical implications of this research.
Resumo:
The authors examined the effects of age, musical experience, and characteristics of musical stimuli on a melodic short-term memory task in which participants had to recognize whether a tune was an exact transposition of another tune recently presented. Participants were musicians and nonmusicians between ages 18 and 30 or 60 and 80. In 4 experiments, the authors found that age and experience affected different aspects of the task, with experience becoming more influential when interference was provided during the task. Age and experience interacted only weakly, and neither age nor experience influenced the superiority of tonal over atonal materials. Recognition memory for the sequences did not reflect the same pattern of results as the transposition task. The implications of these results for theories of aging, experience, and music cognition are discussed.
Resumo:
Investigated the organizing principles in memory for familiar songs in 2 experiments. It was hypothesized that individuals do not store and remember each song in isolation. Rather, there exists a rich system of relationships among tunes that can be revealed through similarity rating studies and memory tasks. One initial assumption was the division of relations among tunes into musical (e.g., tempo, rhythm) and nonmusical similarity. In Exp I, 20 undergraduates were asked to sort 60 familiar tunes into groups according to both musical and nonmusical criteria. Clustering analyses showed clear patterns of nonmusical similarity but few instances of musical similarity. Exp II, with 16 Ss, explored the psychological validity of the nonmusical relationships revealed in Exp I. A speeded verification task showed that songs similar to each other are confused more often than are distantly related songs. A free-recall task showed greater clustering for closely related songs than for distantly related ones. The relationship between these studies and studies of semantic memory is discussed. Also, the contribution of musical training and individual knowledge to the organization of the memory system is considered. (19 ref) (PsycINFO Database Record (c) 2012 APA, all rights reserved)
Resumo:
Music consists of sound sequences that require integration over time. As we become familiar with music, associations between notes, melodies, and entire symphonic movements become stronger and more complex. These associations can become so tight that, for example, hearing the end of one album track can elicit a robust image of the upcoming track while anticipating it in total silence. Here, we study this predictive “anticipatory imagery” at various stages throughout learning and investigate activity changes in corresponding neural structures using functional magnetic resonance imaging. Anticipatory imagery (in silence) for highly familiar naturalistic music was accompanied by pronounced activity in rostral prefrontal cortex (PFC) and premotor areas. Examining changes in the neural bases of anticipatory imagery during two stages of learning conditional associations between simple melodies, however, demonstrates the importance of fronto-striatal connections, consistent with a role of the basal ganglia in “training” frontal cortex (Pasupathy and Miller, 2005). Another striking change in neural resources during learning was a shift between caudal PFC earlier to rostral PFC later in learning. Our findings regarding musical anticipation and sound sequence learning are highly compatible with studies of motor sequence learning, suggesting common predictive mechanisms in both domains.
Resumo:
In this descriptive study, we examined the influences and experiences motivating students to enter college-level music schools as reported by a population of precollegiate students auditioning (but not yet accepted) to music education degree programs. As a follow-up to a published pilot study, this research was designed to quantify the various experiences respondents had as part of their precollege school and community programs that related to teaching and music. Results indicate a strong connection between respondents’ primary musical background and future teaching interest. The top three influential experiences were related to high school ensemble membership (band, choir, orchestra), and the most influential group of individuals in the decision to become a music educator were high school ensemble directors. Respondents from all four primary background groups (band, choir, orchestra, and general or other) rated private lesson teaching as their second strongest future teaching interest, just behind teaching at the high school level in their primary background. Respondents rated parents as moderately influential on their desire to become a music teacher.
Resumo:
The generality of findings implicating secondary auditory areas in auditory imagery was tested by using a timbre imagery task with fMRI. Another aim was to test whether activity in supplementary motor area (SMA) seen in prior studies might have been related to subvocalization. Participants with moderate musical background were scanned while making similarity judgments about the timbre of heard or imagined musical instrument sounds. The critical control condition was a visual imagery task. The pattern of judgments in perceived and imagined conditions was similar, suggesting that perception and imagery access similar cognitive representations of timbre. As expected, judgments of heard timbres, relative to the visual imagery control, activated primary and secondary auditory areas with some right-sided asymmetry. Timbre imagery also activated secondary auditory areas relative to the visual imagery control, although less strongly, in accord with previous data. Significant overlap was observed in these regions between perceptual and imagery conditions. Because the visual control task resulted in deactivation of auditory areas relative to a silent baseline, we interpret the timbre imagery effect as a reversal of that deactivation. Despite the lack of an obvious subvocalization component to timbre imagery, some activity in SMA was observed, suggesting that SMA may have a more general role in imagery beyond any motor component.
Resumo:
We explored the ability of older (60-80 years old) and younger (18-23 years old) musicians and nonmusicians to judge the similarity of transposed melodies varying on rhythm, mode, and/or contour (Experiment 1) and to discriminate among melodies differing only in rhythm, mode, or contour (Experiment 2). Similarity ratings did not vary greatly among groups, with tunes differing only by mode being rated as most similar. In the same/different discrimination task, musicians performed better than nonmusicians, but we found no age differences. We also found that discrimination of major from minor tunes was difficult for everyone, even for musicians. Mode is apparently a subtle dimension in music, despite its deliberate use in composition and despite people's ability to label minor as "sad" and major as "happy."
Resumo:
Two experiments explored the representation of the tonal hierarchy in Western music among older (aged 60 to 80) and younger (aged 15 to 22) musicians and nonmusicians. A probe tone technique was used: 4 notes from the major triad were presented, followed by 1 note chosen from the 12 notes of the chromatic scale. Whereas musicians had a better sense of the tonal hierarchy than nonmusicians, older adults were no worse than younger adults in differentiating the notes according to musical principles. However, older adults were more prone than younger adults to classify the notes by frequency proximity (pitch height) when proximity was made more salient, as were nonmusicians compared with musicians. With notes having ambiguous pitch height, pitch height effects disappeared among older adults but not nonmusicians. Older adults seem to have internalized tonal structure, but they sometimes fail to inhibit less musically relevant information.
Resumo:
Four experiments investigated perception of major and minor thirds whose component tones were sounded simultaneously. Effects akin to categorical perception of speech sounds were found. In the first experiment, musicians demonstrated relatively sharp category boundaries in identification and peaks near the boundary in discrimination tasks of an interval continuum where the bottom note was always an F and the top note varied from A to A flat in seven equal logarithmic steps. Nonmusicians showed these effects only to a small extent. The musicians showed higher than predicted discrimination performance overall, and reaction time increases at category boundaries. In the second experiment, musicians failed to consistently identify or discriminate thirds which varied in absolute pitch, but retained the proper interval ratio. In the last two experiments, using selective adaptation, consistent shifts were found in both identification and discrimination, similar to those found in speech experiments. Manipulations of adapting and test showed that the mechanism underlying the effect appears to be centrally mediated and confined to a frequency-specific level. A multistage model of interval perception, where the first stages deal only with specific pitches may account for the results.
Resumo:
We previously observed that mental manipulation of the pitch level or temporal organization of melodies results in functional activation in the human intraparietal sulcus (IPS), a region also associated with visuospatial transformation and numerical calculation. Two outstanding questions about these musical transformations are whether pitch and time depend on separate or common processing in IPS, and whether IPS recruitment in melodic tasks varies depending upon the degree of transformation required (as it does in mental rotation). In the present study we sought to answer these questions by applying functional magnetic resonance imaging while musicians performed closely matched mental transposition (pitch transformation) and melody reversal (temporal transformation) tasks. A voxel-wise conjunction analysis showed that in individual subjects, both tasks activated overlapping regions in bilateral IPS, suggesting that a common neural substrate subserves both types of mental transformation. Varying the magnitude of mental pitch transposition resulted in variation of IPS BOLD signal in correlation with the musical key-distance of the transposition, but not with the pitch distance, indicating that the cognitive metric relevant for this type of operation is an abstract one, well described by music-theoretic concepts. These findings support a general role for the IPS in systematically transforming auditory stimulus representations in a nonspatial context. (C) 2013 Elsevier Inc. All rights reserved.