15 resultados para MELODIES
em Bucknell University Digital Commons - Pensilvania - USA
Resumo:
Two fMRI experiments explored the neural substrates of a musical imagery task that required manipulation of the imagined sounds: temporal reversal of a melody. Musicians were presented with the first few notes of a familiar tune (Experiment 1) or its title (Experiment 2), followed by a string of notes that was either an exact or an inexact reversal. The task was to judge whether the second string was correct or not by mentally reversing all its notes, thus requiring both maintenance and manipulation of the represented string. Both experiments showed considerable activation of the superior parietal lobe (intraparietal sulcus) during the reversal process. Ventrolateral and dorsolateral frontal cortices were also activated, consistent with the memory load required during the task. We also found weaker evidence for some activation of right auditory cortex in both studies, congruent with results from previous simpler music imagery tasks. We interpret these results in the context of other mental transformation tasks, such as mental rotation in the visual domain, which are known to recruit the intraparietal sulcus region, and we propose that this region subserves general computations that require transformations of a sensory input. Mental imagery tasks may thus have both task or modality-specific components as well as components that supersede any specific codes and instead represent amodal mental manipulation.
Resumo:
COMPOSERS COMMONLY USE MAJOR OR MINOR SCALES to create different moods in music.Nonmusicians show poor discrimination and classification of this musical dimension; however, they can perform these tasks if the decision is phrased as happy vs. sad.We created pairs of melodies identical except for mode; the first major or minor third or sixth was the critical note that distinguished major from minor mode. Musicians and nonmusicians judged each melody as major vs. minor or happy vs. sad.We collected ERP waveforms, triggered to the onset of the critical note. Musicians showed a late positive component (P3) to the critical note only for the minor melodies, and in both tasks.Nonmusicians could adequately classify the melodies as happy or sad but showed little evidence of processing the critical information. Major appears to be the default mode in music, and musicians and nonmusicians apparently process mode differently.
When that tune runs through your head: A PET investigation of auditory imagery for familiar melodies
Resumo:
The present study used positron emission tomography (PET) to examine the cerebral activity pattern associated with auditory imagery forfamiliar tunes. Subjects either imagined the continuation of nonverbaltunes cued by their first few notes, listened to a short sequence of notesas a control task, or listened and then reimagined that short sequence. Subtraction of the activation in the control task from that in the real-tune imagery task revealed primarily right-sided activation in frontal and superior temporal regions, plus supplementary motor area(SMA). Isolating retrieval of the real tunes by subtracting activation in the reimagine task from that in the real-tune imagery task revealedactivation primarily in right frontal areas and right superior temporal gyrus. Subtraction of activation in the control condition from that in the reimagine condition, intended to capture imagery of unfamiliarsequences, revealed activation in SMA, plus some left frontal regions. We conclude that areas of right auditory association cortex, together with right and left frontal cortices, are implicated in imagery for familiartunes, in accord with previous behavioral, lesion and PET data. Retrieval from musical semantic memory is mediated by structures in the right frontal lobe, in contrast to results from previous studies implicating left frontal areas for all semantic retrieval. The SMA seems to be involved specifically in image generation, implicating a motor code in this process.
Resumo:
There is a range of tempos within which listeners can identify familiar tunes (around 0.8 to 6.0 notes/s). Faster and slower tunes are difficult to identify. The authors assessed fast and slow melody-identification thresholds for 80 listeners ages 17–79 years with expertise varying from musically untrained to professional. On fast-to-slow (FS) trials the tune started at a very fast tempo and slowed until the listener identified it. Slow-to-fast (SF) trials started slow and accelerated. Tunes either retained their natural rhythms or were stylized isochronous versions. Increased expertise led to better performance for both FS and SF thresholds (r = .45). Performance declined uniformly across the 62-year age range in the FS condition (r = .27). SF performance was unaffected by age. Although early encoding processes may slow with age, expertise has a greater effect. Musical expertise involves perceptual learning with melodies at a wide range of tempos.
Resumo:
We explored the ability of older (60-80 years old) and younger (18-23 years old) musicians and nonmusicians to judge the similarity of transposed melodies varying on rhythm, mode, and/or contour (Experiment 1) and to discriminate among melodies differing only in rhythm, mode, or contour (Experiment 2). Similarity ratings did not vary greatly among groups, with tunes differing only by mode being rated as most similar. In the same/different discrimination task, musicians performed better than nonmusicians, but we found no age differences. We also found that discrimination of major from minor tunes was difficult for everyone, even for musicians. Mode is apparently a subtle dimension in music, despite its deliberate use in composition and despite people's ability to label minor as "sad" and major as "happy."
When that tune runs through your head: a PET investigation of auditory imagery for familiar melodies
Resumo:
The present study used positron emission tomography (PET) to examine the cerebral activity pattern associated with auditory imagery for familiar tunes. Subjects either imagined the continuation of nonverbal tunes cued by their first few notes, listened to a short sequence of notes as a control task, or listened and then reimagined that short sequence. Subtraction of the activation in the control task from that in the real-tune imagery task revealed primarily right-sided activation in frontal and superior temporal regions, plus supplementary motor area (SMA). Isolating retrieval of the real tunes by subtracting activation in the reimagine task from that in the real-tune imagery task revealed activation primarily in right frontal areas and right superior temporal gyrus. Subtraction of activation in the control condition from that in the reimagine condition, intended to capture imagery of unfamiliar sequences, revealed activation in SMA, plus some left frontal regions. We conclude that areas of right auditory association cortex, together with right and left frontal cortices, are implicated in imagery for familiar tunes, in accord with previous behavioral, lesion and PET data. Retrieval from musical semantic memory is mediated by structures in the right frontal lobe, in contrast to results from previous studies implicating left frontal areas for all semantic retrieval. The SMA seems to be involved specifically in image generation, implicating a motor code in this process.
Resumo:
WE INVESTIGATED HOW WELL STRUCTURAL FEATURES such as note density or the relative number of changes in the melodic contour could predict success in implicit and explicit memory for unfamiliar melodies. We also analyzed which features are more likely to elicit increasingly confident judgments of "old" in a recognition memory task. An automated analysis program computed structural aspects of melodies, both independent of any context, and also with reference to the other melodies in the testset and the parent corpus of pop music. A few features predicted success in both memory tasks, which points to a shared memory component. However, motivic complexity compared to a large corpus of pop music had different effects on explicit and implicit memory. We also found that just a few features are associated with different rates of "old" judgments, whether the items were old or new. Rarer motives relative to the testset predicted hits and rarer motives relative to the corpus predicted false alarms. This data-driven analysis provides further support for both shared and separable mechanisms in implicit and explicit memory retrieval, as well as the role of distinctiveness in true and false judgments of familiarity.
Resumo:
We used fMRI to investigate the neuronal correlates of encoding and recognizing heard and imagined melodies. Ten participants were shown lyrics of familiar verbal tunes; they either heard the tune along with the lyrics, or they had to imagine it. In a subsequent surprise recognition test, they had to identify the titles of tunes that they had heard or imagined earlier. The functional data showed substantial overlap during melody perception and imagery, including secondary auditory areas. During imagery compared with perception, an extended network including pFC, SMA, intraparietal sulcus, and cerebellum showed increased activity, in line with the increased processing demands of imagery. Functional connectivity of anterior right temporal cortex with frontal areas was increased during imagery compared with perception, indicating that these areas form an imagery-related network. Activity in right superior temporal gyrus and pFC was correlated with the subjective rating of imagery vividness. Similar to the encoding phase, the recognition task recruited overlapping areas, including inferior frontal cortex associated with memory retrieval, as well as left middle temporal gyrus. The results present new evidence for the cortical network underlying goal-directed auditory imagery, with a prominent role of the right pFC both for the subjective impression of imagery vividness and for on-line mental monitoring of imagery-related activity in auditory areas.
Resumo:
Eighty-one listeners defined by three age ranges (18–30, 31–59, and over 60 years) and three levels of musical experience performed an immediate recognition task requiring the detection of alterations in melodies. On each trial, a brief melody was presented, followed 5 sec later by a test stimulus that either was identical to the target or had two pitches changed, for a same–different judgment. Each melody pair was presented at 0.6 note/sec, 3.0 notes/sec, or 6.0 notes/sec. Performance was better with familiar melodies than with unfamiliar melodies. Overall performance declined slightly with age and improved substantially with increasing experience, in agreement with earlier results in an identification task. Tempo affected performance on familiar tunes (moderate was best), but not on unfamiliar tunes. We discuss these results in terms of theories of dynamic attending, cognitive slowing, and working memory in aging.
Resumo:
We investigated the effect of level-of-processing manipulations on “remember” and “know” responses in episodic melody recognition (Experiments 1 and 2) and how this effect is modulated by item familiarity (Experiment 2). In Experiment 1, participants performed 2 conceptual and 2 perceptual orienting tasks while listening to familiar melodies: judging the mood, continuing the tune, tracing the pitch contour, and counting long notes. The conceptual mood task led to higher d' rates for “remember” but not “know” responses. In Experiment 2, participants either judged the mood or counted long notes of tunes with high and low familiarity. A level-of-processing effect emerged again in participants’ “remember” d' rates regardless of melody familiarity. Results are discussed within the distinctive processing framework.
Resumo:
Music consists of sound sequences that require integration over time. As we become familiar with music, associations between notes, melodies, and entire symphonic movements become stronger and more complex. These associations can become so tight that, for example, hearing the end of one album track can elicit a robust image of the upcoming track while anticipating it in total silence. Here, we study this predictive “anticipatory imagery” at various stages throughout learning and investigate activity changes in corresponding neural structures using functional magnetic resonance imaging. Anticipatory imagery (in silence) for highly familiar naturalistic music was accompanied by pronounced activity in rostral prefrontal cortex (PFC) and premotor areas. Examining changes in the neural bases of anticipatory imagery during two stages of learning conditional associations between simple melodies, however, demonstrates the importance of fronto-striatal connections, consistent with a role of the basal ganglia in “training” frontal cortex (Pasupathy and Miller, 2005). Another striking change in neural resources during learning was a shift between caudal PFC earlier to rostral PFC later in learning. Our findings regarding musical anticipation and sound sequence learning are highly compatible with studies of motor sequence learning, suggesting common predictive mechanisms in both domains.
Resumo:
The two modes most widely used in Western music today convey opposite moods—a distinction that nonmusicians and even young children are able to make. However, the current studies provide evidence that, despite a strong link between mode and affect, mode perception is problematic. Nonmusicians found mode discrimination to be harder than discrimination of other melodic features, and they were not able to accurately classify major and minor melodies with these labels. Although nonmusicians were able to classify major and minor melodies using affective labels, they performed at chance in mode discrimination. Training, in the form of short lessons given to nonmusicians and the natural musical experience of musicians, improved performance, but not to ceiling levels. Tunes with high note density were classified as major, and tunes with low note density as minor, even though these features were actually unrelated in the experimental material. Although these findings provide support for the importance of mode in the perception of emotion, they clearly indicate that these mode perceptions are inaccurate, even in trained individuals, without the assistance of affective labeling.
Resumo:
Short, unfamiliar melodies were presented to young and older adults and to Alzheimer's disease (AD) patients in an implicit and an explicit memory task. The explicit task was yes–no recognition, and the implicit task was pleasantness ratings, in which memory was shown by higher ratings for old versus new melodies (the mere exposure effect). Young adults showed retention of the melodies in both tasks. Older adults showed little explicit memory but did show the mere exposure effect. The AD patients showed neither. The authors considered and rejected several artifactual reasons for this null effect in the context of the many studies that have shown implicit memory among AD patients. As the previous studies have almost always used the visual modality for presentation, they speculate that auditory presentation, especially of nonverbal material, may be compromised in AD because of neural degeneration in auditory areas in the temporal lobes.
Resumo:
Two experiments plus a pilot investigated the role of melodic structure on short-term memory for musical notation by musicians and nonmusicians. In the pilot experiment, visually similar melodies that had been rated as either "good" or "bad" were presented briefly, followed by a 15-sec retention interval and then recall. Musicians remembered good melodies better than they remembered bad ones: nonmusicians did not distinguish between them. In the second experiment, good, bad, and random melodies were briefly presented, followed by immediate recall. The advantage of musicians over nonmusicians decreased as the melody type progressed from good to bad to random. In the third experiment, musicians and nonmusicians divided the stimulus melodies into groups. For each melody, the consistency of grouping was correlated with memory performance in the first two experiments. Evidence was found for use of musical groupings by musicians and for use of a simple visual strategy by nonmusicians. The nature of these musical groupings and how they may be learned are considered. The relation of this work to other studies of comprehension of symbolic diagrams is also discussed.
Resumo:
We previously observed that mental manipulation of the pitch level or temporal organization of melodies results in functional activation in the human intraparietal sulcus (IPS), a region also associated with visuospatial transformation and numerical calculation. Two outstanding questions about these musical transformations are whether pitch and time depend on separate or common processing in IPS, and whether IPS recruitment in melodic tasks varies depending upon the degree of transformation required (as it does in mental rotation). In the present study we sought to answer these questions by applying functional magnetic resonance imaging while musicians performed closely matched mental transposition (pitch transformation) and melody reversal (temporal transformation) tasks. A voxel-wise conjunction analysis showed that in individual subjects, both tasks activated overlapping regions in bilateral IPS, suggesting that a common neural substrate subserves both types of mental transformation. Varying the magnitude of mental pitch transposition resulted in variation of IPS BOLD signal in correlation with the musical key-distance of the transposition, but not with the pitch distance, indicating that the cognitive metric relevant for this type of operation is an abstract one, well described by music-theoretic concepts. These findings support a general role for the IPS in systematically transforming auditory stimulus representations in a nonspatial context. (C) 2013 Elsevier Inc. All rights reserved.