12 resultados para Produtor familiar
em Bucknell University Digital Commons - Pensilvania - USA
Resumo:
We explored the functional organization of semantic memory for music by comparing priming across familiar songs both within modalities (Experiment 1, tune to tune; Experiment 3, category label to lyrics) and across modalities (Experiment 2, category label to tune; Experiment 4, tune to lyrics). Participants judged whether or not the target tune or lyrics were real (akin to lexical decision tasks). We found significant priming, analogous to linguistic associative-priming effects, in reaction times for related primes as compared to unrelated primes, but primarily for within-modality comparisons. Reaction times to tunes (e.g., "Silent Night") were faster following related tunes ("Deck the Hall") than following unrelated tunes ("God Bless America"). However, a category label (e.g., Christmas) did not prime tunes from within that category. Lyrics were primed by a related category label, but not by a related tune. These results support the conceptual organization of music in semantic memory, but with potentially weaker associations across modalities.
Resumo:
We used fMRI to investigate the neuronal correlates of encoding and recognizing heard and imagined melodies. Ten participants were shown lyrics of familiar verbal tunes; they either heard the tune along with the lyrics, or they had to imagine it. In a subsequent surprise recognition test, they had to identify the titles of tunes that they had heard or imagined earlier. The functional data showed substantial overlap during melody perception and imagery, including secondary auditory areas. During imagery compared with perception, an extended network including pFC, SMA, intraparietal sulcus, and cerebellum showed increased activity, in line with the increased processing demands of imagery. Functional connectivity of anterior right temporal cortex with frontal areas was increased during imagery compared with perception, indicating that these areas form an imagery-related network. Activity in right superior temporal gyrus and pFC was correlated with the subjective rating of imagery vividness. Similar to the encoding phase, the recognition task recruited overlapping areas, including inferior frontal cortex associated with memory retrieval, as well as left middle temporal gyrus. The results present new evidence for the cortical network underlying goal-directed auditory imagery, with a prominent role of the right pFC both for the subjective impression of imagery vividness and for on-line mental monitoring of imagery-related activity in auditory areas.
Resumo:
We investigated the effect of level-of-processing manipulations on “remember” and “know” responses in episodic melody recognition (Experiments 1 and 2) and how this effect is modulated by item familiarity (Experiment 2). In Experiment 1, participants performed 2 conceptual and 2 perceptual orienting tasks while listening to familiar melodies: judging the mood, continuing the tune, tracing the pitch contour, and counting long notes. The conceptual mood task led to higher d' rates for “remember” but not “know” responses. In Experiment 2, participants either judged the mood or counted long notes of tunes with high and low familiarity. A level-of-processing effect emerged again in participants’ “remember” d' rates regardless of melody familiarity. Results are discussed within the distinctive processing framework.
When that tune runs through your head: A PET investigation of auditory imagery for familiar melodies
Resumo:
The present study used positron emission tomography (PET) to examine the cerebral activity pattern associated with auditory imagery forfamiliar tunes. Subjects either imagined the continuation of nonverbaltunes cued by their first few notes, listened to a short sequence of notesas a control task, or listened and then reimagined that short sequence. Subtraction of the activation in the control task from that in the real-tune imagery task revealed primarily right-sided activation in frontal and superior temporal regions, plus supplementary motor area(SMA). Isolating retrieval of the real tunes by subtracting activation in the reimagine task from that in the real-tune imagery task revealedactivation primarily in right frontal areas and right superior temporal gyrus. Subtraction of activation in the control condition from that in the reimagine condition, intended to capture imagery of unfamiliarsequences, revealed activation in SMA, plus some left frontal regions. We conclude that areas of right auditory association cortex, together with right and left frontal cortices, are implicated in imagery for familiartunes, in accord with previous behavioral, lesion and PET data. Retrieval from musical semantic memory is mediated by structures in the right frontal lobe, in contrast to results from previous studies implicating left frontal areas for all semantic retrieval. The SMA seems to be involved specifically in image generation, implicating a motor code in this process.
Resumo:
This study investigated the influence of age, familiarity, and level of exposure on the metamemorial skill of prediction accuracy on a future test. Young (17 to 23 years old) and middle-aged adults (35 to 50 years old) were asked to predict their memory for text material. Participants made predictions on a familiar text and an unfamiliar text, at three different levels of exposure to each. The middle-aged adults were superior to the younger adults at predicting performance. This finding indicates that metamemory may increase from youth to middle age. Other findings include superior prediction accuracy for unfamiliar compared to familiar material, a result conflicting with previous findings, and an interaction between level of exposure and familiarity that appears to modify the main effects of those variables.
Resumo:
There is a range of tempos within which listeners can identify familiar tunes (around 0.8 to 6.0 notes/s). Faster and slower tunes are difficult to identify. The authors assessed fast and slow melody-identification thresholds for 80 listeners ages 17–79 years with expertise varying from musically untrained to professional. On fast-to-slow (FS) trials the tune started at a very fast tempo and slowed until the listener identified it. Slow-to-fast (SF) trials started slow and accelerated. Tunes either retained their natural rhythms or were stylized isochronous versions. Increased expertise led to better performance for both FS and SF thresholds (r = .45). Performance declined uniformly across the 62-year age range in the FS condition (r = .27). SF performance was unaffected by age. Although early encoding processes may slow with age, expertise has a greater effect. Musical expertise involves perceptual learning with melodies at a wide range of tempos.
Resumo:
We tested normal young and elderly adults and elderly Alzheimer’s disease (AD) patients on recognition memory for tunes. In Experiment 1, AD patients and age-matched controls received a study list and an old/new recognition test of highly familiar, traditional tunes, followed by a study list and test of novel tunes. The controls performed better than did the AD patients. The controls showed the “mirror effect” of increased hits and reduced false alarms for traditional versus novel tunes, whereas the patients false-alarmed as often to traditional tunes as to novel tunes. Experiment 2 compared young adults and healthy elderly persons using a similar design. Performance was lower in the elderly group, but both younger and older subjects showed the mirror effect. Experiment 3 produced confusion between preexperimental familiarity and intraexperimental familiarity by mixing traditional and novel tunes in the study lists and tests. Here, the subjects in both age groups resembled the patients of Experiment 1 in failing to show the mirror effect. Older subjects again performed more poorly, and they differed qualitatively from younger subjects in setting stricter criteria for more nameable tunes. Distinguishing different sources of global familiarity is a factor in tune recognition, and the data suggest that this type of source monitoring is impaired in AD and involves different strategies in younger and older adults.
When that tune runs through your head: a PET investigation of auditory imagery for familiar melodies
Resumo:
The present study used positron emission tomography (PET) to examine the cerebral activity pattern associated with auditory imagery for familiar tunes. Subjects either imagined the continuation of nonverbal tunes cued by their first few notes, listened to a short sequence of notes as a control task, or listened and then reimagined that short sequence. Subtraction of the activation in the control task from that in the real-tune imagery task revealed primarily right-sided activation in frontal and superior temporal regions, plus supplementary motor area (SMA). Isolating retrieval of the real tunes by subtracting activation in the reimagine task from that in the real-tune imagery task revealed activation primarily in right frontal areas and right superior temporal gyrus. Subtraction of activation in the control condition from that in the reimagine condition, intended to capture imagery of unfamiliar sequences, revealed activation in SMA, plus some left frontal regions. We conclude that areas of right auditory association cortex, together with right and left frontal cortices, are implicated in imagery for familiar tunes, in accord with previous behavioral, lesion and PET data. Retrieval from musical semantic memory is mediated by structures in the right frontal lobe, in contrast to results from previous studies implicating left frontal areas for all semantic retrieval. The SMA seems to be involved specifically in image generation, implicating a motor code in this process.
Resumo:
Four experiments were conducted to examine the ability of people without "perfect pitch" to retain the absolute pitch offamiliar tunes. In Experiment 1, participants imagined given tunes, and then hummed their first notes four times either between or within sessions. The variability of these productions was very low. Experiment 2 used a recognition paradigm, with results similar to those in Experiment 1 for musicians, but with some additional variability shown for unselected subjects. In Experiment 3, subjects rated the suitability ofvarious pitches to start familiar tunes. Previously given preferred notes were rated high, as were notes three or four semitones distant from the preferred notes, but not notes one or two semitones distant. In Experiment 4, subjects mentally transformed the pitches of familiar tunes to the highest and lowest levels possible. These experiments suggest some retention of the absolute pitch of tunes despite a paucity of verbal or visual cues for the pitch.
Resumo:
Investigated the organizing principles in memory for familiar songs in 2 experiments. It was hypothesized that individuals do not store and remember each song in isolation. Rather, there exists a rich system of relationships among tunes that can be revealed through similarity rating studies and memory tasks. One initial assumption was the division of relations among tunes into musical (e.g., tempo, rhythm) and nonmusical similarity. In Exp I, 20 undergraduates were asked to sort 60 familiar tunes into groups according to both musical and nonmusical criteria. Clustering analyses showed clear patterns of nonmusical similarity but few instances of musical similarity. Exp II, with 16 Ss, explored the psychological validity of the nonmusical relationships revealed in Exp I. A speeded verification task showed that songs similar to each other are confused more often than are distantly related songs. A free-recall task showed greater clustering for closely related songs than for distantly related ones. The relationship between these studies and studies of semantic memory is discussed. Also, the contribution of musical training and individual knowledge to the organization of the memory system is considered. (19 ref) (PsycINFO Database Record (c) 2012 APA, all rights reserved)
Resumo:
Two studies investigated the similarity of metronome settings to perceived and imagined familiar songs by subjects unselected for musical ability. In Study 1, mean tempo settings in the two tasks were about 100 beats per minute. Songs with slower perceived tempos tended to be faster in the imagery task and vice versa. In Study 2, subjects set fastest and slowest acceptable tempos for the same set of songs in the imagery mode. These settings were positively correlated with the preferred tempo for the song. Most subjects thought that there were limits on how fast or slow a song could be imagined. These results suggest that tempo is explicitly represented in auditory imagery.
Resumo:
Metamemory is an important skill that allows humans to monitor their own memory abilities; however, little research has concerned what perceptual information influences metamemory judgments. A series of experiments assessed the accuracy of metamemory judgments for music as well as determined if metamemory judgments are affected by ease of processing of musical features. A recognition memory task inconjunction with metamemory judgments (Judgments of Learning, or JOLs) were used to determine actual and predicted memory performance. We found that changing the ease of processing of the volume and timbre of unfamiliar tunes affected metamemory judgments, but not memory performance, for unfamiliar tunes. Manipulating the ease ofprocessing of the timbre and tempo of familiar tunes did not affect metamemory judgments or memory performance although metamemory accuracy on an item-by-item basis was better for familiar tunes as compared to unfamiliar tunes. Thus, metamemory judgments for unfamiliar tunes are more sensitive to ease of processing changes ascompared to familiar tunes, suggesting that different types of information are processed in different ways.