43 resultados para Visual cues
Resumo:
Research on future episodic thought has produced compelling theories and results in cognitive psychology, cognitive neuroscience, and clinical psychology. In experiments aimed to integrate these with basic concepts and methods from autobiographical memory research, 76 undergraduates remembered past and imagined future positive and negative events that had or would have a major impact on them. Correlations of the online ratings of visual and auditory imagery, emotion, and other measures demonstrated that individuals used the same processes to the same extent to remember past and construct future events. These measures predicted the theoretically important metacognitive judgment of past reliving and future "preliving" in similar ways. On standardized tests of reactions to traumatic events, scores for future negative events were much higher than scores for past negative events. The scores for future negative events were in the range that would qualify for a diagnosis of posttraumatic stress disorder (PTSD); the test was replicated (n = 52) to check for order effects. Consistent with earlier work, future events had less sensory vividness. Thus, the imagined symptoms of future events were unlikely to be caused by sensory vividness. In a second experiment, to confirm this, 63 undergraduates produced numerous added details between 2 constructions of the same negative future events; deficits in rated vividness were removed with no increase in the standardized tests of reactions to traumatic events. Neuroticism predicted individuals' reactions to negative past events but did not predict imagined reactions to future events. This set of novel methods and findings is interpreted in the contexts of the literatures of episodic future thought, autobiographical memory, PTSD, and classic schema theory.
Resumo:
When recalling autobiographical memories, individuals often experience visual images associated with the event. These images can be constructed from two different perspectives: first person, in which the event is visualized from the viewpoint experienced at encoding, or third person, in which the event is visualized from an external vantage point. Using a novel technique to measure visual perspective, we examined where the external vantage point is situated in third-person images. Individuals in two studies were asked to recall either 10 or 15 events from their lives and describe the perspectives they experienced. Wide variation in spatial locations was observed within third-person perspectives, with the location of these perspectives relating to the event being recalled. Results suggest remembering from an external viewpoint may be more common than previous studies have demonstrated.
Resumo:
The number of studies examining visual perspective during retrieval has recently grown. However, the way in which perspective has been conceptualized differs across studies. Some studies have suggested perspective is experienced as either a first-person or a third-person perspective, whereas others have suggested both perspectives can be experienced during a single retrieval attempt. This aspect of perspective was examined across three studies, which used different measurement techniques commonly used in studies of perspective. Results suggest that individuals can experience more than one perspective when recalling events. Furthermore, the experience of the two perspectives correlated differentially with ratings of vividness, suggesting that the two perspectives should not be considered in opposition of one another. We also found evidence of a gender effect in the experience of perspective, with females experiencing third-person perspectives more often than males. Future studies should allow for the experience of more than one perspective during retrieval.
Resumo:
We sought to map the time course of autobiographical memory retrieval, including brain regions that mediate phenomenological experiences of reliving and emotional intensity. Participants recalled personal memories to auditory word cues during event-related functional magnetic resonance imaging (fMRI). Participants pressed a button when a memory was accessed, maintained and elaborated the memory, and then gave subjective ratings of emotion and reliving. A novel fMRI approach based on timing differences capitalized on the protracted reconstructive process of autobiographical memory to segregate brain areas contributing to initial access and later elaboration and maintenance of episodic memories. The initial period engaged hippocampal, retrosplenial, and medial and right prefrontal activity, whereas the later period recruited visual, precuneus, and left prefrontal activity. Emotional intensity ratings were correlated with activity in several regions, including the amygdala and the hippocampus during the initial period. Reliving ratings were correlated with activity in visual cortex and ventromedial and inferior prefrontal regions during the later period. Frontopolar cortex was the only brain region sensitive to emotional intensity across both periods. Results were confirmed by time-locked averages of the fMRI signal. The findings indicate dynamic recruitment of emotion-, memory-, and sensory-related brain regions during remembering and their dissociable contributions to phenomenological features of the memories.
Resumo:
Amnesia typically results from trauma to the medial temporal regions that coordinate activation among the disparate areas of cortex that represent the information that make up autobiographical memories. We proposed that amnesia should also result from damage to these regions, particularly regions that subserve long-term visual memory [Rubin, D. C., & Greenberg, D. L. (1998). Visual memory-deficit amnesia: A distinct amnesic presentation and etiology. Proceedings of the National Academy of Sciences of the USA, 95, 5413-5416]. We previously found 11 such cases in the literature, and all 11 had amnesia. We now present a detailed investigation of one of these patients. M.S. suffers from long-term visual memory loss along with some semantic deficits; he also manifests a severe retrograde amnesia and moderate anterograde amnesia. The presentation of his amnesia differs from that of the typical medial-temporal or lateral-temporal amnesic; we suggest that his visual deficits may be contributing to his autobiographical amnesia.
Resumo:
We investigated the effects of visual input at encoding and retrieval on the phenomenology of memory. In Experiment 1, participants took part in events with and without wearing blindfolds, and later were shown a video of the events. Blindfolding, as well as later viewing of the video, both tended to decrease recollection. In Experiment 2, participants were played videos, with and without the visual component, of events involving other people. Events listened to without visual input were recalled with less recollection; later adding of the visual component increased recollection. In Experiment 3, participants were provided with progressively more information about events that they had experienced, either in the form of photographs that they had taken of the events or narrative descriptions of those photographs. In comparison with manipulations at encoding, the addition of more visual or narrative cues at recall had similar but smaller effects on recollection.
Resumo:
We describe a form of amnesia, which we have called visual memory-deficit amnesia, that is caused by damage to areas of the visual system that store visual information. Because it is caused by a deficit in access to stored visual material and not by an impaired ability to encode or retrieve new material, it has the otherwise infrequent properties of a more severe retrograde than anterograde amnesia with no temporal gradient in the retrograde amnesia. Of the 11 cases of long-term visual memory loss found in the literature, all had amnesia extending beyond a loss of visual memory, often including a near total loss of pretraumatic episodic memory. Of the 6 cases in which both the severity of retrograde and anterograde amnesia and the temporal gradient of the retrograde amnesia were noted, 4 had a more severe retrograde amnesia with no temporal gradient and 2 had a less severe retrograde amnesia with a temporal gradient.
Resumo:
A sample of 124 words were used to cue autobiographical memories in 120 adults varying in age from 20 to 73 years. Individual words reliably cued autobiographical memories of different ages with different speeds. For all age groups, words rated high in imagery produced older memories and faster reaction times.
Resumo:
If and only if each single cue uniquely defines its target, a independence model based on fragment theory can predict the strength of a combined dual cue from the strengths of its single cue components. If the single cues do not each uniquely define their target, no single monotonic function can predict the strength of the dual cue from its components; rather, what matters is the number of possible targets. The probability of generating a target word was .19 for rhyme cues, .14 for category cues, and .97 for rhyme-plus-category dual cues. Moreover, some pairs of cues had probabilities of producing their targets of .03 when used individually and 1.00 when used together, whereas other pairs had moderate probabilities individually and together. The results, which are interpreted in terms of multiple constraints limiting the number of responses, show why rhymes, which play a minimal role in laboratory studies of memory, are common in real-world mnemonics.
Resumo:
In Experiment 1, subjects were presented with either the odors or the names of 15 common objects. In Experiment 2, subjects were presented with either the odors, photographs, or names of 16 common objects. All subjects were asked to describe an autobiographical memory evoked by each cue, to date each memory, and to rate each memory on vividness, pleasantness, and the number of times that the memory had been thought of and talked about prior to the experiment. Compared with memories evoked by photographs or names, memories evoked by odors were reported to be thought of and talked about less often prior to the experiment and were more likely to be reported as never having been thought of or talked about prior to the experiment. No other effects were consistently found, though there was a suggestion that odors might evoke more pleasant and emotional memories than other types of cues. The relation of these results to the folklore concerning olfactory cuing is discussed.
Resumo:
The spiking activity of nearby cortical neurons is correlated on both short and long time scales. Understanding this shared variability in firing patterns is critical for appreciating the representation of sensory stimuli in ensembles of neurons, the coincident influences of neurons on common targets, and the functional implications of microcircuitry. Our knowledge about neuronal correlations, however, derives largely from experiments that used different recording methods, analysis techniques, and cortical regions. Here we studied the structure of neuronal correlation in area V4 of alert macaques using recording and analysis procedures designed to match those used previously in primary visual cortex (V1), the major input to V4. We found that the spatial and temporal properties of correlations in V4 were remarkably similar to those of V1, with two notable differences: correlated variability in V4 was approximately one-third the magnitude of that in V1 and synchrony in V4 was less temporally precise than in V1. In both areas, spontaneous activity (measured during fixation while viewing a blank screen) was approximately twice as correlated as visual-evoked activity. The results provide a foundation for understanding how the structure of neuronal correlation differs among brain regions and stages in cortical processing and suggest that it is likely governed by features of neuronal circuits that are shared across the visual cortex.
Resumo:
Successful interaction with the world depends on accurate perception of the timing of external events. Neurons at early stages of the primate visual system represent time-varying stimuli with high precision. However, it is unknown whether this temporal fidelity is maintained in the prefrontal cortex, where changes in neuronal activity generally correlate with changes in perception. One reason to suspect that it is not maintained is that humans experience surprisingly large fluctuations in the perception of time. To investigate the neuronal correlates of time perception, we recorded from neurons in the prefrontal cortex and midbrain of monkeys performing a temporal-discrimination task. Visual time intervals were presented at a timescale relevant to natural behavior (<500 ms). At this brief timescale, neuronal adaptation--time-dependent changes in the size of successive responses--occurs. We found that visual activity fluctuated with timing judgments in the prefrontal cortex but not in comparable midbrain areas. Surprisingly, only response strength, not timing, predicted task performance. Intervals perceived as longer were associated with larger visual responses and shorter intervals with smaller responses, matching the dynamics of adaptation. These results suggest that the magnitude of prefrontal activity may be read out to provide temporal information that contributes to judging the passage of time.
Resumo:
Our percept of visual stability across saccadic eye movements may be mediated by presaccadic remapping. Just before a saccade, neurons that remap become visually responsive at a future field (FF), which anticipates the saccade vector. Hence, the neurons use corollary discharge of saccades. Many of the neurons also decrease their response at the receptive field (RF). Presaccadic remapping occurs in several brain areas including the frontal eye field (FEF), which receives corollary discharge of saccades in its layer IV from a collicular-thalamic pathway. We studied, at two levels, the microcircuitry of remapping in the FEF. At the laminar level, we compared remapping between layers IV and V. At the cellular level, we compared remapping between different neuron types of layer IV. In the FEF in four monkeys (Macaca mulatta), we identified 27 layer IV neurons with orthodromic stimulation and 57 layer V neurons with antidromic stimulation from the superior colliculus. With the use of established criteria, we classified the layer IV neurons as putative excitatory (n = 11), putative inhibitory (n = 12), or ambiguous (n = 4). We found that just before a saccade, putative excitatory neurons increased their visual response at the RF, putative inhibitory neurons showed no change, and ambiguous neurons increased their visual response at the FF. None of the neurons showed presaccadic visual changes at both RF and FF. In contrast, neurons in layer V showed full remapping (at both the RF and FF). Our data suggest that elemental signals for remapping are distributed across neuron types in early cortical processing and combined in later stages of cortical microcircuitry.
Resumo:
The image on the retina may move because the eyes move, or because something in the visual scene moves. The brain is not fooled by this ambiguity. Even as we make saccades, we are able to detect whether visual objects remain stable or move. Here we test whether this ability to assess visual stability across saccades is present at the single-neuron level in the frontal eye field (FEF), an area that receives both visual input and information about imminent saccades. Our hypothesis was that neurons in the FEF report whether a visual stimulus remains stable or moves as a saccade is made. Monkeys made saccades in the presence of a visual stimulus outside of the receptive field. In some trials, the stimulus remained stable, but in other trials, it moved during the saccade. In every trial, the stimulus occupied the center of the receptive field after the saccade, thus evoking a reafferent visual response. We found that many FEF neurons signaled, in the strength and timing of their reafferent response, whether the stimulus had remained stable or moved. Reafferent responses were tuned for the amount of stimulus translation, and, in accordance with human psychophysics, tuning was better (more prevalent, stronger, and quicker) for stimuli that moved perpendicular, rather than parallel, to the saccade. Tuning was sometimes present as well for nonspatial transaccadic changes (in color, size, or both). Our results indicate that FEF neurons evaluate visual stability during saccades and may be general purpose detectors of transaccadic visual change.