986 resultados para Auditory-visual Interaction
Resumo:
Background: It is well established that phonological awareness, print knowledge and rapid naming predict later reading difficulties. However, additional auditory, visual and motor difficulties have also been observed in dyslexic children. It is examined to what extent these difficulties can be used to predict later literacy difficulties. Method: An unselected sample of 267 children at school entry completed a wide battery of tasks associated with dyslexia. Their reading was tested 2, 3 and 4 years later and poor readers were identified (n = 42). Logistic regression and multiple case study approaches were used to examine the predictive validity of different tasks. Results: As expected, print knowledge, verbal short-term memory, phonological awareness and rapid naming were good predictors of later poor reading. Deficits in visual search and in auditory processing were also present in a large minority of the poor readers. Almost all poor readers showed deficits in at least one area at school entry, but there was no single deficit that characterised the majority of poor readers. Conclusions: Results are in line with Pennington’s (2006) multiple deficits view of dyslexia. They indicate that the causes of poor reading outcome are multiple, interacting and probabilistic, rather than deterministic. Keywords: Dyslexia; educational attainment; longitudinal studies; prediction; phonological processing.
Resumo:
This research aims at studying the use of greeting cards, here understood as a literacy practice widely used in American society of the United States. In American culture, these cards become sources of information and memory about people‟s cycles of life, their experiences and their bonds of sociability enabled by means of the senses that the image and the word comprise. The main purpose of this work is to describe how this literacy practice occurs in American society. Theoretically, this research is based on studies of literacy (BARTON, HAMILTON, 1998; BAYHAM, 1995; HAMILTON, 2000; STREET, 1981, 1984, 1985, 1993, 2003), the contributions of social semiotics, associated with systemic-functional grammar (HALLIDAY; HASAN 1978, 1985, HALLIDAY, 1994, HALLIDAY; MATTHIESSEN, 2004), and the grammar of visual design (KRESS; LEITE-GARCIA, VAN LEEUWEN, 1997, 2004, 2006; KRESS; MATTHIESSEN, 2004). Methodologically, it is a study that falls within the qualitative paradigm of interpretative character, which adopts ethnographic tools in data generation. From this perspective, it makes use of “looking and asking” techniques (ERICKSON, 1986, p. 119), complemented by the technique of "registering", proposed by Paz (2008). The corpus comprises 104 printed cards, provided by users of this cultural artifact, from which we selected 24, and 11 e-cards, extracted from the internet, as well as verbalizations obtained by applying a questionnaire prepared with open questions asked in order to gather information about the perceptions and actions of these cards users with respect to this literacy practice. Data analysis reveals cultural, economic and social aspects of this practice and the belief that literacy practice of using printed greeting cards, despite the existence of virtual alternatives, is still very fruitful in American society. The study also allows users to comprehend that the cardholders position themselves and construct identities that are expressed in verbal and visual interaction in order to achieve the desired effect. As a result, it is understood that greeting cards are not unintentional, but loaded with ideology and power relations, among other aspects that are constitutive of them.
Resumo:
Acompanha: A diferença está no saber agir: conheça!: educação inclusiva: dos documentos legais à realidade escolar
Resumo:
This study investigates if less skilled readers suffer from deficits in echoic memory, which may be responsible for limiting the progress of reading acquisition. Serial recall performance in auditory, visual, and noisy conditions was used to assess echoic memory differences between skilled and less skilled readers. Both groups showed the typical modality effect, demonstrating that each had a functioning echoic memory. Less skilled readers performed more weakly than skilled readers on noisy serial recall, suggesting that the recall of less skilled readers is more vulnerable to interference than the recall of skilled readers. Nonword repetition performance indicated that all participants had reduced recall as a function of word complexity and word length. No difference between reading groups was found on this task; however, as nonword repetition and size of modality effect did not correlate, this task may not be a measure of echoic memory.
Resumo:
Although the effects of cannabis on perception are well documented, little is known about their neural basis or how these may contribute to the formation of psychotic symptoms. We used functional magnetic resonance imaging (fMRI) to assess the effects of Delta-9-tetrahydrocannabinol (THC) and cannabidiol (CBD) during visual and auditory processing in healthy volunteers. In total, 14 healthy volunteers were scanned on three occasions. Identical 10mg THC, 600mg CBD, and placebo capsules were allocated in a balanced double-blinded pseudo-randomized crossover design. Plasma levels of each substance, physiological parameters, and measures of psychopathology were taken at baseline and at regular intervals following ingestion of substances. Volunteers listened passively to words read and viewed a radial visual checkerboard in alternating blocks during fMRI scanning. Administration of THC was associated with increases in anxiety, intoxication, and positive psychotic symptoms, whereas CBD had no significant symptomatic effects. THC decreased activation relative to placebo in bilateral temporal cortices during auditory processing, and increased and decreased activation in different visual areas during visual processing. CBD was associated with activation in right temporal cortex during auditory processing, and when contrasted, THC and CBD had opposite effects in the right posterior superior temporal gyrus, the right-sided homolog to Wernicke`s area. Moreover, the attenuation of activation in this area (maximum 61, -15, -2) by THC during auditory processing was correlated with its acute effect on psychotic symptoms. Single doses of THC and CBD differently modulate brain function in areas that process auditory and visual stimuli and relate to induced psychotic symptoms. Neuropsychopharmacology (2011) 36, 1340-1348; doi:10.1038/npp.2011.17; published online 16 March 2011
Resumo:
Magdeburg, Univ., Fak. für Naturwiss., Diss., 2009
Resumo:
The processing of biological motion is a critical, everyday task performed with remarkable efficiency by human sensory systems. Interest in this ability has focused to a large extent on biological motion processing in the visual modality (see, for example, Cutting, J. E., Moore, C., & Morrison, R. (1988). Masking the motions of human gait. Perception and Psychophysics, 44(4), 339-347). In naturalistic settings, however, it is often the case that biological motion is defined by input to more than one sensory modality. For this reason, here in a series of experiments we investigate behavioural correlates of multisensory, in particular audiovisual, integration in the processing of biological motion cues. More specifically, using a new psychophysical paradigm we investigate the effect of suprathreshold auditory motion on perceptions of visually defined biological motion. Unlike data from previous studies investigating audiovisual integration in linear motion processing [Meyer, G. F. & Wuerger, S. M. (2001). Cross-modal integration of auditory and visual motion signals. Neuroreport, 12(11), 2557-2560; Wuerger, S. M., Hofbauer, M., & Meyer, G. F. (2003). The integration of auditory and motion signals at threshold. Perception and Psychophysics, 65(8), 1188-1196; Alais, D. & Burr, D. (2004). No direction-specific bimodal facilitation for audiovisual motion detection. Cognitive Brain Research, 19, 185-194], we report the existence of direction-selective effects: relative to control (stationary) auditory conditions, auditory motion in the same direction as the visually defined biological motion target increased its detectability, whereas auditory motion in the opposite direction had the inverse effect. Our data suggest these effects do not arise through general shifts in visuo-spatial attention, but instead are a consequence of motion-sensitive, direction-tuned integration mechanisms that are, if not unique to biological visual motion, at least not common to all types of visual motion. Based on these data and evidence from neurophysiological and neuroimaging studies we discuss the neural mechanisms likely to underlie this effect.
Resumo:
Modern cochlear implantation technologies allow deaf patients to understand auditory speech; however, the implants deliver only a coarse auditory input and patients must use long-term adaptive processes to achieve coherent percepts. In adults with post-lingual deafness, the high progress of speech recovery is observed during the first year after cochlear implantation, but there is a large range of variability in the level of cochlear implant outcomes and the temporal evolution of recovery. It has been proposed that when profoundly deaf subjects receive a cochlear implant, the visual cross-modal reorganization of the brain is deleterious for auditory speech recovery. We tested this hypothesis in post-lingually deaf adults by analysing whether brain activity shortly after implantation correlated with the level of auditory recovery 6 months later. Based on brain activity induced by a speech-processing task, we found strong positive correlations in areas outside the auditory cortex. The highest positive correlations were found in the occipital cortex involved in visual processing, as well as in the posterior-temporal cortex known for audio-visual integration. The other area, which positively correlated with auditory speech recovery, was localized in the left inferior frontal area known for speech processing. Our results demonstrate that the visual modality's functional level is related to the proficiency level of auditory recovery. Based on the positive correlation of visual activity with auditory speech recovery, we suggest that visual modality may facilitate the perception of the word's auditory counterpart in communicative situations. The link demonstrated between visual activity and auditory speech perception indicates that visuoauditory synergy is crucial for cross-modal plasticity and fostering speech-comprehension recovery in adult cochlear-implanted deaf patients.
Resumo:
The kitten's auditory cortex (including the first and second auditory fields AI and AII) is known to send transient axons to either ipsi- or contralateral visual areas 17 and 18. By the end of the first postnatal month the transitory axons, but not their neurons of origin, are eliminated. Here we investigated where these neurons project after the elimination of the transitory axon. Eighteen kittens received early (postnatal day (pd) 2 - 5) injections of long lasting retrograde fluorescent traces in visual areas 17 and 18 and late (pd 35 - 64) injections of other retrograde fluorescent tracers in either hemisphere, mostly in areas known to receive projections from AI and AII in the adult cat. The middle ectosylvian gyrus was analysed for double-labelled neurons in the region corresponding approximately to AI and AII. Late injections in the contralateral (to the analysed AI, AII) hemisphere including all of the known auditory areas, as well as some visual and 'association' areas, did not relabel neurons which had had transient projections to either ipsi- or contralateral visual areas 17 - 18. Thus, AI and AII neurons after eliminating their transient juvenile projections to visual areas 17 and 18 do not project to the other hemisphere. In contrast, relabelling was obtained with late injections in several locations in the ipsilateral hemisphere; it was expressed as per cent of the population labelled by the early injections. Few neurons (0 - 2.5%) were relabelled by large injections in the caudal part of the posterior ectosylvian gyrus and the adjacent posterior suprasylvian sulcus (areas DP, P, VP). Multiple injections in the middle ectosylvian gyrus relabelled a considerably larger percentage of neurons (13%). Single small injections in the middle ectosylvian gyrus (areas AI, AII), the caudal part of the anterior ectosylvian gyrus and the rostral part of the posterior ectosylvian gyrus relabelled 3.1 - 7.0% of neurons. These neurons were generally near (<2.0 mm) the outer border of the late injection sites. Neurons with transient projections to ipsi- or contralateral visual areas 17 and 18 were relabelled in similar proportions by late injections at any given location. Thus, AI or AII neurons which send a transitory axon to ipsi- or contralateral visual areas 17 and 18 are most likely to form short permanent cortical connections. In that respect, they are similar to medial area 17 neurons that form transitory callosal axons and short permanent axons to ipsilateral visual areas 17 and 18.
Resumo:
Vision provides a primary sensory input for food perception. It raises expectations on taste and nutritional value and drives acceptance or rejection. So far, the impact of visual food cues varying in energy content on subsequent taste integration remains unexplored. Using electrical neuroimaging, we assessed whether high- and low-calorie food cues differentially influence the brain processing and perception of a subsequent neutral electric taste. When viewing high-calorie food images, participants reported the subsequent taste to be more pleasant than when low-calorie food images preceded the identical taste. Moreover, the taste-evoked neural activity was stronger in the bilateral insula and the adjacent frontal operculum (FOP) within 100 ms after taste onset when preceded by high- versus low-calorie cues. A similar pattern evolved in the anterior cingulate (ACC) and medial orbitofrontal cortex (OFC) around 180 ms, as well as, in the right insula, around 360 ms. The activation differences in the OFC correlated positively with changes in taste pleasantness, a finding that is an accord with the role of the OFC in the hedonic evaluation of taste. Later activation differences in the right insula likely indicate revaluation of interoceptive taste awareness. Our findings reveal previously unknown mechanisms of cross-modal, visual-gustatory, sensory interactions underlying food evaluation.
Resumo:
Rats, like other crepuscular animals, have excellent auditory capacities and they discriminate well between different sounds [Heffner HE, Heffner RS, Hearing in two cricetid rodents: wood rats (Neotoma floridana) and grasshopper mouse (Onychomys leucogaster). J Comp Psychol 1985;99(3):275-88]. However, most experimental literature concerning spatial orientation almost exclusively emphasizes the use of visual landmarks [Cressant A, Muller RU, Poucet B. Failure of centrally placed objects to control the firing fields of hippocampal place cells. J Neurosci 1997;17(7):2531-42; and Goodridge JP, Taube JS. Preferential use of the landmark navigational system by head direction cells in rats. Behav Neurosci 1995;109(1):49-61]. To address the important issue of whether rats are able to achieve a place navigation task relative to auditory beacons, we designed a place learning task in the water maze. We controlled cue availability by conducting the experiment in total darkness. Three auditory cues did not allow place navigation whereas three visual cues in the same positions did support place navigation. One auditory beacon directly associated with the goal location did not support taxon navigation (a beacon strategy allowing the animal to find the goal just by swimming toward the cue). Replacing the auditory beacons by one single visual beacon did support taxon navigation. A multimodal configuration of two auditory cues and one visual cue allowed correct place navigation. The deletion of the two auditory or of the one visual cue did disrupt the spatial performance. Thus rats can combine information from different sensory modalities to achieve a place navigation task. In particular, auditory cues support place navigation when associated with a visual one.