875 resultados para auditory cues
Resumo:
In this research, a cross-model paradigm was chosen to test the hypothesis that affective olfactory and auditory cues paired with neutral visual stimuli bearing no resemblance or logical connection to the affective cues can evoke preference shifts in those stimuli. Neutral visual stimuli of abstract paintings were presented simultaneously with liked and disliked odours and sounds, with neutral-neutral pairings serving as controls. The results confirm previous findings that the affective evaluation of previously neutral visual stimuli shifts in the direction of contingently presented affective auditory stimuli. In addition, this research shows the presence of conditioning with affective odours having no logical connection with the pictures.
Resumo:
This paper addresses the crucial problem of wayfinding assistance in the Virtual Environments (VEs). A number of navigation aids such as maps, agents, trails and acoustic landmarks are available to support the user for navigation in VEs, however it is evident that most of the aids are visually dominated. This work-in-progress describes a sound based approach that intends to assist the task of 'route decision' during navigation in a VE using music. Furthermore, with use of musical sounds it aims to reduce the cognitive load associated with other visually as well as physically dominated tasks. To achieve these goals, the approach exploits the benefits provided by music to ease and enhance the task of wayfinding, whilst making the user experience in the VE smooth and enjoyable.
Resumo:
Gait disturbances are a common feature of Parkinson’s disease, one of the most severe being freezing of gait. Sensory cueing is a common method used to facilitate stepping in people with Parkinson’s. Recent work has shown that, compared to walking to a metronome, Parkinson’s patients without freezing of gait (nFOG) showed reduced gait variability when imitating recorded sounds of footsteps made on gravel. However, it is not known if these benefits are realised through the continuity of the acoustic information or the action-relevance. Furthermore, no study has examined if these benefits extend to PD with freezing of gait. We prepared four different auditory cues (varying in action-relevance and acoustic continuity) and asked 19 Parkinson’s patients (10 nFOG, 9 with freezing of gait (FOG)) to step in place to each cue. Results showed a superiority of action-relevant cues (regardless of cue-continuity) for inducing reductions in Step coefficient of variation (CV). Acoustic continuity was associated with a significant reduction in Swing CV. Neither cue-continuity nor action-relevance was independently sufficient to increase the time spent stepping before freezing. However, combining both attributes in the same cue did yield significant improvements. This study demonstrates the potential of using action-sounds as sensory cues for Parkinson’s patients with freezing of gait. We suggest that the improvements shown might be considered audio-motor ‘priming’ (i.e., listening to the sounds of footsteps will engage sensorimotor circuitry relevant to the production of that same action, thus effectively bypassing the defective basal ganglia).
Resumo:
Gait disorders are identified in people with Parkinson’s disease. The aim of this study was to investigate the effect of auditory cues and medication on kinematic, kinetic and EMG parameters, during different gait phases of people with PD and healthy elderly. Thirty subjects distributed in two groups (Group 1, PD patients off and on medication; Group 2, healthy elderly) participated in this study and were instructed to walk in two experimental conditions: non-cued and cued. Therefore, kinematic, kinetic and electromyography analyses were utilized to investigate the locomotor pattern. Changes in locomotor pattern (greater muscular activity) with auditory cue were observed for PD patients. Regarding the medication, locomotor parameter improvement was observed after levodopa intake in association with the auditory cue. These results confirm the hypothesis about the external cues therapy that could be used as a complement to drug therapy to achieve improvement in the locomotor pattern of PD patients.
Resumo:
One of the fascinating properties of the central nervous system is its ability to learn: the ability to alter its functional properties adaptively as a consequence of the interactions of an animal with the environment. The auditory localization pathway provides an opportunity to observe such adaptive changes and to study the cellular mechanisms that underlie them. The midbrain localization pathway creates a multimodal map of space that represents the nervous system's associations of auditory cues with locations in visual space. Various manipulations of auditory or visual experience, especially during early life, that change the relationship between auditory cues and locations in space lead to adaptive changes in auditory localization behavior and to corresponding changes in the functional and anatomical properties of this pathway. Traces of this early learning persist into adulthood, enabling adults to reacquire patterns of connectivity that were learned initially during the juvenile period.
Resumo:
Thirteen international netballers viewed static images of scenarios taken from netball open play. Two ‘team mates’, each marked by one opponent, could be seen in each image; each team mate-opponent pair was located on opposite sides of the vertical meridian, such that a binary response was required (‘left’ or ‘right’) from the participant, in order to select a team mate to whom they would pass the ball. For each trial, a spoken word (“left”/“right”) was presented monaurally at the onset of the visual image. Spatially invalid auditory cues (i.e., in the ear contralateral to the correct passing option), reduced performance accuracy relative to valid ones. Semantically invalid cues (e.g., a call of “left” when the target was right-located), increased response times relative to valid ones. However, there were no accompanying changes in visual attention to the team mates and their markers. The effects of auditory cues on covert attentional shifts and decision-making are discussed.
Resumo:
Limited research is available on how well visual cues integrate with auditory cues to improve speech intelligibility in persons with visual impairments, such as cataracts. We investigated whether simulated cataracts interfered with participants’ ability to use visual cues to help disambiguate a spoken message in the presence of spoken background noise. We tested 21 young adults with normal visual acuity and hearing sensitivity. Speech intelligibility was tested under three conditions: auditory only with no visual input, auditory-visual with normal viewing, and auditory-visual with simulated cataracts. Central Institute for the Deaf (CID) Everyday Speech Sentences were spoken by a live talker, mimicking a pre-recorded audio track, in the presence of pre-recorded four-person background babble at a signal-to-noise ratio (SNR) of -13 dB. The talker was masked to the experimental conditions to control for experimenter bias. Relative to the normal vision condition, speech intelligibility was significantly poorer, [t (20) = 4.17, p < .01, Cohen’s d =1.0], in the simulated cataract condition. These results suggest that cataracts can interfere with speech perception, which may occur through a reduction in visual cues, less effective integration or a combination of the two effects. These novel findings contribute to our understanding of the association between two common sensory problems in adults: reduced contrast sensitivity associated with cataracts and reduced face-to-face communication in noise.
Resumo:
The ability to synchronise actions with environmental events is a fundamental skill supporting a variety of group activities. In such situations, multiple sensory cues are usually available for synchronisation, yet previous studies have suggested that auditory cues dominate those from other modalities. We examine the control of rhythmic action on the basis of auditory and haptic cues and show that performance is sensitive to both sources of information for synchronisation. Participants were required to tap the dominant hand index finger in synchrony with a metronome defined by periodic auditory tones, imposed movements of the non-dominant index finger, or both cues together. Synchronisation was least variable with the bimodal metronome as predicted by a maximum likelihood estimation (MLE) model. However, increases in timing variability of the auditory cue resulted in some departures from the MLE model. Our findings indicate the need for further investigation of the MLE account of the integration of multisensory signals in the temporal control of action.
Resumo:
Previous research has shown that Parkinson's disease (PD) patients can increase the speed of their movement when catching a moving ball compared to when reaching for a static ball (Majsak et al., 1998). A recent model proposed by Redgrave et al. (2010) explains this phenomenon with regard to the dichotomic organization of motor loops in the basal ganglia circuitry and the role of sensory micro-circuitries in the control of goal-directed actions. According to this model, external visual information that is relevant to the required movement can induce a switch from a habitual control of movement toward an externally-paced, goal-directed form of guidance, resulting in augmented motor performance (Bienkiewicz et al., 2013). In the current study, we investigated whether continuous acoustic information generated by an object in motion can enhance motor performance in an arm reaching task in a similar way to that observed in the studies of Majsak et al. (1998, 2008). In addition, we explored whether the kinematic aspects of the movement are regulated in accordance with time to arrival information generated by the ball's motion as it reaches the catching zone. A group of 7 idiopathic PD (6 male, 1 female) patients performed a ball-catching task where the acceleration (and hence ball velocity) was manipulated by adjusting the angle of the ramp. The type of sensory information (visual and/or auditory) specifying the ball's arrival at the catching zone was also manipulated. Our results showed that patients with PD demonstrate improved motor performance when reaching for a ball in motion, compared to when stationary. We observed how PD patients can adjust their movement kinematics in accordance with the speed of a moving target, even if vision of the target is occluded and patients have to rely solely on auditory information. We demonstrate that the availability of dynamic temporal information is crucial for eliciting motor improvements in PD. Furthermore, these effects appear independent from the sensory modality through-which the information is conveyed.
Resumo:
Double degree. A Work Project presented as part of the requirements for the Award of a Masters Degree in Management from the NOVA- School of Business and Economics and Warsaw School of Economics
Resumo:
Background noise should in theory hinder detection of auditory cues associated with approaching danger. We tested whether foraging chaffinches Fringilla coelebs responded to background noise by increasing vigilance, and examined whether this was explained by predation risk compensation or by a novel stimulus hypothesis. The former predicts that only inter-scan interval should be modified in the presence of background noise, not vigilance levels generally. This is because noise hampers auditory cue detection and increases perceived predation risk primarily when in the head-down position, and also because previous tests have shown that only interscan interval is correlated with predator detection ability in this system. Chaffinches only modified interscan interval supporting this hypothesis. At the same time they made significantly fewer pecks when feeding during the background noise treatment and so the increased vigilance led to a reduction in intake rate, suggesting that compensating for the increased predation risk could indirectly lead to a fitness cost. Finally, the novel stimulus hypothesis predicts that chaffinches should habituate to the noise, which did not occur within a trial or over 5 subsequent trials. We conclude that auditory cues may be an important component of the trade-off between vigilance and feeding, and discuss possible implications for anti-predation theory and ecological processes
Resumo:
As the fidelity of virtual environments (VE) continues to increase, the possibility of using them as training platforms is becoming increasingly realistic for a variety of application domains, including military and emergency personnel training. In the past, there was much debate on whether the acquisition and subsequent transfer of spatial knowledge from VEs to the real world is possible, or whether the differences in medium during training would essentially be an obstacle to truly learning geometric space. In this paper, the authors present various cognitive and environmental factors that not only contribute to this process, but also interact with each other to a certain degree, leading to a variable exposure time requirement in order for the process of spatial knowledge acquisition (SKA) to occur. The cognitive factors that the authors discuss include a variety of individual user differences such as: knowledge and experience; cognitive gender differences; aptitude and spatial orientation skill; and finally, cognitive styles. Environmental factors discussed include: Size, Spatial layout complexity and landmark distribution. It may seem obvious that since every individual's brain is unique - not only through experience, but also through genetic predisposition that a one size fits all approach to training would be illogical. Furthermore, considering that various cognitive differences may further emerge when a certain stimulus is present (e.g. complex environmental space), it would make even more sense to understand how these factors can impact spatial memory, and to try to adapt the training session by providing visual/auditory cues as well as by changing the exposure time requirements for each individual. The impact of this research domain is important to VE training in general, however within service and military domains, guaranteeing appropriate spatial training is critical in order to ensure that disorientation does not occur in a life or death scenario.
Resumo:
Predators directly and indirectly affect the density and the behavior of prey. These effects may potentially cascade down to lower trophic levels. In this study, we tested the effects of predator calls (playbacks of bird vocalizations: Tyto alba, Speotyto cunicularia, and Vanellus chilensis), predator visual stimuli (stuffed birds) and interactions of visual and auditory cues, on the behavior of frugivore phyllostomid bats in the field. In addition, we tested if the effects of predation risk cascade down to other trophic levels by measuring rates of seed dispersal of the tree Muntingia calabura. Using video recording, we found that bats significantly decreased the foraging frequency on trees when a visual cue of T. alba was present. However, no stimuli of potential predatory birds, including vocalization of T. alba, affected bat foraging frequency. There was a change in bat behavior during 7 min, but then their frequency of activity gradually increased. Consequently, the presence of T. alba decreased by up to ten times the rate of seed removal. These results indicate that risk sensitivity of frugivorous phyllostomid bats depends on predator identity and presence. Among the predators used in this study, only T. alba is an effective bat predator in the Neotropics. Sound stimuli of T. alba seem not to be a cue of predation risk, possibly because their vocalizations are used only for intraspecific communication. This study emphasizes the importance of evaluating different predator stimuli on the behavior of vertebrates, as well as the effects of these stimuli on trait-mediated trophic cascades. © 2013 Springer-Verlag Berlin Heidelberg.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)