28 resultados para tactile cartography
em CentAUR: Central Archive University of Reading - UK
Resumo:
Tactile discrimination performance depends on the receptive field (RF) size of somatosensory cortical (SI) neurons. Psychophysical masking effects can reveal the RF of an idealized "virtual" somatosensory neuron. Previous studies show that top-down factors strongly affect tactile discrimination performance. Here, we show that non-informative vision of the touched body part influences tactile discrimination by modulating tactile RFs. Ten subjects performed spatial discrimination between touch locations on the forearm. Performance was improved when subjects saw their forearm compared to viewing a neutral object in the same location. The extent of visual information was relevant, since restricted view of the forearm did not have this enhancing effect. Vibrotactile maskers were placed symmetrically on either side of the tactile target locations, at two different distances. Overall, masking significantly impaired discrimination performance, but the spatial gradient of masking depended on what subjects viewed. Viewing the body reduced the effect of distant maskers, but enhanced the effect of close maskers, as compared to viewing a neutral object. We propose that viewing the body improves functional touch by sharpening tactile RFs in an early somatosensory map. Top-down modulation of lateral inhibition could underlie these effects.
Resumo:
Background: Autism spectrum conditions have a strong genetic component. Atypical sensory sensitivities are one of the core but neglected features of autism spectrum conditions. GABRB3 is a well-characterised candidate gene for autism spectrum conditions. In mice, heterozygous Gabrb3 deletion is associated with increased tactile sensitivity. However, no study has examined if tactile sensitivity is associated with GABRB3 genetic variation in humans. To test this, we conducted two pilot genetic association studies in the general population, analysing two phenotypic measures of tactile sensitivity (a parent-report and a behavioural measure) for association with 43 SNPs in GABRB3. Findings: Across both tactile sensitivity measures, three SNPs (rs11636966, rs8023959 and rs2162241) were nominally associated with both phenotypes, providing a measure of internal validation. Parent-report scores were nominally associated with six SNPs (P <0.05). Behaviourally measured tactile sensitivity was nominally associated with 10 SNPs (three after Bonferroni correction). Conclusions: This is the first human study to show an association between GABRB3 variation and tactile sensitivity. This provides support for the evidence from animal models implicating the role of GABRB3 variation in the atypical sensory sensitivity in autism spectrum conditions. Future research is underway to directly test this association in cases of autism spectrum conditions.
Resumo:
Multisensory integration involves bottom-up as well as top-down processes. We investigated the influences of top-down control on the neural responses to multisensory stimulation using EEG recording and time-frequency analyses. Participants were stimulated at the index or thumb of the left hand, using tactile vibrators mounted on a foam cube. Simultaneously they received a visual distractor from a light emitting diode adjacent to the active vibrator (spatially congruent trial) or adjacent to the inactive vibrator (spatially incongruent trial). The task was to respond to the elevation of the tactile stimulus (upper or lower), while ignoring the simultaneous visual distractor. To manipulate top-down control on this multisensory stimulation, the proportion of spatially congruent (vs. incongruent) trials was changed across blocks. Our results reveal that the behavioral cost of responding to incongruent than congruent trials (i.e., the crossmodal congruency effect) was modulated by the proportion of congruent trials. Most importantly, the EEG gamma band response and the gamma-theta coupling were also affected by this modulation of top-down control, whereas the late theta band response related to the congruency effect was not. These findings suggest that gamma band response is more than a marker of multisensory binding, being also sensitive to the correspondence between expected and actual multisensory stimulation. By contrast, theta band response was affected by congruency but appears to be largely immune to stimulation expectancy.
Resumo:
This paper details an investigation into sensory substitution by means of direct electrical stimulation of the tongue for the purpose of information input to the human brain. In particular, a device has been constructed and a series of trials have been performed in order to demonstrate the efficacy and performance of an electro-tactile array mounted onto the tongue surface for the purpose of sensory augmentation. Tests have shown that by using a low resolution array a computer-human feedback loop can be successfully implemented by humans in order to complete tasks such as object tracking, surface shape identification and shape recognition with no training or prior experience with the device. Comparisons of this technique have been made with visual alternatives and these show that the tongue based tactile array can match such methods in convenience and accuracy in performing simple tasks.
Resumo:
As part of its Data User Element programme, the European Space Agency funded the GlobMODEL project which aimed at investigating the scientific, technical, and organizational issues associated with the use and exploitation of remotely-sensed observations, particularly from new sounders. A pilot study was performed as a "demonstrator" of the GlobMODEL idea, based on the use of new data, with a strong European heritage, not yet assimilated operationally. Two parallel assimilation experiments were performed, using either total column ozone or ozone profiles retrieved at the Royal Netherlands Meteorological Institute (KNMI) from the Ozone Monitoring Instrument (OMI). In both cases, the impact of assimilating OMI data in addition to the total ozone columns from the SCanning Imaging Absorption spectroMeter for Atmospheric CartograpHY (SCIAMACHY) on the European Centre for Medium Range Weather Forecasts (ECMWF) ozone analyses was assessed by means of independent measurements. We found that the impact of OMI total columns is mainly limited to the region between 20 and 80 hPa, and is particularly important at high latitudes in the Southern hemisphere where the stratospheric ozone transport and chemical depletion are generally difficult to model with accuracy. Furthermore, the assimilation experiments carried out in this work suggest that OMI DOAS (Differential Optical Absorption Spectroscopy) total ozone columns are on average larger than SCIAMACHY total columns by up to 3 DU, while OMI total columns derived from OMI ozone profiles are on average about 8 DU larger than SCIAMACHY total columns. At the same time, the demonstrator brought to light a number of issues related to the assimilation of atmospheric composition profiles, such as the shortcomings arising when the vertical resolution of the instrument is not properly accounted for in the assimilation. The GlobMODEL demonstrator accelerated scientific and operational utilization of new observations and its results - prompted ECMWF to start the operational assimilation of OMI total column ozone data.
Resumo:
Morphological, physical and chemical studies were carried out on soils of Mount Bambouto, a volcanic mountain of the West Cameroon highland. These studies show that the soils of this region can be divided into seven groups according to Soils Taxonomy USA [Soil taxonomy: a basic system of soil classification for making and interpreting soils surveys: USDA Agriculture Handbook 436: Washington, DC, US Government Pronting Office, 1975, 754]: lithic dystrandept soils, typical dystrandept soils, oxic dystrandept soils, typical haplohumox soils, typical kandiudox soils, tropopsamment soils and umbriaquox soils. A soils map of this region at scale 1:50,000 has been drawn up, using the seven soils groups above as soil cartography units. These soils are organised into of three main categories: soils with andic characteristics in the upper region of the mountain (lithic dystrandept soils, typical dystrandept soils and oxic dystrandept soils); ferrallitic soils in the lower part of the mountain (typical haplohumox soils and typical kandiudox soils) and imperfectly developed soils (tropopsamment soils and umbraquox soils).
Resumo:
Floral nectar spurs are widely considered to influence pollinator behaviour in orchids. Spurs of 21 orchid species selected from within four molecularly circumscribed clades of subtribe Orchidinae (based on Platanthera s.l., Gymnadenia-Dactylorhiza s.l., Anacamptis s.l., Orchis s.s.) were examined under light and scanning electron microscopes in order to estimate correlations between nectar production (categorized as absent, trace, reservoir), interior epidermal papillae (categorized as absent, short, medium, long) and epidermal cell striations (categorized as apparently absent, weak, moderate, strong). Closely related congeneric species scored similarly, but more divergent species showed less evidence of phylogenetic constraints. Nectar secretion was negatively correlated with striations and positively correlated with papillae, which were especially frequent and large in species producing substantial reservoirs of nectar. We speculate that the primary function of the papillae is conserving energy through nectar resorption and explain the presence of large papillae in a minority of deceit-pollinated species by arguing that the papillae improve pollination because they are a tactile expectation of pollinating insects. In contrast, the prominence of striations may be a 'spandrel', simply reflecting the thickness of the overlying cuticle. Developmentally, the spur is an invagination of the labellum; it is primarily vascularized by a single 'U'-shaped primary strand, with smaller strands present in some species. Several suggestions are made for developing further, more targeted research programmes. (C) 2009 The Linnean Society of London, Botanical Journal of the Linnean Society, 2009, 160, 369-387.
Resumo:
Intelligent buildings should provide a multi-sensory experience so that visual, aural, tactile, olfactory and gustatory senses are stimulated appropriately. A lack of environmental stimuli produces a boring and unsatisfying environment. It is now known that the environment affects people at deeper levels than, say, health and safety, and consequently it can modify moods and work performance. A holistic approach is proposed which recognizes that the physical environment together with social, organizational and personal factors can enhance the productivity of occupants. This approach provides a footprint for the design of healthier and more sustainable workplaces.
Resumo:
The arrival of a student who is Blind in the School of Systems Engineering at the University of Reading has made it an interesting and challenging year for all. Visually impaired students have already graduated from other Schools of the University and the School of Systems Engineering has seen three students with visual impairment graduate recently with good degrees. These students could access materials - and do assessments - essentially by means of enlargement and judicious choice of options. The new student had previously been supported by a specialist college. She is a proficient typist and also a user of both Braille and JAWS screen reader, and she is doing a joint course in Cybernetics and Computer Science. The course requires mathematics which itself includes graphs, and also many diagrams including numerous circuit diagrams. The University bought proven equipment such as a scanner to process books into speech or Braille, and screen reading software as well as a specialist machine for producing tactile diagrams for educational use. Clearly it is also important that the student can access assessments and examinations and present answers for marking or feedback (by sighted staff). So the School also used innovative in-house tactile methods to represent diagrams. This paper discusses the success or otherwise of various modifications of course delivery and the way forward for the next three years.
Resumo:
Researchers in the rehabilitation engineering community have been designing and developing a variety of passive/active devices to help persons with limited upper extremity function to perform essential daily manipulations. Devices range from low-end tools such as head/mouth sticks to sophisticated robots using vision and speech input. While almost all of the high-end equipment developed to date relies on visual feedback alone to guide the user providing no tactile or proprioceptive cues, the “low-tech” head/mouth sticks deliver better “feel” because of the inherent force feedback through physical contact with the user's body. However, the disadvantage of a conventional head/mouth stick is that it can only function in a limited workspace and the performance is limited by the user's strength. It therefore seems reasonable to attempt to develop a system that exploits the advantages of the two approaches: the power and flexibility of robotic systems with the sensory feedback of a headstick. The system presented in this paper reflects the design philosophy stated above. This system contains a pair of master-slave robots with the master being operated by the user's head and the slave acting as a telestick. Described in this paper are the design, control strategies, implementation and performance evaluation of the head-controlled force-reflecting telestick system.
Resumo:
People with disabilities such as quadriplegia can use mouth-sticks and head-sticks as extension devices to perform desired manipulations. These extensions provide extended proprioception which allows users to directly feel forces and other perceptual cues such as texture present at the tip of the mouth-stick. Such devices are effective for two principle reasons: because of their close contact with the user's tactile and proprioceptive sensing abilities; and because they tend to be lightweight and very stiff, and can thus convey tactile and kinesthetic information with high-bandwidth. Unfortunately, traditional mouth-sticks and head-sticks are limited in workspace and in the mechanical power that can be transferred because of user mobility and strength limitations. We describe an alternative implementation of the head-stick device using the idea of a virtual head-stick: a head-controlled bilateral force-reflecting telerobot. In this system the end-effector of the slave robot moves as if it were at the tip of an imaginary extension of the user's head. The design goal is for the system is to have the same intuitive operation and extended proprioception as a regular mouth-stick effector but with augmentation of workspace volume and mechanical power. The input is through a specially modified six DOF master robot (a PerForceTM hand-controller) whose joints can be back-driven to apply forces at the user's head. The manipulation tasks in the environment are performed by a six degree-of-freedom slave robot (the Zebra-ZEROTM) with a built-in force sensor. We describe the prototype hardware/software implementation of the system, control system design, safety/disability issues, and initial evaluation tasks.
Resumo:
We studied how the integration of seen and felt tactile stimulation modulates somatosensory processing, and investigated whether visuotactile integration depends on temporal contiguity of stimulation, and its coherence with a pre-existing body representation. During training, participants viewed a rubber hand or a rubber object that was tapped either synchronously with stimulation of their own hand, or in an uncorrelated fashion. In a subsequent test phase, somatosensory event-related potentials (ERPs) were recorded to tactile stimulation of the left or right hand, to assess how tactile processing was affected by previous visuotactile experience during training. An enhanced somatosensory N140 component was elicited after synchronous, compared with uncorrelated, visuotactile training, irrespective of whether participants viewed a rubber hand or rubber object. This early effect of visuotactile integration on somatosensory processing is interpreted as a candidate electrophysiological correlate of the rubber hand illusion that is determined by temporal contiguity, but not by pre-existing body representations. ERPmodulations were observed beyond 200msec post-stimulus, suggesting an attentional bias induced by visuotactile training. These late modulations were absent when the stimulation of a rubber hand and the participant’s own hand was uncorrelated during training, suggesting that pre-existing body representations may affect later stages of tactile processing.
Resumo:
The brain keeps track of the changing positions of body parts in space using a spatial body schema. When subjects localise a tactile stimulus on the skin, they might either use a somatotopic body map, or use a body schema to identify the location of the stimulation in external space. Healthy subjects were touched on the fingertips, with the hands in one of two postures: either the right hand was vertically above the left, or the fingers of both hands were interwoven. Subjects made speeded verbal responses to identify either the finger or the hand that was touched. Interweaving the fingers significantly impaired hand identification across several experiments, but had no effect on finger identification. Our results suggest that identification of fingers occurs in a somatotopic representation or finger schema. Identification of hands uses a general body schema, and is influenced by external spatial location. This dissociation implies that touches on the finger can only be identified with a particular hand after a process of assigning fingers to hands. This assignment is based on external spatial location. Our results suggest a role of the body schema in the identification of structural body parts from touch.
Resumo:
Perception of our own bodies is based on integration of visual and tactile inputs, notably by neurons in the brain’s parietal lobes. Here we report a behavioural consequence of this integration process. Simply viewing the arm can speed up reactions to an invisible tactile stimulus on the arm. We observed this visual enhancement effect only when a tactile task required spatial computation within a topographic map of the body surface and the judgements made were close to the limits of performance. This effect of viewing the body surface was absent or reversed in tasks that either did not require a spatial computation or in which judgements were well above performance limits. We consider possible mechanisms by which vision may influence tactile processing.
Resumo:
Research in the last four decades has brought a considerable advance in our understanding of how the brain synthesizes information arising from different sensory modalities. Indeed, many cortical and subcortical areas, beyond those traditionally considered to be ‘associative,’ have been shown to be involved in multisensory interaction and integration (Ghazanfar and Schroeder 2006). Visuo-tactile interaction is of particular interest, because of the prominent role played by vision in guiding our actions and anticipating their tactile consequences in everyday life. In this chapter, we focus on the functional role that visuo-tactile processing may play in driving two types of body-object interactions: avoidance and approach. We will first review some basic features of visuo-tactile interactions, as revealed by electrophysiological studies in monkeys. These will prove to be relevant for interpreting the subsequent evidence arising from human studies. A crucial point that will be stressed is that these visuo-tactile mechanisms have not only sensory, but also motor-related activity that qualifies them as multisensory-motor interfaces. Evidence will then be presented for the existence of functionally homologous processing in the human brain, both from neuropsychological research in brain-damaged patients and in healthy participants. The final part of the chapter will focus on some recent studies in humans showing that the human motor system is provided with a multisensory interface that allows for continuous monitoring of the space near the body (i.e., peripersonal space). We further demonstrate that multisensory processing can be modulated on-line as a consequence of interacting with objects. This indicates that, far from being passive, the monitoring of peripersonal space is an active process subserving actions between our body and objects located in the space around us.