930 resultados para Gaze Angle
Resumo:
A desktop tool for replay and analysis of gaze-enhanced multiparty virtual collaborative sessions is described. We linked three CAVE (TM)-like environments, creating a multiparty collaborative virtual space where avatars are animated with 3D gaze as well as head and hand motions in real time. Log files are recorded for subsequent playback and analysis Using the proposed software tool. During replaying the user can rotate the viewpoint and navigate in the simulated 3D scene. The playback mechanism relies on multiple distributed log files captured at every site. This structure enables an observer to experience latencies of movement and information transfer for every site as this is important fir conversation analysis. Playback uses an event-replay algorithm, modified to allow fast traversal of the scene by selective rendering of nodes, and to simulate fast random access. The tool's is analysis module can show each participant's 3D gaze points and areas where gaze has been concentrated.
Resumo:
Participants' eye-gaze is generally not captured or represented in immersive collaborative virtual environment (ICVE) systems. We present EyeCVE. which uses mobile eye-trackers to drive the gaze of each participant's virtual avatar, thus supporting remote mutual eye-contact and awareness of others' gaze in a perceptually unfragmented shared virtual workspace. We detail trials in which participants took part in three-way conferences between remote CAVE (TM) systems linked via EyeCVE. Eye-tracking data was recorded and used to evaluate interaction, confirming; the system's support for the use of gaze as a communicational and management resource in multiparty conversational scenarios. We point toward subsequent investigation of eye-tracking in ICVEs for enhanced remote social-interaction and analysis.
Resumo:
Eye gaze is an important conversational resource that until now could only be supported across a distance if people were rooted to the spot. We introduce EyeCVE, the worldpsilas first tele-presence system that allows people in different physical locations to not only see what each other are doing but follow each otherpsilas eyes, even when walking about. Projected into each space are avatar representations of remote participants, that reproduce not only body, head and hand movements, but also those of the eyes. Spatial and temporal alignment of remote spaces allows the focus of gaze as well as activity and gesture to be used as a resource for non-verbal communication. The temporal challenge met was to reproduce eye movements quick enough and often enough to interpret their focus during a multi-way interaction, along with communicating other verbal and non-verbal language. The spatial challenge met was to maintain communicational eye gaze while allowing free movement of participants within a virtually shared common frame of reference. This paper reports on the technical and especially temporal characteristics of the system.
Resumo:
Our eyes are input sensors which Provide our brains with streams of visual data. They have evolved to be extremely efficient, and they will constantly dart to-and-fro to rapidly build up a picture of the salient entities in a viewed scene. These actions are almost subconscious. However, they can provide telling signs of how the brain is decoding the visuals and call indicate emotional responses, prior to the viewer becoming aware of them. In this paper we discuss a method of tracking a user's eye movements, and Use these to calculate their gaze within an immersive virtual environment. We investigate how these gaze patterns can be captured and used to identify viewed virtual objects, and discuss how this can be used as a, natural method of interacting with the Virtual Environment. We describe a flexible tool that has been developed to achieve this, and detail initial validating applications that prove the concept.
Resumo:
In collaborative situations, eye gaze is a critical element of behavior which supports and fulfills many activities and roles. In current computer-supported collaboration systems, eye gaze is poorly supported. Even in a state-of-the-art video conferencing system such as the access grid, although one can see the face of the user, much of the communicative power of eye gaze is lost. This article gives an overview of some preliminary work that looks towards integrating eye gaze into an immersive collaborative virtual environment and assessing the impact that this would have on interaction between the users of such a system. Three experiments were conducted to assess the efficacy of eye gaze within immersive virtual environments. In each experiment, subjects observed on a large screen the eye-gaze behavior of an avatar. The eye-gaze behavior of that avatar had previously been recorded from a user with the use of a head-mounted eye tracker. The first experiment was conducted to assess the difference between users' abilities to judge what objects an avatar is looking at with only head gaze being viewed and also with eye- and head-gaze data being displayed. The results from the experiment show that eye gaze is of vital importance to the subjects, correctly identifying what a person is looking at in an immersive virtual environment. The second experiment examined whether a monocular or binocular eye-tracker would be required. This was examined by testing subjects' ability to identify where an avatar was looking from their eye direction alone, or by eye direction combined with convergence. This experiment showed that convergence had a significant impact on the subjects' ability to identify where the avatar was looking. The final experiment looked at the effects of stereo and mono-viewing of the scene, with the subjects being asked to identify where the avatar was looking. This experiment showed that there was no difference in the subjects' ability to detect where the avatar was gazing. This is followed by a description of how the eye-tracking system has been integrated into an immersive collaborative virtual environment and some preliminary results from the use of such a system.
Resumo:
For efficient collaboration between participants, eye gaze is seen as being critical for interaction. Video conferencing either does not attempt to support eye gaze (e.g. AcessGrid) or only approximates it in round table conditions (e.g. life size telepresence). Immersive collaborative virtual environments represent remote participants through avatars that follow their tracked movements. By additionally tracking people's eyes and representing their movement on their avatars, the line of gaze can be faithfully reproduced, as opposed to approximated. This paper presents the results of initial work that tested if the focus of gaze could be more accurately gauged if tracked eye movement was added to that of the head of an avatar observed in an immersive VE. An experiment was conducted to assess the difference between user's abilities to judge what objects an avatar is looking at with only head movements being displayed, while the eyes remained static, and with eye gaze and head movement information being displayed. The results from the experiment show that eye gaze is of vital importance to the subjects correctly identifying what a person is looking at in an immersive virtual environment. This is followed by a description of the work that is now being undertaken following the positive results from the experiment. We discuss the integration of an eye tracker more suitable for immersive mobile use and the software and techniques that were developed to integrate the user's real-world eye movements into calibrated eye gaze in an immersive virtual world. This is to be used in the creation of an immersive collaborative virtual environment supporting eye gaze and its ongoing experiments. Copyright (C) 2009 John Wiley & Sons, Ltd.
Resumo:
WThe capillary flow alignment of the thermotropic liquid crystal 4-n-octyl-4′-cyanobiphenyl in the nematic and smectic phases is investigated using time-resolved synchrotron small-angle x-ray scattering. Samples were cooled from the isotropic phase to erase prior orientation. Upon cooling through the nematic phase under Poiseuille flow in a circular capillary, a transition from the alignment of mesogens along the flow direction to the alignment of layers along the flow direction (mesogens perpendicular to flow) appears to occur continuously at the cooling rate applied. The transition is centered on a temperature at which the Leslie viscosity coefficient α3 changes sign. The configuration with layers aligned along the flow direction is also observed in the smectic phase. The transition in the nematic phase on cooling has previously been ascribed to an aligning-nonaligning or tumbling transition. At high flow rates there is evidence for tumbling around an average alignment of layers along the flow direction. At lower flow rates this orientation is more clearly defined. The layer alignment is ascribed to surface-induced ordering propagating into the bulk of the capillary, an observation supported by the parallel alignment of layers observed for a static sample at low temperatures in the nematic phase.
Resumo:
Recent empirical studies have shown that multi-angle spectral data can be useful for predicting canopy height, but the physical reason for this correlation was not understood. We follow the concept of canopy spectral invariants, specifically escape probability, to gain insight into the observed correlation. Airborne Multi-Angle Imaging Spectrometer (AirMISR) and airborne Laser Vegetation Imaging Sensor (LVIS) data acquired during a NASA Terrestrial Ecology Program aircraft campaign underlie our analysis. Two multivariate linear regression models were developed to estimate LVIS height measures from 28 AirMISR multi-angle spectral reflectances and from the spectrally invariant escape probability at 7 AirMISR view angles. Both models achieved nearly the same accuracy, suggesting that canopy spectral invariant theory can explain the observed correlation. We hypothesize that the escape probability is sensitive to the aspect ratio (crown diameter to crown height). The multi-angle spectral data alone therefore may not provide enough information to retrieve canopy height globally.
Resumo:
Electrospinning is a technique employed to produce nanoscale to microscale sized fibres by the application of a high voltage to a spinneret containing a polymer solution. Here we examine how small angle neutron scattering data can be modelled to analyse the polymer chain conformation. We prepared 1:1 blends of deuterated and hydrogenated atactic-polystyrene fibres from solutions in N, N-Dimethylformamide and Methyl Ethyl Ketone. The fibres themselves often contain pores or voiding within the internal structure on the length scales that can interfere with scattering experiments. A model to fit the scattering data in order to obtain values for the radius of gyration of the polymer molecules within the fibres has been developed, that includes in the scattering from the voids. Using this model we find that the radius of gyration is 20% larger than in the bulk state and the chains are slightly extended parallel to the fibre axis.
Resumo:
The objective of a Visual Telepresence System is to provide the operator with a high fidelity image from a remote stereo camera pair linked to a pan/tilt device such that the operator may reorient the camera position by use of head movement. Systems such as these which utilise virtual reality style helmet mounted displays have a number of limitations. The geometry of the camera positions and of the displays is generally fixed and is most suitable only for viewing elements of a scene at a particular distance. To address such limitations, a prototype system has been developed where the geometry of the displays and cameras is dynamically controlled by the eye movement of the operator. This paper explores why it is necessary to actively adjust the display system as well as the cameras and justifies the use of mechanical adjustment of the displays as an alternative to adjustment by electronic or image processing methods. The electronic and mechanical design is described including optical arrangements and control algorithms. The performance and accuracy of the system is assessed with respect to eye movement.
Resumo:
BACKGROUND: Humans from an early age look longer at preferred stimuli, and also typically look longer at facial expressions of emotion, particularly happy faces. Atypical gaze patterns towards social stimuli are common in Autism Spectrum Conditions (ASC). However, it is unknown if gaze fixation patterns have any genetic basis. In this study, we tested if variations in the cannabinoid receptor 1 (CNR1) gene are associated with gaze duration towards happy faces. This gene was selected because CNR1 is a key component of the endocannabinoid system, involved in processing reward, and in our previous fMRI study we found variations in CNR1 modulates the striatal response to happy (but not disgust) faces. The striatum is involved in guiding gaze to rewarding aspects of a visual scene. We aimed to validate and extend this result in another sample using a different technique (gaze tracking). METHODS: 30 volunteers (13 males, 17 females) from the general population observed dynamic emotion expressions on a screen while their eye movements were recorded. They were genotyped for the identical four SNPs in the CNR1 gene tested in our earlier fMRI study. RESULTS: Two SNPs (rs806377 and rs806380) were associated with differential gaze duration for happy (but not disgust) faces. Importantly, the allelic groups associated with greater striatal response to happy faces in the fMRI study were associated with longer gaze duration for happy faces. CONCLUSIONS: These results suggest CNR1 variations modulate striatal function that underlies the perception of signals of social reward such as happy faces. This suggests CNR1 is a key element in the molecular architecture of perception of certain basic emotions. This may have implications for understanding neurodevelopmental conditions marked by atypical eye contact and facial emotion processing, such as ASC.
Resumo:
A proof using the methane tetrahedroid bond angle can be obtained by using spherical polar coordinates to calculate the Cartesian coordinates of the hydrogen atoms, then using the dot product of vectors.
Resumo:
The rotational symmetry of a methane molecule can be used to great advantage to calculate the bond angle. The problem is worked out in this article.