877 resultados para eye-movements
Resumo:
ESTEVES, A. M., M. T. DE MELLO, M. PRADELLA-HALLINAN, and S. TUFIK. Effect of Acute and Chronic Physical Exercise on Patients with Periodic Leg Movements. Med. Sci. Sports Exerc., Vol. 41, No. 1,. pp. 237-242, 2009. Purpose: Nonpharmacological interventions may lead to an improvement in sleep quality. The objective of our study was to evaluate the effects of acute intensive exercise and chronic exercise on sleep patterns in patients with periodic leg movements (PLM). Methods: The study involved acute and chronic exercise. The acute intensive exercise group consisted of 22 volunteers who underwent a maximum effort test and a polysomnography (PSG) on the same night. The chronic exercise group included. 11 patients who performed 72 physical training sessions undergoing three PSG studies on the night of sessions 1, 36, and 72. Blood samples were collected from both acute and chronic groups for beta-endorphin dosage. Results: Our results showed that both forms of physical exercise lowered PLM levels. The acute physical exercise increased sleep efficiency, rapid eye movement (REM) sleep, and reduced wake after sleep onset, whereas the chronic physical exercise increased sleep efficiency, REM sleep, and reduced sleep latency. We also found a significant negative correlation between beta-endorphin release after acute intensive exercise and PLM levels (r = -0.63). Conclusion: Physical exercise may improve sleep patterns and reduce PLM levels. The correlation between beta-endorphin release after acute intensive exercise and PLM levels might be associated with the impact physical exercise has on the opiodergic system. We suggest that physical exercise may be a useful nonpharmacological treatment for PLM.
Resumo:
One of the main challenges for developers of new human-computer interfaces is to provide a more natural way of interacting with computer systems, avoiding excessive use of hand and finger movements. In this way, also a valuable alternative communication pathway is provided to people suffering from motor disabilities. This paper describes the construction of a low cost eye tracker using a fixed head setup. Therefore a webcam, laptop and an infrared lighting source were used together with a simple frame to fix the head of the user. Furthermore, detailed information on the various image processing techniques used for filtering the centre of the pupil and different methods to calculate the point of gaze are discussed. An overall accuracy of 1.5 degrees was obtained while keeping the hardware cost of the device below 100 euros.
Resumo:
A robot mounted camera is useful in many machine vision tasks as it allows control over view direction and position. In this paper we report a technique for calibrating both the robot and the camera using only a single corresponding point. All existing head-eye calibration systems we have encountered rely on using pre-calibrated robots, pre- calibrated cameras, special calibration objects or combinations of these. Our method avoids using large scale non-linear optimizations by recovering the parameters in small dependent groups. This is done by performing a series of planned, but initially uncalibrated robot movements. Many of the kinematic parameters are obtained using only camera views in which the calibration feature is at, or near the image center, thus avoiding errors which could be introduced by lens distortion. The calibration is shown to be both stable and accurate. The robotic system we use consists of camera with pan-tilt capability mounted on a Cartesian robot, providing a total of 5 degrees of freedom.
Resumo:
Purpose: To investigate the dynamics of ocular eyelid movements in newborn infants and preschool-age children.Methods: Fifty newborn infants and 200 preschool children aged 4-6 years were examined. Images of each child, with his or her eyes in the primary eye position looking at an object placed at the child's height, were recorded with a digital videocamera for 3 mins. Complete and incomplete blink rates, opening, closing and complete blink times were calculated.Results: Newborn infants presented a lower number of incomplete movements than preschool children. The complete blink rate was lower in newborn infants (6.2 blinks/min) than in preschool children (8.0 blinks/minute). Eyelid closing, opening and compete blink times were longer in newborn infants than in preschool children at all observation times.Conclusions: Newborn infants had a different pattern of eyelid movement compared with preschool children. Specific characteristics that are found in this group of children particularly, such as immaturity of the neural system and more resistant tear film, may explain these findings in part.
Resumo:
The aim of this study was to determine the role of head, eye and arm movements during the execution of a table tennis forehand stroke. Three-dimensional kinematic analysis of line-of-gaze, arm and ball was used to describe visual and motor behaviour. Skilled and less skilled participants returned the ball to cued right or left target areas under three levels of temporal constraint: pre-, early- and late-cue conditions. In the pre- and early-cue conditions, both high and low skill participants tracked the ball early in flight and kept gaze stable on a location in advance of the ball before ball-bat contact. Skilled participants demonstrated an earlier onset of ball tracking and recorded higher performance accuracy than less skilled counterparts. The manipulation of cue condition showed the limits of adaptation to maintain accuracy on the target. Participants were able to accommodate the constraints imposed by the early-cue condition by using a shorter quiet eye duration, earlier quiet eye offset and reduced arm velocity at contact. In the late-cue condition, modifications to gaze, head and arm movements were not sufficient to preserve accuracy. The findings highlight the functional coupling between perception and action during time-constrained, goal-directed actions.
Resumo:
Speech is often a multimodal process, presented audiovisually through a talking face. One area of speech perception influenced by visual speech is speech segmentation, or the process of breaking a stream of speech into individual words. Mitchel and Weiss (2013) demonstrated that a talking face contains specific cues to word boundaries and that subjects can correctly segment a speech stream when given a silent video of a speaker. The current study expanded upon these results, using an eye tracker to identify highly attended facial features of the audiovisual display used in Mitchel and Weiss (2013). In Experiment 1, subjects were found to spend the most time watching the eyes and mouth, with a trend suggesting that the mouth was viewed more than the eyes. Although subjects displayed significant learning of word boundaries, performance was not correlated with gaze duration on any individual feature, nor was performance correlated with a behavioral measure of autistic-like traits. However, trends suggested that as autistic-like traits increased, gaze duration of the mouth increased and gaze duration of the eyes decreased, similar to significant trends seen in autistic populations (Boratston & Blakemore, 2007). In Experiment 2, the same video was modified so that a black bar covered the eyes or mouth. Both videos elicited learning of word boundaries that was equivalent to that seen in the first experiment. Again, no correlations were found between segmentation performance and SRS scores in either condition. These results, taken with those in Experiment, suggest that neither the eyes nor mouth are critical to speech segmentation and that perhaps more global head movements indicate word boundaries (see Graf, Cosatto, Strom, & Huang, 2002). Future work will elucidate the contribution of individual features relative to global head movements, as well as extend these results to additional types of speech tasks.
Resumo:
Eye-movement abnormalities in schizophrenia are a well-established phenomenon that has been observed in many studies. In such studies, visual targets are usually presented in the center of the visual field, and the subject's head remains fixed. However, in every-day life, targets may also appear in the periphery. This study is among the first to investigate eye and head movements in schizophrenia by presenting targets in the periphery of the visual field.
Resumo:
Coordinated eye and head movements simultaneously occur to scan the visual world for relevant targets. However, measuring both eye and head movements in experiments allowing natural head movements may be challenging. This paper provides an approach to study eye-head coordination: First, we demonstra- te the capabilities and limits of the eye-head tracking system used, and compare it to other technologies. Second, a beha- vioral task is introduced to invoke eye-head coordination. Third, a method is introduced to reconstruct signal loss in video- based oculography caused by cornea reflection artifacts in order to extend the tracking range. Finally, parameters of eye- head coordination are identified using EHCA (eye-head co- ordination analyzer), a MATLAB software which was developed to analyze eye-head shifts. To demonstrate the capabilities of the approach, a study with 11 healthy subjects was performed to investigate motion behavior. The approach presented here is discussed as an instrument to explore eye-head coordination, which may lead to further insights into attentional and motor symptoms of certain neurological or psychiatric diseases, e.g., schizophrenia.
Resumo:
Cataract is a known condition leading to opacification of the eye lens causing partial or total blindness. Mutations are known to cause autosomal dominant or recessive inherited forms of cataracts in humans, mice, rats, guinea pigs and dogs. The use of large-sized animal models instead of those using mice for the study of this condition has been discussed due to the small size of rodent lenses. Four juvenile-onset cases of bilateral incomplete immature nuclear cataract were recently observed in Romagnola cattle. Pedigree analysis suggested a monogenic autosomal recessive inheritance. In addition to the cataract, one of the cases displayed abnormal head movements. Genome-wide association and homozygosity mapping and subsequent whole genome sequencing of a single case identified two perfectly associated sequence variants in a critical interval of 7.2 Mb on cattle chromosome 28: a missense point mutation located in an uncharacterized locus and an 855 bp deletion across the exon 19/intron 19 border of the bovine nidogen 1 (NID1) gene (c.3579_3604+829del). RT-PCR showed that NID1 is expressed in bovine lenses while the transcript of the second locus was absent. The NID1 deletion leads to the skipping of exon 19 during transcription and is therefore predicted to cause a frameshift and premature stop codon (p.1164fs27X). The truncated protein lacks a C-terminal domain essential for binding with matrix assembly complexes. Nidogen 1 deficient mice show neurological abnormalities and highly irregular crystal lens alterations. This study adds NID1 to the list of candidate genes for inherited cataract in humans and is the first report of a naturally occurring mutation leading to non-syndromic catarct in cattle provides a potential large animal model for human cataract.
Resumo:
BACKGROUND: Crossing a street can be a very difficult task for older pedestrians. With increased age and potential cognitive decline, older people take the decision to cross a street primarily based on vehicles' distance, and not on their speed. Furthermore, older pedestrians tend to overestimate their own walking speed, and could not adapt it according to the traffic conditions. Pedestrians' behavior is often tested using virtual reality. Virtual reality presents the advantage of being safe, cost-effective, and allows using standardized test conditions. METHODS: This paper describes an observational study with older and younger adults. Street crossing behavior was investigated in 18 healthy, younger and 18 older subjects by using a virtual reality setting. The aim of the study was to measure behavioral data (such as eye and head movements) and to assess how the two age groups differ in terms of number of safe street crossings, virtual crashes, and missed street crossing opportunities. Street crossing behavior, eye and head movements, in older and younger subjects, were compared with non-parametric tests. RESULTS: The results showed that younger pedestrians behaved in a more secure manner while crossing a street, as compared to older people. The eye and head movements analysis revealed that older people looked more at the ground and less at the other side of the street to cross. CONCLUSIONS: The less secure behavior in street crossing found in older pedestrians could be explained by their reduced cognitive and visual abilities, which, in turn, resulted in difficulties in the decision-making process, especially under time pressure. Decisions to cross a street are based on the distance of the oncoming cars, rather than their speed, for both groups. Older pedestrians look more at their feet, probably because of their need of more time to plan precise stepping movement and, in turn, pay less attention to the traffic. This might help to set up guidelines for improving senior pedestrians' safety, in terms of speed limits, road design, and mixed physical-cognitive trainings.
Resumo:
Resumen del póster expuesto en el 6th EOS Topical Meeting on Visual and Physiological Optics (EMVPO 2012), Dublín, 20-22 Agosto 2012.
Resumo:
Póster presentado en el 6th EOS Meeting on Visual and Physiological Optics (EMVPO 2012), Dublín, 20-22 Agosto 2012.
Resumo:
This research pursued the conceptualization and real-time verification of a system that allows a computer user to control the cursor of a computer interface without using his/her hands. The target user groups for this system are individuals who are unable to use their hands due to spinal dysfunction or other afflictions, and individuals who must use their hands for higher priority tasks while still requiring interaction with a computer. ^ The system receives two forms of input from the user: Electromyogram (EMG) signals from muscles in the face and point-of-gaze coordinates produced by an Eye Gaze Tracking (EGT) system. In order to produce reliable cursor control from the two forms of user input, the development of this EMG/EGT system addressed three key requirements: an algorithm was created to accurately translate EMG signals due to facial movements into cursor actions, a separate algorithm was created that recognized an eye gaze fixation and provided an estimate of the associated eye gaze position, and an information fusion protocol was devised to efficiently integrate the outputs of these algorithms. ^ Experiments were conducted to compare the performance of EMG/EGT cursor control to EGT-only control and mouse control. These experiments took the form of two different types of point-and-click trials. The data produced by these experiments were evaluated using statistical analysis, Fitts' Law analysis and target re-entry (TRE) analysis. ^ The experimental results revealed that though EMG/EGT control was slower than EGT-only and mouse control, it provided effective hands-free control of the cursor without a spatial accuracy limitation, and it also facilitated a reliable click operation. This combination of qualities is not possessed by either EGT-only or mouse control, making EMG/EGT cursor control a unique and practical alternative for a user's cursor control needs. ^
Resumo:
An unfavorable denture-bearing area could compromise denture retention and stability, limit mastication, and possibly alter masticatory motion. The purpose of this study was to evaluate the masticatory movements of denture wearers with normal and resorbed denture-bearing areas. Completely edentulous participants who received new complete dentures were selected and divided into 2 groups (n=15) according to the condition of their denture-bearing areas as classified by the Kapur method: a normal group (control) (mean age, 65.9 ± 7.8 years) and a resorbed group (mean age, 70.2 ± 7.6 years). Masticatory motion was recorded and analyzed with a kinesiographic device. The patients masticated peanuts and Optocal. The masticatory movements evaluated were the durations of opening, closing, and occlusion; duration of the masticatory cycle; maximum velocities and angles of opening and closing; total masticatory area; and amplitudes of the masticatory cycle. The data were analyzed by 2-way ANOVA and the Tukey honestly significant difference post hoc test (α=.05). The group with a resorbed denture-bearing area had a smaller total masticatory area in the frontal plane and shorter horizontal masticatory amplitude than the group with normal denture-bearing area (P<.05). Denture wearers with resorbed denture-bearing areas showed reduced jaw motion during mastication.
Resumo:
Patients with myofascial pain experience impaired mastication, which might also interfere with their sleep quality. The purpose of this study was to evaluate the jaw motion and sleep quality of patients with myofascial pain and the impact of a stabilization device therapy on both parameters. Fifty women diagnosed with myofascial pain by the Research Diagnostic Criteria were enrolled. Pain levels (visual analog scale), jaw movements (kinesiography), and sleep quality (Epworth Sleepiness Scale; Pittsburgh Sleep Quality Index) were evaluated before (control) and after stabilization device use. Range of motion (maximum opening, right and left excursions, and protrusion) and masticatory movements during Optosil mastication (opening, closing, and total cycle time; opening and closing angles; and maximum velocity) also were evaluated. Repeated-measures analysis of variance in a generalized linear mixed models procedure was used for statistical analysis (α=.05). At baseline, participants with myofascial pain showed a reduced range of jaw motion and poorer sleep quality. Treatment with a stabilization device reduced pain (P<.001) and increased both mouth opening (P<.001) and anteroposterior movement (P=.01). Also, after treatment, the maximum opening (P<.001) and closing (P=.04) velocities during mastication increased, and improvements in sleep scores for the Pittsburgh Sleep Quality Index (P<.001) and Epworth Sleepiness Scale (P=.04) were found. Myofascial pain impairs jaw motion and quality of sleep; the reduction of pain after the use of a stabilization device improves the range of motion and sleep parameters.