904 resultados para Motion-based driving simulator
Resumo:
This study evaluates the influence of different cartographic representations of in-car navigation systems on visual demand, subjective preference, and navigational error. It takes into account the type and complexity of the representation, maneuvering complexity, road layout, and driver gender. A group of 28 drivers (14 male and 14 female) participated in this experiment which was performed in a low-cost driving simulator. The tests were performed on a limited number of instances for each type of representation, and their purpose was to carry out a preliminary assessment and provide future avenues for further studies. Data collected for the visual demand study were analyzed using non-parametric statistical analyses. Results confirmed previous research that showed that different levels of design complexity significantly influence visual demand. Non-grid-like road networks, for example, influence significantly visual demand and navigational error. An analysis of simple maneuvers on a grid-like road network showed that static and blinking arrows did not present significant differences. From the set of representations analyzed to assess visual demand, both arrows were equally efficient. From a gender perspective, women seem to took at the display more than men, but this factor was not significant. With respect to subjective preferences, drivers prefer representations with mimetic landmarks when they perform straight-ahead tasks. For maneuvering tasks, landmarks in a perspective model created higher visual demands.
Resumo:
Pós-graduação em Ciências Cartográficas - FCT
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
The digital electronic market development is founded on the continuous reduction of the transistors size, to reduce area, power, cost and increase the computational performance of integrated circuits. This trend, known as technology scaling, is approaching the nanometer size. The lithographic process in the manufacturing stage is increasing its uncertainty with the scaling down of the transistors size, resulting in a larger parameter variation in future technology generations. Furthermore, the exponential relationship between the leakage current and the threshold voltage, is limiting the threshold and supply voltages scaling, increasing the power density and creating local thermal issues, such as hot spots, thermal runaway and thermal cycles. In addiction, the introduction of new materials and the smaller devices dimension are reducing transistors robustness, that combined with high temperature and frequently thermal cycles, are speeding up wear out processes. Those effects are no longer addressable only at the process level. Consequently the deep sub-micron devices will require solutions which will imply several design levels, as system and logic, and new approaches called Design For Manufacturability (DFM) and Design For Reliability. The purpose of the above approaches is to bring in the early design stages the awareness of the device reliability and manufacturability, in order to introduce logic and system able to cope with the yield and reliability loss. The ITRS roadmap suggests the following research steps to integrate the design for manufacturability and reliability in the standard CAD automated design flow: i) The implementation of new analysis algorithms able to predict the system thermal behavior with the impact to the power and speed performances. ii) High level wear out models able to predict the mean time to failure of the system (MTTF). iii) Statistical performance analysis able to predict the impact of the process variation, both random and systematic. The new analysis tools have to be developed beside new logic and system strategies to cope with the future challenges, as for instance: i) Thermal management strategy that increase the reliability and life time of the devices acting to some tunable parameter,such as supply voltage or body bias. ii) Error detection logic able to interact with compensation techniques as Adaptive Supply Voltage ASV, Adaptive Body Bias ABB and error recovering, in order to increase yield and reliability. iii) architectures that are fundamentally resistant to variability, including locally asynchronous designs, redundancy, and error correcting signal encodings (ECC). The literature already features works addressing the prediction of the MTTF, papers focusing on thermal management in the general purpose chip, and publications on statistical performance analysis. In my Phd research activity, I investigated the need for thermal management in future embedded low-power Network On Chip (NoC) devices.I developed a thermal analysis library, that has been integrated in a NoC cycle accurate simulator and in a FPGA based NoC simulator. The results have shown that an accurate layout distribution can avoid the onset of hot-spot in a NoC chip. Furthermore the application of thermal management can reduce temperature and number of thermal cycles, increasing the systemreliability. Therefore the thesis advocates the need to integrate a thermal analysis in the first design stages for embedded NoC design. Later on, I focused my research in the development of statistical process variation analysis tool that is able to address both random and systematic variations. The tool was used to analyze the impact of self-timed asynchronous logic stages in an embedded microprocessor. As results we confirmed the capability of self-timed logic to increase the manufacturability and reliability. Furthermore we used the tool to investigate the suitability of low-swing techniques in the NoC system communication under process variations. In this case We discovered the superior robustness to systematic process variation of low-swing links, which shows a good response to compensation technique as ASV and ABB. Hence low-swing is a good alternative to the standard CMOS communication for power, speed, reliability and manufacturability. In summary my work proves the advantage of integrating a statistical process variation analysis tool in the first stages of the design flow.
Resumo:
The dissertation titled "Driver Safety in Far-side and Far-oblique Crashes" presents a novel approach to assessing vehicle cockpit safety by integrating Human Factors and Applied Mechanics. The methodology of this approach is aimed at improving safety in compact mobile workspaces such as patrol vehicle cockpits. A statistical analysis performed using Michigan state's traffic crash data to assess various contributing factors that affect the risk of severe driver injuries showed that the risk was greater for unrestrained drivers (OR=3.38, p<0.0001) and for incidents involving front and far-side crashes without seatbelts (OR=8.0 and 23.0 respectively, p<0.005). Statistics also showed that near-side and far-side crashes pose similar threat to driver injury severity. A Human Factor survey was conducted to assess various Human-Machine/Human-Computer Interaction aspects in patrol vehicle cockpits. Results showed that tasks requiring manual operation, especially the usage of laptop, would require more attention and potentially cause more distraction. A vehicle survey conducted to evaluate ergonomics-related issues revealed that some of the equipment was in airbag deployment zones. In addition, experiments were conducted to assess the effects on driver distraction caused by changing the position of in-car accessories. A driving simulator study was conducted to mimic HMI/HCI in a patrol vehicle cockpit (20 subjects, average driving experience = 5.35 years, s.d. = 1.8). It was found that the mounting locations of manual tasks did not result in a significant change in response times. Visual displays resulted in response times less than 1.5sec. It can also be concluded that the manual task was equally distracting regardless of mounting positions (average response time was 15 secs). Average speeds and lane deviations did not show any significant results. Data from 13 full-scale sled tests conducted to simulate far-side impacts at 70 PDOF and 40 PDOF was used to analyze head injuries and HIC/AIS values. It was found that accelerations generated by the vehicle deceleration alone were high enough to cause AIS 3 - AIS 6 injuries. Pretensioners could mitigated injuries only in 40 PDOF (oblique) impacts but are useless in 70 PDOF impacts. Seat belts were ineffective in protecting the driver's head from injuries. Head would come in contact with the laptop during a far-oblique (40 PDOF) crash and far-side door for an angle-type crash (70 PDOF). Finite Element analysis head-laptop impact interaction showed that the contact velocity was the most crucial factor in causing a severe (and potentially fatal) head injury. Results indicate that no equipment may be mounted in driver trajectory envelopes. A very narrow band of space is left in patrol vehicles for installation of manual-task equipment to be both safe and ergonomic. In case of a contact, the material stiffness and damping properties play a very significant role in determining the injury outcome. Future work may be done on improving the interiors' material properties to better absorb and dissipate kinetic energy of the head. The design of seat belts and pretensioners may also be seen as an essential aspect to be further improved.
Resumo:
Human behavior is a major factor modulating the consequences of road tunnel accidents. We investigated the effect of information and instruction on drivers' behavior as well as the usability of virtual environments to simulate such emergency situations. Tunnel safety knowledge of the general population was assessed using an online questionnaire, and tunnel safety behavior was investigated in a virtual reality experiment. Forty-four participants completed three drives through a virtual road tunnel and were confronted with a traffic jam, no event, and an accident blocking the road. Participants were randomly assigned to a control group (no intervention), an informed group who read a brochure containing safety information prior to the tunnel drives, or an informed and instructed group who read the same brochure and received additional instructions during the emergency situation. Informed participants showed better and quicker safety behavior than the control group. Self-reports of anxiety were assessed three times during each drive. Anxiety was elevated during and after the emergency situation. The findings demonstrate problematic safety behavior in the control group and that knowledge of safety information fosters adequate behavior in tunnel emergencies. Enhanced anxiety ratings during the emergency situation indicate external validity of the virtual environment.
Resumo:
The Pacific plate has undergone a substantial northward displacement during the late Mesozoic and the Cainozoic. Here we give additional documentation for such motion based on palaeomagnetic measurements of a sequence of sedimentary and basalt samples collected from middle Oligocene to Aptian sections of Deep Sea Drilling Project (DSDP) site 289 (Andrews, 1975; 00° 29.92'S, 158° 30.69'E) drilled on the Ontong Java Plateau.
Resumo:
Thesis (Master's)--University of Washington, 2016-06
Resumo:
Skill and risk taking are argued to be independent and to require different remedial programs. However, it is possible to contend that skill-based training could be associated with an increase, a decrease, or no change in fisk-taking behavior. In 3 experiments, the authors examined the influence of a skill-based training program (hazard perception) on the fisk-taking behavior of car drivers (using video-based driving simulations). Experiment 1 demonstrated a decrease in risk taking for novice drivers. In Experiment 2, the authors examined the possibilities that the skills training might operate through either a nonspecific reduction in risk taking or a specific improvement in hazard perception. Evidence supported the latter. These findings were replicated in a more ecological context in Experiment 3, which compared advanced and nonadvanced police drivers.
Resumo:
Objectives: In this paper, we present a unified electrodynamic heart model that permits simulations of the body surface potentials generated by the heart in motion. The inclusion of motion in the heart model significantly improves the accuracy of the simulated body surface potentials and therefore also the 12-lead ECG. Methods: The key step is to construct an electromechanical heart model. The cardiac excitation propagation is simulated by an electrical heart model, and the resulting cardiac active forces are used to calculate the ventricular wall motion based on a mechanical model. The source-field point relative position changes during heart systole and diastole. These can be obtained, and then used to calculate body surface ECG based on the electrical heart-torso model. Results: An electromechanical biventricular heart model is constructed and a standard 12-lead ECG is simulated. Compared with a simulated ECG based on the static electrical heart model, the simulated ECG based on the dynamic heart model is more accordant with a clinically recorded ECG, especially for the ST segment and T wave of a V1-V6 lead ECG. For slight-degree myocardial ischemia ECG simulation, the ST segment and T wave changes can be observed from the simulated ECG based on a dynamic heart model, while the ST segment and T wave of simulated ECG based on a static heart model is almost unchanged when compared with a normal ECG. Conclusions: This study confirms the importance of the mechanical factor in the ECG simulation. The dynamic heart model could provide more accurate ECG simulation, especially for myocardial ischemia or infarction simulation, since the main ECG changes occur at the ST segment and T wave, which correspond with cardiac systole and diastole phases.
Resumo:
Computer modelling promises to. be an important tool for analysing and predicting interactions between trees within mixed species forest plantations. This study explored the use of an individual-based mechanistic model as a predictive tool for designing mixed species plantations of Australian tropical trees. The 'spatially explicit individually based-forest simulator' (SeXI-FS) modelling system was used to describe the spatial interaction of individual tree crowns within a binary mixed-species experiment. The three-dimensional model was developed and verified with field data from three forest tree species grown in tropical Australia. The model predicted the interactions within monocultures and binary mixtures of Flindersia brayleyana, Eucalyptus pellita and Elaeocarpus grandis, accounting for an average of 42% of the growth variation exhibited by species in different treatments. The model requires only structural dimensions and shade tolerance as species parameters. By modelling interactions in existing tree mixtures, the model predicted both increases and reductions in the growth of mixtures (up to +/- 50% of stem volume at 7 years) compared to monocultures. This modelling approach may be useful for designing mixed tree plantations. (c) 2006 Published by Elsevier B.V.
Resumo:
This paper reports on a current research project in which virtual reality simulators are being investigated as a means of simulating hazardous Rail work conditions in order to allow train drivers to practice decision-making under stress. When working under high stress conditions train drivers need to move beyond procedural responses into a response activated through their own problem-solving and decision-making skills. This study focuses on the use of stress inoculation training which aims to build driver’s confidence in the use of new decision-making skills by being repeatedly required to respond to hazardous driving conditions. In particular, the study makes use of a train cab driving simulator to reproduce potentially stress inducing real-world scenarios. Initial pilot research has been undertaken in which drivers have experienced the training simulation and subsequently completed surveys on the level of immersion experienced. Concurrently drivers have also participated in a velocity perception experiment designed to objectively measure the fidelity of the virtual training environment. Baseline data, against which decision-making skills post training will be measured, is being gathered via cognitive task analysis designed to identify primary decision requirements for specific rail events. While considerable efforts have been invested in improving Virtual Reality technology, little is known about how to best use this technology for training personnel to respond to workplace conditions in the Rail Industry. To enable the best use of simulators for training in the Rail context the project aims to identify those factors within virtual reality that support required learning outcomes and use this information to design training simulations that reliably and safely train staff in required workplace accident response skills.
Resumo:
O objectivo principal deste estudo é tentar identificar as possibilidades de melhoria das interfaces físicas das consolas centrais dos automóveis, e a sua ergonomia e usabilidade enquanto meio de realização de tarefas dentro do veículo, de modo a contribuir para melhorar a experiência e a segurança na condução. Neste estudo é feita uma investigação acerca de algumas das interfaces físicas das consolas centrais, medindo o seu desempenho no que diz respeito à distracção do condutor e facilidade de uso em situações de dupla tarefa. Para isso foi adaptado um simulador de condução, e uma bateria de testes de situações de dupla tarefa foi efectuada, de modo a obter dados de telemetria de condução e de desempenho de cada interface da consola central. Os dados obtidos acerca da trajectória do veículo e da sua comparação com trajectórias de referência e a velocidade média nos sectores de condução, foram comparados com dados de desvio de olhar, e que por sua vez sã o comparados e relacionados com os dados de percepção própria do condutor, obtidos através dos testes subjectivos de auto-percepção NASA - Administração Nacional da Aeronáutica e do Espaço - Raw Task Load Index (NASA RTLX). É esperado que desta análise possam ser encontradas algumas conclusões que deverão indicar oportunidades de melhoria às interfaces das consolas centrais, que possam resultar de uma combinação ou divisão dos sistemas, ou que abram caminho ao desenvolvimento de novas soluções alternativas, ou até à criação de um guia de boas práticas para o futuro design e desenvolvimento de interfaces de consolas centrais para automóveis.
Resumo:
PURPOSE: To examine the effect of uncorrected astigmatism in older adults. SETTING: University Vision Clinic METHOD: Twenty-one healthy presbyopes, aged 58.9±2.8 years, had astigmatism of 0.0 to -4.0 x 90?DC and -3.0DC of cylinder at 90?, 180? and 45? induced with spectacle lenses, with the mean spherical equivalent compensated to plano, in random order. Visual acuity was assessed binocularly using a computerised test chart at 95%, 50% and 10% contrast. Near acuity and reading speed were measured using standardised reading texts. Light scatter was quantified with the cQuant and driving reaction times with a computer simulator. Finally visual clarity of a mobile phone and computer screen was subjectively rated. RESULTS: Distance visual acuity decreased with increasing uncorrected astigmatic power (F=174.50, p<0.001) and was reduced at lower contrasts (F=170.77, p<0.001). Near visual acuity and reading speed also decreased with increasing uncorrected astigmatism power (p<0.001). Light scatter was not significantly affected by uncorrected astigmatism (p>0.05), but the reliability and variability of measurements decreased with increasing uncorrected astigmatic power (p<0.05). Driving simulator performance was also unaffected by uncorrected astigmatism (p>0.05), but subjective rating of clarity decreased with increasing uncorrected astigmatic power (p<0.001). Uncorrected astigmatism at 45? or 180? orientation resulted in a worse distance and near visual acuity, and subjective rated clarity than 90? orientation (p<0.05). CONCLUSION: Uncorrected astigmatism, even as low as 1.0DC, causes a significant burden on a patient’s vision. If left uncorrected, this could impact significantly on their independence, quality of life and wellbeing.
Resumo:
Into the Bends of Time is a 40-minute work in seven movements for a large chamber orchestra with electronics, utilizing real-time computer-assisted processing of music performed by live musicians. The piece explores various combinations of interactive relationships between players and electronics, ranging from relatively basic processing effects to musical gestures achieved through stages of computer analysis, in which resulting sounds are crafted according to parameters of the incoming musical material. Additionally, some elements of interaction are multi-dimensional, in that they rely on the participation of two or more performers fulfilling distinct roles in the interactive process with the computer in order to generate musical material. Through processes of controlled randomness, several electronic effects induce elements of chance into their realization so that no two performances of this work are exactly alike. The piece gets its name from the notion that real-time computer-assisted processing, in which sound pressure waves are transduced into electrical energy, converted to digital data, artfully modified, converted back into electrical energy and transduced into sound waves, represents a “bending” of time.
The Bill Evans Trio featuring bassist Scott LaFaro and drummer Paul Motian is widely regarded as one of the most important and influential piano trios in the history of jazz, lauded for its unparalleled level of group interaction. Most analyses of Bill Evans’ recordings, however, focus on his playing alone and fail to take group interaction into account. This paper examines one performance in particular, of Victor Young’s “My Foolish Heart” as recorded in a live performance by the Bill Evans Trio in 1961. In Part One, I discuss Steve Larson’s theory of musical forces (expanded by Robert S. Hatten) and its applicability to jazz performance. I examine other recordings of ballads by this same trio in order to draw observations about normative ballad performance practice. I discuss meter and phrase structure and show how the relationship between the two is fixed in a formal structure of repeated choruses. I then develop a model of perpetual motion based on the musical forces inherent in this structure. In Part Two, I offer a full transcription and close analysis of “My Foolish Heart,” showing how elements of group interaction work with and against the musical forces inherent in the model of perpetual motion to achieve an unconventional, dynamic use of double-time. I explore the concept of a unified agential persona and discuss its role in imparting the song’s inherent rhetorical tension to the instrumental musical discourse.