972 resultados para Freeway Travel Time
Resumo:
As the number of solutions to the Einstein equations with realistic matter sources that admit closed time-like curves (CTC's) has grown drastically, it has provoked some authors [10] to call for a physical interpretation of these seemingly exotic curves that could possibly allow for causality violations. A first step in drafting a physical interpretation would be to understand how CTC's are created because the recent work of [16] has suggested that, to follow a CTC, observers must counter-rotate with the rotating matter, contrary to the currently accepted explanation that it is due to inertial frame dragging that CTC's are created. The exact link between inertialframe dragging and CTC's is investigated by simulating particle geodesics and the precession of gyroscopes along CTC's and backward in time oriented circular orbits in the van Stockum metric, known to have CTC's that could be traversal, so the van Stockum cylinder could be exploited as a time machine. This study of gyroscopeprecession, in the van Stockum metric, supports the theory that CTC's are produced by inertial frame dragging due to rotating spacetime metrics.
Resumo:
Recent research showed that past events are associated with the back and left side, whereas future events are associated with the front and right side of space. These spatial-temporal associations have an impact on our sensorimotor system: thinking about one's past and future leads to subtle body sways in the sagittal dimension of space (Miles, Nind, & Macrae, 2010). In this study we investigated whether mental time travel leads to sensorimotor correlates in the horizontal dimension of space. Participants were asked to mentally displace themselves into the past or future while measuring their spontaneous eye movements on a blank screen. Eye gaze was directed more rightward and upward when thinking about the future than when thinking about the past. Our results provide further insight into the spatial nature of temporal thoughts, and show that not only body, but also eye movements follow a (diagonal) "time line" during mental time travel.
Resumo:
Federal Highway Administration, Office of Research, Washington, D.C.
Resumo:
Prepared for U.S. Federal Highway Administration.
Resumo:
Bibliography: p. 17.
Resumo:
Bibliography: p. 29.
Resumo:
Library has number 378.
Resumo:
Bibliography: p. 47.
Resumo:
Cover title.
Resumo:
Mechanisms that produce behavior which increase future survival chances provide an adaptive advantage. The flexibility of human behavior is at least partly the result of one such mechanism, our ability to travel mentally in time and entertain potential future scenarios. We can study mental time travel in children using language. Current results suggest that key developments occur between the ages of three to five. However, linguistic performance can be misleading as language itself is developing. We therefore advocate the use of methodologies that focus on future-oriented action. Mental time travel required profound changes in humans' motivational system, so that current behavior could be directed to secure not just present, but individually anticipated future needs. Such behavior should be distinguishable from behavior based on current drives, or on other mechanisms. We propose an experimental paradigm that provides subjects with an opportunity to act now to satisfy a need not currently experienced. This approach may be used to assess mental time travel in nonhuman animals. We conclude by describing a preliminary study employing an adaptation of this paradigm for children. (c) 2005 Elsevier Inc. All rights reserved.
Resumo:
Peer reviewed
Resumo:
Postprint
Resumo:
Using the concept of time travel as a contextual and narrative tool, the author explores themes of love, loss and growth after trauma. Reflections relate primarily to the experience of conducting the qualitative research method of autoethnography. Opening with consideration of existing work (Yoga and Loss: An Autoethnographical Exploration of Grief, Mind, and Body), discussion moves on to academic thought on mental time travel, and personal transformation, culminating in the construction of a new memory combining past, present, and future.
Resumo:
This paper discusses a multi-layer feedforward (MLF) neural network incident detection model that was developed and evaluated using field data. In contrast to published neural network incident detection models which relied on simulated or limited field data for model development and testing, the model described in this paper was trained and tested on a real-world data set of 100 incidents. The model uses speed, flow and occupancy data measured at dual stations, averaged across all lanes and only from time interval t. The off-line performance of the model is reported under both incident and non-incident conditions. The incident detection performance of the model is reported based on a validation-test data set of 40 incidents that were independent of the 60 incidents used for training. The false alarm rates of the model are evaluated based on non-incident data that were collected from a freeway section which was video-taped for a period of 33 days. A comparative evaluation between the neural network model and the incident detection model in operation on Melbourne's freeways is also presented. The results of the comparative performance evaluation clearly demonstrate the substantial improvement in incident detection performance obtained by the neural network model. The paper also presents additional results that demonstrate how improvements in model performance can be achieved using variable decision thresholds. Finally, the model's fault-tolerance under conditions of corrupt or missing data is investigated and the impact of loop detector failure/malfunction on the performance of the trained model is evaluated and discussed. The results presented in this paper provide a comprehensive evaluation of the developed model and confirm that neural network models can provide fast and reliable incident detection on freeways. (C) 1997 Elsevier Science Ltd. All rights reserved.