888 resultados para Ferreira, J. Alfredo
Resumo:
Railway level crossings are amongst the most complex of road safety control systems, due to the conflicts between road vehicles and rail infrastructure, trains and train operations. Driver behaviour at railway crossings is the major collision factor. The main objective of the present paper was to evaluate the existing conventional warning devices in relation to driver behaviour. The common conventional warning devices in Australia are a stop sign (passive), flashing lights and a half boom-barrier with flashing lights (active). The data were collected using two approaches, namely: field video recordings at selected sites and a driving simulator in a laboratory. This paper describes and compares the driver response results from both the field survey and the driving simulator. The conclusion drawn is that different types of warning systems resulted in varying driver responses at crossings. The results showed that on average driver responses to passive crossings were poor when compared to active ones. The field results were consistent with the simulator results for the existing conventional warning devices and hence they may be used to calibrate the simulator for further evaluation of alternative warning systems.
Resumo:
Electricity has been the major source of power in most railway systems. Reliable, efficient and safe power distribution to the trains is vitally important to the overall quality of railway service. Like any large-scale engineering system, design, operation and planning of traction power systems rely heavily on computer simulation. This paper reviews the major features on modelling and the general practices for traction power system simulation; and introduces the development of the latest simulation approach with discussions on simulation results and practical applications. Remarks will also be given on the future challenges on traction power system simulation.
Resumo:
Process mining techniques are able to extract knowledge from event logs commonly available in today’s information systems. These techniques provide new means to discover, monitor, and improve processes in a variety of application domains. There are two main drivers for the growing interest in process mining. On the one hand, more and more events are being recorded, thus, providing detailed information about the history of processes. On the other hand, there is a need to improve and support business processes in competitive and rapidly changing environments. This manifesto is created by the IEEE Task Force on Process Mining and aims to promote the topic of process mining. Moreover, by defining a set of guiding principles and listing important challenges, this manifesto hopes to serve as a guide for software developers, scientists, consultants, business managers, and end-users. The goal is to increase the maturity of process mining as a new tool to improve the (re)design, control, and support of operational business processes.
Resumo:
Participation in networks, both as a concept and process, is widely supported in environmental education as a democratic and equitable pathway to individual and social change for sustainability. However, the processes of participation in networks are rarely problematized. Rather, it is assumed that we inherently know how to participate in networks. This assumption means that participation is seldom questioned. Underlying support for participation in networks is a belief that it allows individuals to connect in new and meaningful ways, that individuals can engage in making decisions and in bringing about change in arenas that affect them, and that they will be engaging in new, non-hierarchical and equitable relationships. In this paper we problematize participation in networks. As an example we use research into a decentralized network – described as such in its own literature - the Queensland Environmentally Sustainable Schools Initiative Alliance in Australia – to argue that while network participants were engaged and committed to participation in this network, 'old' forms of top-down engagement and relationships needed to be unlearnt. This paper thus proposes that for participation in decentralized networks to be meaningful, new learning about how to participate needs to occur.
Resumo:
Virtual environments can provide, through digital games and online social interfaces, extremely exciting forms of interactive entertainment. Because of their capability in displaying and manipulating information in natural and intuitive ways, such environments have found extensive applications in decision support, education and training in the health and science domains amongst others. Currently, the burden of validating both the interactive functionality and visual consistency of a virtual environment content is entirely carried out by developers and play-testers. While considerable research has been conducted in assisting the design of virtual world content and mechanics, to date, only limited contributions have been made regarding the automatic testing of the underpinning graphics software and hardware. The aim of this thesis is to determine whether the correctness of the images generated by a virtual environment can be quantitatively defined, and automatically measured, in order to facilitate the validation of the content. In an attempt to provide an environment-independent definition of visual consistency, a number of classification approaches were developed. First, a novel model-based object description was proposed in order to enable reasoning about the color and geometry change of virtual entities during a play-session. From such an analysis, two view-based connectionist approaches were developed to map from geometry and color spaces to a single, environment-independent, geometric transformation space; we used such a mapping to predict the correct visualization of the scene. Finally, an appearance-based aliasing detector was developed to show how incorrectness too, can be quantified for debugging purposes. Since computer games heavily rely on the use of highly complex and interactive virtual worlds, they provide an excellent test bed against which to develop, calibrate and validate our techniques. Experiments were conducted on a game engine and other virtual worlds prototypes to determine the applicability and effectiveness of our algorithms. The results show that quantifying visual correctness in virtual scenes is a feasible enterprise, and that effective automatic bug detection can be performed through the techniques we have developed. We expect these techniques to find application in large 3D games and virtual world studios that require a scalable solution to testing their virtual world software and digital content.
Resumo:
Monitoring the natural environment is increasingly important as habit degradation and climate change reduce theworld’s biodiversity.We have developed software tools and applications to assist ecologists with the collection and analysis of acoustic data at large spatial and temporal scales.One of our key objectives is automated animal call recognition, and our approach has three novel attributes. First, we work with raw environmental audio, contaminated by noise and artefacts and containing calls that vary greatly in volume depending on the animal’s proximity to the microphone. Second, initial experimentation suggested that no single recognizer could dealwith the enormous variety of calls. Therefore, we developed a toolbox of generic recognizers to extract invariant features for each call type. Third, many species are cryptic and offer little data with which to train a recognizer. Many popular machine learning methods require large volumes of training and validation data and considerable time and expertise to prepare. Consequently we adopt bootstrap techniques that can be initiated with little data and refined subsequently. In this paper, we describe our recognition tools and present results for real ecological problems.
Resumo:
The increasingly widespread use of large-scale 3D virtual environments has translated into an increasing effort required from designers, developers and testers. While considerable research has been conducted into assisting the design of virtual world content and mechanics, to date, only limited contributions have been made regarding the automatic testing of the underpinning graphics software and hardware. In the work presented in this paper, two novel neural network-based approaches are presented to predict the correct visualization of 3D content. Multilayer perceptrons and self-organizing maps are trained to learn the normal geometric and color appearance of objects from validated frames and then used to detect novel or anomalous renderings in new images. Our approach is general, for the appearance of the object is learned rather than explicitly represented. Experiments were conducted on a game engine to determine the applicability and effectiveness of our algorithms. The results show that the neural network technology can be effectively used to address the problem of automatic and reliable visual testing of 3D virtual environments.
Resumo:
In microscopic traffic simulators, the interaction between vehicles is considered. The dynamics of the system then becomes an emergent property of the interaction between its components. Such interactions include lane-changing, car-following behaviours and intersection management. Although, in some cases, such simulators produce realistic prediction, they do not allow for an important aspect of the dynamics, that is, the driver-vehicle interaction. This paper introduces a physically sound vehicle-driver model for realistic microscopic simulation. By building a nanoscopic traffic simulation model that uses steering angle and throttle position as parameters, the model aims to overcome unrealistic acceleration and deceleration values, as found in various microscopic simulation tools. A physics engine calculates the driving force of the vehicle, and the preliminary results presented here, show that, through a realistic driver-vehicle-environment simulator, it becomes possible to model realistic driver and vehicle behaviours in a traffic simulation.
Resumo:
Safety at Railway Level Crossings (RLXs) is an important issue within the Australian transport system. Crashes at RLXs involving road vehicles in Australia are estimated to cost $10 million each year. Such crashes are mainly due to human factors; unintentional errors contribute to 46% of all fatal collisions and are far more common than deliberate violations. This suggests that innovative intervention targeting drivers are particularly promising to improve RLX safety. In recent years there has been a rapid development of a variety of affordable technologies which can be used to increase driver’s risk awareness around crossings. To date, no research has evaluated the potential effects of such technologies at RLXs in terms of safety, traffic and acceptance of the technology. Integrating driving and traffic simulations is a safe and affordable approach for evaluating these effects. This methodology will be implemented in a driving simulator, where we recreated realistic driving scenario with typical road environments and realistic traffic. This paper presents a methodology for evaluating comprehensively potential benefits and negative effects of such interventions: this methodology evaluates driver awareness at RLXs , driver distraction and workload when using the technology . Subjective assessment on perceived usefulness and ease of use of the technology is obtained from standard questionnaires. Driving simulation will provide a model of driving behaviour at RLXs which will be used to estimate the effects of such new technology on a road network featuring RLX for different market penetrations using a traffic simulation. This methodology can assist in evaluating future safety interventions at RLXs.
Resumo:
Transit oriented developments (TODs) are master planned communities constructed to reduce the dependence on the private car and promote the modes of transport such as public transport, walking and cycling, which are presumed by many transport professionals to be more sustainable. This paper tests this assumption that TOD is a more sustainable form of development than traditional development, with respect to travel demand, by conducting travel surveys for a case study TOD and comparing the travel characteristics of TOD residents with the travel characteristics of residents of Brisbane, Australia who live in non TOD suburbs. The results of a household comparison showed that the Kelvin Grove Urban Village (KGUV) households had slightly smaller household size, lower vehicle and bicycle ownership compared to Brisbane Statistical Division (BSD), Brisbane’s inner north and inner south suburbs. The comparison of average trip characteristics showed that on an average KGUV residents undertook fewer trips on the given travel day (2.6 trips/person) compared to BSD (3.1 trips/person), Brisbane Inner North Suburbs (BINS) (3.6 trips/person) and Brisbane Inner South Suburbs (BISS) (3.5 trips/person) residents. The mode share comparison indicated that KGUV residents used more public transport and made more walk-only trips in comparison to BSD, BINS and BISS residents. Overall, 72.4 percent of KGUV residents used a sustainable mode of transport for their travel on a typical weekday. On the other hand, only 17.4 percent, 22.2 percent and 24.4 percent residents of BSD, BINS and BISS used sustainable modes of transport for this travel. The results of trip length comparison showed that overall KGUV residents have smaller average trip lengths as compared to its counterparts. KGUV & BINS residents used car for travelling farther and used public transport for accessing destinations located closer to their homes. On the contrary, BSD and BISS residents exhibited an opposite trend. These results support the transportation claims of many transport professionals that TODs are more transport efficient and therefore more sustainable in this respect.
Resumo:
TOD: - A fully planned, mixed use development equipped with good quality transit service and infrastructure for walking and cycling Hypothesis: -TOD will help to reduce urban transport congestion Method: -Comparison of a TOD with non TOD urban environments -Residents’ trip characteristics
Resumo:
The deployment of new emerging technologies, such as cooperative systems, allows the traffic community to foresee relevant improvements in terms of traffic safety and efficiency. Vehicles are able to communicate on the local traffic state in real time, which could result in an automatic and therefore better reaction to the mechanism of traffic jam formation. An upstream single hop radio broadcast network can improve the perception of each cooperative driver within radio range and hence the traffic stability. The impact of a cooperative law on traffic congestion appearance is investigated, analytically and through simulation. Ngsim field data is used to calibrate the Optimal Velocity with Relative Velocity (OVRV) car following model and the MOBIL lane-changing model is implemented. Assuming that congestion can be triggered either by a perturbation in the instability domain or by a critical lane changing behavior, the calibrated car following behavior is used to assess the impact of a microscopic cooperative law on abnormal lane changing behavior. The cooperative law helps reduce and delay traffic congestion as it increases traffic flow stability.
Resumo:
Polysulphone (PS) dosimetry has been a widely used technique for more than 30 years to quantify the erythemally effective UV dose received by anatomic sites (personal exposure). The calibration of PS dosimeters is an important issue as their spectral response is different from the erythemal action spectrum. It is performed exposing a set of PS dosimeters on a horizontal plane and measuring the UV doses received by dosimeters using calibrated spectroradiometers or radiometers. In this study, data collected during PS field campaigns (from 2004 to 2006), using horizontal and differently inclined dosimeters, were analyzed to provide some considerations on the transfer of the horizontal calibration to differently inclined dosimeters, as anatomic sites usually are. The role of sky conditions, of the angle of incidence between the sun and the normal to the slope, and of the type of surrounding surface on the calibration were investigated. It was concluded that PS horizontal calibrations apply to differently inclined dosimeters for incidence angles up to approximately 70 degrees and for surfaces excluding ones with high albedo. Caution should be used in the application of horizontal calibrations for cases of high-incidence angle and/or high albedo surfaces.
Resumo:
Assessing and prioritising cost-effective strategies to mitigate the impacts of traffic incidents and accidents on non-recurrent congestion on major roads represents a significant challenge for road network managers. This research examines the influence of numerous factors associated with incidents of various types on their duration. It presents a comprehensive traffic incident data mining and analysis by developing an incident duration model based on twelve months of incident data obtained from the Australian freeway network. Parametric accelerated failure time (AFT) survival models of incident duration were developed, including log-logistic, lognormal, and Weibul-considering both fixed and random parameters, as well as a Weibull model with gamma heterogeneity. The Weibull AFT models with random parameters were appropriate for modelling incident duration arising from crashes and hazards. A Weibull model with gamma heterogeneity was most suitable for modelling incident duration of stationary vehicles. Significant variables affecting incident duration include characteristics of the incidents (severity, type, towing requirements, etc.), and location, time of day, and traffic characteristics of the incident. Moreover, the findings reveal no significant effects of infrastructure and weather on incident duration. A significant and unique contribution of this paper is that the durations of each type of incident are uniquely different and respond to different factors. The results of this study are useful for traffic incident management agencies to implement strategies to reduce incident duration, leading to reduced congestion, secondary incidents, and the associated human and economic losses.