902 resultados para train traffic
Resumo:
Product Ecosystem theory is an emerging theory that shows that disruptive “game changing” innovation is only possible when the entire ecosystem is considered. When environmental variables change faster than products or services can adapt, disruptive innovation is required to keep pace. This has many parallels with natural ecosystems where species that cannot keep up with changes to the environment will struggle or become extinct. In this case the environment is the city, the environmental pressures are pollution and congestion, the product is the car and the product ecosystem is comprised of roads, bridges, traffic lights, legislation, refuelling facilities etc. Each one of these components is the responsibility of a different organisation and so any change that affects the whole ecosystem requires a transdisciplinary approach. As a simple example, cars that communicate wirelessly with traffic lights are only of value if wireless-enabled traffic lights exist and vice versa. Cars that drive themselves are technically possible but legislation in most places doesn’t allow their use. According to innovation theory, incremental innovation tends to chase ever diminishing returns and becomes increasingly unable to tackle the “big issues.” Eventually “game changing” disruptive innovation comes along and solves the “big issues” and/or provides new opportunities. Seen through this lens, the environmental pressures of urban traffic congestion and pollution are the “big issues.” It can be argued that the design of cars and the other components of the product ecosystem follow an incremental innovation approach. That is why the “big issues” remain unresolved. This paper explores the problems of pollution and congestion in urban environments from a Product Ecosystem perspective. From this a strategy will be proposed for a transdisciplinary approach to develop and implement solutions.
Resumo:
This paper reports profiling information for speeding offenders and is part of a larger project that assessed the deterrent effects of increased speeding penalties in Queensland, Australia, using a total of 84,456 speeding offences. The speeding offenders were classified into three groups based on the extent and severity of an index offence: once-only low-rang offenders; repeat high-range offenders; and other offenders. The three groups were then compared in terms of personal characteristics, traffic offences, crash history and criminal history. Results revealed a number of significant differences between repeat high-range offenders and those in the other two offender groups. Repeat high-range speeding offenders were more likely to be male, younger, hold a provisional and a motorcycle licence, to have committed a range of previous traffic offences, to have a significantly greater likelihood of crash involvement, and to have been involved in multiple-vehicle crashes than drivers in the other two offender types. Additionally, when a subset of offenders’ criminal histories were examined, results revealed that repeat high-range speeding offenders were also more likely to have committed a previous criminal offence compared to once only low-range and other offenders and that 55.2% of the repeat high-range offenders had a criminal history. They were also significantly more likely to have committed drug offences and offences against order than the once only low-range speeding offenders, and significantly more likely to have committed regulation offences than those in the other offenders group. Overall, the results indicate that speeding offenders are not an homogeneous group and that, therefore, more tailored and innovative sanctions should be considered and evaluated for high-range recidivist speeders because they are a high-risk road user group.
Resumo:
This thesis presents an association rule mining approach, association hierarchy mining (AHM). Different to the traditional two-step bottom-up rule mining, AHM adopts one-step top-down rule mining strategy to improve the efficiency and effectiveness of mining association rules from datasets. The thesis also presents a novel approach to evaluate the quality of knowledge discovered by AHM, which focuses on evaluating information difference between the discovered knowledge and the original datasets. Experiments performed on the real application, characterizing network traffic behaviour, have shown that AHM achieves encouraging performance.
Resumo:
In this article an alternate sensitivity analysis is proposed for train schedules. It characterises the schedules robustness or lack thereof and provides unique profiles of performance for different sources of delay and for different values of delay. An approach like this is necessary because train schedules are only a prediction of what will actually happen. They can perform poorly with respect to a variety of performance metrics, when deviations and other delays occur, if for instance they can even be implemented, and as originally intended. The information provided by this analytical approach is beneficial because it can be used as part of a proactive scheduling approach to alter a schedule in advance or to identify suitable courses of action for specific “bad behaviour”. Furthermore this information may be used to quantify the cost of delay. The effect of sectional running time (SRT) deviations and additional dwell time in particular were quantified for three railway schedule performance measures. The key features of this approach were demonstrated in a case study.
Resumo:
Increasing train speeds is conceptually a simple and straight forward method to expand railway capacity, for example in comparison to other more extensive and elaborate alternatives. In this article an analytical capacity model has been investigated as a means of performing a sensitivity analysis of train speeds. The results of this sensitivity analysis can help improve the operation of this railway system and to help it cope with additional demands in the future. To test our approach a case study of the Rah Ahane Iran (RAI) national railway network has been selected. The absolute capacity levels for this railway network have been determined and the analysis shows that increasing trains speeds may not be entirely cost effective in all circumstances.
Resumo:
A systematic literature review and a comprehensive meta-analysis that combines the findings from existing studies, was conducted in this thesis to analyse the impact of traffic characteristics on crash occurrence. Sensitivity analyses were conducted to investigate the quality, publication bias and outlier bias of the various studies, and the time intervals used to measure traffic characteristics were considered. Based on this comprehensive and systematic review, and the results of the subsequent meta-analysis, major issues in study design, traffic and crash data, and model development and evaluation are discussed.
Resumo:
In current bridge management systems (BMSs), load and speed restrictions are applied on unhealthy bridges to keep the structure safe and serviceable for as long as possible. But the question is, whether applying these restrictions will always decrease the internal forces in critical components of the bridge and enhance the safety of the unhealthy bridges. To find the answer, this paper for the first time in literature, looks into the design aspects through studying the changes in demand by capacity ratios of the critical components of a bridge under the train loads. For this purpose, a structural model of a simply supported bridge, whose dynamic behaviour is similar to a group of real railway bridges, is developed. Demand by capacity ratios of the critical components of the bridge are calculated, to identify their sensitivity to increase of speed and magnitude of live load. The outcomes of this study are very significant as they show that, on the contrary to what is expected, by applying restriction on speed, the demand by capacity ratio of components may increase and make the bridge unsafe for carrying live load. Suggestions are made to solve the problem.