487 resultados para Traffic engineering computing
Resumo:
Many infrastructure and necessity systems such as electricity and telecommunication in Europe and the Northern America were used to be operated as monopolies, if not state-owned. However, they have now been disintegrated into a group of smaller companies managed by different stakeholders. Railways are no exceptions. Since the early 1980s, there have been reforms in the shape of restructuring of the national railways in different parts of the world. Continuous refinements are still conducted to allow better utilisation of railway resources and quality of service. There has been a growing interest for the industry to understand the impacts of these reforms on the operation efficiency and constraints. A number of post-evaluations have been conducted by analysing the performance of the stakeholders on their profits (Crompton and Jupe 2003), quality of train service (Shaw 2001) and engineering operations (Watson 2001). Results from these studies are valuable for future improvement in the system, followed by a new cycle of post-evaluations. However, direct implementation of these changes is often costly and the consequences take a long period of time (e.g. years) to surface. With the advance of fast computing technologies, computer simulation is a cost-effective means to evaluate a hypothetical change in a system prior to actual implementation. For example, simulation suites have been developed to study a variety of traffic control strategies according to sophisticated models of train dynamics, traction and power systems (Goodman, Siu and Ho 1998, Ho and Yeung 2001). Unfortunately, under the restructured railway environment, it is by no means easy to model the complex behaviour of the stakeholders and the interactions between them. Multi-agent system (MAS) is a recently developed modelling technique which may be useful in assisting the railway industry to conduct simulations on the restructured railway system. In MAS, a real-world entity is modelled as a software agent that is autonomous, reactive to changes, able to initiate proactive actions and social communicative acts. It has been applied in the areas of supply-chain management processes (García-Flores, Wang and Goltz 2000, Jennings et al. 2000a, b) and e-commerce activities (Au, Ngai and Parameswaran 2003, Liu and You 2003), in which the objectives and behaviour of the buyers and sellers are captured by software agents. It is therefore beneficial to investigate the suitability or feasibility of applying agent modelling in railways and the extent to which it might help in developing better resource management strategies. This paper sets out to examine the benefits of using MAS to model the resource management process in railways. Section 2 first describes the business environment after the railway 2 Modelling issues on the railway resource management process using MAS reforms. Then the problems emerge from the restructuring process are identified in section 3. Section 4 describes the realisation of a MAS for railway resource management under the restructured scheme and the feasible studies expected from the model.
Resumo:
Popular wireless network standards, such as IEEE 802.11/15/16, are increasingly adopted in real-time control systems. However, they are not designed for real-time applications. Therefore, the performance of such wireless networks needs to be carefully evaluated before the systems are implemented and deployed. While efforts have been made to model general wireless networks with completely random traffic generation, there is a lack of theoretical investigations into the modelling of wireless networks with periodic real-time traffic. Considering the widely used IEEE 802.11 standard, with the focus on its distributed coordination function (DCF), for soft-real-time control applications, this paper develops an analytical Markov model to quantitatively evaluate the network quality-of-service (QoS) performance in periodic real-time traffic environments. Performance indices to be evaluated include throughput capacity, transmission delay and packet loss ratio, which are crucial for real-time QoS guarantee in real-time control applications. They are derived under the critical real-time traffic condition, which is formally defined in this paper to characterize the marginal satisfaction of real-time performance constraints.
Resumo:
A high performance, low computational complexity rate-based flow control algorithm which can avoid congestion and achieve fairness is important to ATM available bit rate service. The explicit rate allocation algorithm proposed by Kalampoukas et al. is designed to achieve max–min fairness in ATM networks. It has several attractive features, such as a fixed computational complexity of O(1) and the guaranteed convergence to max–min fairness. In this paper, certain drawbacks of the algorithm, such as the severe overload of an outgoing link during transient period and the non-conforming use of the current cell rate field in a resource management cell, have been identified and analysed; a new algorithm which overcomes these drawbacks is proposed. The proposed algorithm simplifies the rate computation as well. Compared with Kalampoukas's algorithm, it has better performance in terms of congestion avoidance and smoothness of rate allocation.
Resumo:
Understanding the impacts of traffic and climate change on water quality helps decision makers to develop better policy and plans for dealing with unsustainable urban and transport development. This chapter presents detailed methodologies developed for sample collection and testing for heavy metals and total petroleum hydrocarbons, as part of a research study to investigate the impacts of climate change and changes to urban traffic characteristics on pollutant build-up and wash-off from urban road surfaces. Cadmium, chromium, nickel, copper, lead, iron, aluminium, manganese and zinc were the target heavy metals, and selected gasoline and diesel range organics were the target total petroleum hydrocarbons for this study. The study sites were selected to encompass the urban traffic characteristics of the Gold Coast region, Australia. An improved sample collection method referred to as ‘the wet and dry vacuum system’ for the pollutant build-up, and an effective wash-off plan to incorporate predicted changes to rainfall characteristics due to climate change, were implemented. The novel approach to sample collection for pollutant build-up helped to maintain the integrity of collection efficiency. The wash-off plan helped to incorporate the predicted impacts of climate change in the Gold Coast region. The robust experimental methods developed will help in field sample collection and chemical testing of different stormwater pollutants in build-up and wash-off.
Resumo:
This paper presents a Genetic Algorithms (GA) approach to resolve traffic conflicts at a railway junction. The formulation of the problem for the suitable application of GA will be discussed and three neighborhoods have been proposed for generation evolution. The performance of the GA is evaluated by computer simulation. This study paves the way for more applications of artificial intelligence techniques on a rather conservative industry.
Resumo:
This paper introduces an event-based traffic model for railway systems adopting fixed-block signalling schemes. In this model, the events of trains' arrival at and departure from signalling blocks constitute the states of the traffic flow. A state transition is equivalent to the progress of the trains by one signalling block and it is realised by referring to past and present states, as well as a number of pre-calculated look-up tables of run-times in the signalling block under various signalling conditions. Simulation results are compared with those from a time-based multi-train simulator to study the improvement of processing time and accuracy.
Resumo:
Railways in Hong Kong have been one of the few successful stories in the major metropolitan cities around the world, not only for their profit-making operation but also the efficiency in dealing with the astonishingly high traffic demands every day. While railway operations require a chain of delicate systems working in harmony all the time, there are numerous engineering problems arising and jeopardising the quality of services. Research in various railway engineering problems is therefore a must to tackle these problems. This paper highlights the railway research works in Hong Kong and discusses their relevance to Mainland China.
Resumo:
Traffic control at a road junction by a complex fuzzy logic controller is investigated. The increase in the complexity of junction means more number of input variables must be taken into account, which will increase the number of fuzzy rules in the system. A hierarchical fuzzy logic controller is introduced to reduce the number of rules. Besides, the increase in the complexity of the controller makes formulation of the fuzzy rules difficult. A genetic algorithm based off-line leaning algorithm is employed to generate the fuzzy rules. The learning algorithm uses constant flow-rates as training sets. The system is tested by both constant and time-varying flow-rates. Simulation results show that the proposed controller produces lower average delay than a fixed-time controller does under various traffic conditions.
Resumo:
Short-term traffic flow data is characterized by rapid and dramatic fluctuations. It reflects the nature of the frequent congestion in the lane, which shows a strong nonlinear feature. Traffic state estimation based on the data gained by electronic sensors is critical for much intelligent traffic management and the traffic control. In this paper, a solution to freeway traffic estimation in Beijing is proposed using a particle filter, based on macroscopic traffic flow model, which estimates both traffic density and speed.Particle filter is a nonlinear prediction method, which has obvious advantages for traffic flows prediction. However, with the increase of sampling period, the volatility of the traffic state curve will be much dramatic. Therefore, the prediction accuracy will be affected and difficulty of forecasting is raised. In this paper, particle filter model is applied to estimate the short-term traffic flow. Numerical study is conducted based on the Beijing freeway data with the sampling period of 2 min. The relatively high accuracy of the results indicates the superiority of the proposed model.
Resumo:
Abstract Computer simulation is a versatile and commonly used tool for the design and evaluation of systems with different degrees of complexity. Power distribution systems and electric railway network are areas for which computer simulations are being heavily applied. A dominant factor in evaluating the performance of a software simulator is its processing time, especially in the cases of real-time simulation. Parallel processing provides a viable mean to reduce the computing time and is therefore suitable for building real-time simulators. In this paper, we present different issues related to solving the power distribution system with parallel computing based on a multiple-CPU server and we will concentrate, in particular, on the speedup performance of such an approach.
Resumo:
Parallel computing is currently used in many engineering problems. However, because of limitations in curriculum design, it is not always possible to offer students specific formal teaching in this topic. Furthermore, parallel machines are still too expensive for many institutions. The latest microprocessors, such as Intel’s Pentium III and IV, embody single instruction multiple-data (SIMD) type parallel features, which makes them a viable solution for introducing parallel computing concepts to students. Final year projects have been initiated utilizing SSE (streaming SIMD extensions) features and it has been observed that students can easily learn parallel programming concepts after going through some programming exercises. They can now experiment with parallel algorithms on their own PCs at home. Keywords
Resumo:
An investigation into the effects of changes in urban traffic characteristics due to rapid urbanisation and the predicted changes in rainfall characteristics due to climate change on the build-up and wash-off of heavy metals was carried out in Gold Coast, Australia. The study sites encompassed three different urban land uses. Nine heavy metals commonly associated with traffic emissions were selected. The results were interpreted using multivariate data analysis and decision making tools, such as principal component analysis (PCA), fuzzy clustering (FC), PROMETHEE and GAIA. Initial analyses established high, low and moderate traffic scenarios as well as low, low to moderate, moderate, high and extreme rainfall scenarios for build-up and wash-off investigations. GAIA analyses established that moderate to high traffic scenarios could affect the build-up while moderate to high rainfall scenarios could affect the wash-off of heavy metals under changed conditions. However, in wash-off, metal concentrations in 1-75µm fraction were found to be independent of the changes to rainfall characteristics. In build-up, high traffic activities in commercial and industrial areas influenced the accumulation of heavy metal concentrations in particulate size range from 75 - >300 µm, whereas metal concentrations in finer size range of <1-75 µm were not affected. As practical implications, solids <1 µm and organic matter from 1 - >300 µm can be targeted for removal of Ni, Cu, Pb, Cd, Cr and Zn from build-up whilst organic matter from <1 - >300 µm can be targeted for removal of Cd, Cr, Pb and Ni from wash-off. Cu and Zn need to be removed as free ions from most fractions in wash-off.
Resumo:
A combined specular reflection and diffusion model using the radiosity technique was developed to calculate road traffic noise level on residential balconies. The model is capable of numerous geometrical configurations for a single balcony situated in the centre of a street canyon. The geometry of the balcony and the street can be altered with width,length and height. The model was used to calculate for three different geometrical and acoustic absorption characteristics for a balcony. The calculated results are presented in this paper.
Resumo:
Web applications such as blogs, wikis, video and photo sharing sites, and social networking systems have been termed ‘Web 2.0’ to highlight an arguably more open, collaborative, personalisable, and therefore more participatory internet experience than what had previously been possible. Giving rise to a culture of participation, an increasing number of these social applications are now available on mobile phones where they take advantage of device-specific features such as sensors, location and context awareness. This international volume of book chapters will make a contribution towards exploring and better understanding the opportunities and challenges provided by tools, interfaces, methods and practices of social and mobile technology that enable participation and engagement. It brings together an international group of academics and practitioners from a diverse range of disciplines such as computing and engineering, social sciences, digital media and human-computer interaction to critically examine a range of applications of social and mobile technology, such as social networking, mobile interaction, wikis, twitter, blogging, virtual worlds, shared displays and urban sceens, and their impact to foster community activism, civic engagement and cultural citizenship.
Resumo:
In the study of traffic safety, expected crash frequencies across sites are generally estimated via the negative binomial model, assuming time invariant safety. Since the time invariant safety assumption may be invalid, Hauer (1997) proposed a modified empirical Bayes (EB) method. Despite the modification, no attempts have been made to examine the generalisable form of the marginal distribution resulting from the modified EB framework. Because the hyper-parameters needed to apply the modified EB method are not readily available, an assessment is lacking on how accurately the modified EB method estimates safety in the presence of the time variant safety and regression-to-the-mean (RTM) effects. This study derives the closed form marginal distribution, and reveals that the marginal distribution in the modified EB method is equivalent to the negative multinomial (NM) distribution, which is essentially the same as the likelihood function used in the random effects Poisson model. As a result, this study shows that the gamma posterior distribution from the multivariate Poisson-gamma mixture can be estimated using the NM model or the random effects Poisson model. This study also shows that the estimation errors from the modified EB method are systematically smaller than those from the comparison group method by simultaneously accounting for the RTM and time variant safety effects. Hence, the modified EB method via the NM model is a generalisable method for estimating safety in the presence of the time variant safety and the RTM effects.