989 resultados para Traffic capacity


Relevância:

30.00% 30.00%

Publicador:

Resumo:

As it is defined in ATM 2000+ Strategy (Eurocontrol 2001), the mission of the Air Traffic Management (ATM) System is: “For all the phases of a flight, the ATM system should facilitate a safe, efficient, and expedite traffic flow, through the provision of adaptable ATM services that can be dimensioned in relation to the requirements of all the users and areas of the European air space. The ATM services should comply with the demand, be compatible, operate under uniform principles, respect the environment and satisfy the national security requirements.” The objective of this paper is to present a methodology designed to evaluate the status of the ATM system in terms of the relationship between the offered capacity and traffic demand, identifying weakness areas and proposing solutions. The first part of the methodology relates to the characterization and evaluation of the current system, while a second part proposes an approach to analyze the possible development limit. As part of the work, general criteria are established to define the framework in which the analysis and diagnostic methodology presented is placed. They are: the use of Air Traffic Control (ATC) sectors as analysis unit, the presence of network effects, the tactical focus, the relative character of the analysis, objectivity and a high level assessment that allows assumptions on the human and Communications, Navigation and Surveillance (CNS) elements, considered as the typical high density air traffic resources. The steps followed by the methodology start with the definition of indicators and metrics, like the nominal criticality or the nominal efficiency of a sector; scenario characterization where the necessary data is collected; network effects analysis to study the relations among the constitutive elements of the ATC system; diagnostic by means of the “System Status Diagram”; analytical study of the ATC system development limit; and finally, formulation of conclusions and proposal for improvement. This methodology was employed by Aena (Spanish Airports Manager and Air Navigation Service Provider) and INECO (Spanish Transport Engineering Company) in the analysis of the Spanish ATM System in the frame of the Spanish airspace capacity sustainability program, although it could be applied elsewhere.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Federal Highway Administration, Office of Research, Washington, D.C.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Texas State Department of Highways and Public Transportation, Transportation Planning Division, Austin

Relevância:

30.00% 30.00%

Publicador:

Resumo:

National Highway Traffic Safety Administration, Washington, D.C.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In studies of complex heterogeneous networks, particularly of the Internet, significant attention was paid to analyzing network failures caused by hardware faults or overload, where the network reaction was modeled as rerouting of traffic away from failed or congested elements. Here we model another type of the network reaction to congestion - a sharp reduction of the input traffic rate through congested routes which occurs on much shorter time scales. We consider the onset of congestion in the Internet where local mismatch between demand and capacity results in traffic losses and show that it can be described as a phase transition characterized by strong non-Gaussian loss fluctuations at a mesoscopic time scale. The fluctuations, caused by noise in input traffic, are exacerbated by the heterogeneous nature of the network manifested in a scale-free load distribution. They result in the network strongly overreacting to the first signs of congestion by significantly reducing input traffic along the communication paths where congestion is utterly negligible. © Copyright EPLA, 2012.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper is concerned with long-term (20+ years) forecasting of broadband traffic in next-generation networks. Such long-term approach requires going beyond extrapolations of past traffic data while facing high uncertainty in predicting the future developments and facing the fact that, in 20 years, the current network technologies and architectures will be obsolete. Thus, "order of magnitude" upper bounds of upstream and downstream traffic are deemed to be good enough to facilitate such long-term forecasting. These bounds can be obtained by evaluating the limits of human sighting and assuming that these limits will be achieved by future services or, alternatively, by considering the contents transferred by bandwidth-demanding applications such as those using embedded interactive 3D video streaming. The traffic upper bounds are a good indication of the peak values and, subsequently, also of the future network capacity demands. Furthermore, the main drivers of traffic growth including multimedia as well as non-multimedia applications are identified. New disruptive applications and services are explored that can make good use of the large bandwidth provided by next-generation networks. The results can be used to identify monetization opportunities of future services and to map potential revenues for network operators. © 2014 The Author(s).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In studies of complex heterogeneous networks, particularly of the Internet, significant attention was paid to analysing network failures caused by hardware faults or overload. There network reaction was modelled as rerouting of traffic away from failed or congested elements. Here we model network reaction to congestion on much shorter time scales when the input traffic rate through congested routes is reduced. As an example we consider the Internet where local mismatch between demand and capacity results in traffic losses. We describe the onset of congestion as a phase transition characterised by strong, albeit relatively short-lived, fluctuations of losses caused by noise in input traffic and exacerbated by the heterogeneous nature of the network manifested in a power-law load distribution. The fluctuations may result in the network strongly overreacting to the first signs of congestion by significantly reducing input traffic along the communication paths where congestion is utterly negligible. © 2013 IEEE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Next-generation networks are likely to be non-uniform in all their aspects, including number of lightpaths carried per link, number of wavelengths per link, number of fibres per link, asymmetry of the links, and traffic flows. Routing and wavelength allocation models generally assume that the optical network is uniform and that the number of wavelengths per link is a constant. In practice however, some nodes and links carry heavy traffic and additional wavelengths are needed in those links. We study a wavelength-routed optical network based on the UK JANET topology where traffic demands between nodes are assumed to be non-uniform. We investigate how network capacity can be increased by locating congested links and suggesting cost-effective upgrades. Different traffic demands patterns, hop distances, number of wavelengths per link, and routing algorithms are considered. Numerical results show that a 95% increase in network capacity is possible by overlaying fibre on just 5% of existing links. We conclude that non-uniform traffic allocation can be beneficial to localize traffic in nodes and links deep in the network core and provisioning of additional resources there can efficiently and cost-effectively increase network capacity. © 2013 IEEE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Choosing between Light Rail Transit (LRT) and Bus Rapid Transit (BRT) systems is often controversial and not an easy task for transportation planners who are contemplating the upgrade of their public transportation services. These two transit systems provide comparable services for medium-sized cities from the suburban neighborhood to the Central Business District (CBD) and utilize similar right-of-way (ROW) categories. The research is aimed at developing a method to assist transportation planners and decision makers in determining the most feasible system between LRT and BRT. ^ Cost estimation is a major factor when evaluating a transit system. Typically, LRT is more expensive to build and implement than BRT, but has significantly lower Operating and Maintenance (OM) costs than BRT. This dissertation examines the factors impacting capacity and costs, and develops cost models, which are a capacity-based cost estimate for the LRT and BRT systems. Various ROW categories and alignment configurations of the systems are also considered in the developed cost models. Kikuchi's fleet size model (1985) and cost allocation method are used to develop the cost models to estimate the capacity and costs. ^ The comparison between LRT and BRT are complicated due to many possible transportation planning and operation scenarios. In the end, a user-friendly computer interface integrated with the established capacity-based cost models, the LRT and BRT Cost Estimator (LBCostor), was developed by using Microsoft Visual Basic language to facilitate the process and will guide the users throughout the comparison operations. The cost models and the LBCostor can be used to analyze transit volumes, alignments, ROW configurations, number of stops and stations, headway, size of vehicle, and traffic signal timing at the intersections. The planners can make the necessary changes and adjustments depending on their operating practices. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Optimization of adaptive traffic signal timing is one of the most complex problems in traffic control systems. This dissertation presents a new method that applies the parallel genetic algorithm (PGA) to optimize adaptive traffic signal control in the presence of transit signal priority (TSP). The method can optimize the phase plan, cycle length, and green splits at isolated intersections with consideration for the performance of both the transit and the general vehicles. Unlike the simple genetic algorithm (GA), PGA can provide better and faster solutions needed for real-time optimization of adaptive traffic signal control. ^ An important component in the proposed method involves the development of a microscopic delay estimation model that was designed specifically to optimize adaptive traffic signal with TSP. Macroscopic delay models such as the Highway Capacity Manual (HCM) delay model are unable to accurately consider the effect of phase combination and phase sequence in delay calculations. In addition, because the number of phases and the phase sequence of adaptive traffic signal may vary from cycle to cycle, the phase splits cannot be optimized when the phase sequence is also a decision variable. A "flex-phase" concept was introduced in the proposed microscopic delay estimation model to overcome these limitations. ^ The performance of PGA was first evaluated against the simple GA. The results show that PGA achieved both faster convergence and lower delay for both under- or over-saturated traffic conditions. A VISSIM simulation testbed was then developed to evaluate the performance of the proposed PGA-based adaptive traffic signal control with TSP. The simulation results show that the PGA-based optimizer for adaptive TSP outperformed the fully actuated NEMA control in all test cases. The results also show that the PGA-based optimizer was able to produce TSP timing plans that benefit the transit vehicles while minimizing the impact of TSP on the general vehicles. The VISSIM testbed developed in this research provides a powerful tool to design and evaluate different TSP strategies under both actuated and adaptive signal control. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Rapid population increase and booming economic growth have caused a significant escalation in car ownership in many cities, leading to additional or, multiple traffic problems on congested roadways. The increase of automobiles is generating a significant amount of congestion and pollution in many cities. It has become necessary to find a solution to the ever worsening traffic problems in our cities. Building more roadways is nearly impossible due to the limitations of right-of-way in cities. Studies have shown that guideway transit could provide effective transportation and could stimulate land development. The Medium-Capacity Guideway Transit (MCGT) is one of the alternatives to solve this problem. The objective of this research was to better understand the characteristics of MCGT systems, to investigate the existing MCGT systems around the world and determine the main factors behind the planning of successful systems, and to develop a MCGT planning guide. The factors utilized in this study were determined and were analyzed using Excel. A MCGT Planning Guide was developed using Microsoft Visual Basic. ^ A MCGT was defined as a transit system whose capacity can carry up to 20,000 passengers per hour per direction (pphpd). The results shown that Light Rail Transit (LRT) is favored when peak hour demand is less than 13,000 pphpd. Automated People Mover (APM) is favored when the peak hour demand is more than 18,000 pphpd. APM systems could save up to three times the waiting time cost compared to that of the LRT. If comfort and convenience are important, then using an APM does make sense. However, if cost is the critical factor, then LRT will make more sense because it is reasonable service at a reasonable price. If travel time and safety (accident/crush) costs were included in calculating life-cycle “total” costs, the capital cost advantage of LRT disappeared and APM could become very competitive. The results also included a range of cost-performance criteria for MCGT systems that help planners, engineers, and decision-makers to select the most feasible system for their respective areas. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Traffic incidents are a major source of traffic congestion on freeways. Freeway traffic diversion using pre-planned alternate routes has been used as a strategy to reduce traffic delays due to major traffic incidents. However, it is not always beneficial to divert traffic when an incident occurs. Route diversion may adversely impact traffic on the alternate routes and may not result in an overall benefit. This dissertation research attempts to apply Artificial Neural Network (ANN) and Support Vector Regression (SVR) techniques to predict the percent of delay reduction from route diversion to help determine whether traffic should be diverted under given conditions. The DYNASMART-P mesoscopic traffic simulation model was applied to generate simulated data that were used to develop the ANN and SVR models. A sample network that comes with the DYNASMART-P package was used as the base simulation network. A combination of different levels of incident duration, capacity lost, percent of drivers diverted, VMS (variable message sign) messaging duration, and network congestion was simulated to represent different incident scenarios. The resulting percent of delay reduction, average speed, and queue length from each scenario were extracted from the simulation output. The ANN and SVR models were then calibrated for percent of delay reduction as a function of all of the simulated input and output variables. The results show that both the calibrated ANN and SVR models, when applied to the same location used to generate the calibration data, were able to predict delay reduction with a relatively high accuracy in terms of mean square error (MSE) and regression correlation. It was also found that the performance of the ANN model was superior to that of the SVR model. Likewise, when the models were applied to a new location, only the ANN model could produce comparatively good delay reduction predictions under high network congestion level.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The standard highway assignment model in the Florida Standard Urban Transportation Modeling Structure (FSUTMS) is based on the equilibrium traffic assignment method. This method involves running several iterations of all-or-nothing capacity-restraint assignment with an adjustment of travel time to reflect delays encountered in the associated iteration. The iterative link time adjustment process is accomplished through the Bureau of Public Roads (BPR) volume-delay equation. Since FSUTMS' traffic assignment procedure outputs daily volumes, and the input capacities are given in hourly volumes, it is necessary to convert the hourly capacities to their daily equivalents when computing the volume-to-capacity ratios used in the BPR function. The conversion is accomplished by dividing the hourly capacity by a factor called the peak-to-daily ratio, or referred to as CONFAC in FSUTMS. The ratio is computed as the highest hourly volume of a day divided by the corresponding total daily volume. ^ While several studies have indicated that CONFAC is a decreasing function of the level of congestion, a constant value is used for each facility type in the current version of FSUTMS. This ignores the different congestion level associated with each roadway and is believed to be one of the culprits of traffic assignment errors. Traffic counts data from across the state of Florida were used to calibrate CONFACs as a function of a congestion measure using the weighted least squares method. The calibrated functions were then implemented in FSUTMS through a procedure that takes advantage of the iterative nature of FSUTMS' equilibrium assignment method. ^ The assignment results based on constant and variable CONFACs were then compared against the ground counts for three selected networks. It was found that the accuracy from the two assignments was not significantly different, that the hypothesized improvement in assignment results from the variable CONFAC model was not empirically evident. It was recognized that many other factors beyond the scope and control of this study could contribute to this finding. It was recommended that further studies focus on the use of the variable CONFAC model with recalibrated parameters for the BPR function and/or with other forms of volume-delay functions. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Managed lane strategies are innovative road operation schemes for addressing congestion problems. These strategies operate a lane (lanes) adjacent to a freeway that provides congestion-free trips to eligible users, such as transit or toll-payers. To ensure the successful implementation of managed lanes, the demand on these lanes need to be accurately estimated. Among different approaches for predicting this demand, the four-step demand forecasting process is most common. Managed lane demand is usually estimated at the assignment step. Therefore, the key to reliably estimating the demand is the utilization of effective assignment modeling processes. ^ Managed lanes are particularly effective when the road is functioning at near-capacity. Therefore, capturing variations in demand and network attributes and performance is crucial for their modeling, monitoring and operation. As a result, traditional modeling approaches, such as those used in static traffic assignment of demand forecasting models, fail to correctly predict the managed lane demand and the associated system performance. The present study demonstrates the power of the more advanced modeling approach of dynamic traffic assignment (DTA), as well as the shortcomings of conventional approaches, when used to model managed lanes in congested environments. In addition, the study develops processes to support an effective utilization of DTA to model managed lane operations. ^ Static and dynamic traffic assignments consist of demand, network, and route choice model components that need to be calibrated. These components interact with each other, and an iterative method for calibrating them is needed. In this study, an effective standalone framework that combines static demand estimation and dynamic traffic assignment has been developed to replicate real-world traffic conditions. ^ With advances in traffic surveillance technologies collecting, archiving, and analyzing traffic data is becoming more accessible and affordable. The present study shows how data from multiple sources can be integrated, validated, and best used in different stages of modeling and calibration of managed lanes. Extensive and careful processing of demand, traffic, and toll data, as well as proper definition of performance measures, result in a calibrated and stable model, which closely replicates real-world congestion patterns, and can reasonably respond to perturbations in network and demand properties.^