995 resultados para Traffic volume.
Resumo:
National Highway Traffic Safety Administration, Office of Vehicle Safety Compliance, Washington, D.C.
Resumo:
Final report; December 1977.
Resumo:
National Highway Traffic Safety Administration, Washington, D.C.
Resumo:
National Highway Traffic Safety Administration, Washington, D.C.
Resumo:
Mode of access: Internet.
Resumo:
Thesis (Master's)--University of Washington, 2016-06
Resumo:
Annual Average Daily Traffic (AADT) is a critical input to many transportation analyses. By definition, AADT is the average 24-hour volume at a highway location over a full year. Traditionally, AADT is estimated using a mix of permanent and temporary traffic counts. Because field collection of traffic counts is expensive, it is usually done for only the major roads, thus leaving most of the local roads without any AADT information. However, AADTs are needed for local roads for many applications. For example, AADTs are used by state Departments of Transportation (DOTs) to calculate the crash rates of all local roads in order to identify the top five percent of hazardous locations for annual reporting to the U.S. DOT. ^ This dissertation develops a new method for estimating AADTs for local roads using travel demand modeling. A major component of the new method involves a parcel-level trip generation model that estimates the trips generated by each parcel. The model uses the tax parcel data together with the trip generation rates and equations provided by the ITE Trip Generation Report. The generated trips are then distributed to existing traffic count sites using a parcel-level trip distribution gravity model. The all-or-nothing assignment method is then used to assign the trips onto the roadway network to estimate the final AADTs. The entire process was implemented in the Cube demand modeling system with extensive spatial data processing using ArcGIS. ^ To evaluate the performance of the new method, data from several study areas in Broward County in Florida were used. The estimated AADTs were compared with those from two existing methods using actual traffic counts as the ground truths. The results show that the new method performs better than both existing methods. One limitation with the new method is that it relies on Cube which limits the number of zones to 32,000. Accordingly, a study area exceeding this limit must be partitioned into smaller areas. Because AADT estimates for roads near the boundary areas were found to be less accurate, further research could examine the best way to partition a study area to minimize the impact.^
Resumo:
The standard highway assignment model in the Florida Standard Urban Transportation Modeling Structure (FSUTMS) is based on the equilibrium traffic assignment method. This method involves running several iterations of all-or-nothing capacity-restraint assignment with an adjustment of travel time to reflect delays encountered in the associated iteration. The iterative link time adjustment process is accomplished through the Bureau of Public Roads (BPR) volume-delay equation. Since FSUTMS' traffic assignment procedure outputs daily volumes, and the input capacities are given in hourly volumes, it is necessary to convert the hourly capacities to their daily equivalents when computing the volume-to-capacity ratios used in the BPR function. The conversion is accomplished by dividing the hourly capacity by a factor called the peak-to-daily ratio, or referred to as CONFAC in FSUTMS. The ratio is computed as the highest hourly volume of a day divided by the corresponding total daily volume. ^ While several studies have indicated that CONFAC is a decreasing function of the level of congestion, a constant value is used for each facility type in the current version of FSUTMS. This ignores the different congestion level associated with each roadway and is believed to be one of the culprits of traffic assignment errors. Traffic counts data from across the state of Florida were used to calibrate CONFACs as a function of a congestion measure using the weighted least squares method. The calibrated functions were then implemented in FSUTMS through a procedure that takes advantage of the iterative nature of FSUTMS' equilibrium assignment method. ^ The assignment results based on constant and variable CONFACs were then compared against the ground counts for three selected networks. It was found that the accuracy from the two assignments was not significantly different, that the hypothesized improvement in assignment results from the variable CONFAC model was not empirically evident. It was recognized that many other factors beyond the scope and control of this study could contribute to this finding. It was recommended that further studies focus on the use of the variable CONFAC model with recalibrated parameters for the BPR function and/or with other forms of volume-delay functions. ^
Resumo:
Variable Speed Limit (VSL) strategies identify and disseminate dynamic speed limits that are determined to be appropriate based on prevailing traffic conditions, road surface conditions, and weather conditions. This dissertation develops and evaluates a shockwave-based VSL system that uses a heuristic switching logic-based controller with specified thresholds of prevailing traffic flow conditions. The system aims to improve operations and mobility at critical bottlenecks. Before traffic breakdown occurrence, the proposed VSL’s goal is to prevent or postpone breakdown by decreasing the inflow and achieving uniform distribution in speed and flow. After breakdown occurrence, the VSL system aims to dampen traffic congestion by reducing the inflow traffic to the congested area and increasing the bottleneck capacity by deactivating the VSL at the head of the congested area. The shockwave-based VSL system pushes the VSL location upstream as the congested area propagates upstream. In addition to testing the system using infrastructure detector-based data, this dissertation investigates the use of Connected Vehicle trajectory data as input to the shockwave-based VSL system performance. Since the field Connected Vehicle data are not available, as part of this research, Vehicle-to-Infrastructure communication is modeled in the microscopic simulation to obtain individual vehicle trajectories. In this system, wavelet transform is used to analyze aggregated individual vehicles’ speed data to determine the locations of congestion. The currently recommended calibration procedures of simulation models are generally based on the capacity, volume and system-performance values and do not specifically examine traffic breakdown characteristics. However, since the proposed VSL strategies are countermeasures to the impacts of breakdown conditions, considering breakdown characteristics in the calibration procedure is important to have a reliable assessment. Several enhancements were proposed in this study to account for the breakdown characteristics at bottleneck locations in the calibration process. In this dissertation, performance of shockwave-based VSL is compared to VSL systems with different fixed VSL message sign locations utilizing the calibrated microscopic model. The results show that shockwave-based VSL outperforms fixed-location VSL systems, and it can considerably decrease the maximum back of queue and duration of breakdown while increasing the average speed during breakdown.
Resumo:
In order to determine the adequacy with which safety problems on low-volume rural roadways were addressed by the four states of Federal Region VII (Iowa, Kansas, Missouri, and Nebraska), a review was made of the states' safety policies. After reviewing literature dealing with the identification of hazardous locations, evaluation methodologies, and system-wide safety improvements, a survey of the states' safety policies was conducted. An official from each state was questioned about the various aspects and procedures dealing with safety improvements. After analyzing and comparing the remarkably diverse policies, recommendations were made in the form of a model safety program. This program included special modifications that would help remediate hazards on low-volume rural roadways. Especially encouraged is a system-wide approach to improvement which would cover all parts of the highway system, not just urban and high-volume roadways.
Resumo:
Building on previous research, the goal of this project was to identify significant influencing factors for the Iowa Department of Transportation (DOT) to consider in future updates of its Instructional Memorandum (I.M.) 3.213, which provides guidelines for determining the need for traffic barriers (guardrail and bridge rail) at secondary roadway bridges—specifically, factors that might be significant for the bridge rail rating system component of I.M. 3.213. A literature review was conducted of policies and guidelines in other states and, specifically, of studies related to traffic barrier safety countermeasures at bridges in several states. In addition, a safety impact study was conducted to evaluate possible non-driver-related behavior characteristics of crashes on secondary road structures in Iowa using road data, structure data, and crash data from 2004 to 2013. Statistical models (negative binomial regression) were used to determine which factors were significant in terms of crash volume and crash severity. The study found that crashes are somewhat more frequent on or at bridges possessing certain characteristics—traffic volume greater than 400 vehicles per day (vpd) (paved) or greater than 50 vpd (unpaved), bridge length greater than 150 ft (paved) or greater than 35 ft (unpaved), bridge width narrower than its approach (paved) or narrower than 20 ft (unpaved), and bridges older than 25 years (both paved and unpaved). No specific roadway or bridge characteristic was found to contribute to more serious crashes. The study also confirmed previous research findings that crashes with bridges on secondary roads are rare, low-severity events. Although the findings of the study support the need for appropriate use of bridge rails, it concludes that prescriptive guidelines for bridge rail use on secondary roads may not be necessary, given the limited crash expectancy and lack of differences in crash expectancy among the various combinations of explanatory characteristics.
Resumo:
Image processing offers unparalleled potential for traffic monitoring and control. For many years engineers have attempted to perfect the art of automatic data abstraction from sequences of video images. This paper outlines a research project undertaken at Napier University by the authors in the field of image processing for automatic traffic analysis. A software based system implementing TRIP algorithms to count cars and measure vehicle speed has been developed by members of the Transport Engineering Research Unit (TERU) at the University. The TRIP algorithm has been ported and evaluated on an IBM PC platform with a view to hardware implementation of the pre-processing routines required for vehicle detection. Results show that a software based traffic counting system is realisable for single window processing. Due to the high volume of data required to be processed for full frames or multiple lanes, system operations in real time are limited. Therefore specific hardware is required to be designed. The paper outlines a hardware design for implementation of inter-frame and background differencing, background updating and shadow removal techniques. Preliminary results showing the processing time and counting accuracy for the routines implemented in software are presented and a real time hardware pre-processing architecture is described.
Resumo:
The mechanics-based analysis framework predicts top-down fatigue cracking initiation time in asphalt concrete pavements by utilising fracture mechanics and mixture morphology-based property. To reduce the level of complexity involved, traffic data were characterised and incorporated into the framework using the equivalent single axle load (ESAL) approach. There is a concern that this kind of simplistic traffic characterisation might result in erroneous performance predictions and pavement structural designs. This paper integrates axle load spectra and other traffic characterisation parameters into the mechanics-based analysis framework and studies the impact these traffic characterisation parameters have on predicted fatigue cracking performance. The traffic characterisation inputs studied are traffic growth rate, axle load spectra, lateral wheel wander and volume adjustment factors. For this purpose, a traffic integration approach which incorporates Monte Carlo simulation and representative traffic characterisation inputs was developed. The significance of these traffic characterisation parameters was established by evaluating a number of field pavement sections. It is evident from the results that all the traffic characterisation parameters except truck wheel wander have been observed to have significant influence on predicted top-down fatigue cracking performance.
Resumo:
This study tested whether myocardial extracellular volume (ECV) is increased in patients with hypertension and atrial fibrillation (AF) undergoing pulmonary vein isolation and whether there is an association between ECV and post-procedural recurrence of AF. Hypertension is associated with myocardial fibrosis, an increase in ECV, and AF. Data linking these findings are limited. T1 measurements pre-contrast and post-contrast in a cardiac magnetic resonance (CMR) study provide a method for quantification of ECV. Consecutive patients with hypertension and recurrent AF referred for pulmonary vein isolation underwent a contrast CMR study with measurement of ECV and were followed up prospectively for a median of 18 months. The endpoint of interest was late recurrence of AF. Patients had elevated left ventricular (LV) volumes, LV mass, left atrial volumes, and increased ECV (patients with AF, 0.34 ± 0.03; healthy control patients, 0.29 ± 0.03; p < 0.001). There were positive associations between ECV and left atrial volume (r = 0.46, p < 0.01) and LV mass and a negative association between ECV and diastolic function (early mitral annular relaxation [E'], r = -0.55, p < 0.001). In the best overall multivariable model, ECV was the strongest predictor of the primary outcome of recurrent AF (hazard ratio: 1.29; 95% confidence interval: 1.15 to 1.44; p < 0.0001) and the secondary composite outcome of recurrent AF, heart failure admission, and death (hazard ratio: 1.35; 95% confidence interval: 1.21 to 1.51; p < 0.0001). Each 10% increase in ECV was associated with a 29% increased risk of recurrent AF. In patients with AF and hypertension, expansion of ECV is associated with diastolic function and left atrial remodeling and is a strong independent predictor of recurrent AF post-pulmonary vein isolation.
Resumo:
Current data indicate that the size of high-density lipoprotein (HDL) may be considered an important marker for cardiovascular disease risk. We established reference values of mean HDL size and volume in an asymptomatic representative Brazilian population sample (n=590) and their associations with metabolic parameters by gender. Size and volume were determined in HDL isolated from plasma by polyethyleneglycol precipitation of apoB-containing lipoproteins and measured using the dynamic light scattering (DLS) technique. Although the gender and age distributions agreed with other studies, the mean HDL size reference value was slightly lower than in some other populations. Both HDL size and volume were influenced by gender and varied according to age. HDL size was associated with age and HDL-C (total population); non- white ethnicity and CETP inversely (females); HDL-C and PLTP mass (males). On the other hand, HDL volume was determined only by HDL-C (total population and in both genders) and by PLTP mass (males). The reference values for mean HDL size and volume using the DLS technique were established in an asymptomatic and representative Brazilian population sample, as well as their related metabolic factors. HDL-C was a major determinant of HDL size and volume, which were differently modulated in females and in males.