915 resultados para Liquor traffic.
Resumo:
Traffic related emissions have been recognised as one of the main sources of air pollutants. In the research study discussed in this paper, variability of atmospheric total suspended particulate matter (TSP), polycyclic aromatic hydrocarbons (PAH) and heavy metal (HM) concentrations with traffic and land use characteristics during weekdays and weekends were investigated. Data required for the study were collected from a range of sampling sites to ensure a wide mix of traffic and land use characteristics. The analysis undertaken confirmed that zinc has the highest concentration in the atmospheric phase during weekends as well as weekdays. Although the use of leaded gasoline was discontinued a decade ago, lead was the second most commonly detected heavy metal. This is attributed to the association of previously generated lead with roadside soil and re-suspension to the atmosphere. Soil related particles are the primary source of TSP and manganese to the atmosphere. The analysis further revealed that traffic sources are dominant in gas phase PAHs compared to the other sources during weekdays. Land use related sources become important contributors to atmospheric PAHs during weekends when traffic sources are at their minimal levels.
Resumo:
For the further noise reduction in the future, the traffic management which controls traffic flow and physical distribution is important. To conduct the measure by the traffic management effectively, it is necessary to apply the model for predicting the traffic flow in the citywide road network. For this purpose, the existing model named AVENUE was used as a macro-traffic flow prediction model. The traffic flow model was integrated with the road vehicles' sound power model, and the new road traffic noise prediction model was established. By using this prediction model, the noise map of entire city can be made. In this study, first, the change of traffic flow on the road network after the establishment of new roads was estimated, and the change of the road traffic noise caused by the new roads was predicted. As a result, it has been found that this prediction model has the ability to estimate the change of noise map by the traffic management. In addition, the macro-traffic flow model and our conventional micro-traffic flow model were combined, and the coverage of the noise prediction model was expanded.
Resumo:
This paper presents a behavioral car-following model based on empirical trajectory data that is able to reproduce the spontaneous formation and ensuing propagation of stop-and-go waves in congested traffic. By analyzing individual drivers’ car-following behavior throughout oscillation cycles it is found that this behavior is consistent across drivers and can be captured by a simple model. The statistical analysis of the model’s parameters reveals that there is a strong correlation between driver behavior before and during the oscillation, and that this correlation should not be ignored if one is interested in microscopic output. If macroscopic outputs are of interest, simulation results indicate that an existing model with fewer parameters can be used instead. This is shown for traffic oscillations caused by rubbernecking as observed in the US 101 NGSIM dataset. The same experiment is used to establish the relationship between rubbernecking behavior and the period of oscillations.
Resumo:
Serving as a powerful tool for extracting localized variations in non-stationary signals, applications of wavelet transforms (WTs) in traffic engineering have been introduced; however, lacking in some important theoretical fundamentals. In particular, there is little guidance provided on selecting an appropriate WT across potential transport applications. This research described in this paper contributes uniquely to the literature by first describing a numerical experiment to demonstrate the shortcomings of commonly-used data processing techniques in traffic engineering (i.e., averaging, moving averaging, second-order difference, oblique cumulative curve, and short-time Fourier transform). It then mathematically describes WT’s ability to detect singularities in traffic data. Next, selecting a suitable WT for a particular research topic in traffic engineering is discussed in detail by objectively and quantitatively comparing candidate wavelets’ performances using a numerical experiment. Finally, based on several case studies using both loop detector data and vehicle trajectories, it is shown that selecting a suitable wavelet largely depends on the specific research topic, and that the Mexican hat wavelet generally gives a satisfactory performance in detecting singularities in traffic and vehicular data.
Resumo:
Many governments throughout the world rely heavily on traffic law enforcement programs to modify driver behaviour and enhance road safety. There are two related functions of traffic law enforcement, apprehension and deterrence, and these are achieved through three processes: the establishment of traffic laws, the policing of those laws, and the application of penalties and sanctions to offenders. Traffic policing programs can vary by visibility (overt or covert) and deployment methods (scheduled and non-scheduled), while sanctions can serve to constrain, deter or reform offending behaviour. This chapter will review the effectiveness of traffic law enforcement strategies from the perspective of a range of high-risk, illegal driving behaviours including drink/drug driving, speeding, seat belt use and red light running. Additionally, this chapter discusses how traffic police are increasingly using technology to enforce traffic laws and thus reduce crashes. The chapter concludes that effective traffic policing involves a range of both overt and covert operations and includes a mix of automatic and more traditional manual enforcement methods. It is important to increase both the perceived and actual risk of detection by ensuring that traffic law enforcement operations are sufficiently intensive, unpredictable in nature and conducted as widely as possible across the road network. A key means of maintaining the unpredictability of operations is through the random deployment of enforcement and/or the random checking of drivers. The impact of traffic enforcement is also heightened when it is supported by public education campaigns. In the future, technological improvements will allow the use of more innovative enforcement strategies. Finally, further research is needed to continue the development of traffic policing approaches and address emerging road safety issues.
Resumo:
Navigational safety analysis relying on collision statistics is often hampered because of low number of observations. A promising alternative approach that overcomes this problem is proposed in this paper. By analyzing critical vessel interactions this approach proactively measures collision risk in port waters. The proposed method is illustrated for quantitative measurement of collision risks in Singapore port fairways, and validated by examining correlations between the measured risks with those perceived by pilots. This method is an ethically appealing alternative to the collision-based analysis for fast, reliable and effective safety assessment, thus possesses great potential for managing collision risks in port waters.
Resumo:
Navigational collisions are one of the major safety concerns for many seaports. Despite the extent of work recently done on collision risk analysis in port waters, little is known about the influencing factors of the risk. This paper develops a technique for modeling collision risks in port waterways in order to examine the associations between the risks and the geometric, traffic, and regulatory control characteristics of waterways. A binomial logistic model, which accounts for the correlations in the risks of a particular fairway at different time periods, is derived from traffic conflicts and calibrated for the Singapore port fairways. Estimation results show that the fairways attached to shoreline, traffic intersection and international fairway attribute higher risks, whereas those attached to confined water and local fairway possess lower risks. Higher risks are also found in the fairways featuring higher degree of bend, lower depth of water, higher numbers of cardinal and isolated danger marks, higher density of moving ships and lower operating speed. The risks are also found to be higher for night-time conditions.
Resumo:
This paper investigates relationship between traffic conditions and the crash occurrence likelihood (COL) using the I-880 data. To remedy the data limitations and the methodological shortcomings suffered by previous studies, a multiresolution data processing method is proposed and implemented, upon which binary logistic models were developed. The major findings of this paper are: 1) traffic conditions have significant impacts on COL at the study site; Specifically, COL in a congested (transitioning) traffic flow is about 6 (1.6) times of that in a free flow condition; 2)Speed variance alone is not sufficient to capture traffic dynamics’ impact on COL; a traffic chaos indicator that integrates speed, speed variance, and flow is proposed and shows a promising performance; 3) Models based on aggregated data shall be interpreted with caution. Generally, conclusions obtained from such models shall not be generalized to individual vehicles (drivers) without further evidences using high-resolution data and it is dubious to either claim or disclaim speed kills based on aggregated data.
Resumo:
Most crash severity studies ignored severity correlations between driver-vehicle units involved in the same crashes. Models without accounting for these within-crash correlations will result in biased estimates in the factor effects. This study developed a Bayesian hierarchical binomial logistic model to identify the significant factors affecting the severity level of driver injury and vehicle damage in traffic crashes at signalized intersections. Crash data in Singapore were employed to calibrate the model. Model fitness assessment and comparison using Intra-class Correlation Coefficient (ICC) and Deviance Information Criterion (DIC) ensured the suitability of introducing the crash-level random effects. Crashes occurring in peak time, in good street lighting condition, involving pedestrian injuries are associated with a lower severity, while those in night time, at T/Y type intersections, on right-most lane, and installed with red light camera have larger odds of being severe. Moreover, heavy vehicles have a better resistance on severe crash, while crashes involving two-wheel vehicles, young or aged drivers, and the involvement of offending party are more likely to result in severe injuries.
Resumo:
Poisson distribution has often been used for count like accident data. Negative Binomial (NB) distribution has been adopted in the count data to take care of the over-dispersion problem. However, Poisson and NB distributions are incapable of taking into account some unobserved heterogeneities due to spatial and temporal effects of accident data. To overcome this problem, Random Effect models have been developed. Again another challenge with existing traffic accident prediction models is the distribution of excess zero accident observations in some accident data. Although Zero-Inflated Poisson (ZIP) model is capable of handling the dual-state system in accident data with excess zero observations, it does not accommodate the within-location correlation and between-location correlation heterogeneities which are the basic motivations for the need of the Random Effect models. This paper proposes an effective way of fitting ZIP model with location specific random effects and for model calibration and assessment the Bayesian analysis is recommended.
Resumo:
Navigational collisions are one of the major safety concerns in many seaports. To address this safety concern, a comprehensive and structured method of collision risk management is necessary. Traditionally management of port water collision risks has been relied on historical collision data. However, this collision-data-based approach is hampered by several shortcomings, such as randomness and rarity of collision occurrence leading to obtaining insufficient number of samples for a sound statistical analysis, insufficiency in explaining collision causation, and reactive approach to safety. A promising alternative approach that overcomes these shortcomings is the navigational traffic conflict technique that uses traffic conflicts as an alternative to the collision data. This paper proposes a collision risk management method by utilizing the principles of this technique. This risk management method allows safety analysts to diagnose safety deficiencies in a proactive manner, which, consequently, has great potential for managing collision risks in a fast, reliable and efficient manner.
Resumo:
Particles emitted by vehicles are known to cause detrimental health effects, with their size and oxidative potential among the main factors responsible. Therefore, understanding the relationship between traffic composition and both the physical characteristics and oxidative potential of particles is critical. To contribute to the limited knowledge base in this area, we investigated this relationship in a 4.5 km road tunnel in Brisbane, Australia. On-road concentrations of ultrafine particles (<100 nm, UFPs), fine particles (PM2.5), CO, CO2 and particle associated reactive oxygen species (ROS) were measured using vehicle-based mobile sampling. UFPs were measured using a condensation particle counter and PM2.5 with a DustTrak aerosol photometer. A new profluorescent nitroxide probe, BPEAnit, was used to determine ROS levels. Comparative measurements were also performed on an above-ground road to assess the role of emission dilution on the parameters measured. The profile of UFP and PM2.5 concentration with distance through the tunnel was determined, and demonstrated relationships with both road gradient and tunnel ventilation. ROS levels in the tunnel were found to be high compared to an open road with similar traffic characteristics, which was attributed to the substantial difference in estimated emission dilution ratios on the two roadways. Principal component analysis (PCA) revealed that the levels of pollutants and ROS were generally better correlated with total traffic count, rather than the traffic composition (i.e. diesel and gasoline-powered vehicles). A possible reason for the lack of correlation with HDV, which has previously been shown to be strongly associated with UFPs especially, was the low absolute numbers encountered during the sampling. This may have made their contribution to in-tunnel pollution largely indistinguishable from the total vehicle volume. For ROS, the stronger association observed with HDV and gasoline vehicles when combined (total traffic count) compared to when considered individually may signal a role for the interaction of their emissions as a determinant of on-road ROS in this pilot study. If further validated, this should not be overlooked in studies of on- or near-road particle exposure and its potential health effects.
Resumo:
Decision table and decision rules play an important role in rough set based data analysis, which compress databases into granules and describe the associations between granules. Granule mining was also proposed to interpret decision rules in terms of association rules and multi-tier structure. In this paper, we further extend granule mining to describe the relationships between granules not only by traditional support and confidence, but by diversity and condition diversity as well. Diversity measures how diverse of a granule associated with the other ganules, it provides a kind of novel knowledge in databases. Some experiments are conducted to test the proposed new concepts for describing the characteristics of a real network traffic data collection. The results show that the proposed concepts are promising.