311 resultados para machine traffic
Resumo:
Navigational safety analysis relying on collision statistics is often hampered because of low number of observations. A promising alternative approach that overcomes this problem is proposed in this paper. By analyzing critical vessel interactions this approach proactively measures collision risk in port waters. The proposed method is illustrated for quantitative measurement of collision risks in Singapore port fairways, and validated by examining correlations between the measured risks with those perceived by pilots. This method is an ethically appealing alternative to the collision-based analysis for fast, reliable and effective safety assessment, thus possesses great potential for managing collision risks in port waters.
Resumo:
Navigational collisions are one of the major safety concerns for many seaports. Despite the extent of work recently done on collision risk analysis in port waters, little is known about the influencing factors of the risk. This paper develops a technique for modeling collision risks in port waterways in order to examine the associations between the risks and the geometric, traffic, and regulatory control characteristics of waterways. A binomial logistic model, which accounts for the correlations in the risks of a particular fairway at different time periods, is derived from traffic conflicts and calibrated for the Singapore port fairways. Estimation results show that the fairways attached to shoreline, traffic intersection and international fairway attribute higher risks, whereas those attached to confined water and local fairway possess lower risks. Higher risks are also found in the fairways featuring higher degree of bend, lower depth of water, higher numbers of cardinal and isolated danger marks, higher density of moving ships and lower operating speed. The risks are also found to be higher for night-time conditions.
Resumo:
This paper investigates relationship between traffic conditions and the crash occurrence likelihood (COL) using the I-880 data. To remedy the data limitations and the methodological shortcomings suffered by previous studies, a multiresolution data processing method is proposed and implemented, upon which binary logistic models were developed. The major findings of this paper are: 1) traffic conditions have significant impacts on COL at the study site; Specifically, COL in a congested (transitioning) traffic flow is about 6 (1.6) times of that in a free flow condition; 2)Speed variance alone is not sufficient to capture traffic dynamics’ impact on COL; a traffic chaos indicator that integrates speed, speed variance, and flow is proposed and shows a promising performance; 3) Models based on aggregated data shall be interpreted with caution. Generally, conclusions obtained from such models shall not be generalized to individual vehicles (drivers) without further evidences using high-resolution data and it is dubious to either claim or disclaim speed kills based on aggregated data.
Resumo:
Most crash severity studies ignored severity correlations between driver-vehicle units involved in the same crashes. Models without accounting for these within-crash correlations will result in biased estimates in the factor effects. This study developed a Bayesian hierarchical binomial logistic model to identify the significant factors affecting the severity level of driver injury and vehicle damage in traffic crashes at signalized intersections. Crash data in Singapore were employed to calibrate the model. Model fitness assessment and comparison using Intra-class Correlation Coefficient (ICC) and Deviance Information Criterion (DIC) ensured the suitability of introducing the crash-level random effects. Crashes occurring in peak time, in good street lighting condition, involving pedestrian injuries are associated with a lower severity, while those in night time, at T/Y type intersections, on right-most lane, and installed with red light camera have larger odds of being severe. Moreover, heavy vehicles have a better resistance on severe crash, while crashes involving two-wheel vehicles, young or aged drivers, and the involvement of offending party are more likely to result in severe injuries.
Resumo:
Poisson distribution has often been used for count like accident data. Negative Binomial (NB) distribution has been adopted in the count data to take care of the over-dispersion problem. However, Poisson and NB distributions are incapable of taking into account some unobserved heterogeneities due to spatial and temporal effects of accident data. To overcome this problem, Random Effect models have been developed. Again another challenge with existing traffic accident prediction models is the distribution of excess zero accident observations in some accident data. Although Zero-Inflated Poisson (ZIP) model is capable of handling the dual-state system in accident data with excess zero observations, it does not accommodate the within-location correlation and between-location correlation heterogeneities which are the basic motivations for the need of the Random Effect models. This paper proposes an effective way of fitting ZIP model with location specific random effects and for model calibration and assessment the Bayesian analysis is recommended.
Resumo:
Navigational collisions are one of the major safety concerns in many seaports. To address this safety concern, a comprehensive and structured method of collision risk management is necessary. Traditionally management of port water collision risks has been relied on historical collision data. However, this collision-data-based approach is hampered by several shortcomings, such as randomness and rarity of collision occurrence leading to obtaining insufficient number of samples for a sound statistical analysis, insufficiency in explaining collision causation, and reactive approach to safety. A promising alternative approach that overcomes these shortcomings is the navigational traffic conflict technique that uses traffic conflicts as an alternative to the collision data. This paper proposes a collision risk management method by utilizing the principles of this technique. This risk management method allows safety analysts to diagnose safety deficiencies in a proactive manner, which, consequently, has great potential for managing collision risks in a fast, reliable and efficient manner.
Resumo:
The discovery of protein variation is an important strategy in disease diagnosis within the biological sciences. The current benchmark for elucidating information from multiple biological variables is the so called “omics” disciplines of the biological sciences. Such variability is uncovered by implementation of multivariable data mining techniques which come under two primary categories, machine learning strategies and statistical based approaches. Typically proteomic studies can produce hundreds or thousands of variables, p, per observation, n, depending on the analytical platform or method employed to generate the data. Many classification methods are limited by an n≪p constraint, and as such, require pre-treatment to reduce the dimensionality prior to classification. Recently machine learning techniques have gained popularity in the field for their ability to successfully classify unknown samples. One limitation of such methods is the lack of a functional model allowing meaningful interpretation of results in terms of the features used for classification. This is a problem that might be solved using a statistical model-based approach where not only is the importance of the individual protein explicit, they are combined into a readily interpretable classification rule without relying on a black box approach. Here we incorporate statistical dimension reduction techniques Partial Least Squares (PLS) and Principal Components Analysis (PCA) followed by both statistical and machine learning classification methods, and compared them to a popular machine learning technique, Support Vector Machines (SVM). Both PLS and SVM demonstrate strong utility for proteomic classification problems.
Resumo:
Particles emitted by vehicles are known to cause detrimental health effects, with their size and oxidative potential among the main factors responsible. Therefore, understanding the relationship between traffic composition and both the physical characteristics and oxidative potential of particles is critical. To contribute to the limited knowledge base in this area, we investigated this relationship in a 4.5 km road tunnel in Brisbane, Australia. On-road concentrations of ultrafine particles (<100 nm, UFPs), fine particles (PM2.5), CO, CO2 and particle associated reactive oxygen species (ROS) were measured using vehicle-based mobile sampling. UFPs were measured using a condensation particle counter and PM2.5 with a DustTrak aerosol photometer. A new profluorescent nitroxide probe, BPEAnit, was used to determine ROS levels. Comparative measurements were also performed on an above-ground road to assess the role of emission dilution on the parameters measured. The profile of UFP and PM2.5 concentration with distance through the tunnel was determined, and demonstrated relationships with both road gradient and tunnel ventilation. ROS levels in the tunnel were found to be high compared to an open road with similar traffic characteristics, which was attributed to the substantial difference in estimated emission dilution ratios on the two roadways. Principal component analysis (PCA) revealed that the levels of pollutants and ROS were generally better correlated with total traffic count, rather than the traffic composition (i.e. diesel and gasoline-powered vehicles). A possible reason for the lack of correlation with HDV, which has previously been shown to be strongly associated with UFPs especially, was the low absolute numbers encountered during the sampling. This may have made their contribution to in-tunnel pollution largely indistinguishable from the total vehicle volume. For ROS, the stronger association observed with HDV and gasoline vehicles when combined (total traffic count) compared to when considered individually may signal a role for the interaction of their emissions as a determinant of on-road ROS in this pilot study. If further validated, this should not be overlooked in studies of on- or near-road particle exposure and its potential health effects.
Resumo:
Decision table and decision rules play an important role in rough set based data analysis, which compress databases into granules and describe the associations between granules. Granule mining was also proposed to interpret decision rules in terms of association rules and multi-tier structure. In this paper, we further extend granule mining to describe the relationships between granules not only by traditional support and confidence, but by diversity and condition diversity as well. Diversity measures how diverse of a granule associated with the other ganules, it provides a kind of novel knowledge in databases. Some experiments are conducted to test the proposed new concepts for describing the characteristics of a real network traffic data collection. The results show that the proposed concepts are promising.
Resumo:
Traffic safety studies demand more than what current micro-simulation models can provide as they presume that all drivers exhibit safe behaviors. All the microscopic traffic simulation models include a car following model. This paper highlights the limitations of the Gipps car following model ability to emulate driver behavior for safety study purposes. A safety adapted car following model based on the Gipps car following model is proposed to simulate unsafe vehicle movements, with safety indicators below critical thresholds. The modifications are based on the observations of driver behavior in real data and also psychophysical notions. NGSIM vehicle trajectory data is used to evaluate the new model and short following headways and Time To Collision are employed to assess critical safety events within traffic flow. Risky events are extracted from available NGSIM data to evaluate the modified model against them. The results from simulation tests illustrate that the proposed model can predict the safety metrics better than the generic Gipps model. The outcome of this paper can potentially facilitate assessing and predicting traffic safety using microscopic simulation.
Resumo:
The purpose of traffic law enforcement is to encourage compliant driver behaviour. That is, the threat of an undesirable sanction encourages drivers to comply with traffic laws. However, not all traffic law violations are considered equal. For example, while drink driving is generally seen as socially unacceptable, behaviours such as speeding are arguably less so, and speed enforcement is often portrayed in the popular media as a means of “revenue raising”. The perceived legitimacy of traffic law enforcement has received limited research attention to date. Perceived legitimacy of traffic law enforcement may influence (or be influenced by) attitudes toward illegal driving behaviours, and both of these factors are likely to influence on-road driving behaviour. This study aimed to explore attitudes toward a number of illegal driving behaviours and traffic law enforcement approaches that typically target these behaviours using self-reported data from a large sample of drivers. The results of this research can be used to inform further research in this area, as well as the content of public education and advertising campaigns designed to influence attitudes toward illegal driving behaviours and perceived legitimacy of traffic law enforcement.
Resumo:
Most unsignalised intersection capacity calculation procedures are based on gap acceptance models. Accuracy of critical gap estimation affects accuracy of capacity and delay estimation. Several methods have been published to estimate drivers’ sample mean critical gap, the Maximum Likelihood Estimation (MLE) technique regarded as the most accurate. This study assesses three novel methods; Average Central Gap (ACG) method, Strength Weighted Central Gap method (SWCG), and Mode Central Gap method (MCG), against MLE for their fidelity in rendering true sample mean critical gaps. A Monte Carlo event based simulation model was used to draw the maximum rejected gap and accepted gap for each of a sample of 300 drivers across 32 simulation runs. Simulation mean critical gap is varied between 3s and 8s, while offered gap rate is varied between 0.05veh/s and 0.55veh/s. This study affirms that MLE provides a close to perfect fit to simulation mean critical gaps across a broad range of conditions. The MCG method also provides an almost perfect fit and has superior computational simplicity and efficiency to the MLE. The SWCG method performs robustly under high flows; however, poorly under low to moderate flows. Further research is recommended using field traffic data, under a variety of minor stream and major stream flow conditions for a variety of minor stream movement types, to compare critical gap estimates using MLE against MCG. Should the MCG method prove as robust as MLE, serious consideration should be given to its adoption to estimate critical gap parameters in guidelines.