993 resultados para Vehicle maintenance.
Resumo:
This paper presents a case study of a design for a complete microair vehicle thruster. Fixed-pitch small-scale rotors, brushless motors, lithium-polymer cells, and embedded control are combined to produce a mechanically simple, high-performance thruster with potentially high reliability. The custom rotor design requires a balance between manufacturing simplicity and rigidity of a blade versus its aerodynamic performance. An iterative steady-state aeroelastic simulator is used for holistic blade design. The aerodynamic load disturbances of the rotor-motor system in normal conditions are experimentally characterized. The motors require fast dynamic response for authoritative vehicle flight control. We detail a dynamic compensator that achieves satisfactory closed-loop response time. The experimental rotor-motor plant displayed satisfactory thrust performance and dynamic response.
Resumo:
The Lane Change Test (LCT) is one of the growing number of methods developed to quantify driving performance degradation brought about by the use of in-vehicle devices. Beyond its validity and reliability, for such a test to be of practical use, it must also be sensitive to the varied demands of individual tasks. The current study evaluated the ability of several recent LCT lateral control and event detection parameters to discriminate between visual-manual and cognitive surrogate In-Vehicle Information System tasks with different levels of demand. Twenty-seven participants (mean age 24.4 years) completed a PC version of the LCT while performing visual search and math problem solving tasks. A number of the lateral control metrics were found to be sensitive to task differences, but the event detection metrics were less able to discriminate between tasks. The mean deviation and lane excursion measures were able to distinguish between the visual and cognitive tasks, but were less sensitive to the different levels of task demand. The other LCT metrics examined were less sensitive to task differences. A major factor influencing the sensitivity of at least some of the LCT metrics could be the type of lane change instructions given to participants. The provision of clear and explicit lane change instructions and further refinement of its metrics will be essential for increasing the utility of the LCT as an evaluation tool.
Resumo:
The Georgia Institute of Technology is currently performing research that will result in the development and deployment of three instrumentation packages that allow for automated capture of personal travel-related data for a given time period (up to 10 days). These three packages include: A handheld electronic travel diary (ETD) with Global Positioning System (GPS) capabilities to capture trip information for all modes of travel; A comprehensive electronic travel monitoring system (CETMS), which includes an ETD, a rugged laptop computer, a GPS receiver and antenna, and an onboard engine monitoring system, to capture all trip and vehicle information; and a passive GPS receiver, antenna, and data logger to capture vehicle trips only.
Resumo:
This paper presents a critical review of past research in the work-related driving field in light vehicle fleets (e.g., vehicles < 4.5 tonnes) and an intervention framework that provides future direction for practitioners and researchers. Although work-related driving crashes have become the most common cause of death, injury, and absence from work in Australia and overseas, very limited research has progressed in establishing effective strategies to improve safety outcomes. In particular, the majority of past research has been data-driven, and therefore, limited attention has been given to theoretical development in establishing the behavioural mechanism underlying driving behaviour. As such, this paper argues that to move forward in the field of work-related driving safety, practitioners and researchers need to gain a better understanding of the individual and organisational factors influencing safety through adopting relevant theoretical frameworks, which in turn will inform the development of specifically targeted theory-driven interventions. This paper presents an intervention framework that is based on relevant theoretical frameworks and sound methodological design, incorporating interventions that can be directed at the appropriate level, individual and driving target group.
Resumo:
Statistical modeling of traffic crashes has been of interest to researchers for decades. Over the most recent decade many crash models have accounted for extra-variation in crash counts—variation over and above that accounted for by the Poisson density. The extra-variation – or dispersion – is theorized to capture unaccounted for variation in crashes across sites. The majority of studies have assumed fixed dispersion parameters in over-dispersed crash models—tantamount to assuming that unaccounted for variation is proportional to the expected crash count. Miaou and Lord [Miaou, S.P., Lord, D., 2003. Modeling traffic crash-flow relationships for intersections: dispersion parameter, functional form, and Bayes versus empirical Bayes methods. Transport. Res. Rec. 1840, 31–40] challenged the fixed dispersion parameter assumption, and examined various dispersion parameter relationships when modeling urban signalized intersection accidents in Toronto. They suggested that further work is needed to determine the appropriateness of the findings for rural as well as other intersection types, to corroborate their findings, and to explore alternative dispersion functions. This study builds upon the work of Miaou and Lord, with exploration of additional dispersion functions, the use of an independent data set, and presents an opportunity to corroborate their findings. Data from Georgia are used in this study. A Bayesian modeling approach with non-informative priors is adopted, using sampling-based estimation via Markov Chain Monte Carlo (MCMC) and the Gibbs sampler. A total of eight model specifications were developed; four of them employed traffic flows as explanatory factors in mean structure while the remainder of them included geometric factors in addition to major and minor road traffic flows. The models were compared and contrasted using the significance of coefficients, standard deviance, chi-square goodness-of-fit, and deviance information criteria (DIC) statistics. The findings indicate that the modeling of the dispersion parameter, which essentially explains the extra-variance structure, depends greatly on how the mean structure is modeled. In the presence of a well-defined mean function, the extra-variance structure generally becomes insignificant, i.e. the variance structure is a simple function of the mean. It appears that extra-variation is a function of covariates when the mean structure (expected crash count) is poorly specified and suffers from omitted variables. In contrast, when sufficient explanatory variables are used to model the mean (expected crash count), extra-Poisson variation is not significantly related to these variables. If these results are generalizable, they suggest that model specification may be improved by testing extra-variation functions for significance. They also suggest that known influences of expected crash counts are likely to be different than factors that might help to explain unaccounted for variation in crashes across sites
Resumo:
There has been considerable research conducted over the last 20 years focused on predicting motor vehicle crashes on transportation facilities. The range of statistical models commonly applied includes binomial, Poisson, Poisson-gamma (or negative binomial), zero-inflated Poisson and negative binomial models (ZIP and ZINB), and multinomial probability models. Given the range of possible modeling approaches and the host of assumptions with each modeling approach, making an intelligent choice for modeling motor vehicle crash data is difficult. There is little discussion in the literature comparing different statistical modeling approaches, identifying which statistical models are most appropriate for modeling crash data, and providing a strong justification from basic crash principles. In the recent literature, it has been suggested that the motor vehicle crash process can successfully be modeled by assuming a dual-state data-generating process, which implies that entities (e.g., intersections, road segments, pedestrian crossings, etc.) exist in one of two states—perfectly safe and unsafe. As a result, the ZIP and ZINB are two models that have been applied to account for the preponderance of “excess” zeros frequently observed in crash count data. The objective of this study is to provide defensible guidance on how to appropriate model crash data. We first examine the motor vehicle crash process using theoretical principles and a basic understanding of the crash process. It is shown that the fundamental crash process follows a Bernoulli trial with unequal probability of independent events, also known as Poisson trials. We examine the evolution of statistical models as they apply to the motor vehicle crash process, and indicate how well they statistically approximate the crash process. We also present the theory behind dual-state process count models, and note why they have become popular for modeling crash data. A simulation experiment is then conducted to demonstrate how crash data give rise to “excess” zeros frequently observed in crash data. It is shown that the Poisson and other mixed probabilistic structures are approximations assumed for modeling the motor vehicle crash process. Furthermore, it is demonstrated that under certain (fairly common) circumstances excess zeros are observed—and that these circumstances arise from low exposure and/or inappropriate selection of time/space scales and not an underlying dual state process. In conclusion, carefully selecting the time/space scales for analysis, including an improved set of explanatory variables and/or unobserved heterogeneity effects in count regression models, or applying small-area statistical methods (observations with low exposure) represent the most defensible modeling approaches for datasets with a preponderance of zeros
Resumo:
Now in its sixth edition, the Traffic Engineering Handbook continues to be a must have publication in the transportation industry, as it has been for the past 60 years. The new edition provides updated information for people entering the practice and for those already practicing. The handbook is a convenient desk reference, as well as an all in one source of principles and proven techniques in traffic engineering. Most chapters are presented in a new format, which divides the chapters into four areas-basics, current practice, emerging trends and information sources. Chapter topics include road users, vehicle characteristics, statistics, planning for operations, communications, safety, regulations, traffic calming, access management, geometrics, signs and markings, signals, parking, traffic demand, maintenance and studies. In addition, as the focus in transportation has shifted from project based to operations based, two new chapters have been added-"Planning for Operations" and "Managing Traffic Demand to Address Congestion: Providing Travelers with Choices." The Traffic Engineering Handbook continues to be one of the primary reference sources for study to become a certified Professional Traffic Operations Engineer™. Chapters are authored by notable and experienced authors, and reviewed and edited by a distinguished panel of traffic engineering experts.
Resumo:
On-board mass (OBM) monitoring devices on heavy vehicles (HVs) have been tested in a national programme jointly by Transport Certification Australia Limited and the National Transport Commission. The tests were for, amongst other parameters, accuracy and tamper-evidence. The latter by deliberately tampering with the signals from OBM primary transducers during the tests. The OBM feasibility team is analysing dynamic data recorded at the primary transducers of OBM systems to determine if it can be used to detect tamper events. Tamper-evidence of current OBM systems needs to be determined if jurisdictions are to have confidence in specifying OBM for HVs as part of regulatory schemes. An algorithm has been developed to detect tamper events. The results of its application are detailed here.
Resumo:
Australia, road crash trauma costs the nation A$15 billion annually whilst the US estimates an economic impact of around US$ 230 billion on its network. Worldwide economic cost of road crashes is estimated to be around US$ 518 billion each year. Road accidents occur due to a number of factors including driver behaviour, geometric alignment, vehicle characteristics, environmental impacts, and the type and condition of the road surfacing. Skid resistance is considered one of the most important road surface characteristics because it has a direct effect on traffic safety. In 2005, Austroads (the Association of Australian and New Zealand Road Transport and Traffic Authorities) published a guideline for the management of skid resistance and Queensland Department of Main Roads (QDMR) developed a skid resistance management plan (SRMP). The current QDMR strategy is based on rationale analytical methodology supported by field inspection with related asset management decision tools. The Austroads’s guideline and QDMR's skid resistance management plan have prompted QDMR to review its skid resistance management practice. As a result, a joint research project involving QDMR, Queensland University of Technology (QUT) and the Corporative Research Centre for Integrated Engineering Asset Management (CRC CIEAM) was formed. The research project aims at investigating whether there is significant relationship between road crashes and skid resistance on Queensland’s road networks. If there is, the current skid resistance management practice of QDMR will be reviewed and appropriate skid resistance investigatory levels will be recommended. This paper presents analysis results in assessing the relationship between wet crashes and skid resistance on Queensland roads. Attributes considered in the analysis include surface types, annual average daily traffic (AADT), speed and seal age.
Resumo:
This paper demonstrates the application of the reliability-centred maintenance (RCM) process to analyse and develop preventive maintenance tasks for electric multiple units (EMU) in the East Rail of the Kowloon-Canton Railway Corporation (KCRC). Two systems, the 25 kV electrical power supply and the air-conditioning system of the EMU, have been chosen for the study. RCM approach on the two systems is delineated step by step in the paper. This study confirms the feasibility and effectiveness of RCM applications on the maintenance of electric trains.