409 resultados para Vehicle Status.
Resumo:
This paper presents a case study of a design for a complete microair vehicle thruster. Fixed-pitch small-scale rotors, brushless motors, lithium-polymer cells, and embedded control are combined to produce a mechanically simple, high-performance thruster with potentially high reliability. The custom rotor design requires a balance between manufacturing simplicity and rigidity of a blade versus its aerodynamic performance. An iterative steady-state aeroelastic simulator is used for holistic blade design. The aerodynamic load disturbances of the rotor-motor system in normal conditions are experimentally characterized. The motors require fast dynamic response for authoritative vehicle flight control. We detail a dynamic compensator that achieves satisfactory closed-loop response time. The experimental rotor-motor plant displayed satisfactory thrust performance and dynamic response.
Resumo:
A hip fracture causes permanent changes to life style for older people. Further, two important mortality indicators found post operatively for this group include, the time until surgery after fracture, and pre-operative health status prior to surgery, yet no research is available investigating relationships between time to surgery and health status. The researchers aimed to establish the health status risks for patients aged over 65 years with a non-pathological hip fracture to guide nursing care interventions. A prospective cohort design was used to investigate relationships between time to surgery and measures on pre-operative health status indicators including, skin integrity risk, vigor, mental state, bowel function and continence. Twenty-nine patients with a mean age in years of 81.93 (SD,9.49), were recruited. The mean number of hours from time 1 assessment to surgery was 52.72 (SD,58.35) and the range was 1 hour to 219 hours. At Time 2, the mean scores of vigor and skin integrity risk were significantly higher, indicating poorer health status. A change in health status occurred but possibly due to the small sample size it was difficult to relate this result to time. However the results informed preoperative care prior to surgery, for this group.
Resumo:
The Lane Change Test (LCT) is one of the growing number of methods developed to quantify driving performance degradation brought about by the use of in-vehicle devices. Beyond its validity and reliability, for such a test to be of practical use, it must also be sensitive to the varied demands of individual tasks. The current study evaluated the ability of several recent LCT lateral control and event detection parameters to discriminate between visual-manual and cognitive surrogate In-Vehicle Information System tasks with different levels of demand. Twenty-seven participants (mean age 24.4 years) completed a PC version of the LCT while performing visual search and math problem solving tasks. A number of the lateral control metrics were found to be sensitive to task differences, but the event detection metrics were less able to discriminate between tasks. The mean deviation and lane excursion measures were able to distinguish between the visual and cognitive tasks, but were less sensitive to the different levels of task demand. The other LCT metrics examined were less sensitive to task differences. A major factor influencing the sensitivity of at least some of the LCT metrics could be the type of lane change instructions given to participants. The provision of clear and explicit lane change instructions and further refinement of its metrics will be essential for increasing the utility of the LCT as an evaluation tool.
Resumo:
The Georgia Institute of Technology is currently performing research that will result in the development and deployment of three instrumentation packages that allow for automated capture of personal travel-related data for a given time period (up to 10 days). These three packages include: A handheld electronic travel diary (ETD) with Global Positioning System (GPS) capabilities to capture trip information for all modes of travel; A comprehensive electronic travel monitoring system (CETMS), which includes an ETD, a rugged laptop computer, a GPS receiver and antenna, and an onboard engine monitoring system, to capture all trip and vehicle information; and a passive GPS receiver, antenna, and data logger to capture vehicle trips only.
Resumo:
This paper presents a critical review of past research in the work-related driving field in light vehicle fleets (e.g., vehicles < 4.5 tonnes) and an intervention framework that provides future direction for practitioners and researchers. Although work-related driving crashes have become the most common cause of death, injury, and absence from work in Australia and overseas, very limited research has progressed in establishing effective strategies to improve safety outcomes. In particular, the majority of past research has been data-driven, and therefore, limited attention has been given to theoretical development in establishing the behavioural mechanism underlying driving behaviour. As such, this paper argues that to move forward in the field of work-related driving safety, practitioners and researchers need to gain a better understanding of the individual and organisational factors influencing safety through adopting relevant theoretical frameworks, which in turn will inform the development of specifically targeted theory-driven interventions. This paper presents an intervention framework that is based on relevant theoretical frameworks and sound methodological design, incorporating interventions that can be directed at the appropriate level, individual and driving target group.
Resumo:
Statistical modeling of traffic crashes has been of interest to researchers for decades. Over the most recent decade many crash models have accounted for extra-variation in crash counts—variation over and above that accounted for by the Poisson density. The extra-variation – or dispersion – is theorized to capture unaccounted for variation in crashes across sites. The majority of studies have assumed fixed dispersion parameters in over-dispersed crash models—tantamount to assuming that unaccounted for variation is proportional to the expected crash count. Miaou and Lord [Miaou, S.P., Lord, D., 2003. Modeling traffic crash-flow relationships for intersections: dispersion parameter, functional form, and Bayes versus empirical Bayes methods. Transport. Res. Rec. 1840, 31–40] challenged the fixed dispersion parameter assumption, and examined various dispersion parameter relationships when modeling urban signalized intersection accidents in Toronto. They suggested that further work is needed to determine the appropriateness of the findings for rural as well as other intersection types, to corroborate their findings, and to explore alternative dispersion functions. This study builds upon the work of Miaou and Lord, with exploration of additional dispersion functions, the use of an independent data set, and presents an opportunity to corroborate their findings. Data from Georgia are used in this study. A Bayesian modeling approach with non-informative priors is adopted, using sampling-based estimation via Markov Chain Monte Carlo (MCMC) and the Gibbs sampler. A total of eight model specifications were developed; four of them employed traffic flows as explanatory factors in mean structure while the remainder of them included geometric factors in addition to major and minor road traffic flows. The models were compared and contrasted using the significance of coefficients, standard deviance, chi-square goodness-of-fit, and deviance information criteria (DIC) statistics. The findings indicate that the modeling of the dispersion parameter, which essentially explains the extra-variance structure, depends greatly on how the mean structure is modeled. In the presence of a well-defined mean function, the extra-variance structure generally becomes insignificant, i.e. the variance structure is a simple function of the mean. It appears that extra-variation is a function of covariates when the mean structure (expected crash count) is poorly specified and suffers from omitted variables. In contrast, when sufficient explanatory variables are used to model the mean (expected crash count), extra-Poisson variation is not significantly related to these variables. If these results are generalizable, they suggest that model specification may be improved by testing extra-variation functions for significance. They also suggest that known influences of expected crash counts are likely to be different than factors that might help to explain unaccounted for variation in crashes across sites
Resumo:
There has been considerable research conducted over the last 20 years focused on predicting motor vehicle crashes on transportation facilities. The range of statistical models commonly applied includes binomial, Poisson, Poisson-gamma (or negative binomial), zero-inflated Poisson and negative binomial models (ZIP and ZINB), and multinomial probability models. Given the range of possible modeling approaches and the host of assumptions with each modeling approach, making an intelligent choice for modeling motor vehicle crash data is difficult. There is little discussion in the literature comparing different statistical modeling approaches, identifying which statistical models are most appropriate for modeling crash data, and providing a strong justification from basic crash principles. In the recent literature, it has been suggested that the motor vehicle crash process can successfully be modeled by assuming a dual-state data-generating process, which implies that entities (e.g., intersections, road segments, pedestrian crossings, etc.) exist in one of two states—perfectly safe and unsafe. As a result, the ZIP and ZINB are two models that have been applied to account for the preponderance of “excess” zeros frequently observed in crash count data. The objective of this study is to provide defensible guidance on how to appropriate model crash data. We first examine the motor vehicle crash process using theoretical principles and a basic understanding of the crash process. It is shown that the fundamental crash process follows a Bernoulli trial with unequal probability of independent events, also known as Poisson trials. We examine the evolution of statistical models as they apply to the motor vehicle crash process, and indicate how well they statistically approximate the crash process. We also present the theory behind dual-state process count models, and note why they have become popular for modeling crash data. A simulation experiment is then conducted to demonstrate how crash data give rise to “excess” zeros frequently observed in crash data. It is shown that the Poisson and other mixed probabilistic structures are approximations assumed for modeling the motor vehicle crash process. Furthermore, it is demonstrated that under certain (fairly common) circumstances excess zeros are observed—and that these circumstances arise from low exposure and/or inappropriate selection of time/space scales and not an underlying dual state process. In conclusion, carefully selecting the time/space scales for analysis, including an improved set of explanatory variables and/or unobserved heterogeneity effects in count regression models, or applying small-area statistical methods (observations with low exposure) represent the most defensible modeling approaches for datasets with a preponderance of zeros
Resumo:
On-board mass (OBM) monitoring devices on heavy vehicles (HVs) have been tested in a national programme jointly by Transport Certification Australia Limited and the National Transport Commission. The tests were for, amongst other parameters, accuracy and tamper-evidence. The latter by deliberately tampering with the signals from OBM primary transducers during the tests. The OBM feasibility team is analysing dynamic data recorded at the primary transducers of OBM systems to determine if it can be used to detect tamper events. Tamper-evidence of current OBM systems needs to be determined if jurisdictions are to have confidence in specifying OBM for HVs as part of regulatory schemes. An algorithm has been developed to detect tamper events. The results of its application are detailed here.