292 resultados para Portmanteau test statistics
Resumo:
‘Hooning’ constitutes a set of illegal and high-risk vehicle related activities typically performed by males aged 17-25, a group that is over-represented in road trauma statistics. This study used an online survey of 422 participants to test the efficacy of the Five Factor Model of Personality in predicting ‘loss of traction’ (LOT) hooning behaviour. Drivers who engaged in LOT behaviour scored significantly lower on the factor of Agreeableness than those who did not. Regression analyses indicated that the Five Factor Model of Personality was a significant predictor of LOT behaviour over and above sex and age, although Agreeableness was the only significant personality factor in the model. The findings may be used to better understand those drivers likely to engage in LOT behaviours. Road safety advertising and educational campaigns can target less socially agreeable drivers, and aim to encourage more agreeable attitudes to driving, particularly for younger male drivers.
Resumo:
The Lane Change Test (LCT) is one of the growing number of methods developed to quantify driving performance degradation brought about by the use of in-vehicle devices. Beyond its validity and reliability, for such a test to be of practical use, it must also be sensitive to the varied demands of individual tasks. The current study evaluated the ability of several recent LCT lateral control and event detection parameters to discriminate between visual-manual and cognitive surrogate In-Vehicle Information System tasks with different levels of demand. Twenty-seven participants (mean age 24.4 years) completed a PC version of the LCT while performing visual search and math problem solving tasks. A number of the lateral control metrics were found to be sensitive to task differences, but the event detection metrics were less able to discriminate between tasks. The mean deviation and lane excursion measures were able to distinguish between the visual and cognitive tasks, but were less sensitive to the different levels of task demand. The other LCT metrics examined were less sensitive to task differences. A major factor influencing the sensitivity of at least some of the LCT metrics could be the type of lane change instructions given to participants. The provision of clear and explicit lane change instructions and further refinement of its metrics will be essential for increasing the utility of the LCT as an evaluation tool.
Resumo:
Most statistical methods use hypothesis testing. Analysis of variance, regression, discrete choice models, contingency tables, and other analysis methods commonly used in transportation research share hypothesis testing as the means of making inferences about the population of interest. Despite the fact that hypothesis testing has been a cornerstone of empirical research for many years, various aspects of hypothesis tests commonly are incorrectly applied, misinterpreted, and ignored—by novices and expert researchers alike. On initial glance, hypothesis testing appears straightforward: develop the null and alternative hypotheses, compute the test statistic to compare to a standard distribution, estimate the probability of rejecting the null hypothesis, and then make claims about the importance of the finding. This is an oversimplification of the process of hypothesis testing. Hypothesis testing as applied in empirical research is examined here. The reader is assumed to have a basic knowledge of the role of hypothesis testing in various statistical methods. Through the use of an example, the mechanics of hypothesis testing is first reviewed. Then, five precautions surrounding the use and interpretation of hypothesis tests are developed; examples of each are provided to demonstrate how errors are made, and solutions are identified so similar errors can be avoided. Remedies are provided for common errors, and conclusions are drawn on how to use the results of this paper to improve the conduct of empirical research in transportation.
Resumo:
Purpose - Building project management (BPM) requires effective coordination and collaboration between multiple project team organisations which can be achieved by real time information flow between all participants. In the present scenario, this can be achieved by the use of information communication technologies (ICT). The purpose of this paper is to present part of a research project conducted to study the causal relationships between factors affecting ICT adoption for BPM by small and medium enterprises. Design/methodology/approach - This paper discusses structural equation modelling (SEM) analysis conducted to test the causal relationships between quantitative factors. Data for quantitative analysis were gathered through a questionnaire survey conducted in the Indian construction industry. Findings - SEM analysis results help in demonstrating that an increased and matured use of ICT for general administration within the organisation would lead to: an improved ICT infrastructure within the organisation; development of electronic databases; and a staff that is confident of using information technology (IT) tools. In such a scenario, staff would use advanced software and IT technologies for project management (PM) processes and that would lead to an increased adoption of ICT for PM processes. But, for general administration also, ICT adoption would be enhanced if the organisation is interacting more with geographically separated agencies and senior management perceives that significant benefits would accrue by adoption of ICT. All the factors are inter-related and their effect cannot be maximized in isolation. Originality/value - The results provide direction to building project managements for strategically adopting the effective use of ICT within their organisations and for BPM general.
Resumo:
Now in its sixth edition, the Traffic Engineering Handbook continues to be a must have publication in the transportation industry, as it has been for the past 60 years. The new edition provides updated information for people entering the practice and for those already practicing. The handbook is a convenient desk reference, as well as an all in one source of principles and proven techniques in traffic engineering. Most chapters are presented in a new format, which divides the chapters into four areas-basics, current practice, emerging trends and information sources. Chapter topics include road users, vehicle characteristics, statistics, planning for operations, communications, safety, regulations, traffic calming, access management, geometrics, signs and markings, signals, parking, traffic demand, maintenance and studies. In addition, as the focus in transportation has shifted from project based to operations based, two new chapters have been added-"Planning for Operations" and "Managing Traffic Demand to Address Congestion: Providing Travelers with Choices." The Traffic Engineering Handbook continues to be one of the primary reference sources for study to become a certified Professional Traffic Operations Engineer™. Chapters are authored by notable and experienced authors, and reviewed and edited by a distinguished panel of traffic engineering experts.
Resumo:
The design of driven pile foundations involves an iterative process requiring an initial estimate of the refusal level to determine the depth of boreholes for subsequent analyses. Current procedures for determining borehole depths incorporate parameters typically unknown at the investigation stage. Thus, a quantifiable procedure more applicable at this preliminary stage would provide greater confidence in estimating the founding level of driven piles. This paper examines the effectiveness of the Standard Penetration Test (SPT) in directly estimating driven pile refusal levels. A number of significant correlations were obtained between SPT information and pile penetration records demonstrating the potential application of the SPT. Results indicated pile penetration was generally best described as a function of both the pile toe and cumulative shaft SPT values. The influence of the toe SPT increased when piles penetrated rock. A refusal criteria was established from the results to guide both the estimation of borehole depths and likely pile lengths during the design stage.
Resumo:
Given there is currently a migration trend from traditional electrical supervisory control and data acquisition (SCADA) systems towards a smart grid based approach to critical infrastructure management. This project provides an evaluation of existing and proposed implementations for both traditional electrical SCADA and smart grid based architectures, and proposals a set of reference requirements which test bed implementations should implement. A high-level design for smart grid test beds is proposed and initial implementation performed, based on the proposed design, using open source and freely available software tools. The project examines the move towards smart grid based critical infrastructure management and illustrates the increased security requirements. The implemented test bed provides a basic framework for testing network requirements in a smart grid environment, as well as a platform for further research and development. Particularly to develop, implement and test network security related disturbances such as intrusion detection and network forensics. The project undertaken proposes and develops an architecture of the emulation of some smart grid functionality. The Common Open Research Emulator (CORE) platform was used to emulate the communication network of the smart grid. Specifically CORE was used to virtualise and emulate the TCP/IP networking stack. This is intended to be used for further evaluation and analysis, for example the analysis of application protocol messages, etc. As a proof of concept, software libraries were designed, developed and documented to enable and support the design and development of further smart grid emulated components, such as reclosers, switches, smart meters, etc. As part of the testing and evaluation a Modbus based smart meter emulator was developed to provide basic functionality of a smart meter. Further code was developed to send Modbus request messages to the emulated smart meter and receive Modbus responses from it. Although the functionality of the emulated components were limited, it does provide a starting point for further research and development. The design is extensible to enable the design and implementation of additional SCADA protocols. The project also defines an evaluation criteria for the evaluation of the implemented test bed, and experiments are designed to evaluate the test bed according to the defined criteria. The results of the experiments are collated and presented, and conclusions drawn from the results to facilitate discussion on the test bed implementation. The discussion undertaken also present possible future work.
Resumo:
We estimate the parameters of a stochastic process model for a macroparasite population within a host using approximate Bayesian computation (ABC). The immunity of the host is an unobserved model variable and only mature macroparasites at sacrifice of the host are counted. With very limited data, process rates are inferred reasonably precisely. Modeling involves a three variable Markov process for which the observed data likelihood is computationally intractable. ABC methods are particularly useful when the likelihood is analytically or computationally intractable. The ABC algorithm we present is based on sequential Monte Carlo, is adaptive in nature, and overcomes some drawbacks of previous approaches to ABC. The algorithm is validated on a test example involving simulated data from an autologistic model before being used to infer parameters of the Markov process model for experimental data. The fitted model explains the observed extra-binomial variation in terms of a zero-one immunity variable, which has a short-lived presence in the host.
Resumo:
OBJECTIVE: The accurate quantification of human diabetic neuropathy is important to define at-risk patients, anticipate deterioration, and assess new therapies. ---------- RESEARCH DESIGN AND METHODS: A total of 101 diabetic patients and 17 age-matched control subjects underwent neurological evaluation, neurophysiology tests, quantitative sensory testing, and evaluation of corneal sensation and corneal nerve morphology using corneal confocal microscopy (CCM). ---------- RESULTS: Corneal sensation decreased significantly (P = 0.0001) with increasing neuropathic severity and correlated with the neuropathy disability score (NDS) (r = 0.441, P < 0.0001). Corneal nerve fiber density (NFD) (P < 0.0001), nerve fiber length (NFL), (P < 0.0001), and nerve branch density (NBD) (P < 0.0001) decreased significantly with increasing neuropathic severity and correlated with NDS (NFD r = −0.475, P < 0.0001; NBD r = −0.511, P < 0.0001; and NFL r = −0.581, P < 0.0001). NBD and NFL demonstrated a significant and progressive reduction with worsening heat pain thresholds (P = 0.01). Receiver operating characteristic curve analysis for the diagnosis of neuropathy (NDS >3) defined an NFD of <27.8/mm2 with a sensitivity of 0.82 (95% CI 0.68–0.92) and specificity of 0.52 (0.40–0.64) and for detecting patients at risk of foot ulceration (NDS >6) defined a NFD cutoff of <20.8/mm2 with a sensitivity of 0.71 (0.42–0.92) and specificity of 0.64 (0.54–0.74). ---------- CONCLUSIONS: CCM is a noninvasive clinical technique that may be used to detect early nerve damage and stratify diabetic patients with increasing neuropathic severity. Established diabetic neuropathy leads to pain and foot ulceration. Detecting neuropathy early may allow intervention with treatments to slow or reverse this condition (1). Recent studies suggested that small unmyelinated C-fibers are damaged early in diabetic neuropathy (2–4) but can only be detected using invasive procedures such as sural nerve biopsy (4,5) or skin-punch biopsy (6–8). Our studies have shown that corneal confocal microscopy (CCM) can identify early small nerve fiber damage and accurately quantify the severity of diabetic neuropathy (9–11). We have also shown that CCM relates to intraepidermal nerve fiber loss (12) and a reduction in corneal sensitivity (13) and detects early nerve fiber regeneration after pancreas transplantation (14). Recently we have also shown that CCM detects nerve fiber damage in patients with Fabry disease (15) and idiopathic small fiber neuropathy (16) when results of electrophysiology tests and quantitative sensory testing (QST) are normal. In this study we assessed corneal sensitivity and corneal nerve morphology using CCM in diabetic patients stratified for the severity of diabetic neuropathy using neurological evaluation, electrophysiology tests, and QST. This enabled us to compare CCM and corneal esthesiometry with established tests of diabetic neuropathy and define their sensitivity and specificity to detect diabetic patients with early neuropathy and those at risk of foot ulceration.
Resumo:
While in many travel situations consumers have an almost limitless range of destinations to choose from, their actual decision set will usually only comprise between two and six destinations. One of the greatest challenges facing destination marketers is positioning their destination, against the myriad of competing places that offer similar features, into consumer decision sets. Since positioning requires a narrow focus, marketing communications must present a succinct and meaningful proposition, the selection of which is often problematic for destination marketing organisations (DMO), which deal with a diverse and often eclectic range of attributes in addition to numerous self-interested and demanding stakeholders. This paper reports the application of two qualitative techniques used to explore the range of cognitive attributes, consequences and personal values that represent potential positioning opportunities in the context of short break holidays. The Repertory Test is an effective technique for understanding the salient attributes used by a traveller to differentiate destinations, while Laddering Analysis enables the researcher to explore the smaller set of personal values guiding such decision making. A key finding of the research was that while individuals might vary in their repertoire of salient attributes, there was a commonality of shared consequences and values.
Resumo:
Process modeling is an emergent area of Information Systems research that is characterized through an abundance of conceptual work with little empirical research. To fill this gap, this paper reports on the development and validation of an instrument to measure user acceptance of process modeling grammars. We advance an extended model for a multi-stage measurement instrument development procedure, which incorporates feedback from both expert and user panels. We identify two main contributions: First, we provide a validated measurement instrument for the study of user acceptance of process modeling grammars, which can be used to assist in further empirical studies that investigate phenomena associated with the business process modeling domain. Second, in doing so, we describe in detail a procedural model for developing measurement instruments that ensures high levels of reliability and validity, which may assist fellow scholars in executing their empirical research.
Resumo:
It is important to understand how student performance when higher education is delivered by via new technology. Podcasting is a relatively recent new technology gaining widespread use across the world. We present the results of a quasi-experimental research project that finds when podcasts are used as a revision tool, student performance in Accounting improves. We highlight that aligning podcast use with pedagological design is important and discuss constraints on and barriers to the use of podcasting in higher education.
Resumo:
The development and use of a virtual assessment tool for a signal processing unit is described. It allows students to take a test from anywhere using a web browser to connect to the university server that hosts the test. While student responses are of the multiple choice type, they have to work out problems to arrive at the answer to be entered. CGI programming is used to verify student identification information and record their scores as well as provide immediate feedback after the test is complete. The tool has been used at QUT for the past 3 years and student feedback is discussed. The virtual assessment tool is an efficient alternative to marking written assignment reports that can often take more hours than actual lecture hall contact from a lecturer or tutor. It is especially attractive for very large classes that are now the norm at many universities in the first two years.
Resumo:
Increasingly, large amounts of public and private money are being invested in education and as a result, schools are becoming more accountable to stakeholders for this financial input. In terms of the curriculum, governments worldwide are frequently tying school funding to students‟ and schools‟ academic performances, which are monitored through high-stakes testing programs. To accommodate the resultant pressures from these testing initiatives, many principals are re-focussing their school‟s curriculum on the testing requirements. Such a re-focussing, which was examined critically in this thesis, constituted an externally facilitated rapid approach to curriculum change. In line with previously enacted change theories and recommendations from these, curriculum change in schools has tended to be a fairly slow, considered, collaborative process that is facilitated internally by a deputy-principal (curriculum). However, theoretically based research has shown that such a process has often proved to be difficult and very rarely successful. The present study reports and theorises the experiences of an externally facilitated process that emerged from a practitioner model of change. This case study of the development of the controlled rapid approach to curriculum change began by establishing the reasons three principals initiated curriculum change and why they then engaged an outsider to facilitate the process. It also examined this particular change process from the perspectives of the research participants. The investigation led to the revision of the practitioner model as used in the three schools and challenged the current thinking about the process of school curriculum change. The thesis aims to offer principals and the wider education community an alternative model for consideration when undertaking curriculum change. Finally, the thesis warns that, in the longer term, the application of study‟s revised model (the Controlled Rapid Approach to Curriculum Change [CRACC] Model) may have less then desirable educational consequences.