735 resultados para Bayesian framework
Resumo:
In recent years there has been considerable discussion afforded to the challenges facing the future of library and information science (LIS) education in Australia. This paper outlines a twelve-month project funded by the Australian Learning and Teaching Council that was undertaken by eleven institutions representing university and vocational LIS education in Australia. The project established a Framework for the Education of the Information Professions in Australia that provides a set of strategic recommendations that will inform future directions of Australian LIS education. This national project represented a bold move within Australian LIS education, and provided a unique opportunity for LIS educators across Australia to collectively unite in order to ‘future-proof’ education for future generations of LIS professionals.
Resumo:
While the engagement, success and retention of first year students are ongoing issues in higher education, they are currently of considerable and increasing importance as the pressures on teaching and learning from the new standards framework and performance funding intensifies. This Nuts & Bolts presentation introduces the concept of a maturity model and its application to the assessment of the capability of higher education institutions to address student engagement, success and retention. Participants will be provided with (a) a concise description of the concept and features of a maturity model; and (b) the opportunity to explore the potential application of maturity models (i) to the management of student engagement and retention programs and strategies within an institution and (ii) to the improvement of these features by benchmarking across the sector.
Resumo:
Information communication and technology (ICT) systems are almost ubiquitous in the modern world. It is hard to identify any industry, or for that matter any part of society, that is not in some way dependent on these systems and their continued secure operation. Therefore the security of information infrastructures, both on an organisational and societal level, is of critical importance. Information security risk assessment is an essential part of ensuring that these systems are appropriately protected and positioned to deal with a rapidly changing threat environment. The complexity of these systems and their inter-dependencies however, introduces a similar complexity to the information security risk assessment task. This complexity suggests that information security risk assessment cannot, optimally, be undertaken manually. Information security risk assessment for individual components of the information infrastructure can be aided by the use of a software tool, a type of simulation, which concentrates on modelling failure rather than normal operational simulation. Avoiding the modelling of the operational system will once again reduce the level of complexity of the assessment task. The use of such a tool provides the opportunity to reuse information in many different ways by developing a repository of relevant information to aid in both risk assessment and management and governance and compliance activities. Widespread use of such a tool allows the opportunity for the risk models developed for individual information infrastructure components to be connected in order to develop a model of information security exposures across the entire information infrastructure. In this thesis conceptual and practical aspects of risk and its underlying epistemology are analysed to produce a model suitable for application to information security risk assessment. Based on this work prototype software has been developed to explore these concepts for information security risk assessment. Initial work has been carried out to investigate the use of this software for information security compliance and governance activities. Finally, an initial concept for extending the use of this approach across an information infrastructure is presented.
Resumo:
Most crash severity studies ignored severity correlations between driver-vehicle units involved in the same crashes. Models without accounting for these within-crash correlations will result in biased estimates in the factor effects. This study developed a Bayesian hierarchical binomial logistic model to identify the significant factors affecting the severity level of driver injury and vehicle damage in traffic crashes at signalized intersections. Crash data in Singapore were employed to calibrate the model. Model fitness assessment and comparison using Intra-class Correlation Coefficient (ICC) and Deviance Information Criterion (DIC) ensured the suitability of introducing the crash-level random effects. Crashes occurring in peak time, in good street lighting condition, involving pedestrian injuries are associated with a lower severity, while those in night time, at T/Y type intersections, on right-most lane, and installed with red light camera have larger odds of being severe. Moreover, heavy vehicles have a better resistance on severe crash, while crashes involving two-wheel vehicles, young or aged drivers, and the involvement of offending party are more likely to result in severe injuries.
Resumo:
Motorcycles are overrepresented in road traffic crashes and particularly vulnerable at signalized intersections. The objective of this study is to identify causal factors affecting the motorcycle crashes at both four-legged and T signalized intersections. Treating the data in time-series cross-section panels, this study explores different Hierarchical Poisson models and found that the model allowing autoregressive lag 1 dependent specification in the error term is the most suitable. Results show that the number of lanes at the four-legged signalized intersections significantly increases motorcycle crashes largely because of the higher exposure resulting from higher motorcycle accumulation at the stop line. Furthermore, the presence of a wide median and an uncontrolled left-turn lane at major roadways of four-legged intersections exacerbate this potential hazard. For T signalized intersections, the presence of exclusive right-turn lane at both major and minor roadways and an uncontrolled left-turn lane at major roadways of T intersections increases motorcycle crashes. Motorcycle crashes increase on high-speed roadways because they are more vulnerable and less likely to react in time during conflicts. The presence of red light cameras reduces motorcycle crashes significantly for both four-legged and T intersections. With the red-light camera, motorcycles are less exposed to conflicts because it is observed that they are more disciplined in queuing at the stop line and less likely to jump start at the start of green.
Resumo:
This study proposes a full Bayes (FB) hierarchical modeling approach in traffic crash hotspot identification. The FB approach is able to account for all uncertainties associated with crash risk and various risk factors by estimating a posterior distribution of the site safety on which various ranking criteria could be based. Moreover, by use of hierarchical model specification, FB approach is able to flexibly take into account various heterogeneities of crash occurrence due to spatiotemporal effects on traffic safety. Using Singapore intersection crash data(1997-2006), an empirical evaluate was conducted to compare the proposed FB approach to the state-of-the-art approaches. Results show that the Bayesian hierarchical models with accommodation for site specific effect and serial correlation have better goodness-of-fit than non hierarchical models. Furthermore, all model-based approaches perform significantly better in safety ranking than the naive approach using raw crash count. The FB hierarchical models were found to significantly outperform the standard EB approach in correctly identifying hotspots.
Resumo:
Data mining techniques extract repeated and useful patterns from a large data set that in turn are utilized to predict the outcome of future events. The main purpose of the research presented in this paper is to investigate data mining strategies and develop an efficient framework for multi-attribute project information analysis to predict the performance of construction projects. The research team first reviewed existing data mining algorithms, applied them to systematically analyze a large project data set collected by the survey, and finally proposed a data-mining-based decision support framework for project performance prediction. To evaluate the potential of the framework, a case study was conducted using data collected from 139 capital projects and analyzed the relationship between use of information technology and project cost performance. The study results showed that the proposed framework has potential to promote fast, easy to use, interpretable, and accurate project data analysis.
Resumo:
Flexibility is a key driver of any successful design, specifically in highly unpredictable environment such as airport terminal. Ever growing aviation industry requires airport terminals to be planned and constructed in such a way that will allow flexibility for future design, alteration and redevelopment. The concept of flexibility in terminal design is a relatively new initiative, where existing rules or guidelines are not adequate to assist designers. A shift towards flexible design concept would allow terminal buildings to be designed to accommodate future changes and to make passengers’ journey as simple, timely and hassle free as possible. Currently available research indicates that a theoretical framework on flexible design approach for airport terminals would facilitate the future design process. The generic principles of flexibility are investigated in the current research to incorporate flexible design approaches within the process of an airport terminal design. A conceptual framework is proposed herein, which is expected to ascertain flexibility to current passenger terminal facilities within their corresponding locations as well as in future design and expansion.
Resumo:
Key establishment is a crucial primitive for building secure channels in a multi-party setting. Without quantum mechanics, key establishment can only be done under the assumption that some computational problem is hard. Since digital communication can be easily eavesdropped and recorded, it is important to consider the secrecy of information anticipating future algorithmic and computational discoveries which could break the secrecy of past keys, violating the secrecy of the confidential channel. Quantum key distribution (QKD) can be used generate secret keys that are secure against any future algorithmic or computational improvements. QKD protocols still require authentication of classical communication, although existing security proofs of QKD typically assume idealized authentication. It is generally considered folklore that QKD when used with computationally secure authentication is still secure against an unbounded adversary, provided the adversary did not break the authentication during the run of the protocol. We describe a security model for quantum key distribution extending classical authenticated key exchange (AKE) security models. Using our model, we characterize the long-term security of the BB84 QKD protocol with computationally secure authentication against an eventually unbounded adversary. By basing our model on traditional AKE models, we can more readily compare the relative merits of various forms of QKD and existing classical AKE protocols. This comparison illustrates in which types of adversarial environments different quantum and classical key agreement protocols can be secure.
Resumo:
Positive user experience (UX) has become a key factor in designing interactive products. It acts as a differentiator which can determine a product’s success on the mature market. However, current UX frameworks and methods do not fully support the early stages of product design and development. During these phases, assessment of UX is challenging as no actual user-product interaction can be tested. This qualitative study investigated anticipated user experience (AUX) to address this problem. Using the co-discovery method, participants were asked to imagine a desired product, anticipate experiences with it, and discuss their views with another participant. Fourteen sub-categories emerged from the data, and relationships among them were defined through co-occurrence analysis. These data formed the basis of the AUX framework which consists of two networks which elucidate 1) how users imagine a desired product and 2) how they anticipate positive experiences with that product. Through this AUX framework, important factors in the process of imagining future products and experiences were learnt, including the way in which these factors interrelate. Focusing on and exploring each component of the two networks in the framework will allow designers to obtain a deeper understanding of the required pragmatic and hedonic qualities of product, intended uses of product, user characteristics, potential contexts of experience, and anticipated emotions embedded within the experience. This understanding, in turn, will help designers to better foresee users’ underlying needs and to focus on the most important aspects of their positive experience. Therefore, the use of the AUX framework in the early stages of product development will contribute to the design for pleasurable UX.