863 resultados para Background Factors
Resumo:
Road curves are an important feature of road infrastructure and many serious crashes occur on road curves. In Queensland, the number of fatalities is twice as many on curves as that on straight roads. Therefore, there is a need to reduce drivers’ exposure to crash risk on road curves. Road crashes in Australia and in the Organisation for Economic Co-operation and Development(OECD) have plateaued in the last five years (2004 to 2008) and the road safety community is desperately seeking innovative interventions to reduce the number of crashes. However, designing an innovative and effective intervention may prove to be difficult as it relies on providing theoretical foundation, coherence, understanding, and structure to both the design and validation of the efficiency of the new intervention. Researchers from multiple disciplines have developed various models to determine the contributing factors for crashes on road curves with a view towards reducing the crash rate. However, most of the existing methods are based on statistical analysis of contributing factors described in government crash reports. In order to further explore the contributing factors related to crashes on road curves, this thesis designs a novel method to analyse and validate these contributing factors. The use of crash claim reports from an insurance company is proposed for analysis using data mining techniques. To the best of our knowledge, this is the first attempt to use data mining techniques to analyse crashes on road curves. Text mining technique is employed as the reports consist of thousands of textual descriptions and hence, text mining is able to identify the contributing factors. Besides identifying the contributing factors, limited studies to date have investigated the relationships between these factors, especially for crashes on road curves. Thus, this study proposed the use of the rough set analysis technique to determine these relationships. The results from this analysis are used to assess the effect of these contributing factors on crash severity. The findings obtained through the use of data mining techniques presented in this thesis, have been found to be consistent with existing identified contributing factors. Furthermore, this thesis has identified new contributing factors towards crashes and the relationships between them. A significant pattern related with crash severity is the time of the day where severe road crashes occur more frequently in the evening or night time. Tree collision is another common pattern where crashes that occur in the morning and involves hitting a tree are likely to have a higher crash severity. Another factor that influences crash severity is the age of the driver. Most age groups face a high crash severity except for drivers between 60 and 100 years old, who have the lowest crash severity. The significant relationship identified between contributing factors consists of the time of the crash, the manufactured year of the vehicle, the age of the driver and hitting a tree. Having identified new contributing factors and relationships, a validation process is carried out using a traffic simulator in order to determine their accuracy. The validation process indicates that the results are accurate. This demonstrates that data mining techniques are a powerful tool in road safety research, and can be usefully applied within the Intelligent Transport System (ITS) domain. The research presented in this thesis provides an insight into the complexity of crashes on road curves. The findings of this research have important implications for both practitioners and academics. For road safety practitioners, the results from this research illustrate practical benefits for the design of interventions for road curves that will potentially help in decreasing related injuries and fatalities. For academics, this research opens up a new research methodology to assess crash severity, related to road crashes on curves.
Resumo:
The case study of Lusoponte illustrates the concession awarded by the Portuguese Government to finance, design, build and operate two bridges over the Tagus in Lisbon, Portugal. It includes an overview of the project's background and an analysis of the main risk categories stating both the actual risks encountered and the mitigation measures adopted. Throughout the project a great attention was given to whole life cycle costs, and gains in efficiency and cost control. Among the lessons that can be learned from both the public and private sector is that a complete risk management analysis must include not only the technical factors but also a realistic assessment of environmental and social risks. These were the risks that were somewhat overseen and that caused the main problems to the project's development.
Resumo:
Adiabatic compression testing of components in gaseous oxygen is a test method that is utilized worldwide and is commonly required to qualify a component for ignition tolerance under its intended service. This testing is required by many industry standards organizations and government agencies; however, a thorough evaluation of the test parameters and test system influences on the thermal energy produced during the test has not yet been performed. This paper presents a background for adiabatic compression testing and discusses an approach to estimating potential differences in the thermal profiles produced by different test laboratories. A “Thermal Profile Test Fixture” (TPTF) is described that is capable of measuring and characterizing the thermal energy for a typical pressure shock by any test system. The test systems at Wendell Hull & Associates, Inc. (WHA) in the USA and at the BAM Federal Institute for Materials Research and Testing in Germany are compared in this manner and some of the data obtained is presented. The paper also introduces a new way of comparing the test method to idealized processes to perform system-by-system comparisons. Thus, the paper introduces an “Idealized Severity Index” (ISI) of the thermal energy to characterize a rapid pressure surge. From the TPTF data a “Test Severity Index” (TSI) can also be calculated so that the thermal energies developed by different test systems can be compared to each other and to the ISI for the equivalent isentropic process. Finally, a “Service Severity Index” (SSI) is introduced to characterizing the thermal energy of actual service conditions. This paper is the second in a series of publications planned on the subject of adiabatic compression testing.
Resumo:
We evaluated sustainability of an intervention to reduce women’s cardiovascular risk factors, determined the influence of self-efficacy, and described women’s current health. We used a mixed method approach that utilized forced choice and open-ended questionnaire items about health status, habits, and self-efficacy. Sixty women, average age 61, returned questionnaires. Women in the original intervention group continued health behaviors intended to reduce cardiovascular disease (CVD) at a higher rate than the control group, supporting the feasibility of a targeted intervention built around women’s individual goals. The role of self-efficacy in behavior change is unclear. The original intervention group reported higher self-reported health.
Resumo:
Objective: To evaluate the importance of contextual and policy factors on nurses’ judgment about medication administration practice.---------- Design: A questionnaire survey of responses to a number of factorial vignettes in June 2004. These vignettes considered a combination of seven contextual and policy factors that were thought to influence nurses’ judgments relating to medication administration.---------- Participants: 185 (67% of eligible) clinical paediatric nursing staff returned completed questionnaires.--------- Setting: A tertiary paediatric hospital in Brisbane, Australia.---------- Results: Double checking the patient, double checking the drug and checking the legality of the prescription were the three strongest predictors of nurses’ actions regarding medication administration.--------- Conclusions: Policy factors and not contextual factors drive nurses’ judgment in response to hypothetical scenarios.
Resumo:
This paper analyzes the common factor structure of US, German, and Japanese Government bond returns. Unlike previous studies, we formally take into account the presence of country-specific factors when estimating common factors. We show that the classical approach of running a principal component analysis on a multi-country dataset of bond returns captures both local and common influences and therefore tends to pick too many factors. We conclude that US bond returns share only one common factor with German and Japanese bond returns. This single common factor is associated most notably with changes in the level of domestic term structures. We show that accounting for country-specific factors improves the performance of domestic and international hedging strategies.
Resumo:
Automatic recognition of people is an active field of research with important forensic and security applications. In these applications, it is not always possible for the subject to be in close proximity to the system. Voice represents a human behavioural trait which can be used to recognise people in such situations. Automatic Speaker Verification (ASV) is the process of verifying a persons identity through the analysis of their speech and enables recognition of a subject at a distance over a telephone channel { wired or wireless. A significant amount of research has focussed on the application of Gaussian mixture model (GMM) techniques to speaker verification systems providing state-of-the-art performance. GMM's are a type of generative classifier trained to model the probability distribution of the features used to represent a speaker. Recently introduced to the field of ASV research is the support vector machine (SVM). An SVM is a discriminative classifier requiring examples from both positive and negative classes to train a speaker model. The SVM is based on margin maximisation whereby a hyperplane attempts to separate classes in a high dimensional space. SVMs applied to the task of speaker verification have shown high potential, particularly when used to complement current GMM-based techniques in hybrid systems. This work aims to improve the performance of ASV systems using novel and innovative SVM-based techniques. Research was divided into three main themes: session variability compensation for SVMs; unsupervised model adaptation; and impostor dataset selection. The first theme investigated the differences between the GMM and SVM domains for the modelling of session variability | an aspect crucial for robust speaker verification. Techniques developed to improve the robustness of GMMbased classification were shown to bring about similar benefits to discriminative SVM classification through their integration in the hybrid GMM mean supervector SVM classifier. Further, the domains for the modelling of session variation were contrasted to find a number of common factors, however, the SVM-domain consistently provided marginally better session variation compensation. Minimal complementary information was found between the techniques due to the similarities in how they achieved their objectives. The second theme saw the proposal of a novel model for the purpose of session variation compensation in ASV systems. Continuous progressive model adaptation attempts to improve speaker models by retraining them after exploiting all encountered test utterances during normal use of the system. The introduction of the weight-based factor analysis model provided significant performance improvements of over 60% in an unsupervised scenario. SVM-based classification was then integrated into the progressive system providing further benefits in performance over the GMM counterpart. Analysis demonstrated that SVMs also hold several beneficial characteristics to the task of unsupervised model adaptation prompting further research in the area. In pursuing the final theme, an innovative background dataset selection technique was developed. This technique selects the most appropriate subset of examples from a large and diverse set of candidate impostor observations for use as the SVM background by exploiting the SVM training process. This selection was performed on a per-observation basis so as to overcome the shortcoming of the traditional heuristic-based approach to dataset selection. Results demonstrate the approach to provide performance improvements over both the use of the complete candidate dataset and the best heuristically-selected dataset whilst being only a fraction of the size. The refined dataset was also shown to generalise well to unseen corpora and be highly applicable to the selection of impostor cohorts required in alternate techniques for speaker verification.
Resumo:
Background: Apart from promoting physical recovery and assisting in activities of daily living, a major challenge in stroke rehabilitation is to minimize psychosocial morbidity and to promote the reintegration of stroke survivors into their family and community. The identification of key factors influencing long-term outcome are essential in developing more effective rehabilitation measures for reducing stroke-related morbidity. The aim of this study was to test a theoretical model of predictors of participation restriction which included the direct and indirect effects between psychosocial outcomes, physical outcome, and socio-demographic variables at 12 months after stroke.--------- Methods: Data were collected from 188 stroke survivors at 12 months following their discharge from one of the two rehabilitation hospitals in Hong Kong. The settings included patients' homes and residential care facilities. Path analysis was used to test a hypothesized model of participation restriction at 12 months.---------- Results: The path coefficients show functional ability having the largest direct effect on participation restriction (β = 0.51). The results also show that more depressive symptoms (β = -0.27), low state self-esteem (β = 0.20), female gender (β = 0.13), older age (β = -0.11) and living in a residential care facility (β = -0.12) have a direct effect on participation restriction. The explanatory variables accounted for 71% of the variance in explaining participation restriction at 12 months.---------- Conclusion: Identification of stroke survivors at risk of high levels of participation restriction, depressive symptoms and low self-esteem will assist health professionals to devise appropriate rehabilitation interventions that target improving both physical and psychosocial functioning.
Resumo:
Aim: This paper is a report of a study to explore the phenomenon of resilience in the lives of adult patients of mental health services who have experienced mental illness. ---------- Background: Mental illness is a major health concern worldwide, and the majority experiencing it will continue to battle with relapses throughout their lives. However, in many instances people go on to overcome their illness to lead productive and socially engaged lives. Contemporary mental health nursing practice primarily focuses on symptom reduction, and working with resilience has not generally been a consideration. ---------- Method: A descriptive phenomenological study was carried out in 2006. One participant was recruited through advertisements in community newspapers and newsletters and the others using the snowballing method. Information was gathered through in-depth individual interviews which were tape-recorded and subsequently transcribed. Colaizzi's original seven-step approach was used for data analysis, with the inclusion of two additional steps. ---------- Findings: The following themes were identified: Universality, Acceptance, Naming and knowing, Faith, Hope, Being the fool and Striking a balance, Having meaning and meaningful relationships, and 'Just doing it'. The conceptualization identified as encapsulating the themes was 'Viewing life from the ridge with eyes wide open', which involved knowing the risks and dangers ahead and making a decision for life amid ever-present hardships. ---------- Conclusion: Knowledge about resilience should be included in the theoretical and practical education of nursing students and experienced nurses. Early intervention, based on resilience factors identified through screening processes, is needed for people with mental illness.
Resumo:
Process models are used by information professionals to convey semantics about the business operations in a real world domain intended to be supported by an information system. The understandability of these models is vital to them actually being used. After all, what is not understood cannot be acted upon. Yet until now, understandability has primarily been defined as an intrinsic quality of the models themselves. Moreover, those studies that looked at understandability from a user perspective have mainly conceptualized users through rather arbitrary sets of variables. In this paper we advance an integrative framework to understand the role of the user in the process of understanding process models. Building on cognitive psychology, goal-setting theory and multimedia learning theory, we identify three stages of learning required to realize model understanding, these being Presage, Process, and Product. We define eight relevant user characteristics in the Presage stage of learning, three knowledge construction variables in the Process stage and three potential learning outcomes in the Product stage. To illustrate the benefits of the framework, we review existing process modeling work to identify where our framework can complement and extend existing studies.
Resumo:
The recently proposed data-driven background dataset refinement technique provides a means of selecting an informative background for support vector machine (SVM)-based speaker verification systems. This paper investigates the characteristics of the impostor examples in such highly-informative background datasets. Data-driven dataset refinement individually evaluates the suitability of candidate impostor examples for the SVM background prior to selecting the highest-ranking examples as a refined background dataset. Further, the characteristics of the refined dataset were analysed to investigate the desired traits of an informative SVM background. The most informative examples of the refined dataset were found to consist of large amounts of active speech and distinctive language characteristics. The data-driven refinement technique was shown to filter the set of candidate impostor examples to produce a more disperse representation of the impostor population in the SVM kernel space, thereby reducing the number of redundant and less-informative examples in the background dataset. Furthermore, data-driven refinement was shown to provide performance gains when applied to the difficult task of refining a small candidate dataset that was mis-matched to the evaluation conditions.
Resumo:
We investigate whether the two 2 zero cost portfolios, SMB and HML, have the ability to predict economic growth for markets investigated in this paper. Our findings show that there are only a limited number of cases when the coefficients are positive and significance is achieved in an even more limited number of cases. Our results are in stark contrast to Liew and Vassalou (2000) who find coefficients to be generally positive and of a similar magnitude. We go a step further and also employ the methodology of Lakonishok, Shleifer and Vishny (1994) and once again fail to support the risk-based hypothesis of Liew and Vassalou (2000). In sum, we argue that search for a robust economic explanation for firm size and book-to-market equity effects needs sustained effort as these two zero cost portfolios do not represent economically relevant risk.
Resumo:
E-commerce technologies such as a website, email and the use of web browsers enables access to large amounts of information, facilitates communication and provides niche companies with an effective mechanism for competing with larger organisations world-wide. However recent literature has shown Australian SMEs have been slow in the uptake of these technologies. The aim of this research was to determine which factors were important in impacting on small firms' decision making in respect of information technology and e-commerce adoption. Findings indicate that generally the more a firm was concerned about its competitive position such a firm was likely to develop a web site. Moreover the 'Industry and Skill Demands' dimension suggested that as the formal education of the owner/manager increased, coupled with the likelihood that the firm was in the transport and storage or communication services industries, and realising the cost of IT adoption was in effect an investment, then such a firm would be inclined to develop a web site.
Resumo:
The adoption of Internet technologies by the small business sector (SMEs)The adoption of Internet technologies by the small business sector is important to their on-going survival. Yet, given the opportunities and benefits that Internet technologies can provide it has been shown that Australian small businesses are relatively slow in adopting them. This paper develops a model from recent literature on the facilitators and inhibitors to the adoption of Internet technologies by small business. Cross-case analysis of findings from three case studies are presented. Findings indicate that perceived lack of business benefit, mistrust of the IT industry and lack of understanding of Internet technologies are major inhibitors to Internet adoption by small business.