245 resultados para Prove
Resumo:
Young people aged 17–24 years are at high risk of being killed in road crashes around the world. Road safety interventions consider some influences upon young driver behaviour; for example, imposing passenger restrictions on young novice drivers indirectly minimises the potential negative social influences of peers as passengers. To change young driver risky behaviour, the multitude of psychosocial influences upon its initiation and maintenance must be identified. A study questionnaire was developed to investigate the relationships between risky driving and Akers’ social learning theory, social identity theory, and thrill seeking variables. The questionnaire was completed by 165 participants (105 women,60 men) residing in south-east Queensland, Australia. The sociodemographic variables of age, gender, and exposure explained 19% of the variance in self-reported risky driving behaviour, whilst Akers’ social learning variables explained an additional 42%. Thrill seeking and social identity variables did not explain any significant additional variance. Significant predictors of risky driving included imitation of the driving behaviours of, and anticipated rewards and punishments administered by, parents and peers. Road safety policy that directly considers and incorporates these factors in their design, implementation, and enforcement of young driver road safety interventions should prove more efficacious than current approaches.
Resumo:
Sewage and its microbiology, treatment and disposal are important to the topic of Antarctic wildlife health because disposal of untreated sewage effluent into the Antarctic marine environment is both allowed and commonplace. Human sewage contains enteric bacteria as normal flora, and has the potential to contain parasites, bacteria and viruses which may prove pathogenic to Antarctic wildlife. Treatment can reduce levels of micro-organisms in sewage effluent, but is not a requirement of the Environmental Protocol to the Antarctic Treaty (the Madrid Protocol). In contrast, the deliberate release of non-native organisms for any other reason is prohibited. Hence, disposal of sewage effluent to the marine environment is the only activity routinely undertaken in Antarctica knowing that it will likely result in the release of large numbers of potentially non-native species. When the Madrid Protocol was negotiated, the decision to allow release of untreated sewage effluent was considered the only pragmatic option, as a prohibition would have been costly, and may not have been achievable by many Antarctic operators. In addition, at that time the potential for transmission of pathogens to wildlife from sewage was not emphasised as a significant potential risk. Since then, the transmission of disease-causing agents between species is more widely recognised and it is now timely to consider the risks of continued discharge of sewage effluent in Antarctica and whether there are practical alternatives.
Resumo:
As part of a Doctor of Business Administration degree programme jointly run by Curtin University, Perth, Australia and Lingnan University, Hong Kong, a research thesis relating organizational effectiveness to the organizational culture of Hong Kong construction firms involved in public housing is being undertaken. Organizational effectiveness is measured by the Housing Department (HD) Performance Assessment Scoring System (PASS) and organizational culture traits and strengths have been measured by using the Denison Organizational Culture Survey (OCS), developed by Daniel Denison and William S. Neale and based on 16 years of research involving over 1,000 organizations. The PASS scores of building contractors are compared with the OCS scores to determine if there is any significant correlation between highly effective companies and particular organizational strengths and traits. Profiles are then drawn using the Denison Model and can be compared against ‘norms’ for the industry sector on which the survey has been carried out. The next stage of the work is to present the results of the survey to individual companies, conduct focus group interviews to test the results, discover more detail on that company’s culture and discuss possible actions based on the results. It is in this latter stage that certain value management techniques may well prove very useful.
Resumo:
For most of the work done in developing association rule mining, the primary focus has been on the efficiency of the approach and to a lesser extent the quality of the derived rules has been emphasized. Often for a dataset, a huge number of rules can be derived, but many of them can be redundant to other rules and thus are useless in practice. The extremely large number of rules makes it difficult for the end users to comprehend and therefore effectively use the discovered rules and thus significantly reduces the effectiveness of rule mining algorithms. If the extracted knowledge can’t be effectively used in solving real world problems, the effort of extracting the knowledge is worth little. This is a serious problem but not yet solved satisfactorily. In this paper, we propose a concise representation called Reliable Approximate basis for representing non-redundant approximate association rules. We prove that the redundancy elimination based on the proposed basis does not reduce the belief to the extracted rules. We also prove that all approximate association rules can be deduced from the Reliable Approximate basis. Therefore the basis is a lossless representation of approximate association rules.
Resumo:
Matrix function approximation is a current focus of worldwide interest and finds application in a variety of areas of applied mathematics and statistics. In this thesis we focus on the approximation of A^(-α/2)b, where A ∈ ℝ^(n×n) is a large, sparse symmetric positive definite matrix and b ∈ ℝ^n is a vector. In particular, we will focus on matrix function techniques for sampling from Gaussian Markov random fields in applied statistics and the solution of fractional-in-space partial differential equations. Gaussian Markov random fields (GMRFs) are multivariate normal random variables characterised by a sparse precision (inverse covariance) matrix. GMRFs are popular models in computational spatial statistics as the sparse structure can be exploited, typically through the use of the sparse Cholesky decomposition, to construct fast sampling methods. It is well known, however, that for sufficiently large problems, iterative methods for solving linear systems outperform direct methods. Fractional-in-space partial differential equations arise in models of processes undergoing anomalous diffusion. Unfortunately, as the fractional Laplacian is a non-local operator, numerical methods based on the direct discretisation of these equations typically requires the solution of dense linear systems, which is impractical for fine discretisations. In this thesis, novel applications of Krylov subspace approximations to matrix functions for both of these problems are investigated. Matrix functions arise when sampling from a GMRF by noting that the Cholesky decomposition A = LL^T is, essentially, a `square root' of the precision matrix A. Therefore, we can replace the usual sampling method, which forms x = L^(-T)z, with x = A^(-1/2)z, where z is a vector of independent and identically distributed standard normal random variables. Similarly, the matrix transfer technique can be used to build solutions to the fractional Poisson equation of the form ϕn = A^(-α/2)b, where A is the finite difference approximation to the Laplacian. Hence both applications require the approximation of f(A)b, where f(t) = t^(-α/2) and A is sparse. In this thesis we will compare the Lanczos approximation, the shift-and-invert Lanczos approximation, the extended Krylov subspace method, rational approximations and the restarted Lanczos approximation for approximating matrix functions of this form. A number of new and novel results are presented in this thesis. Firstly, we prove the convergence of the matrix transfer technique for the solution of the fractional Poisson equation and we give conditions by which the finite difference discretisation can be replaced by other methods for discretising the Laplacian. We then investigate a number of methods for approximating matrix functions of the form A^(-α/2)b and investigate stopping criteria for these methods. In particular, we derive a new method for restarting the Lanczos approximation to f(A)b. We then apply these techniques to the problem of sampling from a GMRF and construct a full suite of methods for sampling conditioned on linear constraints and approximating the likelihood. Finally, we consider the problem of sampling from a generalised Matern random field, which combines our techniques for solving fractional-in-space partial differential equations with our method for sampling from GMRFs.
Resumo:
Association rule mining has made many advances in the area of knowledge discovery. However, the quality of the discovered association rules is a big concern and has drawn more and more attention recently. One problem with the quality of the discovered association rules is the huge size of the extracted rule set. Often for a dataset, a huge number of rules can be extracted, but many of them can be redundant to other rules and thus useless in practice. Mining non-redundant rules is a promising approach to solve this problem. In this paper, we firstly propose a definition for redundancy; then we propose a concise representation called Reliable basis for representing non-redundant association rules for both exact rules and approximate rules. An important contribution of this paper is that we propose to use the certainty factor as the criteria to measure the strength of the discovered association rules. With the criteria, we can determine the boundary between redundancy and non-redundancy to ensure eliminating as many redundant rules as possible without reducing the inference capacity of and the belief to the remaining extracted non-redundant rules. We prove that the redundancy elimination based on the proposed Reliable basis does not reduce the belief to the extracted rules. We also prove that all association rules can be deduced from the Reliable basis. Therefore the Reliable basis is a lossless representation of association rules. Experimental results show that the proposed Reliable basis can significantly reduce the number of extracted rules.
Resumo:
In this paper, A Riesz fractional diffusion equation with a nonlinear source term (RFDE-NST) is considered. This equation is commonly used to model the growth and spreading of biological species. According to the equivalent of the Riemann-Liouville(R-L) and Gr¨unwald-Letnikov(GL) fractional derivative definitions, an implicit difference approximation (IFDA) for the RFDE-NST is derived. We prove the IFDA is unconditionally stable and convergent. In order to evaluate the efficiency of the IFDA, a comparison with a fractional method of lines (FMOL) is used. Finally, two numerical examples are presented to show that the numerical results are in good agreement with our theoretical analysis.
Resumo:
Road curves are an important feature of road infrastructure and many serious crashes occur on road curves. In Queensland, the number of fatalities is twice as many on curves as that on straight roads. Therefore, there is a need to reduce drivers’ exposure to crash risk on road curves. Road crashes in Australia and in the Organisation for Economic Co-operation and Development(OECD) have plateaued in the last five years (2004 to 2008) and the road safety community is desperately seeking innovative interventions to reduce the number of crashes. However, designing an innovative and effective intervention may prove to be difficult as it relies on providing theoretical foundation, coherence, understanding, and structure to both the design and validation of the efficiency of the new intervention. Researchers from multiple disciplines have developed various models to determine the contributing factors for crashes on road curves with a view towards reducing the crash rate. However, most of the existing methods are based on statistical analysis of contributing factors described in government crash reports. In order to further explore the contributing factors related to crashes on road curves, this thesis designs a novel method to analyse and validate these contributing factors. The use of crash claim reports from an insurance company is proposed for analysis using data mining techniques. To the best of our knowledge, this is the first attempt to use data mining techniques to analyse crashes on road curves. Text mining technique is employed as the reports consist of thousands of textual descriptions and hence, text mining is able to identify the contributing factors. Besides identifying the contributing factors, limited studies to date have investigated the relationships between these factors, especially for crashes on road curves. Thus, this study proposed the use of the rough set analysis technique to determine these relationships. The results from this analysis are used to assess the effect of these contributing factors on crash severity. The findings obtained through the use of data mining techniques presented in this thesis, have been found to be consistent with existing identified contributing factors. Furthermore, this thesis has identified new contributing factors towards crashes and the relationships between them. A significant pattern related with crash severity is the time of the day where severe road crashes occur more frequently in the evening or night time. Tree collision is another common pattern where crashes that occur in the morning and involves hitting a tree are likely to have a higher crash severity. Another factor that influences crash severity is the age of the driver. Most age groups face a high crash severity except for drivers between 60 and 100 years old, who have the lowest crash severity. The significant relationship identified between contributing factors consists of the time of the crash, the manufactured year of the vehicle, the age of the driver and hitting a tree. Having identified new contributing factors and relationships, a validation process is carried out using a traffic simulator in order to determine their accuracy. The validation process indicates that the results are accurate. This demonstrates that data mining techniques are a powerful tool in road safety research, and can be usefully applied within the Intelligent Transport System (ITS) domain. The research presented in this thesis provides an insight into the complexity of crashes on road curves. The findings of this research have important implications for both practitioners and academics. For road safety practitioners, the results from this research illustrate practical benefits for the design of interventions for road curves that will potentially help in decreasing related injuries and fatalities. For academics, this research opens up a new research methodology to assess crash severity, related to road crashes on curves.
Resumo:
Principal Topic: There is increasing recognition that the organizational configurations of corporate venture units should depend on the types of ventures the unit seeks to develop (Burgelman, 1984; Hill and Birkinshaw, 2008). Distinction have been made between internal and external as well as exploitative versus explorative ventures (Hill and Birkinshaw, 2008; Narayan et al., 2009; Schildt et al., 2005). Assuming that firms do not want to limit themselves to a single type of venture, but rather employ a portfolio of ventures, the logical consequence is that firms should employ multiple corporate venture units. Each venture unit tailor-made for the type of venture it seeks to develop. Surprisingly, there is limited attention in the literature for the challenges of managing multiple corporate venture units in a single firm. Maintaining multiple venture units within one firm provides easier access to funding for new ideas (Hamel, 1999). It allows for freedom and flexibility to tie the organizational systems (Rice et al., 2000), autonomy (Hill and Rothaermel, 2003), and involvement of management (Day, 1994; Wadwha and Kotha, 2006) to the requirements of the individual ventures. Yet, the strategic objectives of a venture may change when uncertainty around the venture is resolved (Burgelman, 1984). For example, firms may decide to spin-in external ventures (Chesbrough, 2002) or spun-out ventures that prove strategically unimportant (Burgelman, 1984). This suggests that ventures might need to be transferred between venture units, e.g. from a more internally-driven corporate venture division to a corporate venture capital unit. Several studies suggested that ventures require different managerial skills across their phase of development (Desouza et al., 2007; O'Connor and Ayers, 2005; Kazanjian and Drazin, 1990; Westerman et al., 2006). To facilitate effective transfer between venture units and manage the overall venturing process, it is important that firms set up and manage integrative linkages. Integrative linkages provide synergies and coordination between differentiated units (Lawrence and Lorsch, 1967). Prior findings pointed to the important role of senior management (Westerman et al., 2006; Gilbert, 2006) and a shared organizational vision (Burgers et al., 2009) to coordinate venture units with mainstream businesses. We will draw on these literatures to investigate the key question of how to integratively manage multiple venture units. ---------- Methodology/Key Propositions: In order to seek an answer to the research question, we employ a case study approach that provides unique insights into how firms can break up their venturing process. We selected three Fortune 500 companies that employ multiple venturing units, IBM, Royal Dutch/ Shell and Nokia, and investigated and compared their approaches. It was important that the case companies somewhat differed in the type of venture units they employed as well as the way they integrate and coordinate their venture units. The data are based on extensive interviews and a variety of internal and external company documents to triangulate our findings (Eisenhardt, 1989). The key proposition of the article is that firms can best manage their multiple venture units through an ambidextrous design of loosely coupled units. This provides venture units with sufficient flexibility to employ organizational configurations that best support the type of venture they seek to develop, as well as provides sufficient integration to facilitate smooth transfer of ventures between venture units. Based on the case findings, we develop a generic framework for a new way of managing the venturing process through multiple corporate venture units. ---------- Results and Implications: One of our main findings is that these firms tend to organize their venture units according to phases in the venture development process. That is, they tend to have venture units aimed at incubation of venture ideas as well as units aimed more at the commercialization of ventures into a new business unit for the firm or a start-up. The companies in our case studies tended to coordinate venture units through integrative management skills or a coordinative venture unit that spanned multiple phases. We believe this paper makes two significant contributions. First, we extend prior venturing literature by addressing how firms manage a portfolio of venture units, each achieving different strategic objectives. Second, our framework provides recommendations on how firms should manage such an approach towards venturing. This helps to increase the likelihood of success of their venturing programs.
Resumo:
In this work, we investigate an alternative bootstrap approach based on a result of Ramsey [F.L. Ramsey, Characterization of the partial autocorrelation function, Ann. Statist. 2 (1974), pp. 1296-1301] and on the Durbin-Levinson algorithm to obtain a surrogate series from linear Gaussian processes with long range dependence. We compare this bootstrap method with other existing procedures in a wide Monte Carlo experiment by estimating, parametrically and semi-parametrically, the memory parameter d. We consider Gaussian and non-Gaussian processes to prove the robustness of the method to deviations from normality. The approach is also useful to estimate confidence intervals for the memory parameter d by improving the coverage level of the interval.
Resumo:
Lifecycle funds offered by retirement plan providers allocate aggressively to risky asset classes when the employee participants are young, gradually switching to more conservative asset classes as they grow older and approach retirement. This approach focuses on maximizing growth of the accumulation fund in the initial years and preserving its value in the later years. The authors simulate terminal wealth outcomes based on conventional lifecycle asset allocation rules as well as on contrarian strategies that reverse the direction of asset switching. The evidence suggests that the growth in portfolio size over time significantly impacts the asset allocation decision. Due to the portfolio size effect that is observed by the authors, the terminal value of accumulation in retirement accounts is influenced more by the asset allocation strategy adopted in later years relative to that adopted in early years. By mechanistically switching to conservative assets in the later years of a plan, lifecycle strategies sacrifice significant growth opportunity and prove counterproductive to the participant's wealth accumulation objective. The authors' conclude that this sacrifice does not seem to be compensated adequately in terms of reducing the risk of potentially adverse outcomes.
Resumo:
This paper details a systematic literature review identifying problems in extant research relating to teachers’ attitudes towards reporting child sexual abuse, and offers a model for new attitude scale development and testing. Scale development comprised a five-phase process grounded in contemporary attitude theories including: a) developing the initial item pool; b) conducting a panel review; c) refining the scale via an expert focus group; d) building content validity through cognitive interviews; e) assessing internal consistency via field testing. The resulting 21-item scale displayed construct validity in preliminary testing. The scale may prove useful as a research tool, given the theoretical supposition that attitudes may be changed with time, context, experience, and education. Further investigation with a larger sample is warranted.
Resumo:
During wound repair, the balance between matrix metalloproteinases (MMPs) and their natural inhibitors (the TIMPs) is crucial for the normal extra cellular matrix turnover. However, the over expression of several MMPs including MMP-1, 2, 3, 8, 9 and MMP-10, combined with abnormally high levels of activation or low expression of TIMPs, may contribute to excessive degradation of connective tissue and formation of chronic ulcers. There are many groups exploring strategies for promoting wound healing involving delivery of growth factors, cells, ECM components and small molecules. Our approach for improving the balance of MMPs is not to add anything more to the wound, but instead to neutralise the over-expressed MMPs using inhibitors tethered to a bandage-like hydrogel. Our in vitro experiments using designed synthetic pseudo peptide inhibitors have been demonstrated to inhibit MMP activity in standard solutions. These inhibitors have also been tethered to polyethylene glycol hydrogels using a facile reaction between the linker unit on the inhibitor and the gel. After tethering the inhibition of MMPs diminishes to some extent and we postulate that this arises due to poor diffusion of the MMPs into the gels. When the tethered inhibitors were tested against chronic wound fluid obtained against patients we observed over 40% inhibition in proteolytic activity suggesting our approach may prove useful in rebalancing MMPs within chronic wounds.
Resumo:
In the past, high order series expansion techniques have been used to study the nonlinear equations that govern the form of periodic Stokes waves moving steadily on the surface of an inviscid fluid. In the present study, two such series solutions are recomputed using exact arithmetic, eliminating any loss of accuracy due to accumulation of round-off error, allowing a much greater number of terms to be found with confidence. It is shown that higher order behaviour of series generated by the solution casts doubt over arguments that rely on estimating the series’ radius of convergence. Further, the exact nature of the series is used to shed light on the unusual nature of convergence of higher order Pade approximants near the highest wave. Finally, it is concluded that, provided exact values are used in the series, these Pade approximants prove very effective in successfully predicting three turning points in both the dispersion relation and the total energy.
Resumo:
Successful wound repair and normal turnover of the extracellular matrix relies on a balance between matrix metalloproteinases (MMPs) and their natural inhibitors (the TIMPs). When over-expression of MMPs and abnormally high levels of activation or low expression of TIMPs are encountered, excessive degradation of connective tissue and the formation of chronic ulcers can occur. One strategy to rebalance MMPs and TIMPs is to use inhibitors. We have designed a synthetic pseudopeptide inhibitor with an amine linker group based on a known high-affinity peptidomimetic MMP inhibitor have demonstrated inhibition of MMP-1, -2, -3 and -9 activity in standard solutions. The inhibitor was also tethered to a polyethylene glycol hydrogel using a facile reaction between the linker unit on the inhibitor and the hydrogel precursors. After tethering, we observed inhibition of the MMPs although there was an increase in the IC50s which was attributed to poor diffusion of the MMPs into the hydrogels, reduced activity of the tethered inhibitor or incomplete incorporation of the inhibitor into the hydrogels. When the tethered inhibitors were tested against chronic wound fluid we observed significant inhibition in proteolytic activity suggesting our approach may prove useful in rebalancing MMPs within chronic wounds.