342 resultados para Supersymmetric Effective Theories
Resumo:
With the emergence of patient-centered care, consumers are becoming more effective managers of their care—in other words, “effective consumers.” To support patients to become effective consumers, a number of strategies to translate knowledge to action (KTA) have been used with varying success. The use of a KTA framework can be helpful to researchers and implementers when framing, planning, and evaluating knowledge translation activities and can potentially lead to more successful activities. This article briefly describes the KTA framework and its use by a team based out of the University of Ottawa to translate evidence-based knowledge to consumers. Using the framework, tailored consumer summaries, decision aids, and a scale to measure consumer effectiveness were created in collaboration with consumers. Strategies to translate the products into action then were selected and implemented. Evaluation of the knowledge tools and products indicates that the products are useful to consumers. Current research is in place to monitor the use of these products, and future research is planned to evaluate the effect of using the knowledge on health outcomes. The KTA framework provides a useful and valuable approach to knowledge translation.
Resumo:
As the popularity of video as an information medium rises, the amount of video content that we produce and archive keeps growing. This creates a demand for shorter representations of videos in order to assist the task of video retrieval. The traditional solution is to let humans watch these videos and write textual summaries based on what they saw. This summarisation process, however, is time-consuming. Moreover, a lot of useful audio-visual information contained in the original video can be lost. Video summarisation aims to turn a full-length video into a more concise version that preserves as much information as possible. The problem of video summarisation is to minimise the trade-off between how concise and how representative a summary is. There are also usability concerns that need to be addressed in a video summarisation scheme. To solve these problems, this research aims to create an automatic video summarisation framework that combines and improves on existing video summarisation techniques, with the focus on practicality and user satisfaction. We also investigate the need for different summarisation strategies in different kinds of videos, for example news, sports, or TV series. Finally, we develop a video summarisation system based on the framework, which is validated by subjective and objective evaluation. The evaluation results shows that the proposed framework is effective for creating video skims, producing high user satisfaction rate and having reasonably low computing requirement. We also demonstrate that the techniques presented in this research can be used for visualising video summaries in the form web pages showing various useful information, both from the video itself and from external sources.
Resumo:
Multivariate volatility forecasts are an important input in many financial applications, in particular portfolio optimisation problems. Given the number of models available and the range of loss functions to discriminate between them, it is obvious that selecting the optimal forecasting model is challenging. The aim of this thesis is to thoroughly investigate how effective many commonly used statistical (MSE and QLIKE) and economic (portfolio variance and portfolio utility) loss functions are at discriminating between competing multivariate volatility forecasts. An analytical investigation of the loss functions is performed to determine whether they identify the correct forecast as the best forecast. This is followed by an extensive simulation study examines the ability of the loss functions to consistently rank forecasts, and their statistical power within tests of predictive ability. For the tests of predictive ability, the model confidence set (MCS) approach of Hansen, Lunde and Nason (2003, 2011) is employed. As well, an empirical study investigates whether simulation findings hold in a realistic setting. In light of these earlier studies, a major empirical study seeks to identify the set of superior multivariate volatility forecasting models from 43 models that use either daily squared returns or realised volatility to generate forecasts. This study also assesses how the choice of volatility proxy affects the ability of the statistical loss functions to discriminate between forecasts. Analysis of the loss functions shows that QLIKE, MSE and portfolio variance can discriminate between multivariate volatility forecasts, while portfolio utility cannot. An examination of the effective loss functions shows that they all can identify the correct forecast at a point in time, however, their ability to discriminate between competing forecasts does vary. That is, QLIKE is identified as the most effective loss function, followed by portfolio variance which is then followed by MSE. The major empirical analysis reports that the optimal set of multivariate volatility forecasting models includes forecasts generated from daily squared returns and realised volatility. Furthermore, it finds that the volatility proxy affects the statistical loss functions’ ability to discriminate between forecasts in tests of predictive ability. These findings deepen our understanding of how to choose between competing multivariate volatility forecasts.
Resumo:
Individuals, community organisations and industry have always been involved to varying degrees in efforts to address the Queensland road toll. Traditionally, road crash prevention efforts have been led by state and local government organisations. While community and industry groups have sometimes become involved (e.g. Driver Reviver campaign), their efforts have largely been uncoordinated and under-resourced. A common strength of these initiatives lies in the energy, enthusiasm and persistence of community-based efforts. Conversely, a weakness has sometimes been the lack of knowledge, awareness or prioritisation of evidence-based interventions or their capacity to build on collaborative efforts. In 2000, the Queensland University of Technology’s Centre for Accident Research and Road Safety – Queensland (CARRS-Q) identified this issue as an opportunity to bridge practice and research and began acknowledging a selection of these initiatives, in partnership with the RACQ, through the Queensland Road Safety Awards program. After nine years it became apparent there was need to strengthen this connection, with the Centre establishing a Community Engagement Workshop in 2009 as part of the overall Awards program. With an aim of providing community participants opportunities to see, hear and discuss the experiences of others, this event was further developed in 2010, and with the collaboration of the Queensland Department of Transport and Main Roads, the RACQ, Queensland Police Service and Leighton Contractors Pty Ltd, a stand-alone Queensland Road Safety Awards Community Engagement Workshop was held in 2010. Each collaborating organisation recognised a need to mobilise the community through effective information and knowledge sharing, and recognised that learning and discussion can influence lasting behaviour change and action in this often emotive, yet not always evidence-based, area. This free event featured a number of speakers representing successful projects from around Australia and overseas. Attendees were encouraged to interact with the speakers, to ask questions, and most importantly, build connections with other attendees to build a ‘community road safety army’ all working throughout Australia on projects underpinned by evaluated research. The workshop facilitated the integration of research, policy and grass-roots action enhancing the success of community road safety initiatives. For collaboration partners, the event enabled them to transfer their knowledge in an engaged approach, working within a more personal communication process. An analysis of the success factors for this event identified openness to community groups and individuals, relevance of content to local initiatives, generous support with the provision of online materials and ongoing communication with key staff members as critical and supports the view that the university can directly provide both the leadership and the research needed for effective and credible community-based initiatives to address injury and death on the roads.
Resumo:
Evasive change-of-direction manoeuvres (agility skills) are a fundamental ability in rugby union. In this study, we explored the attributes of agility skill execution as they relate to effective attacking strategies in rugby union. Seven Super 14 games were coded using variables that assessed team patterns and individual movement characteristics during attacking ball carries. The results indicated that tackle-breaks are a key determinant of try-scoring ability and team success in rugby union. The ability of the attacking ball carrier to receive the ball at high speed with at least two body lengths from the defence line against an isolated defender promoted tackle-breaks. Furthermore, the execution of a side-step evasive manoeuvre at a change of direction angle of 20–60° and a distance of one to two body lengths from the defence, and then straightening the running line following the initial direction change at an angle of 20–60°, was associated with tackle-breaks. This study provides critical insight regarding the attributes of agility skill execution that are associated with effective ball carries in rugby union.
Resumo:
Purpose: This paper provides a selective annotated bibliography that summarises journal articles which have employed either the theory of reasoned action or the theory of planned behaviour to circumstances which are relevant to business activities. Design/methodology/approach: Searches were conducted on the EBSCO Host and ProQuest databases to identify papers that had used either the theory of reasoned action or theory of planned behaviour in their methodology. The bibliography was separated into three categories- financial decision making, strategic decision making, and professional decision making. Implications: The information presented in this paper is intended to assist and facilitate further research by broadening the awareness of the literature and providing examples of the application of the theory as it has been employed in prior research.
Resumo:
This study investigated how the interpretation of mathematical problems by Year 7 students impacted on their ability to demonstrate what they can do in NAPLAN numeracy testing. In the study, mathematics is viewed as a culturally and socially determined system of signs and signifiers that establish the meaning, origins and importance of mathematics. The study hypothesises that students are unable to succeed in NAPLAN numeracy tests because they cannot interpret the questions, even though they may be able to perform the necessary calculations. To investigate this, the study applied contemporary theories of literacy to the context of mathematical problem solving. A case study design with multiple methods was used. The study used a correlation design to explore the connections between NAPLAN literacy and numeracy outcomes of 198 Year 7 students in a Queensland school. Additionally, qualitative methods provided a rich description of the effect of the various forms of NAPLAN numeracy questions on the success of ten Year 7 students in the same school. The study argues that there is a quantitative link between reading and numeracy. It illustrates that interpretation (literacy) errors are the most common error type in the selected NAPLAN questions, made by students of all abilities. In contrast, conceptual (mathematical) errors are less frequent amongst more capable students. This has important implications in preparing students for NAPLAN numeracy tests. The study concluded by recommending that increased focus on the literacies of mathematics would be effective in improving NAPLAN results.
Resumo:
This study examines a dialogue process managers can use to explore community attitudes. The objectives of the research are to develop a dialogue process that engages community audiences on climate mitigation strategies. Secondly, to understand participants perspectives and potential reactions in particular to underground storage of CO2 and determine the strategies that most effectively engage people in dialogue to enable the climate change debate to move forward. Finally, to develop a dialogue process that can be used by managers on other politically sensitive topics. Knowledge of the dynamics of psychosocial relationships and communication between stakeholders contributed to increased understanding of the issues. The key findings of this study indicate that the public can be engaged in dialogue on the issue of CO2 capture and storage and low emission technologies without engendering adverse reactions. The dialogue process is critical to participant’s engagement and led to behaviour change in energy use.
Resumo:
This study explores the relationship between new venture team composition and new venture persistence and performance over time. We examine the team characteristics of a 5-year panel study of 202 new venture teams and new venture performance. Our study makes two contributions. First, we extend earlier research concerning homophily theories of the prevalence of homogeneous teams. Using structural event analysis we demonstrate that team members’ start-up experience is important in this context. Second, we attempt to reconcile conflicting evidence concerning the influence of team homogeneity on performance by considering the element of time. We hypothesize that higher team homogeneity is positively related to short term outcomes, but is less effective in the longer term. Our results confirm a difference over time. We find that more homogeneous teams are less likely to be higher performing in the long term. However, we find no relationship between team homogeneity and short-term performance outcomes.
Resumo:
This thesis explores the business environment for self-publishing musicians at the end of the 20th century and the start of the 21st century from theoretical and empirical standpoints. The exploration begins by asking three research questions: what are the factors affecting the sustainability of an Independent music business; how many of those factors can be directly influenced by an Independent musician in the day-to-day operations of their musical enterprise; and how can those factors be best manipulated to maximise the benefit generated from digital music assets? It answers these questions by considering the nature of value in the music business in light of theories of political economy, then quantitative and qualitative examinations of the nature of participation in the music business, and then auto-ethnographic approaches to the application of two technologically enabled tools available to Independent musicians. By analyzing the results of five different examinations of the topic it answers each research question with reference to four sets of recurring issues that affect the operations of a 21st century music business: the musicians’ personal characteristics, their ability to address their business’s informational needs; their ability to manage the relationships upon which their business depends; and their ability to resolve the remaining technological problems that confront them. It discusses ways in which Independent self-publishing musicians can and cannot deal with these four issues on a day-to-day basis and highlights aspects for which technological solutions do not exist as well as ways in which technology is not as effective as has been claimed. It then presents a self-critique and proposes some directions for further study before concluding by suggesting some common features of 21st century Independent music businesses. This thesis makes three contributions to knowledge. First, it provides a new understanding of the sources of musical value, shows how this explains changes in the music industries over the past 30 years, and provides a framework for predicting future developments in those industries. Second, it shows how the technological discontinuity that has occurred around the start of the 21st century has and has not affected the production and distribution of digital cultural artefacts and thus the attitudes, approaches, and business prospects of Independent musicians. Third, it argues for new understandings of two methods by which self-publishing musicians can grow a business using production methods that are only beginning to be more broadly understood: home studio recording and fan-sourced production. Developed from the perspective of working musicians themselves, this thesis identifies four sets of issues that determine the probable success of musicians’ efforts to adopt new technologies to capture the value of the musicians’ creativity and thereby foster growth that will sustain an Independent music business in the 21st century.
Resumo:
Intuitively, any ‘bag of words’ approach in IR should benefit from taking term dependencies into account. Unfortunately, for years the results of exploiting such dependencies have been mixed or inconclusive. To improve the situation, this paper shows how the natural language properties of the target documents can be used to transform and enrich the term dependencies to more useful statistics. This is done in three steps. The term co-occurrence statistics of queries and documents are each represented by a Markov chain. The paper proves that such a chain is ergodic, and therefore its asymptotic behavior is unique, stationary, and independent of the initial state. Next, the stationary distribution is taken to model queries and documents, rather than their initial distributions. Finally, ranking is achieved following the customary language modeling paradigm. The main contribution of this paper is to argue why the asymptotic behavior of the document model is a better representation then just the document’s initial distribution. A secondary contribution is to investigate the practical application of this representation in case the queries become increasingly verbose. In the experiments (based on Lemur’s search engine substrate) the default query model was replaced by the stable distribution of the query. Just modeling the query this way already resulted in significant improvements over a standard language model baseline. The results were on a par or better than more sophisticated algorithms that use fine-tuned parameters or extensive training. Moreover, the more verbose the query, the more effective the approach seems to become.