770 resultados para MERIT
Resumo:
Forecasting volatility has received a great deal of research attention, with the relative performances of econometric model based and option implied volatility forecasts often being considered. While many studies find that implied volatility is the pre-ferred approach, a number of issues remain unresolved, including the relative merit of combining forecasts and whether the relative performances of various forecasts are statistically different. By utilising recent econometric advances, this paper considers whether combination forecasts of S&P 500 volatility are statistically superior to a wide range of model based forecasts and implied volatility. It is found that a combination of model based forecasts is the dominant approach, indicating that the implied volatility cannot simply be viewed as a combination of various model based forecasts. Therefore, while often viewed as a superior volatility forecast, the implied volatility is in fact an inferior forecast of S&P 500 volatility relative to model-based forecasts.
Resumo:
In teaching introductory economics there has been a tendency to put a lot of emphasis on imparting abstract models and technical skills to students (Stilwell, 2005; Voss, Blais, Greens, & Ahwesh, 1986). This model building approach has the merit of preparing the grounding for students 10 pursue further studies in economics. However, in a business degree with only a small proportion of students majoring in economics, such an approach tend to alienate the majority of students transiting from high school in to university. Surveys in Europe and Australia found that students complained about the lack of relevance of economics courses to the real world and the over-reliance of abstract mathematical modelling (Kirman, 2001; Lewis and Norris, 1997; Siegfried & Round, 2000). BSB112 Economics 1 is one of the eight faculty core units in the Faculty of Business at QUT, with over 1000 students in each semester. In semester I 2008, a new approach to teaching this unit was designed aiming to achieve three inter-related objectives: (1) to provide business students with a first insight into economic thinking and language, (2) to integrate economic analysis with current Australian social, environmental and political issues, and (3) to cater for students with a wide range of academic needs. Strategies used to achieve these objectives included writing up a new text which departs from traditional economics textbooks in important ways, integrating students' cultures in teaching and learning activities, and devising a new assessment format to encourage development of research skills and applications rather than reproduction of factual knowledge. This paper will document the strategies used in this teaching innovation, present quantitative and qualitative evidence to evaluate this new approach and suggest ways of further improvement.
Resumo:
Bioelectrical impedance analysis, (BIA), is a method of body composition analysis first investigated in 1962 which has recently received much attention by a number of research groups. The reasons for this recent interest are its advantages, (viz: inexpensive, non-invasive and portable) and also the increasing interest in the diagnostic value of body composition analysis. The concept utilised by BIA to predict body water volumes is the proportional relationship for a simple cylindrical conductor, (volume oc length2/resistance), which allows the volume to be predicted from the measured resistance and length. Most of the research to date has measured the body's resistance to the passage of a 50· kHz AC current to predict total body water, (TBW). Several research groups have investigated the application of AC currents at lower frequencies, (eg 5 kHz), to predict extracellular water, (ECW). However all research to date using BIA to predict body water volumes has used the impedance measured at a discrete frequency or frequencies. This thesis investigates the variation of impedance and phase of biological systems over a range of frequencies and describes the development of a swept frequency bioimpedance meter which measures impedance and phase at 496 frequencies ranging from 4 kHz to 1 MHz. The impedance of any biological system varies with the frequency of the applied current. The graph of reactance vs resistance yields a circular arc with the resistance decreasing with increasing frequency and reactance increasing from zero to a maximum then decreasing to zero. Computer programs were written to analyse the measured impedance spectrum and determine the impedance, Zc, at the characteristic frequency, (the frequency at which the reactance is a maximum). The fitted locus of the measured data was extrapolated to determine the resistance, Ro, at zero frequency; a value that cannot be measured directly using surface electrodes. The explanation of the theoretical basis for selecting these impedance values (Zc and Ro), to predict TBW and ECW is presented. Studies were conducted on a group of normal healthy animals, (n=42), in which TBW and ECW were determined by the gold standard of isotope dilution. The prediction quotients L2/Zc and L2/Ro, (L=length), yielded standard errors of 4.2% and 3.2% respectively, and were found to be significantly better than previously reported, empirically determined prediction quotients derived from measurements at a single frequency. The prediction equations established in this group of normal healthy animals were applied to a group of animals with abnormally low fluid levels, (n=20), and also to a group with an abnormal balance of extra-cellular to intracellular fluids, (n=20). In both cases the equations using L2/Zc and L2/Ro accurately and precisely predicted TBW and ECW. This demonstrated that the technique developed using multiple frequency bioelectrical impedance analysis, (MFBIA), can accurately predict both TBW and ECW in both normal and abnormal animals, (with standard errors of the estimate of 6% and 3% for TBW and ECW respectively). Isotope dilution techniques were used to determine TBW and ECW in a group of 60 healthy human subjects, (male. and female, aged between 18 and 45). Whole body impedance measurements were recorded on each subject using the MFBIA technique and the correlations between body water volumes, (TBW and ECW), and heighe/impedance, (for all measured frequencies), were compared. The prediction quotients H2/Zc and H2/Ro, (H=height), again yielded the highest correlation with TBW and ECW respectively with corresponding standard errors of 5.2% and 10%. The values of the correlation coefficients obtained in this study were very similar to those recently reported by others. It was also observed that in healthy human subjects the impedance measured at virtually any frequency yielded correlations not significantly different from those obtained from the MFBIA quotients. This phenomenon has been reported by other research groups and emphasises the need to validate the technique by investigating its application in one or more groups with abnormalities in fluid levels. The clinical application of MFBIA was trialled and its capability of detecting lymphoedema, (an excess of extracellular fluid), was investigated. The MFBIA technique was demonstrated to be significantly more sensitive, (P<.05), in detecting lymphoedema than the current technique of circumferential measurements. MFBIA was also shown to provide valuable information describing the changes in the quantity of muscle mass of the patient during the course of the treatment. The determination of body composition, (viz TBW and ECW), by MFBIA has been shown to be a significant improvement on previous bioelectrical impedance techniques. The merit of the MFBIA technique is evidenced in its accurate, precise and valid application in animal groups with a wide variation in body fluid volumes and balances. The multiple frequency bioelectrical impedance analysis technique developed in this study provides accurate and precise estimates of body composition, (viz TBW and ECW), regardless of the individual's state of health.
Resumo:
Although rarely referred to in litigation in the years that have followed the Ipp Review Report, there may well be some merit in more frequent judicial reference to the NHMRC guidelines for medical practitioners on providing information to patients 2004.
Resumo:
Mandatory data breach notification has become a matter of increasing concern for law reformers. In Australia, this issue was recently addressed as part of a comprehensive review of privacy law conducted by the Australian Law Reform Commission (ALRC) which recommended a uniform national regime for protecting personal information applicable to both the public and private sectors. As in all federal systems, the distribution of powers between central and state governments poses problems for national consistency. In the authors’ view, a uniform approach to mandatory data breach notification has greater merit than a ‘jurisdiction specific’ approach epitomized by US state-based laws. The US response has given rise to unnecessary overlaps and inefficiencies as demonstrated by a review of different notification triggers and encryption safe harbors. Reviewing the US response, the authors conclude that a uniform approach to data breach notification is inherently more efficient.
Resumo:
For the first time in human history, large volumes of spoken audio are being broadcast, made available on the internet, archived, and monitored for surveillance every day. New technologies are urgently required to unlock these vast and powerful stores of information. Spoken Term Detection (STD) systems provide access to speech collections by detecting individual occurrences of specified search terms. The aim of this work is to develop improved STD solutions based on phonetic indexing. In particular, this work aims to develop phonetic STD systems for applications that require open-vocabulary search, fast indexing and search speeds, and accurate term detection. Within this scope, novel contributions are made within two research themes, that is, accommodating phone recognition errors and, secondly, modelling uncertainty with probabilistic scores. A state-of-the-art Dynamic Match Lattice Spotting (DMLS) system is used to address the problem of accommodating phone recognition errors with approximate phone sequence matching. Extensive experimentation on the use of DMLS is carried out and a number of novel enhancements are developed that provide for faster indexing, faster search, and improved accuracy. Firstly, a novel comparison of methods for deriving a phone error cost model is presented to improve STD accuracy, resulting in up to a 33% improvement in the Figure of Merit. A method is also presented for drastically increasing the speed of DMLS search by at least an order of magnitude with no loss in search accuracy. An investigation is then presented of the effects of increasing indexing speed for DMLS, by using simpler modelling during phone decoding, with results highlighting the trade-off between indexing speed, search speed and search accuracy. The Figure of Merit is further improved by up to 25% using a novel proposal to utilise word-level language modelling during DMLS indexing. Analysis shows that this use of language modelling can, however, be unhelpful or even disadvantageous for terms with a very low language model probability. The DMLS approach to STD involves generating an index of phone sequences using phone recognition. An alternative approach to phonetic STD is also investigated that instead indexes probabilistic acoustic scores in the form of a posterior-feature matrix. A state-of-the-art system is described and its use for STD is explored through several experiments on spontaneous conversational telephone speech. A novel technique and framework is proposed for discriminatively training such a system to directly maximise the Figure of Merit. This results in a 13% improvement in the Figure of Merit on held-out data. The framework is also found to be particularly useful for index compression in conjunction with the proposed optimisation technique, providing for a substantial index compression factor in addition to an overall gain in the Figure of Merit. These contributions significantly advance the state-of-the-art in phonetic STD, by improving the utility of such systems in a wide range of applications.
Resumo:
Data collection using Autonomous Underwater Vehicles (AUVs) is increasing in importance within the oceano- graphic research community. Contrary to traditional moored or static platforms, mobile sensors require intelligent planning strategies to manoeuvre through the ocean. However, the ability to navigate to high-value locations and collect data with specific scientific merit is worth the planning efforts. In this study, we examine the use of ocean model predictions to determine the locations to be visited by an AUV, and aid in planning the trajectory that the vehicle executes during the sampling mission. The objectives are: a) to provide near-real time, in situ measurements to a large-scale ocean model to increase the skill of future predictions, and b) to utilize ocean model predictions as a component in an end-to-end autonomous prediction and tasking system for aquatic, mobile sensor networks. We present an algorithm designed to generate paths for AUVs to track a dynamically evolving ocean feature utilizing ocean model predictions. This builds on previous work in this area by incorporating the predicted current velocities into the path planning to assist in solving the 3-D motion planning problem of steering an AUV between two selected locations. We present simulation results for tracking a fresh water plume by use of our algorithm. Additionally, we present experimental results from field trials that test the skill of the model used as well as the incorporation of the model predictions into an AUV trajectory planner. These results indicate a modest, but measurable, improvement in surfacing error when the model predictions are incorporated into the planner.
Resumo:
Stereo vision is a method of depth perception, in which depth information is inferred from two (or more) images of a scene, taken from different perspectives. Applications of stereo vision include aerial photogrammetry, autonomous vehicle guidance, robotics, industrial automation and stereomicroscopy. A key issue in stereo vision is that of image matching, or identifying corresponding points in a stereo pair. The difference in the positions of corresponding points in image coordinates is termed the parallax or disparity. When the orientation of the two cameras is known, corresponding points may be projected back to find the location of the original object point in world coordinates. Matching techniques are typically categorised according to the nature of the matching primitives they use and the matching strategy they employ. This report provides a detailed taxonomy of image matching techniques, including area based, transform based, feature based, phase based, hybrid, relaxation based, dynamic programming and object space methods. A number of area based matching metrics as well as the rank and census transforms were implemented, in order to investigate their suitability for a real-time stereo sensor for mining automation applications. The requirements of this sensor were speed, robustness, and the ability to produce a dense depth map. The Sum of Absolute Differences matching metric was the least computationally expensive; however, this metric was the most sensitive to radiometric distortion. Metrics such as the Zero Mean Sum of Absolute Differences and Normalised Cross Correlation were the most robust to this type of distortion but introduced additional computational complexity. The rank and census transforms were found to be robust to radiometric distortion, in addition to having low computational complexity. They are therefore prime candidates for a matching algorithm for a stereo sensor for real-time mining applications. A number of issues came to light during this investigation which may merit further work. These include devising a means to evaluate and compare disparity results of different matching algorithms, and finding a method of assigning a level of confidence to a match. Another issue of interest is the possibility of statistically combining the results of different matching algorithms, in order to improve robustness.
Resumo:
The merits of a research project are commonly framed in terms of perceived benefits with respect to knowledge production, wellbeing, the social good, and so on. Such measures can, however, be at odds with certain types of creative practice, which may be perceived as frivolous, unsettling, or shocking. Moreover, creative practice research methodologies commonly eschew more traditional research conventions. In exploring these tensions, this live performance event (including a DVD component) adapted key dramatic principles developed in Geoffrey Robertson's groundbreaking Hypotheticals. The event was presented for an audience of staff and students at QUT's Creative Industries Faculty in July 2010. It confirmed Dr Angela Romano's contention that: “Part of the ethical clearance process for practice-led researchers will be to find a language to explain the methodology, significance, merit and integrity of their research to people outside their field of practice.” (Angela Romano, QUT Creative Industries) “Part of the ethical clearance process for practice-led researchers will be to find a language to explain the methodology, significance, merit and integrity of their research to people outside their field of practice.”
Resumo:
The emergence of ePortfolios is relatively recent in the university sector as a way to engage students in their learning and assessment, and to produce records of their accomplishments. An ePortfolio is an online tool that students can utilise to record, catalogue, retrieve and present reflections and artefacts that support and demonstrate the development of graduate students’ capabilities and professional standards across university courses. The ePortfolio is therefore considered as both process and product. Although ePortfolios show promise as a useful tool and their uptake has grown, they are not yet a mainstream higher education technology. To date, the emphasis has been on investigating their potential to support the multiple purposes of learning, assessment and employability, but less is known about whether and how students engage with ePortfolios in the university setting. This thesis investigates student engagement with an ePortfolio in one university. As the educational designer for the ePortfolio project at the University, I was uniquely positioned as a researching professional to undertake an inquiry into whether students were engaging with the ePortfolio. The participants in this study were a cohort (defined by enrolment in a unit of study) of second and third year education students (n=105) enrolled in a four year Bachelor of Education degree. The students were introduced to the ePortfolio in an introductory lecture and a hands-on workshop in a computer laboratory. They were subsequently required to complete a compulsory assessment task – a critical reflection - using the ePortfolio. Following that, engagement with the ePortfolio was voluntary. A single case study approach arising from an interpretivist paradigm directed the methodological approach and research design for this study. The study investigated the participants’ own accounts of their experiences with the ePortfolio, including how and when they engaged with the ePortfolio and the factors that impacted on their engagement. Data collection methods consisted of an attitude survey, student interviews, document collection, a researcher reflective journal and researcher observations. The findings of the study show that, while the students were encouraged to use the ePortfolio as a learning and employability tool, most students ultimately chose to disengage after completing the assessment task. Only six of the forty-five students (13%) who completed the research survey had used the ePortfolio in a sustained manner. The data obtained from the students during this research has provided insight into reasons why they disengaged from the ePortfolio. The findings add to the understandings and descriptions of student engagement with technology, and more broadly, advance the understanding of ePortfolios. These findings also contribute to the interdisciplinary field of technology implementation. There are three key outcomes from this study, a model of student engagement with technology, a set of criteria for the design of an ePortfolio, and a set of recommendations for effective practice for those implementing ePortfolios. The first, the Model of Student Engagement with Technology (MSET) (Version 2) explored student engagement with technology by highlighting key engagement decision points for students The model was initially conceptualised by building on work of previous research (Version 1), however, following data analysis a new model emerged, MSET (Version 2). The engagement decision points were identified as: • Prior Knowledge and Experience, leading to imagined usefulness and imagined ease of use; • Initial Supported Engagement, leading to supported experience of usefulness and supported ease of use; • Initial Independent Engagement, leading to actual experience of independent usefulness and actual ease of use; and • Ongoing Independent Engagement, leading to ongoing experience of usefulness and ongoing ease of use. The Model of Student Engagement with Technology (MSET) goes beyond numerical figures of usage to demonstrate student engagement with an ePortfolio. The explanatory power of the model is based on the identification of the types of decisions that students make and when they make them during the engagement process. This model presents a greater depth of understanding student engagement than was previously available and has implications for the direction and timing of future implementation, and academic and student development activities. The second key outcome from this study is a set of criteria for the re-conceptualisation of the University ePortfolio. The knowledge gained from this research has resulted in a new set of design criteria that focus on the student actions of writing reflections and adding artefacts. The process of using the ePortfolio is reconceptualised in terms of privileging student learning over administrative compliance. The focus of the ePortfolio is that the writing of critical reflections is the key function, not the selection of capabilities. The third key outcome from this research consists of five recommendations for university practice that have arisen from this study. They are that, sustainable implementation is more often achieved through small steps building on one another; that a clear definition of the purpose of an ePortfolio is crucial for students and staff; that ePortfolio pedagogy should be the driving force not the technology; that the merit of the ePortfolio is fostered in students and staff; and finally, that supporting delayed task performance is crucial. Students do not adopt an ePortfolio just because it is provided. While students must accept responsibility for their own engagement with the ePortfolio, the institution has to accept responsibility for providing the environment, and technical and pedagogical support to foster engagement. Ultimately, an ePortfolio should be considered as a joint venture between student and institution where strong returns on investment can be realised by both. It is acknowledged that the current implementation strategies for the ePortfolio are just the beginning of a much longer process. The real rewards for students, academics and the university lie in the future.
Resumo:
One of the impediments to large-scale use of wind generation within power system is its variable and uncertain real-time availability. Due to the low marginal cost of wind power, its output will change the merit order of power markets and influence the Locational Marginal Price (LMP). For the large scale of wind power, LMP calculation can't ignore the essential variable and uncertain nature of wind power. This paper proposes an algorithm to estimate LMP. The estimation result of conventional Monte Carlo simulation is taken as benchmark to examine accuracy. Case study is conducted on a simplified SE Australian power system, and the simulation results show the feasibility of proposed method.
Resumo:
This article sets the context for this special themed issue on the 'Korean digital wave' by considering the symbiotic relationship between digital technologies, their techniques and practices, their uses and the affordances they provide, and Korea's 'compressed modernity' and swift industrialisation. It underscores the importance of interrogating a range of groundbreaking developments and innovations within Korea's digital mediascapes, and its creative and cultural industries, in order to gain a complex understanding of one of Australia's most significant export markets and trading partners. Given the financial and political commitment in Australia to a high-speed broadband network that aims to stimulate economic and cultural activity, recent technological developments in Korea, and the double-edged role played by government policy in shaping the 'Korean digital wave', merit close attention from media and communications scholars.
Resumo:
This thesis is an ethical and empirical exploration of the late discovery of genetic origins in two contexts, adoption and sperm donor-assisted conception. This exploration has two interlinked strands of concern. The first is the identification of ‘late discovery’ as a significant issue of concern, deserving of recognition and acknowledgment. The second concerns the ethical implications of late discovery experiences for the welfare of the child. The apparently simple act of recognition of a phenomenon is a precondition to any analysis and critique of it. This is especially important when the phenomenon arises out of social practices that arouse significant debate in ethical and legal contexts. As the new reproductive technologies and some adoption practices remain highly contested, an ethical exploration of this long neglected experience has the potential to offer new insights and perspectives in a range of contexts. It provides an opportunity to revisit developmental debate on the relative merit or otherwise of biological versus social influences, from the perspective of those who have lived this dichotomy in practise. Their experiences are the human face of the effects arising from decisions taken by others to intentionally separate their biological and social worlds, an action which has then been compounded by family and institutional secrecy from birth. This has been accompanied by a failure to ensure that normative standards and values are upheld for them. Following discovery, these factors can be exacerbated by a lack of recognition and acknowledgement of their concerns by family, friends, community and institutions. Late discovery experiences offer valuable insights to inform discussions on the ethical meanings of child welfare, best interests, parental responsibility, duty of care and child identity rights in this and other contexts. They can strengthen understandings of what factors are necessary for a child to be able to live a reasonably happy or worthwhile life.
Resumo:
This article describes a follow-up study of 232 individuals who underwent psychiatric assessment by a Criminal Justice Mental Health Team (CJMHT) in 2001/2002, and also draws upon in-depth interviews conducted with 26 of the cohort. At assessment many people are identified with substance misuse problems, as homeless and with a history of psychiatric contact but in the main their problems are of insufficient severity to merit diversion to psychiatric hospital. The study mapped service contact, housing and offending in the 12 months following assessment and compared this to the 12 months prior to assessment, and found increased levels of service contact but also increased levels of offending and no decrease in homelessness. Thus assessment by the CJMHT brought few discernible advantages for the majority of clients. This was also the perception of the 26 clients who were interviewed. Their own perceptions of their lifestyle and the support that they deemed most valuable are described to identify means of enhancing the efficacy of court assessment.
Resumo:
This paper presents an evaluation of an instrument to measure teachers’ attitudes towards reporting child sexual abuse and discusses the instrument’s merit for research into reporting practice. Based on responses from 444 Australian teachers, the Teachers’ Reporting Attitude Scale for Child Sexual Abuse (TRAS - CSA) was evaluated using exploratory factor analysis. The scale isolated three dimensions: commitment to the reporting role; confidence in the system’s response to reports; and concerns about reporting. These three factors accounted for 37.5% of the variance in the 14-item measure. Alpha coefficients for the subscales were 0.769 (commitment), 0.617 (confidence), and 0.661 (concerns). The findings provide insights into the complexity of studying teachers’ attitudes towards reporting of child sexual abuse, and have implications for future research.