212 resultados para GRADIENT TEST


Relevância:

20.00% 20.00%

Publicador:

Resumo:

While in many travel situations consumers have an almost limitless range of destinations to choose from, their actual decision set will usually only comprise between two and six destinations. One of the greatest challenges facing destination marketers is positioning their destination, against the myriad of competing places that offer similar features, into consumer decision sets. Since positioning requires a narrow focus, marketing communications must present a succinct and meaningful proposition, the selection of which is often problematic for destination marketing organisations (DMO), which deal with a diverse and often eclectic range of attributes in addition to numerous self-interested and demanding stakeholders. This paper reports the application of two qualitative techniques used to explore the range of cognitive attributes, consequences and personal values that represent potential positioning opportunities in the context of short break holidays. The Repertory Test is an effective technique for understanding the salient attributes used by a traveller to differentiate destinations, while Laddering Analysis enables the researcher to explore the smaller set of personal values guiding such decision making. A key finding of the research was that while individuals might vary in their repertoire of salient attributes, there was a commonality of shared consequences and values.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Process modeling is an emergent area of Information Systems research that is characterized through an abundance of conceptual work with little empirical research. To fill this gap, this paper reports on the development and validation of an instrument to measure user acceptance of process modeling grammars. We advance an extended model for a multi-stage measurement instrument development procedure, which incorporates feedback from both expert and user panels. We identify two main contributions: First, we provide a validated measurement instrument for the study of user acceptance of process modeling grammars, which can be used to assist in further empirical studies that investigate phenomena associated with the business process modeling domain. Second, in doing so, we describe in detail a procedural model for developing measurement instruments that ensures high levels of reliability and validity, which may assist fellow scholars in executing their empirical research.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

It is important to understand how student performance when higher education is delivered by via new technology. Podcasting is a relatively recent new technology gaining widespread use across the world. We present the results of a quasi-experimental research project that finds when podcasts are used as a revision tool, student performance in Accounting improves. We highlight that aligning podcast use with pedagological design is important and discuss constraints on and barriers to the use of podcasting in higher education.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The development and use of a virtual assessment tool for a signal processing unit is described. It allows students to take a test from anywhere using a web browser to connect to the university server that hosts the test. While student responses are of the multiple choice type, they have to work out problems to arrive at the answer to be entered. CGI programming is used to verify student identification information and record their scores as well as provide immediate feedback after the test is complete. The tool has been used at QUT for the past 3 years and student feedback is discussed. The virtual assessment tool is an efficient alternative to marking written assignment reports that can often take more hours than actual lecture hall contact from a lecturer or tutor. It is especially attractive for very large classes that are now the norm at many universities in the first two years.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Increasingly, large amounts of public and private money are being invested in education and as a result, schools are becoming more accountable to stakeholders for this financial input. In terms of the curriculum, governments worldwide are frequently tying school funding to students‟ and schools‟ academic performances, which are monitored through high-stakes testing programs. To accommodate the resultant pressures from these testing initiatives, many principals are re-focussing their school‟s curriculum on the testing requirements. Such a re-focussing, which was examined critically in this thesis, constituted an externally facilitated rapid approach to curriculum change. In line with previously enacted change theories and recommendations from these, curriculum change in schools has tended to be a fairly slow, considered, collaborative process that is facilitated internally by a deputy-principal (curriculum). However, theoretically based research has shown that such a process has often proved to be difficult and very rarely successful. The present study reports and theorises the experiences of an externally facilitated process that emerged from a practitioner model of change. This case study of the development of the controlled rapid approach to curriculum change began by establishing the reasons three principals initiated curriculum change and why they then engaged an outsider to facilitate the process. It also examined this particular change process from the perspectives of the research participants. The investigation led to the revision of the practitioner model as used in the three schools and challenged the current thinking about the process of school curriculum change. The thesis aims to offer principals and the wider education community an alternative model for consideration when undertaking curriculum change. Finally, the thesis warns that, in the longer term, the application of study‟s revised model (the Controlled Rapid Approach to Curriculum Change [CRACC] Model) may have less then desirable educational consequences.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Burnout has been identified as a significant factor in HIV/AIDS volunteering. It has been associated with depression, anxiety and the loss of volunteers from the health care delivery system. The aim of this study was to test the independence of the health and motivational processes hypothesized within the Job Demands – Resources model of burnout in HIV/AIDS volunteers. Participants were 307 HIV/AIDS volunteers from state AIDS Councils throughout Australia who completed self-report measures pertaining to role ambiguity and role conflict, social support, burnout, intrinsic and organizational satisfaction, and depression. Findings suggested that the independence of the dual processes hypothesized by the model was only partially supported. These findings provide a model for burnout which gives a framework for interventions at both the individual and organizational level which would contribute to the prevention of burnout, depression, and job dissatisfaction in HIV/AIDS volunteers.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

IEC 61850 Process Bus technology has the potential to improve cost, performance and reliability of substation design. Substantial costs associated with copper wiring (designing, documentation, construction, commissioning and troubleshooting) can be reduced with the application of digital Process Bus technology, especially those based upon international standards. An IEC 61850-9-2 based sampled value Process Bus is an enabling technology for the application of Non-Conventional Instrument Transformers (NCIT). Retaining the output of the NCIT in its native digital form, rather than conversion to an analogue output, allows for improved transient performance, dynamic range, safety, reliability and reduced cost. In this paper we report on a pilot installation using NCITs communicating across a switched Ethernet network using the UCAIug Implementation Guideline for IEC 61850-9-2 (9-2 Light Edition or 9-2LE). This system was commissioned in a 275 kV Line Reactor bay at Powerlink Queensland’s Braemar substation in 2009, with sampled value protection IEDs 'shadowing' the existing protection system. The results of commissioning tests and twelve months of service experience using a Fibre Optic Current Transformer (FOCT) from Smart Digital Optics (SDO) are presented, including the response of the system to fault conditions. A number of remaining issues to be resolved to enable wide-scale deployment of NCITs and IEC 61850-9-2 Process Bus technology are also discussed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Proposed transmission smart grids will use a digital platform for the automation of substations operating at voltage levels of 110 kV and above. The IEC 61850 series of standards, released in parts over the last ten years, provide a specification for substation communications networks and systems. These standards, along with IEEE Std 1588-2008 Precision Time Protocol version 2 (PTPv2) for precision timing, are recommended by the both IEC Smart Grid Strategy Group and the NIST Framework and Roadmap for Smart Grid Interoperability Standards for substation automation. IEC 61850-8-1 and IEC 61850-9-2 provide an inter-operable solution to support multi-vendor digital process bus solutions, allowing for the removal of potentially lethal voltages and damaging currents from substation control rooms, a reduction in the amount of cabling required in substations, and facilitates the adoption of non-conventional instrument transformers (NCITs). IEC 61850, PTPv2 and Ethernet are three complementary protocol families that together define the future of sampled value digital process connections for smart substation automation. This paper describes a specific test and evaluation system that uses real time simulation, protection relays, PTPv2 time clocks and artificial network impairment that is being used to investigate technical impediments to the adoption of SV process bus systems by transmission utilities. Knowing the limits of a digital process bus, especially when sampled values and NCITs are included, will enable utilities to make informed decisions regarding the adoption of this technology.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The tear film plays an important role preserving the health of the ocular surface and maintaining the optimal refractive power of the cornea. Moreover dry eye syndrome is one of the most commonly reported eye health problems. This syndrome is caused by abnormalities in the properties of the tear film. Current clinical tools to assess the tear film properties have shown certain limitations. The traditional invasive methods for the assessment of tear film quality, which are used by most clinicians, have been criticized for the lack of reliability and/or repeatability. A range of non-invasive methods of tear assessment have been investigated, but also present limitations. Hence no “gold standard” test is currently available to assess the tear film integrity. Therefore, improving techniques for the assessment of the tear film quality is of clinical significance and the main motivation for the work described in this thesis. In this study the tear film surface quality (TFSQ) changes were investigated by means of high-speed videokeratoscopy (HSV). In this technique, a set of concentric rings formed in an illuminated cone or a bowl is projected on the anterior cornea and their reflection from the ocular surface imaged on a charge-coupled device (CCD). The reflection of the light is produced in the outer most layer of the cornea, the tear film. Hence, when the tear film is smooth the reflected image presents a well structure pattern. In contrast, when the tear film surface presents irregularities, the pattern also becomes irregular due to the light scatter and deviation of the reflected light. The videokeratoscope provides an estimate of the corneal topography associated with each Placido disk image. Topographical estimates, which have been used in the past to quantify tear film changes, may not always be suitable for the evaluation of all the dynamic phases of the tear film. However the Placido disk image itself, which contains the reflected pattern, may be more appropriate to assess the tear film dynamics. A set of novel routines have been purposely developed to quantify the changes of the reflected pattern and to extract a time series estimate of the TFSQ from the video recording. The routine extracts from each frame of the video recording a maximized area of analysis. In this area a metric of the TFSQ is calculated. Initially two metrics based on the Gabor filter and Gaussian gradient-based techniques, were used to quantify the consistency of the pattern’s local orientation as a metric of TFSQ. These metrics have helped to demonstrate the applicability of HSV to assess the tear film, and the influence of contact lens wear on TFSQ. The results suggest that the dynamic-area analysis method of HSV was able to distinguish and quantify the subtle, but systematic degradation of tear film surface quality in the inter-blink interval in contact lens wear. It was also able to clearly show a difference between bare eye and contact lens wearing conditions. Thus, the HSV method appears to be a useful technique for quantitatively investigating the effects of contact lens wear on the TFSQ. Subsequently a larger clinical study was conducted to perform a comparison between HSV and two other non-invasive techniques, lateral shearing interferometry (LSI) and dynamic wavefront sensing (DWS). Of these non-invasive techniques, the HSV appeared to be the most precise method for measuring TFSQ, by virtue of its lower coefficient of variation. While the LSI appears to be the most sensitive method for analyzing the tear build-up time (TBUT). The capability of each of the non-invasive methods to discriminate dry eye from normal subjects was also investigated. The receiver operating characteristic (ROC) curves were calculated to assess the ability of each method to predict dry eye syndrome. The LSI technique gave the best results under both natural blinking conditions and in suppressed blinking conditions, which was closely followed by HSV. The DWS did not perform as well as LSI or HSV. The main limitation of the HSV technique, which was identified during the former clinical study, was the lack of the sensitivity to quantify the build-up/formation phase of the tear film cycle. For that reason an extra metric based on image transformation and block processing was proposed. In this metric, the area of analysis was transformed from Cartesian to Polar coordinates, converting the concentric circles pattern into a quasi-straight lines image in which a block statistics value was extracted. This metric has shown better sensitivity under low pattern disturbance as well as has improved the performance of the ROC curves. Additionally a theoretical study, based on ray-tracing techniques and topographical models of the tear film, was proposed to fully comprehend the HSV measurement and the instrument’s potential limitations. Of special interested was the assessment of the instrument’s sensitivity under subtle topographic changes. The theoretical simulations have helped to provide some understanding on the tear film dynamics, for instance the model extracted for the build-up phase has helped to provide some insight into the dynamics during this initial phase. Finally some aspects of the mathematical modeling of TFSQ time series have been reported in this thesis. Over the years, different functions have been used to model the time series as well as to extract the key clinical parameters (i.e., timing). Unfortunately those techniques to model the tear film time series do not simultaneously consider the underlying physiological mechanism and the parameter extraction methods. A set of guidelines are proposed to meet both criteria. Special attention was given to a commonly used fit, the polynomial function, and considerations to select the appropriate model order to ensure the true derivative of the signal is accurately represented. The work described in this thesis has shown the potential of using high-speed videokeratoscopy to assess tear film surface quality. A set of novel image and signal processing techniques have been proposed to quantify different aspects of the tear film assessment, analysis and modeling. The dynamic-area HSV has shown good performance in a broad range of conditions (i.e., contact lens, normal and dry eye subjects). As a result, this technique could be a useful clinical tool to assess tear film surface quality in the future.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Paired speaking tests are now commonly used in both high-stakes testing and classroom assessment contexts. The co-construction of discourse by candidates is regarded as a strength of paired speaking tests, as candidates have the opportunity to display a wider range of interactional competencies, including turn taking, initiating topics and engaging in extended discourse with a partner, rather than an examiner. However, the impact of the interlocutor in such jointly negotiated discourse and the implications for assessing interactional competence are areas of concern. This article reports on the features of interactional competence that were salient to four trained raters of 12 paired speaking tests through the analysis of rater notes, stimulated verbal recalls and rater discussions. Findings enabled the identification of features of the performance noted by raters when awarding scores for interactional competence, and the particular features associated with higher and lower scores. A number of these features were seen by the raters as mutual achievements, which raises the issue of the extent to which it is possible to assess individual contributions to the co-constructed performance. The findings have implications for defining the construct of interactional competence in paired speaking tests and operationalising this in rating scales.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: People with cardiac disease and type 2 diabetes have higher hospital readmission rates (22%)compared to those without diabetes (6%). Self-management is an effective approach to achieve better health outcomes; however there is a lack of specifically designed programs for patients with these dual conditions. This project aims to extend the development and pilot test of a Cardiac-Diabetes Self-Management Program incorporating user-friendly technologies and the preparation of lay personnel to provide follow-up support. Methods/Design: A randomised controlled trial will be used to explore the feasibility and acceptability of the Cardiac-Diabetes Self-Management Program incorporating DVD case studies and trained peers to provide follow-up support by telephone and text-messaging. A total of 30 cardiac patients with type 2 diabetes will be randomised, either to the usual care group, or to the intervention group. Participants in the intervention group will received the Cardiac-Diabetes Self-Management Program in addition to their usual care. The intervention consists of three faceto- face sessions as well as telephone and text-messaging follow up. The face-to-face sessions will be provided by a trained Research Nurse, commencing in the Coronary Care Unit, and continuing after discharge by trained peers. Peers will follow up patients for up to one month after discharge using text messages and telephone support. Data collection will be conducted at baseline (Time 1) and at one month (Time 2). The primary outcomes include self-efficacy, self-care behaviour and knowledge, measured by well established reliable tools. Discussion: This paper presents the study protocol of a randomised controlled trial to pilot evaluates a Cardiac- Diabetes Self-Management program, and the feasibility of incorporating peers in the follow-ups. Results of this study will provide directions for using such mode in delivering a self-management program for patients with both cardiac condition and diabetes. Furthermore, it will provide valuable information of refinement of the intervention program.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper will focus on the development of an interactive test engine using Rasch analysis of item responses for question selection and reporting of results. The Rasch analysis is used to determine student ability and question difficulty. This model is widely used in the preparation of paper-based tests and has been the subject of particular use and development at the Australian Council for Education Research (ACER). This paper presents an overview of an interactive implementation of the Rasch analysis model in HyperCard, where student ability estimates are generated 'on the fly' and question difficulty values updated from time to time. The student ability estimates are used to determine question selection and are the basis of scoring and reporting schemes.