810 resultados para technology support
Resumo:
The educational advantage of students working cooperatively in teams has been acknowledged in the higher education sector as being profitable in the world of work and other post-university experiences.
Resumo:
In Semester 1 2007, a Monitoring Student Engagement study, conducted as part of the Enhancing Transition at Queensland University of Technology (ET@QUT) Project and extending earlier work in the Project by Arora (2006), aimed at mapping the processes and resources used at that time to identify, monitor and manage students in their first year who were at risk of leaving QUT (Shaw, 2007). This identified a lack of documentation of the processes and resources used and revealed an ad-hoc rather than holistic and systematic approach to monitoring student engagement. One of Shaw’s recommendations was to: “To introduce a centralised case management approach to student engagement” (p. 14). That provided the genesis for the Student Success Project that is being reported on here. The aim of the Student Success Project is to trial, evaluate and ultimately establish holistic and systematic ways of helping students who appear to be at-risk of failing or withdrawing from a unit to persist and succeed. Students are profiled as being at-risk if they are absent from more than 2 tutorials in a row without contacting their tutor or if they fail to submit their first assignment. A Project Officer makes personal contact with these students to suggest ways they can get further assistance depending on their situation.
Resumo:
Communities of practice (CoPs) may be defined as groups of people who are mutually bound by what they do together (Wenger, 1998, p. 2), that is, they “form to share what they know, to learn from one another regarding some aspects of their work and to provide a social context for that work” (Nickols, 2000, para. 1). They are “emergent” in that the shape and membership emerges in the process of activity (Lees, 2005, p. 7). People in CoPs share their knowledge and experiences freely with the purpose of finding inventive ways to approach new problems (Wenger & Snyder, 2000, p. 2). They can be seen as “shared histories of learning” (Wenger, 1998, p. 86). For some time, QUT staff have been involved in a number of initiatives aimed at sharing ideas and resources for teaching first year students such as the Coordinators of Large First Year Units Working Party. To harness these initiatives and maximise their influence, the leaders of the Transitions In Project (TIP)1 decided to form a CoP around the design, assessment and management of large first year units.
Resumo:
This paper first describes a new three-year, longitudinal project that is implementing engineering education in three middle schools in Australia (grade levels 7-9). This important domain is untapped in Australia. Hence, as a starting point, we conducted a context analysis to help situate engineering education in a school system. We report on this analysis with respect to findings from one of two literature-based surveys that gathered middle-school student responses in mathematics (n=172) and science (n=166) towards understanding their dispositions for engineering education. ANOVA indicated gender differences for 3 out of 23 items in both mathematics and science. In addition, the majority of students agreed or strongly agreed with 17 of the 23 survey items, however, there were some differences between mathematics and science. We conclude the paper with some recommendations for establishing engineering education in schools, including the development of partnerships among engineering and education faculties, school systems, and industry to develop contemporary engineering resources to support school-level mathematics, science, and technology.
Resumo:
The use of Information and Communication Technologies (ICT) in education is often a topic of much discussion within all sectors of education with educators and educational researchers continually looking for innovative ways of using these technologies to support and enhance student outcomes in education. Consequently, Malaysia is no exception to this and as the Ministry of Education (MOE), Malaysia strives to meet its government’s Vision 2020, educational reform across all educational sectors has become imperative. ICT will play an integral role in the educational reform process and teacher education programs are no exception to this. ICT and capacity building will play an important role in the re-conceptualisation of teacher education programs. This paper reports on how a collaborative capacity building project between two Malaysian teacher education Institutes and an Australian University has given lecturers and pre-service teachers an opportunity to redefine their use of ICT in their prospective teaching areas of science, mathematics and design and technology. It also highlights the positive capacity building programs that occurred between both Australian university lecturers and Malaysian Institute lecturers and how this contributed to the effective integration and use of ICT.
Resumo:
How and why visualisations support learning was the subject of this qualitative instrumental collective case study. Five computer programming languages (PHP, Visual Basic, Alice, GameMaker, and RoboLab) supporting differing degrees of visualisation were used as cases to explore the effectiveness of software visualisation to develop fundamental computer programming concepts (sequence, iteration, selection, and modularity). Cognitive theories of visual and auditory processing, cognitive load, and mental models provided a framework in which student cognitive development was tracked and measured by thirty-one 15-17 year old students drawn from a Queensland metropolitan secondary private girls’ school, as active participants in the research. Seventeen findings in three sections increase our understanding of the effects of visualisation on the learning process. The study extended the use of mental model theory to track the learning process, and demonstrated application of student research based metacognitive analysis on individual and peer cognitive development as a means to support research and as an approach to teaching. The findings also forward an explanation for failures in previous software visualisation studies, in particular the study has demonstrated that for the cases examined, where complex concepts are being developed, the mixing of auditory (or text) and visual elements can result in excessive cognitive load and impede learning. This finding provides a framework for selecting the most appropriate instructional programming language based on the cognitive complexity of the concepts under study.
Resumo:
This special issue aims to provide up-to-date knowledge and the latest scientific concepts and technological developments in the processing, characterization, testing, mechanics, modeling and applications of a broad range of advanced materials. The many contributors, from Denmark, Germany, UK, Iran, Saudi Arabia, Malaysia, Japan, the People’s Republic of China, Singapore, Taiwan, USA, New Zealand and Australia, present a wide range of topics including: nanomaterials, thin films and coatings, metals and alloys, composite materials, materials processing and characterization, biomaterials and biomechanics, and computational materials science and simulation. The work will therefore be of great interest to a broad spectrum of researchers and technologists.
Resumo:
This document is a collection of ‘cases’ adapted from interviews with supervisors of higher degree research students from the technology disciplines. The supervisors come from a wide range of sub disciplines and represent many levels of experience. We follow in this document Hammond and Ryland’s (2009)2 suggested ranking of supervision experience: No completions – No experience or new supervisors, with no doctoral completions as principal supervisor Experienced – 1 to 5 doctoral completions as principal supervisor Very experienced – over 6 doctoral completions as principal supervisor The cases focus attention on thinking about supervision as a teaching and learning practice; a dimension of higher degree research supervision that is increasingly being recognized as important. They are offered as prompts for individuals and groups of supervisors in thinking about their supervision as a teaching and learning practice.
Resumo:
Real-Time Kinematic (RTK) positioning is a technique used to provide precise positioning services at centimetre accuracy level in the context of Global Navigation Satellite Systems (GNSS). While a Network-based RTK (N-RTK) system involves multiple continuously operating reference stations (CORS), the simplest form of a NRTK system is a single-base RTK. In Australia there are several NRTK services operating in different states and over 1000 single-base RTK systems to support precise positioning applications for surveying, mining, agriculture, and civil construction in regional areas. Additionally, future generation GNSS constellations, including modernised GPS, Galileo, GLONASS, and Compass, with multiple frequencies have been either developed or will become fully operational in the next decade. A trend of future development of RTK systems is to make use of various isolated operating network and single-base RTK systems and multiple GNSS constellations for extended service coverage and improved performance. Several computational challenges have been identified for future NRTK services including: • Multiple GNSS constellations and multiple frequencies • Large scale, wide area NRTK services with a network of networks • Complex computation algorithms and processes • Greater part of positioning processes shifting from user end to network centre with the ability to cope with hundreds of simultaneous users’ requests (reverse RTK) There are two major requirements for NRTK data processing based on the four challenges faced by future NRTK systems, expandable computing power and scalable data sharing/transferring capability. This research explores new approaches to address these future NRTK challenges and requirements using the Grid Computing facility, in particular for large data processing burdens and complex computation algorithms. A Grid Computing based NRTK framework is proposed in this research, which is a layered framework consisting of: 1) Client layer with the form of Grid portal; 2) Service layer; 3) Execution layer. The user’s request is passed through these layers, and scheduled to different Grid nodes in the network infrastructure. A proof-of-concept demonstration for the proposed framework is performed in a five-node Grid environment at QUT and also Grid Australia. The Networked Transport of RTCM via Internet Protocol (Ntrip) open source software is adopted to download real-time RTCM data from multiple reference stations through the Internet, followed by job scheduling and simplified RTK computing. The system performance has been analysed and the results have preliminarily demonstrated the concepts and functionality of the new NRTK framework based on Grid Computing, whilst some aspects of the performance of the system are yet to be improved in future work.
Resumo:
In recent years, practitioners and researchers alike have turned their attention to knowledge management (KM) in order to increase organisational performance (OP). As a result, many different approaches and strategies have been investigated and suggested for how knowledge should be managed to make organisations more effective and efficient. However, most research has been undertaken in the for-profit sector, with only a few studies focusing on the benefits nonprofit organisations might gain by managing knowledge. This study broadly investigates the impact of knowledge management on the organisational performance of nonprofit organisations. Organisational performance can be evaluated through either financial or non-financial measurements. In order to evaluate knowledge management and organisational performance, non-financial measurements are argued to be more suitable given that knowledge is an intangible asset which often cannot be expressed through financial indicators. Non-financial measurement concepts of performance such as the balanced scorecard or the concept of Intellectual Capital (IC) are well accepted and used within the for-profit and nonprofit sectors to evaluate organisational performance. This study utilised the concept of IC as the method to evaluate KM and OP in the context of nonprofit organisations due to the close link between KM and IC: Indeed, KM is concerned with managing the KM processes of creating, storing, sharing and applying knowledge and the organisational KM infrastructure such as organisational culture or organisational structure to support these processes. On the other hand, IC measures the knowledge stocks in different ontological levels: at the individual level (human capital), at the group level (relational capital) and at the organisational level (structural capital). In other words, IC measures the value of the knowledge which has been managed through KM. As KM encompasses the different KM processes and the KM infrastructure facilitating these processes, previous research has investigated the relationship between KM infrastructure and KM processes. Organisational culture, organisational structure and the level of IT support have been identified as the main factors of the KM infrastructure influencing the KM processes of creating, storing, sharing and applying knowledge. Other research has focused on the link between KM and OP or organisational effectiveness. Based on existing literature, a theoretical model was developed to enable the investigation of the relation between KM (encompassing KM infrastructure and KM processes) and IC. The model assumes an association between KM infrastructure and KM processes, as well as an association between KM processes and the various levels of IC (human capital, structural capital and relational capital). As a result, five research questions (RQ) with respect to the various factors of the KM infrastructure as well as with respect to the relationship between KM infrastructure and IC were raised and included into the research model: RQ 1 Do nonprofit organisations which have a Hierarchy culture have a stronger IT support than nonprofit organisations which have an Adhocracy culture? RQ 2 Do nonprofit organisations which have a centralised organisational structure have a stronger IT support than nonprofit organisations which have decentralised organisational structure? RQ 3 Do nonprofit organisations which have a stronger IT support have a higher value of Human Capital than nonprofit organisations which have a less strong IT support? RQ 4 Do nonprofit organisations which have a stronger IT support have a higher value of Structural Capital than nonprofit organisations which have a less strong IT support? RQ 5 Do nonprofit organisations which have a stronger IT support have a higher value of Relational Capital than nonprofit organisations which have a less strong IT support? In order to investigate the research questions, measurements for IC were developed which were linked to the main KM processes. The final KM/IC model contained four items for evaluating human capital, five items for evaluating structural capital and four items for evaluating relational capital. The research questions were investigated through empirical research using a case study approach with the focus on two nonprofit organisations providing trade promotions services through local offices worldwide. Data for the investigation of the assumptions were collected via qualitative as well as quantitative research methods. The qualitative study included interviews with representatives of the two participating organisations as well as in-depth document research. The purpose of the qualitative study was to investigate the factors of the KM infrastructure (organisational culture, organisational structure, IT support) of the organisations and how these factors were related to each other. On the other hand, the quantitative study was carried out through an online-survey amongst staff of the various local offices. The purpose of the quantitative study was to investigate which impact the level of IT support, as the main instrument of the KM infrastructure, had on IC. Overall several key themes were found as a result of the study: • Knowledge Management and Intellectual Capital were complementary with each other, which should be expressed through measurements of IC based on KM processes. • The various factors of the KM infrastructure (organisational culture, organisational structure and level of IT support) are interdependent. • IT was a primary instrument through which the different KM processes (creating, storing, sharing and applying knowledge) were performed. • A high level of IT support was evident when participants reported higher level of IC (human capital, structural capital and relational capital). The study supported previous research in the field of KM and replicated the findings from other case studies in this area. The study also contributed to theory by placing the KM research within the nonprofit context and analysing the linkage between KM and IC. From the managerial perspective, the findings gave clear indications that would allow interested parties, such as nonprofit managers or consultants to understand more about the implications of KM on OP and to use this knowledge for implementing efficient and effective KM strategies within their organisations.
Resumo:
Since the 1980s, industries and researchers have sought to better understand the quality of services due to the rise in their importance (Brogowicz, Delene and Lyth 1990). More recent developments with online services, coupled with growing recognition of service quality (SQ) as a key contributor to national economies and as an increasingly important competitive differentiator, amplify the need to revisit our understanding of SQ and its measurement. Although ‘SQ’ can be broadly defined as “a global overarching judgment or attitude relating to the overall excellence or superiority of a service” (Parasuraman, Berry and Zeithaml 1988), the term has many interpretations. There has been considerable progress on how to measure SQ perceptions, but little consensus has been achieved on what should be measured. There is agreement that SQ is multi-dimensional, but little agreement as to the nature or content of these dimensions (Brady and Cronin 2001). For example, within the banking sector, there exist multiple SQ models, each consisting of varying dimensions. The existence of multiple conceptions and the lack of a unifying theory bring the credibility of existing conceptions into question, and beg the question of whether it is possible at some higher level to define SQ broadly such that it spans all service types and industries. This research aims to explore the viability of a universal conception of SQ, primarily through a careful re-visitation of the services and SQ literature. The study analyses the strengths and weaknesses of the highly regarded and widely used global SQ model (SERVQUAL) which reflects a single-level approach to SQ measurement. The SERVQUAL model states that customers evaluate SQ (of each service encounter) based on five dimensions namely reliability, assurance, tangibles, empathy and responsibility. SERVQUAL, however, failed to address what needs to be reliable, assured, tangible, empathetic and responsible. This research also addresses a more recent global SQ model from Brady and Cronin (2001); the B&C (2001) model, that has potential to be the successor of SERVQUAL in that it encompasses other global SQ models and addresses the ‘what’ questions that SERVQUAL didn’t. The B&C (2001) model conceives SQ as being multidimensional and multi-level; this hierarchical approach to SQ measurement better reflecting human perceptions. In-line with the initial intention of SERVQUAL, which was developed to be generalizable across industries and service types, this research aims to develop a conceptual understanding of SQ, via literature and reflection, that encompasses the content/nature of factors related to SQ; and addresses the benefits and weaknesses of various SQ measurement approaches (i.e. disconfirmation versus perceptions-only). Such understanding of SQ seeks to transcend industries and service types with the intention of extending our knowledge of SQ and assisting practitioners in understanding and evaluating SQ. The candidate’s research has been conducted within, and seeks to contribute to, the ‘IS-Impact’ research track of the IT Professional Services (ITPS) Research Program at QUT. The vision of the track is “to develop the most widely employed model for benchmarking Information Systems in organizations for the joint benefit of research and practice.” The ‘IS-Impact’ research track has developed an Information Systems (IS) success measurement model, the IS-Impact Model (Gable, Sedera and Chan 2008), which seeks to fulfill the track’s vision. Results of this study will help future researchers in the ‘IS-Impact’ research track address questions such as: • Is SQ an antecedent or consequence of the IS-Impact model or both? • Has SQ already been addressed by existing measures of the IS-Impact model? • Is SQ a separate, new dimension of the IS-Impact model? • Is SQ an alternative conception of the IS? Results from the candidate’s research suggest that SQ dimensions can be classified at a higher level which is encompassed by the B&C (2001) model’s 3 primary dimensions (interaction, physical environment and outcome). The candidate also notes that it might be viable to re-word the ‘physical environment quality’ primary dimension to ‘environment quality’ so as to better encompass both physical and virtual scenarios (E.g: web sites). The candidate does not rule out the global feasibility of the B&C (2001) model’s nine sub-dimensions, however, acknowledges that more work has to be done to better define the sub-dimensions. The candidate observes that the ‘expertise’, ‘design’ and ‘valence’ sub-dimensions are supportive representations of the ‘interaction’, physical environment’ and ‘outcome’ primary dimensions respectively. The latter statement suggests that customers evaluate each primary dimension (or each higher level of SQ classification) namely ‘interaction’, physical environment’ and ‘outcome’ based on the ‘expertise’, ‘design’ and ‘valence’ sub-dimensions respectively. The ability to classify SQ dimensions at a higher level coupled with support for the measures that make up this higher level, leads the candidate to propose the B&C (2001) model as a unifying theory that acts as a starting point to measuring SQ and the SQ of IS. The candidate also notes, in parallel with the continuing validation and generalization of the IS-Impact model, that there is value in alternatively conceptualizing the IS as a ‘service’ and ultimately triangulating measures of IS SQ with the IS-Impact model. These further efforts are beyond the scope of the candidate’s study. Results from the candidate’s research also suggest that both the disconfirmation and perceptions-only approaches have their merits and the choice of approach would depend on the objective(s) of the study. Should the objective(s) be an overall evaluation of SQ, the perceptions-only approached is more appropriate as this approach is more straightforward and reduces administrative overheads in the process. However, should the objective(s) be to identify SQ gaps (shortfalls), the (measured) disconfirmation approach is more appropriate as this approach has the ability to identify areas that need improvement.
Resumo:
Vocational education and training for the library and information services (LIS) sector in Australia offers students the career pathway to become library technicians. Library technicians play a valuable role in drawing on sound practical knowledge and skills to support the delivery of library and information services that meet client needs. Over the past forty years, the Australian Library and Information Association (ALIA) has monitored the quality of library technician courses. Since 2005, ALIA has run national professional development days for library technician educators with the goal of establishing an alternative model for course recognition focusing on the process of peer review to benchmark good practice and stimulate continuous improvement in library technician education. This initial developmental work has culminated in 2009 with site visits to all library technician courses in Australia. The paper presents a whole-of-industry case study to critically review the work undertaken to date.
Resumo:
Poor student engagement and high failure rates in first year units were addressed at the Queensland University of Technology (QUT) with a course restructure involving a fresh approach to introducing programming. Students’ first taste of programming in the new course focused less on the language and syntax, and more on problem solving and design, and the role of programming in relation to other technologies they are likely to encounter in their studies. In effect, several technologies that have historically been compartmentalised and taught in isolation have been brought together as a breadth-first introduction to IT. Incorporating databases and Web development technologies into what used to be a purely programming unit gave students a very short introduction to each technology, with programming acting as the glue between each of them. As a result, students not only had a clearer understanding of the application of programming in the real world, but were able to determine their preference or otherwise for each of the technologies introduced, which will help them when the time comes for choosing a course major. Students engaged well in an intensely collaborative learning environment for this unit which was designed to both support the needs of students and meet industry expectations. Attrition from the unit was low, with computer laboratory practical attendance rates for the first time remaining high throughout semester, and the failure rate falling to a single figure percentage.
Resumo:
An emergent form of political economy, facilitated by information and communication technologies (ICTs), is widely propagated as the apotheosis of unmitigated social, economic, and technological progress. Meanwhile, throughout the world, social degradation and economic inequality are increasing logarithmically. Valued categories of thought are, axiomatically, the basic commodities of the “knowledge economy”. Language is its means of exchange. This paper proposes a sociolinguistic method with which to critically engage the hyperbole of the “Information Age”. The method is grounded in a systemic social theory that synthesises aspects of autopoiesis and Marxist political economy. A trade policy statement is analysed to exemplify the sociolinguistically created aberrations that are today most often construed as social and political determinants.