765 resultados para Technology-based self-service
Resumo:
One of the key issues facing public asset owners is the decision of refurbishing aged built assets. This decision requires an assessment of the “remaining service life” of the key components in a building. The remaining service life is significantly dependent upon the existing condition of the asset and future degradation patterns considering durability and functional obsolescence. Recently developed methods on Residual Service Life modelling, require sophisticated data that are not readily available. Most of the data available are in the form of reports prior to undertaking major repairs or in the form of sessional audit reports. Valuable information from these available sources can serve as bench marks for estimating the reference service life. The authors have acquired similar informations from a public asset building in Melbourne. Using these informations, the residual service life of a case study building façade has been estimated in this paper based on state-of-the-art approaches. These estimations have been evaluated against expert opinion. Though the results are encouraging it is clear that the state-of-the-art methodologies can only provide meaningful estimates provided the level and quality of data are available. This investigation resulted in the development of a new framework for maintenance that integrates the condition assessment procedures and factors influencing residual service life
Resumo:
The endeavour to obtain estimates of durability of components for use in lifecycle assessment or costing and infrastructure and maintenance planning systems is large. The factor method and the reference service life concept provide a very valuable structure, but do not resolve the central dilemma of the need to derive an extensive database of service life. Traditional methods of estimating service life, such as dose functions or degradation models, can play a role in developing this database, however the scale of the problem clearly indicates that individual dose functions cannot be derived for each component in each different local and geographic setting. Thus, a wider range of techniques is required in order to devise reference service life. This paper outlines the approaches being taken in the Cooperative Research Centre for Construction Innovation project to predict reference service life. Approaches include the development of fundamental degradation and microclimate models, the development of a situation-based reasoning ‘engine’ to vary the ‘estimator’ of service life, and the development of a database on expert performance (Delphi study). These methods should be viewed as complementary rather than as discrete alternatives. As discussed in the paper, the situation-based reasoning approach in fact has the possibility of encompassing all other methods.
Resumo:
Over the last decade in Australia, the role of the teacher has changed. Teachers are now expected to model and foster in their students a wide range of skills such as critical thinking, self-regulated learning, knowledge of self and others and lifelong learning. These changes are having a significant impact on the design of pre-service teacher education programmes, with university educators re-evaluating the teacher training curriculum and embedded pedagogical processes in order to consider how they might develop these skills in pre-service teachers. One approach is to consider the processes and practices inherent in philosophical inquiry. This paper reports on three participants’ reflections of a 12-week philosophy programme that was conducted for teacher educators at Queensland’s University of Technology (QUT) in 2008. The programme was facilitated by teachers from Buranda State School who have been teaching philosophy in their P-7 school for more than ten years. This paper provides insight into teacher educators’ reflections on the philosophy programme and the associated changes and challenges of implementing such a programme in pre-service teacher education degrees.
Resumo:
The study will cross-fertilise Information Systems (IS) and Services Marketing ideas through reconceptualising the information system as a service (ISaaS). The study addresses known limitations of arguably the two most significant dependent variables in these disciplines - Information System Success or IS-Impact, and Service Quality. Planned efforts to synthesise analogous conceptions across these disciplines, are expected to force a deeper theoretical understanding of the broad notions of success, quality, value and satisfaction and their interrelations. The aims of this research are to: (1) yield a conceptually superior and more extensively validated IS success measurement model, and (2) develop and operationalise a more rigorously validated Service Quality measurement model, while extending the ‘service’ notion to ‘operational computer-based information systems in organisations’. In the development of the new models the study will address contemporary validation issues.
Resumo:
Evidence-based Practice (EBP) has recently emerged as a topic of discussion amongst professionals within the library and information services (LIS) industry. Simply stated, EBP is the process of using formal research skills and methods to assist in decision making and establishing best practice. The emerging interest in EBP within the library context serves to remind the library profession that research skills and methods can help ensure that the library industry remains current and relevant in changing times. The LIS sector faces ongoing challenges in terms of the expectation that financial and human resources will be managed efficiently, particularly if library budgets are reduced and accountability to the principal stakeholders is increased. Library managers are charged with the responsibility to deliver relevant and cost effective services, in an environment characterised by rapidly changing models of information provision, information access and user behaviours. Consequently they are called upon not only to justify the services they provide, or plan to introduce, but also to measure the effectiveness of these services and to evaluate the impact on the communities they serve. The imperative for innovation in and enhancements to library practice is accompanied by the need for a strong understanding of the processes of review, measurement, assessment and evaluation. In 2001 the Centre for Information Research was commissioned by the Chartered Institute of Library and Information Professionals (CILIP) in the UK to conduct an examination into the research landscape for library and information science. The examination concluded that research is “important for the LIS [library and information science] domain in a number of ways” (McNicol & Nankivell, 2001, p.77). At the professional level, research can inform practice, assist in the future planning of the profession, raise the profile of the discipline, and indeed the reputation and standing of the library and information service itself. At the personal level, research can “broaden horizons and offer individuals development opportunities” (McNicol & Nankivell, 2001, p.77). The study recommended that “research should be promoted as a valuable professional activity for practitioners to engage in” (McNicol & Nankivell, 2001, p.82). This chapter will consider the role of EBP within the library profession. A brief review of key literature in the area is provided. The review considers issues of definition and terminology, highlights the importance of research in professional practice and outlines the research approaches that underpin EBP. The chapter concludes with a consideration of the specific application of EBP within the dynamic and evolving field of information literacy (IL).
Resumo:
Real-Time Kinematic (RTK) positioning is a technique used to provide precise positioning services at centimetre accuracy level in the context of Global Navigation Satellite Systems (GNSS). While a Network-based RTK (N-RTK) system involves multiple continuously operating reference stations (CORS), the simplest form of a NRTK system is a single-base RTK. In Australia there are several NRTK services operating in different states and over 1000 single-base RTK systems to support precise positioning applications for surveying, mining, agriculture, and civil construction in regional areas. Additionally, future generation GNSS constellations, including modernised GPS, Galileo, GLONASS, and Compass, with multiple frequencies have been either developed or will become fully operational in the next decade. A trend of future development of RTK systems is to make use of various isolated operating network and single-base RTK systems and multiple GNSS constellations for extended service coverage and improved performance. Several computational challenges have been identified for future NRTK services including: • Multiple GNSS constellations and multiple frequencies • Large scale, wide area NRTK services with a network of networks • Complex computation algorithms and processes • Greater part of positioning processes shifting from user end to network centre with the ability to cope with hundreds of simultaneous users’ requests (reverse RTK) There are two major requirements for NRTK data processing based on the four challenges faced by future NRTK systems, expandable computing power and scalable data sharing/transferring capability. This research explores new approaches to address these future NRTK challenges and requirements using the Grid Computing facility, in particular for large data processing burdens and complex computation algorithms. A Grid Computing based NRTK framework is proposed in this research, which is a layered framework consisting of: 1) Client layer with the form of Grid portal; 2) Service layer; 3) Execution layer. The user’s request is passed through these layers, and scheduled to different Grid nodes in the network infrastructure. A proof-of-concept demonstration for the proposed framework is performed in a five-node Grid environment at QUT and also Grid Australia. The Networked Transport of RTCM via Internet Protocol (Ntrip) open source software is adopted to download real-time RTCM data from multiple reference stations through the Internet, followed by job scheduling and simplified RTK computing. The system performance has been analysed and the results have preliminarily demonstrated the concepts and functionality of the new NRTK framework based on Grid Computing, whilst some aspects of the performance of the system are yet to be improved in future work.
Resumo:
To navigate successfully in a previously unexplored environment, a mobile robot must be able to estimate the spatial relationships of the objects of interest accurately. A Simultaneous Localization and Mapping (SLAM) sys- tem employs its sensors to build incrementally a map of its surroundings and to localize itself in the map simultaneously. The aim of this research project is to develop a SLAM system suitable for self propelled household lawnmowers. The proposed bearing-only SLAM system requires only an omnidirec- tional camera and some inexpensive landmarks. The main advantage of an omnidirectional camera is the panoramic view of all the landmarks in the scene. Placing landmarks in a lawn field to define the working domain is much easier and more flexible than installing the perimeter wire required by existing autonomous lawnmowers. The common approach of existing bearing-only SLAM methods relies on a motion model for predicting the robot’s pose and a sensor model for updating the pose. In the motion model, the error on the estimates of object positions is cumulated due mainly to the wheel slippage. Quantifying accu- rately the uncertainty of object positions is a fundamental requirement. In bearing-only SLAM, the Probability Density Function (PDF) of landmark position should be uniform along the observed bearing. Existing methods that approximate the PDF with a Gaussian estimation do not satisfy this uniformity requirement. This thesis introduces both geometric and proba- bilistic methods to address the above problems. The main novel contribu- tions of this thesis are: 1. A bearing-only SLAM method not requiring odometry. The proposed method relies solely on the sensor model (landmark bearings only) without relying on the motion model (odometry). The uncertainty of the estimated landmark positions depends on the vision error only, instead of the combination of both odometry and vision errors. 2. The transformation of the spatial uncertainty of objects. This thesis introduces a novel method for translating the spatial un- certainty of objects estimated from a moving frame attached to the robot into the global frame attached to the static landmarks in the environment. 3. The characterization of an improved PDF for representing landmark position in bearing-only SLAM. The proposed PDF is expressed in polar coordinates, and the marginal probability on range is constrained to be uniform. Compared to the PDF estimated from a mixture of Gaussians, the PDF developed here has far fewer parameters and can be easily adopted in a probabilistic framework, such as a particle filtering system. The main advantages of our proposed bearing-only SLAM system are its lower production cost and flexibility of use. The proposed system can be adopted in other domestic robots as well, such as vacuum cleaners or robotic toys when terrain is essentially 2D.
Resumo:
Since the 1980s, industries and researchers have sought to better understand the quality of services due to the rise in their importance (Brogowicz, Delene and Lyth 1990). More recent developments with online services, coupled with growing recognition of service quality (SQ) as a key contributor to national economies and as an increasingly important competitive differentiator, amplify the need to revisit our understanding of SQ and its measurement. Although ‘SQ’ can be broadly defined as “a global overarching judgment or attitude relating to the overall excellence or superiority of a service” (Parasuraman, Berry and Zeithaml 1988), the term has many interpretations. There has been considerable progress on how to measure SQ perceptions, but little consensus has been achieved on what should be measured. There is agreement that SQ is multi-dimensional, but little agreement as to the nature or content of these dimensions (Brady and Cronin 2001). For example, within the banking sector, there exist multiple SQ models, each consisting of varying dimensions. The existence of multiple conceptions and the lack of a unifying theory bring the credibility of existing conceptions into question, and beg the question of whether it is possible at some higher level to define SQ broadly such that it spans all service types and industries. This research aims to explore the viability of a universal conception of SQ, primarily through a careful re-visitation of the services and SQ literature. The study analyses the strengths and weaknesses of the highly regarded and widely used global SQ model (SERVQUAL) which reflects a single-level approach to SQ measurement. The SERVQUAL model states that customers evaluate SQ (of each service encounter) based on five dimensions namely reliability, assurance, tangibles, empathy and responsibility. SERVQUAL, however, failed to address what needs to be reliable, assured, tangible, empathetic and responsible. This research also addresses a more recent global SQ model from Brady and Cronin (2001); the B&C (2001) model, that has potential to be the successor of SERVQUAL in that it encompasses other global SQ models and addresses the ‘what’ questions that SERVQUAL didn’t. The B&C (2001) model conceives SQ as being multidimensional and multi-level; this hierarchical approach to SQ measurement better reflecting human perceptions. In-line with the initial intention of SERVQUAL, which was developed to be generalizable across industries and service types, this research aims to develop a conceptual understanding of SQ, via literature and reflection, that encompasses the content/nature of factors related to SQ; and addresses the benefits and weaknesses of various SQ measurement approaches (i.e. disconfirmation versus perceptions-only). Such understanding of SQ seeks to transcend industries and service types with the intention of extending our knowledge of SQ and assisting practitioners in understanding and evaluating SQ. The candidate’s research has been conducted within, and seeks to contribute to, the ‘IS-Impact’ research track of the IT Professional Services (ITPS) Research Program at QUT. The vision of the track is “to develop the most widely employed model for benchmarking Information Systems in organizations for the joint benefit of research and practice.” The ‘IS-Impact’ research track has developed an Information Systems (IS) success measurement model, the IS-Impact Model (Gable, Sedera and Chan 2008), which seeks to fulfill the track’s vision. Results of this study will help future researchers in the ‘IS-Impact’ research track address questions such as: • Is SQ an antecedent or consequence of the IS-Impact model or both? • Has SQ already been addressed by existing measures of the IS-Impact model? • Is SQ a separate, new dimension of the IS-Impact model? • Is SQ an alternative conception of the IS? Results from the candidate’s research suggest that SQ dimensions can be classified at a higher level which is encompassed by the B&C (2001) model’s 3 primary dimensions (interaction, physical environment and outcome). The candidate also notes that it might be viable to re-word the ‘physical environment quality’ primary dimension to ‘environment quality’ so as to better encompass both physical and virtual scenarios (E.g: web sites). The candidate does not rule out the global feasibility of the B&C (2001) model’s nine sub-dimensions, however, acknowledges that more work has to be done to better define the sub-dimensions. The candidate observes that the ‘expertise’, ‘design’ and ‘valence’ sub-dimensions are supportive representations of the ‘interaction’, physical environment’ and ‘outcome’ primary dimensions respectively. The latter statement suggests that customers evaluate each primary dimension (or each higher level of SQ classification) namely ‘interaction’, physical environment’ and ‘outcome’ based on the ‘expertise’, ‘design’ and ‘valence’ sub-dimensions respectively. The ability to classify SQ dimensions at a higher level coupled with support for the measures that make up this higher level, leads the candidate to propose the B&C (2001) model as a unifying theory that acts as a starting point to measuring SQ and the SQ of IS. The candidate also notes, in parallel with the continuing validation and generalization of the IS-Impact model, that there is value in alternatively conceptualizing the IS as a ‘service’ and ultimately triangulating measures of IS SQ with the IS-Impact model. These further efforts are beyond the scope of the candidate’s study. Results from the candidate’s research also suggest that both the disconfirmation and perceptions-only approaches have their merits and the choice of approach would depend on the objective(s) of the study. Should the objective(s) be an overall evaluation of SQ, the perceptions-only approached is more appropriate as this approach is more straightforward and reduces administrative overheads in the process. However, should the objective(s) be to identify SQ gaps (shortfalls), the (measured) disconfirmation approach is more appropriate as this approach has the ability to identify areas that need improvement.
Resumo:
Do commencing students possess the level of information literacy (IL) knowledge and skills they need to succeed at university? What impact does embedding IL within the engineering and design curriculum have? This paper reports on the self-perception versus the reality of IL knowledge and skills, across a large cohort of first year built environment and engineering students. Acting on the findings of this evaluation, the authors (a team of academic librarians) developed an intensive IL skills program which was integrated into a faculty wide unit. Perceptions, knowledge and skills were re-evaluated at the end of the semester to determine if embedded IL education made a difference. Findings reveal that both the perception and reality of IL skills were significantly and measurably improved.
Resumo:
With service interaction modelling, it is customary to distinguish between two types of models: choreographies and orchestrations. A choreography describes interactions within a collection of services from a global perspective, where no service plays a privileged role. Instead, services interact in a peer-to-peer manner. In contrast, an orchestration describes the interactions between one particular service, the orchestrator, and a number of partner services. The main proposition of this work is an approach to bridge these two modelling viewpoints by synthesising orchestrators from choreographies. To start with, choreographies are defined using a simple behaviour description language based on communicating finite state machines. From such a model, orchestrators are initially synthesised in the form of state machines. It turns out that state machines are not suitable for orchestration modelling, because orchestrators generally need to engage in concurrent interactions. To address this issue, a technique is proposed to transform state machines into process models in the Business Process Modelling Notation (BPMN). Orchestrations represented in BPMN can then be augmented with additional business logic to achieve value-adding mediation. In addition, techniques exist for refining BPMN models into executable process definitions. The transformation from state machines to BPMN relies on Petri nets as an intermediary representation and leverages techniques from theory of regions to identify concurrency in the initial Petri net. Once concurrency has been identified, the resulting Petri net is transformed into a BPMN model. The original contributions of this work are: an algorithm to synthesise orchestrators from choreographies and a rules-based transformation from Petri nets into BPMN.
Resumo:
We argue that web service discovery technology should help the user navigate a complex problem space by providing suggestions for services which they may not be able to formulate themselves as (s)he lacks the epistemic resources to do so. Free text documents in service environments provide an untapped source of information for augmenting the epistemic state of the user and hence their ability to search effectively for services. A quantitative approach to semantic knowledge representation is adopted in the form of semantic space models computed from these free text documents. Knowledge of the user’s agenda is promoted by associational inferences computed from the semantic space. The inferences are suggestive and aim to promote human abductive reasoning to guide the user from fuzzy search goals into a better understanding of the problem space surrounding the given agenda. Experimental results are discussed based on a complex and realistic planning activity.
Resumo:
Services in the form of business services or IT-enabled (Web) Services have become a corporate asset of high interest in striving towards the agile organisation. However, while the design and management of a single service is widely studied and well understood, little is known about how a set of services can be managed. This gap motivated this paper, in which we explore the concept of Service Portfolio Management. In particular, we propose a Service Portfolio Management Framework that explicates service portfolio goals, tasks, governance issues, methods and enablers. The Service Portfolio Management Framework is based upon a thorough analysis and consolidation of existing, well-established portfolio management approaches. From an academic point of view, the Service Portfolio Management Framework can be positioned as an extension of portfolio management conceptualisations in the area of service management. Based on the framework, possible directions for future research are provided. From a practical point of view, the Service Portfolio Management Framework provides an organisation with a novel approach to managing its emerging service portfolios.
Resumo:
This position paper examines the development of a dedicated service aggregator role in business networks. We predict that these intermediaries will soon emerge in service ecosystems and add value through the application of dedicated domain knowledge in the process of creating new, innovative services or service bundles based on the aggregation, composition, integration or orchestration of existing services procured from different service providers in the service ecosystem. We discuss general foundations of service aggregators and present Fourth-Party Logistics Providers as a real-world example of emerging business service aggregators. We also point out a demand for future research, e.g. into governance models, risk management tools, service portfolio management approaches and service bundling techniques, to be able to better understand core determinants of competitiveness and success of service aggregators.
Resumo:
Appropriate behaviours toward customers often requires employees to suppress some genuine emotions and/or express other emotions; genuine or contrived. Managing one's emotions in this way gives rise to emotional exhaustion. This can have consequences for psychological ill health, in the form of work place strain, and ultimately employee's desire to leave. This student examines the relationships between emotional management, emotional exhaustion and turnover intentions amongst diversional therapy professionals. We find that some forms of emotional management have a significant impact on emotional exhaustion and that this predicts workplace strain. Furthermore, the deleterious effects of emotional exhaustion are mitigated somewhat for employees who have strong beliefs in their ability to provide good service, compared to employees with lower self efficacy beliefs.
Resumo:
The inquiry documented in this thesis is located at the nexus of technological innovation and traditional schooling. As we enter the second decade of a new century, few would argue against the increasingly urgent need to integrate digital literacies with traditional academic knowledge. Yet, despite substantial investments from governments and businesses, the adoption and diffusion of contemporary digital tools in formal schooling remain sluggish. To date, research on technology adoption in schools tends to take a deficit perspective of schools and teachers, with the lack of resources and teacher ‘technophobia’ most commonly cited as barriers to digital uptake. Corresponding interventions that focus on increasing funding and upskilling teachers, however, have made little difference to adoption trends in the last decade. Empirical evidence that explicates the cultural and pedagogical complexities of innovation diffusion within long-established conventions of mainstream schooling, particularly from the standpoint of students, is wanting. To address this knowledge gap, this thesis inquires into how students evaluate and account for the constraints and affordances of contemporary digital tools when they engage with them as part of their conventional schooling. It documents the attempted integration of a student-led Web 2.0 learning initiative, known as the Student Media Centre (SMC), into the schooling practices of a long-established, high-performing independent senior boys’ school in urban Australia. The study employed an ‘explanatory’ two-phase research design (Creswell, 2003) that combined complementary quantitative and qualitative methods to achieve both breadth of measurement and richness of characterisation. In the initial quantitative phase, a self-reported questionnaire was administered to the senior school student population to determine adoption trends and predictors of SMC usage (N=481). Measurement constructs included individual learning dispositions (learning and performance goals, cognitive playfulness and personal innovativeness), as well as social and technological variables (peer support, perceived usefulness and ease of use). Incremental predictive models of SMC usage were conducted using Classification and Regression Tree (CART) modelling: (i) individual-level predictors, (ii) individual and social predictors, and (iii) individual, social and technological predictors. Peer support emerged as the best predictor of SMC usage. Other salient predictors include perceived ease of use and usefulness, cognitive playfulness and learning goals. On the whole, an overwhelming proportion of students reported low usage levels, low perceived usefulness and a lack of peer support for engaging with the digital learning initiative. The small minority of frequent users reported having high levels of peer support and robust learning goal orientations, rather than being predominantly driven by performance goals. These findings indicate that tensions around social validation, digital learning and academic performance pressures influence students’ engagement with the Web 2.0 learning initiative. The qualitative phase that followed provided insights into these tensions by shifting the analytics from individual attitudes and behaviours to shared social and cultural reasoning practices that explain students’ engagement with the innovation. Six indepth focus groups, comprising 60 students with different levels of SMC usage, were conducted, audio-recorded and transcribed. Textual data were analysed using Membership Categorisation Analysis. Students’ accounts converged around a key proposition. The Web 2.0 learning initiative was useful-in-principle but useless-in-practice. While students endorsed the usefulness of the SMC for enhancing multimodal engagement, extending peer-topeer networks and acquiring real-world skills, they also called attention to a number of constraints that obfuscated the realisation of these design affordances in practice. These constraints were cast in terms of three binary formulations of social and cultural imperatives at play within the school: (i) ‘cool/uncool’, (ii) ‘dominant staff/compliant student’, and (iii) ‘digital learning/academic performance’. The first formulation foregrounds the social stigma of the SMC among peers and its resultant lack of positive network benefits. The second relates to students’ perception of the school culture as authoritarian and punitive with adverse effects on the very student agency required to drive the innovation. The third points to academic performance pressures in a crowded curriculum with tight timelines. Taken together, findings from both phases of the study provide the following key insights. First, students endorsed the learning affordances of contemporary digital tools such as the SMC for enhancing their current schooling practices. For the majority of students, however, these learning affordances were overshadowed by the performative demands of schooling, both social and academic. The student participants saw engagement with the SMC in-school as distinct from, even oppositional to, the conventional social and academic performance indicators of schooling, namely (i) being ‘cool’ (or at least ‘not uncool’), (ii) sufficiently ‘compliant’, and (iii) achieving good academic grades. Their reasoned response therefore, was simply to resist engagement with the digital learning innovation. Second, a small minority of students seemed dispositionally inclined to negotiate the learning affordances and performance constraints of digital learning and traditional schooling more effectively than others. These students were able to engage more frequently and meaningfully with the SMC in school. Their ability to adapt and traverse seemingly incommensurate social and institutional identities and norms is theorised as cultural agility – a dispositional construct that comprises personal innovativeness, cognitive playfulness and learning goals orientation. The logic then is ‘both and’ rather than ‘either or’ for these individuals with a capacity to accommodate both learning and performance in school, whether in terms of digital engagement and academic excellence, or successful brokerage across multiple social identities and institutional affiliations within the school. In sum, this study takes us beyond the familiar terrain of deficit discourses that tend to blame institutional conservatism, lack of resourcing and teacher resistance for low uptake of digital technologies in schools. It does so by providing an empirical base for the development of a ‘third way’ of theorising technological and pedagogical innovation in schools, one which is more informed by students as critical stakeholders and thus more relevant to the lived culture within the school, and its complex relationship to students’ lives outside of school. It is in this relationship that we find an explanation for how these individuals can, at the one time, be digital kids and analogue students.