814 resultados para Service-Based Architecture
Resumo:
The creative industries idea is better than even its original perpetrators might have imagined, judging from the original mapping documents. By throwing the heavy duty copyright industries into the same basket as public service broadcasting, the arts and a lot of not-for-profit activity (public goods) and commercial but non-copyright-based sectors (architecture, design, increasingly software), it really messed with the minds of economic and cultural traditionalists. And, perhaps unwittingly, it prepared the way for understanding the dynamics of contemporary cultural ‘prosumption’ or ‘playbour’ in an increasingly networked social and economic space.
Resumo:
Synthetic polymers have attracted much attention in tissue engineering due to their ability to modulate biomechanical properties. This study investigated the feasibility of processing poly(varepsilon-caprolactone) (PCL) homopolymer, PCL-poly(ethylene glycol) (PEG) diblock, and PCL-PEG-PCL triblock copolymers into three-dimensional porous scaffolds. Properties of the various polymers were investigated by dynamic thermal analysis. The scaffolds were manufactured using the desktop robot-based rapid prototyping technique. Gross morphology and internal three-dimensional structure of scaffolds were identified by scanning electron microscopy and micro-computed tomography, which showed excellent fusion at the filament junctions, high uniformity, and complete interconnectivity of pore networks. The influences of process parameters on scaffolds' morphological and mechanical characteristics were studied. Data confirmed that the process parameters directly influenced the pore size, porosity, and, consequently, the mechanical properties of the scaffolds. The in vitro cell culture study was performed to investigate the influence of polymer nature and scaffold architecture on the adhesion of the cells onto the scaffolds using rabbit smooth muscle cells. Light, scanning electron, and confocal laser microscopy showed cell adhesion, proliferation, and extracellular matrix formation on the surface as well as inside the structure of both scaffold groups. The completely interconnected and highly regular honeycomb-like pore morphology supported bridging of the pores via cell-to-cell contact as well as production of extracellular matrix at later time points. The results indicated that the incorporation of hydrophilic PEG into hydrophobic PCL enhanced the overall hydrophilicity and cell culture performance of PCL-PEG copolymer. However, the scaffold architecture did not significantly influence the cell culture performance in this study.
Resumo:
Evidence-based Practice (EBP) has recently emerged as a topic of discussion amongst professionals within the library and information services (LIS) industry. Simply stated, EBP is the process of using formal research skills and methods to assist in decision making and establishing best practice. The emerging interest in EBP within the library context serves to remind the library profession that research skills and methods can help ensure that the library industry remains current and relevant in changing times. The LIS sector faces ongoing challenges in terms of the expectation that financial and human resources will be managed efficiently, particularly if library budgets are reduced and accountability to the principal stakeholders is increased. Library managers are charged with the responsibility to deliver relevant and cost effective services, in an environment characterised by rapidly changing models of information provision, information access and user behaviours. Consequently they are called upon not only to justify the services they provide, or plan to introduce, but also to measure the effectiveness of these services and to evaluate the impact on the communities they serve. The imperative for innovation in and enhancements to library practice is accompanied by the need for a strong understanding of the processes of review, measurement, assessment and evaluation. In 2001 the Centre for Information Research was commissioned by the Chartered Institute of Library and Information Professionals (CILIP) in the UK to conduct an examination into the research landscape for library and information science. The examination concluded that research is “important for the LIS [library and information science] domain in a number of ways” (McNicol & Nankivell, 2001, p.77). At the professional level, research can inform practice, assist in the future planning of the profession, raise the profile of the discipline, and indeed the reputation and standing of the library and information service itself. At the personal level, research can “broaden horizons and offer individuals development opportunities” (McNicol & Nankivell, 2001, p.77). The study recommended that “research should be promoted as a valuable professional activity for practitioners to engage in” (McNicol & Nankivell, 2001, p.82). This chapter will consider the role of EBP within the library profession. A brief review of key literature in the area is provided. The review considers issues of definition and terminology, highlights the importance of research in professional practice and outlines the research approaches that underpin EBP. The chapter concludes with a consideration of the specific application of EBP within the dynamic and evolving field of information literacy (IL).
Resumo:
Real-Time Kinematic (RTK) positioning is a technique used to provide precise positioning services at centimetre accuracy level in the context of Global Navigation Satellite Systems (GNSS). While a Network-based RTK (N-RTK) system involves multiple continuously operating reference stations (CORS), the simplest form of a NRTK system is a single-base RTK. In Australia there are several NRTK services operating in different states and over 1000 single-base RTK systems to support precise positioning applications for surveying, mining, agriculture, and civil construction in regional areas. Additionally, future generation GNSS constellations, including modernised GPS, Galileo, GLONASS, and Compass, with multiple frequencies have been either developed or will become fully operational in the next decade. A trend of future development of RTK systems is to make use of various isolated operating network and single-base RTK systems and multiple GNSS constellations for extended service coverage and improved performance. Several computational challenges have been identified for future NRTK services including: • Multiple GNSS constellations and multiple frequencies • Large scale, wide area NRTK services with a network of networks • Complex computation algorithms and processes • Greater part of positioning processes shifting from user end to network centre with the ability to cope with hundreds of simultaneous users’ requests (reverse RTK) There are two major requirements for NRTK data processing based on the four challenges faced by future NRTK systems, expandable computing power and scalable data sharing/transferring capability. This research explores new approaches to address these future NRTK challenges and requirements using the Grid Computing facility, in particular for large data processing burdens and complex computation algorithms. A Grid Computing based NRTK framework is proposed in this research, which is a layered framework consisting of: 1) Client layer with the form of Grid portal; 2) Service layer; 3) Execution layer. The user’s request is passed through these layers, and scheduled to different Grid nodes in the network infrastructure. A proof-of-concept demonstration for the proposed framework is performed in a five-node Grid environment at QUT and also Grid Australia. The Networked Transport of RTCM via Internet Protocol (Ntrip) open source software is adopted to download real-time RTCM data from multiple reference stations through the Internet, followed by job scheduling and simplified RTK computing. The system performance has been analysed and the results have preliminarily demonstrated the concepts and functionality of the new NRTK framework based on Grid Computing, whilst some aspects of the performance of the system are yet to be improved in future work.
Resumo:
Since the 1980s, industries and researchers have sought to better understand the quality of services due to the rise in their importance (Brogowicz, Delene and Lyth 1990). More recent developments with online services, coupled with growing recognition of service quality (SQ) as a key contributor to national economies and as an increasingly important competitive differentiator, amplify the need to revisit our understanding of SQ and its measurement. Although ‘SQ’ can be broadly defined as “a global overarching judgment or attitude relating to the overall excellence or superiority of a service” (Parasuraman, Berry and Zeithaml 1988), the term has many interpretations. There has been considerable progress on how to measure SQ perceptions, but little consensus has been achieved on what should be measured. There is agreement that SQ is multi-dimensional, but little agreement as to the nature or content of these dimensions (Brady and Cronin 2001). For example, within the banking sector, there exist multiple SQ models, each consisting of varying dimensions. The existence of multiple conceptions and the lack of a unifying theory bring the credibility of existing conceptions into question, and beg the question of whether it is possible at some higher level to define SQ broadly such that it spans all service types and industries. This research aims to explore the viability of a universal conception of SQ, primarily through a careful re-visitation of the services and SQ literature. The study analyses the strengths and weaknesses of the highly regarded and widely used global SQ model (SERVQUAL) which reflects a single-level approach to SQ measurement. The SERVQUAL model states that customers evaluate SQ (of each service encounter) based on five dimensions namely reliability, assurance, tangibles, empathy and responsibility. SERVQUAL, however, failed to address what needs to be reliable, assured, tangible, empathetic and responsible. This research also addresses a more recent global SQ model from Brady and Cronin (2001); the B&C (2001) model, that has potential to be the successor of SERVQUAL in that it encompasses other global SQ models and addresses the ‘what’ questions that SERVQUAL didn’t. The B&C (2001) model conceives SQ as being multidimensional and multi-level; this hierarchical approach to SQ measurement better reflecting human perceptions. In-line with the initial intention of SERVQUAL, which was developed to be generalizable across industries and service types, this research aims to develop a conceptual understanding of SQ, via literature and reflection, that encompasses the content/nature of factors related to SQ; and addresses the benefits and weaknesses of various SQ measurement approaches (i.e. disconfirmation versus perceptions-only). Such understanding of SQ seeks to transcend industries and service types with the intention of extending our knowledge of SQ and assisting practitioners in understanding and evaluating SQ. The candidate’s research has been conducted within, and seeks to contribute to, the ‘IS-Impact’ research track of the IT Professional Services (ITPS) Research Program at QUT. The vision of the track is “to develop the most widely employed model for benchmarking Information Systems in organizations for the joint benefit of research and practice.” The ‘IS-Impact’ research track has developed an Information Systems (IS) success measurement model, the IS-Impact Model (Gable, Sedera and Chan 2008), which seeks to fulfill the track’s vision. Results of this study will help future researchers in the ‘IS-Impact’ research track address questions such as: • Is SQ an antecedent or consequence of the IS-Impact model or both? • Has SQ already been addressed by existing measures of the IS-Impact model? • Is SQ a separate, new dimension of the IS-Impact model? • Is SQ an alternative conception of the IS? Results from the candidate’s research suggest that SQ dimensions can be classified at a higher level which is encompassed by the B&C (2001) model’s 3 primary dimensions (interaction, physical environment and outcome). The candidate also notes that it might be viable to re-word the ‘physical environment quality’ primary dimension to ‘environment quality’ so as to better encompass both physical and virtual scenarios (E.g: web sites). The candidate does not rule out the global feasibility of the B&C (2001) model’s nine sub-dimensions, however, acknowledges that more work has to be done to better define the sub-dimensions. The candidate observes that the ‘expertise’, ‘design’ and ‘valence’ sub-dimensions are supportive representations of the ‘interaction’, physical environment’ and ‘outcome’ primary dimensions respectively. The latter statement suggests that customers evaluate each primary dimension (or each higher level of SQ classification) namely ‘interaction’, physical environment’ and ‘outcome’ based on the ‘expertise’, ‘design’ and ‘valence’ sub-dimensions respectively. The ability to classify SQ dimensions at a higher level coupled with support for the measures that make up this higher level, leads the candidate to propose the B&C (2001) model as a unifying theory that acts as a starting point to measuring SQ and the SQ of IS. The candidate also notes, in parallel with the continuing validation and generalization of the IS-Impact model, that there is value in alternatively conceptualizing the IS as a ‘service’ and ultimately triangulating measures of IS SQ with the IS-Impact model. These further efforts are beyond the scope of the candidate’s study. Results from the candidate’s research also suggest that both the disconfirmation and perceptions-only approaches have their merits and the choice of approach would depend on the objective(s) of the study. Should the objective(s) be an overall evaluation of SQ, the perceptions-only approached is more appropriate as this approach is more straightforward and reduces administrative overheads in the process. However, should the objective(s) be to identify SQ gaps (shortfalls), the (measured) disconfirmation approach is more appropriate as this approach has the ability to identify areas that need improvement.
Resumo:
This paper highlights challenges in implementing mental health policy at a service delivery level. It describes an attempt to foster greater application of recovery-orientated principles and practices within mental health services. Notwithstanding a highly supportive policy environment, strong support from service administrators, and an enthusiastic staff response to training, application of the training and support tools was weaker than anticipated. This paper evaluates the dissemination trial against key elements to promote sustained adoption of innovations. Organisational and procedural changes are required before mental health policies are systematically implemented in practice.
Resumo:
A continuing challenge for pre-service teacher education is the learning transfer between the university based components and the practical school based components of their training. It is not clear how easily pre-service teachers can transfer university learnings into ‘in school’ practice. Similarly, it is not clear how easily knowledge learned in the school context can be disembedded from this particular context and understood more generally by the pre-service teacher. This paper examines the effect of a community of practice formed specifically to explore learning transfer via collaboration and professional enquiry, in ‘real time’, across the globe. “Activity Theory” (Engestrom, 1999) provided the theoretical framework through which the cognitive, physical and social processes involved could be understood. For the study, three activity systems formed community of practice network. The first activity system involved pre-service teachers at a large university in Queensland, Australia. The second activity system was introduced by the pre-service teachers and involved Year 12 students and teachers at a private secondary school also in Queensland, Australia. The third activity system involved university staff engineers at a large university in Pennsylvania, USA. The common object among the three activity systems was to explore the principles and applications of nanotechnology. The participants in the two Queensland activity systems, controlled laboratory equipment (a high powered Atomic Force Microscope – CPII) in Pennsylvania, USA, with the aim of investigating surface topography and the properties of nano particles. The pre-service teachers were to develop their remote ‘real time’ experience into school classroom tasks, implement these tasks, and later report their findings to other pre-service teachers in the university activity system. As an extension to the project, the pre-service teachers were invited to co-author papers relating to the project. Data were collected from (a) reflective journals; (b) participant field notes – a pre-service teacher initiative; (c) surveys – a pre-service teacher initiative; (d) lesson reflections and digital recordings – a pre-service teacher initiative; and (e) interviews with participants. The findings are reported in terms of the major themes: boundary crossing, the philosophy of teaching, and professional relationships The findings have implications for teacher education. The researchers feel that deliberate planning for networking between activity systems may well be a solution to the apparent theory/practice gap. Proximity of activity systems need not be a hindering issue.
Resumo:
With service interaction modelling, it is customary to distinguish between two types of models: choreographies and orchestrations. A choreography describes interactions within a collection of services from a global perspective, where no service plays a privileged role. Instead, services interact in a peer-to-peer manner. In contrast, an orchestration describes the interactions between one particular service, the orchestrator, and a number of partner services. The main proposition of this work is an approach to bridge these two modelling viewpoints by synthesising orchestrators from choreographies. To start with, choreographies are defined using a simple behaviour description language based on communicating finite state machines. From such a model, orchestrators are initially synthesised in the form of state machines. It turns out that state machines are not suitable for orchestration modelling, because orchestrators generally need to engage in concurrent interactions. To address this issue, a technique is proposed to transform state machines into process models in the Business Process Modelling Notation (BPMN). Orchestrations represented in BPMN can then be augmented with additional business logic to achieve value-adding mediation. In addition, techniques exist for refining BPMN models into executable process definitions. The transformation from state machines to BPMN relies on Petri nets as an intermediary representation and leverages techniques from theory of regions to identify concurrency in the initial Petri net. Once concurrency has been identified, the resulting Petri net is transformed into a BPMN model. The original contributions of this work are: an algorithm to synthesise orchestrators from choreographies and a rules-based transformation from Petri nets into BPMN.
Resumo:
We argue that web service discovery technology should help the user navigate a complex problem space by providing suggestions for services which they may not be able to formulate themselves as (s)he lacks the epistemic resources to do so. Free text documents in service environments provide an untapped source of information for augmenting the epistemic state of the user and hence their ability to search effectively for services. A quantitative approach to semantic knowledge representation is adopted in the form of semantic space models computed from these free text documents. Knowledge of the user’s agenda is promoted by associational inferences computed from the semantic space. The inferences are suggestive and aim to promote human abductive reasoning to guide the user from fuzzy search goals into a better understanding of the problem space surrounding the given agenda. Experimental results are discussed based on a complex and realistic planning activity.
Resumo:
Community service learning is the integration of experiential learning and community service into coursework such that community needs are met and students gain both professional skills and a sense of civic responsibility. A critical component is student reflection. This paper provides an example of the application of community service learning within an undergraduate health unit at the Queensland University of Technology. Based on survey data from 36 program participants, it demonstrates the impact of CSL on student outcomes. Results show that students benefited by developing autonomy through real world experiences, through increased self-assurance and achievement of personal growth, through gaining new insights into the operations of community service organisations and through moving towards becoming responsible citizens. Students expect their CSL experience to have long-lasting impact on their lives, with two-thirds of participants noting that they would like to continue volunteering as part of their future development.
Resumo:
Modern enterprise knowledge management systems typically require distributed approaches and the integration of numerous heterogeneous sources of information. A powerful foundation for these tasks can be Topic Maps, which not only provide a semantic net-like knowledge representation means and the possibility to use ontologies for modelling knowledge structures, but also offer concepts to link these knowledge structures with unstructured data stored in files, external documents etc. In this paper, we present the architecture and prototypical implementation of a Topic Map application infrastructure, the ‘Topic Grid’, which enables transparent, node-spanning access to different Topic Maps distributed in a network.
Resumo:
Services in the form of business services or IT-enabled (Web) Services have become a corporate asset of high interest in striving towards the agile organisation. However, while the design and management of a single service is widely studied and well understood, little is known about how a set of services can be managed. This gap motivated this paper, in which we explore the concept of Service Portfolio Management. In particular, we propose a Service Portfolio Management Framework that explicates service portfolio goals, tasks, governance issues, methods and enablers. The Service Portfolio Management Framework is based upon a thorough analysis and consolidation of existing, well-established portfolio management approaches. From an academic point of view, the Service Portfolio Management Framework can be positioned as an extension of portfolio management conceptualisations in the area of service management. Based on the framework, possible directions for future research are provided. From a practical point of view, the Service Portfolio Management Framework provides an organisation with a novel approach to managing its emerging service portfolios.
Resumo:
This position paper examines the development of a dedicated service aggregator role in business networks. We predict that these intermediaries will soon emerge in service ecosystems and add value through the application of dedicated domain knowledge in the process of creating new, innovative services or service bundles based on the aggregation, composition, integration or orchestration of existing services procured from different service providers in the service ecosystem. We discuss general foundations of service aggregators and present Fourth-Party Logistics Providers as a real-world example of emerging business service aggregators. We also point out a demand for future research, e.g. into governance models, risk management tools, service portfolio management approaches and service bundling techniques, to be able to better understand core determinants of competitiveness and success of service aggregators.
Resumo:
Cultural objects are increasingly generated and stored in digital form, yet effective methods for their indexing and retrieval still remain an important area of research. The main problem arises from the disconnection between the content-based indexing approach used by computer scientists and the description-based approach used by information scientists. There is also a lack of representational schemes that allow the alignment of the semantics and context with keywords and low-level features that can be automatically extracted from the content of these cultural objects. This paper presents an integrated approach to address these problems, taking advantage of both computer science and information science approaches. We firstly discuss the requirements from a number of perspectives: users, content providers, content managers and technical systems. We then present an overview of our system architecture and describe various techniques which underlie the major components of the system. These include: automatic object category detection; user-driven tagging; metadata transform and augmentation, and an expression language for digital cultural objects. In addition, we discuss our experience on testing and evaluating some existing collections, analyse the difficulties encountered and propose ways to address these problems.