833 resultados para Web self-service


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Self-service technology is affecting the service encounter. The potential reduction in personal contact through self-service technology may affect assessments of consumer satisfaction and commitment, making it necessary to investigate self-service technology usage, particularly the long-term impact on consumers' relationships with service organisations. Thus, this paper presents a framework for investigating the impact of self-service technology on consumer satisfaction and on a multi-dimensional measure of consumer commitment. Illustrative quotes from exploratory in-depth interviews support the framework and lead to a set of propositions. Future research directions for testing the framework are also discussed, and potential implications of this research are outlined.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Advances in technology coupled with increasing labour costs have caused service firms to explore self-service delivery options. Although some studies have focused on self-service and use of technology in service delivery, few have explored the role of service quality in consumer evaluation of technology-based self-service options. By integrating and extending the self-service quality framework the service evaluation model and the Technology Acceptance Model the authors address this emerging issue by empirically testing a comprehensive model that captures the antecedents and consequences of perceived service quality to predict continued customer interaction in the technology-based self-service context of Internet banking. Important service evaluation constructs like perceived risk, perceived value and perceived satisfaction are modelled in this framework. The results show that perceived control has the strongest influence on service quality evaluations. Perceived speed of delivery, reliability and enjoyment also have a significant impact on service quality perceptions. The study also found that even though perceived service quality, perceived risk and satisfaction are important predictors of continued interaction, perceived customer value plays a pivotal role in influencing continued interaction.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper expands research into self-service technology in the service encounter. Self-service technology is where customers deliver service themselves using some form of a technological interface. There is still a great deal unknown about self-service technology, in particular its impact on consumer satisfaction and consumer commitment. With that in mind, this empirical study explores the relative impact of self-service technology on consumer satisfaction and on a multidimensional measure of consumer commitment containing affective commitment, temporal commitment and instrumental commitment. The results reveal that in a hotel context personal service still remains very important for assessments of satisfaction, and affective and temporal commitment. What is particularly interesting is that self-service technology, while impacting these constructs, also impacts instrumental commitment. This suggests that positive evaluations of self-service technology may tie consumers into relationships with hotels. A discussion and implications for managers are provided on these and other results, and the paper is concluded with further potential research.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

An interoperable web processing service (WPS) for the automatic interpolation of environmental data has been developed in the frame of the INTAMAP project. In order to assess the performance of the interpolation method implemented, a validation WPS has also been developed. This validation WPS can be used to perform leave one out and K-fold cross validation: a full dataset is submitted and a range of validation statistics and diagnostic plots (e.g. histograms, variogram of residuals, mean errors) is received in return. This paper presents the architecture of the validation WPS and a case study is used to briefly illustrate its use in practice. We conclude with a discussion on the current limitations of the system and make proposals for further developments.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The number of interoperable research infrastructures has increased significantly with the growing awareness of the efforts made by the Global Earth Observation System of Systems (GEOSS). One of the Societal Benefit Areas (SBA) that is benefiting most from GEOSS is biodiversity, given the costs of monitoring the environment and managing complex information, from space observations to species records including their genetic characteristics. But GEOSS goes beyond simple data sharing to encourage the publishing and combination of models, an approach which can ease the handling of complex multi-disciplinary questions. It is the purpose of this paper to illustrate these concepts by presenting eHabitat, a basic Web Processing Service (WPS) for computing the likelihood of finding ecosystems with equal properties to those specified by a user. When chained with other services providing data on climate change, eHabitat can be used for ecological forecasting and becomes a useful tool for decision-makers assessing different strategies when selecting new areas to protect. eHabitat can use virtually any kind of thematic data that can be considered as useful when defining ecosystems and their future persistence under different climatic or development scenarios. The paper will present the architecture and illustrate the concepts through case studies which forecast the impact of climate change on protected areas or on the ecological niche of an African bird.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Un ambiente sempre più interconnesso per facilitare la condivisione di dati, lo sviluppo di strumenti sempre più ricettivi, l’utilizzo di algoritmi sempre più mirati ed efficaci nel selezionare le giuste informazioni sono alcuni dei fattori chiave che hanno consentito e tuttora consentono la crescita, la gestione, il riutilizzo e la diffusione del patrimonio conoscitivo a disposizione delle organizzazioni. Il continuo aumento di risorse informatiche ha indotto le organizzazioni a rivedere il ruolo svolto dalla Business Intelligence, arricchendolo di strumenti e procedure nuove e creando ulteriori figure professionali. L’obiettivo di questo elaborato è fornire una panoramica della business intelligence, della sua origine e della rilevanza e utilità in ambito aziendale. Nel primo capitolo si tratta della disciplina della Business Intelligence, in particolare definizione, cenni storici e differenza con la Business Analytics. Si descrivono successivamente i sistemi informativi e i loro componenti per finire con l’architettura di una soluzione di BI. Nel secondo capitolo, si effettua una panoramica sui software di Business Intelligence sul mercato, dopo di che si presenta Microsoft Power BI di Microsoft, in particolare funzionalità e caratteristiche. Il terzo capitolo è relativo al progetto effettuato durante il periodo di tirocinio: l’implementazione di nuove funzionalità e analisi su un software BI sviluppato dall’azienda ospitante.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Future Internet is expected to be composed of a mesh of interoperable Web services accessed from all over the Web. This approach has not yet caught on since global user-service interaction is still an open issue. Successful composite applications rely on heavyweight service orchestration technologies that raise the bar far above end-user skills. The weakness lies in the abstraction of the underlying service front-end architecture rather than the infrastructure technologies themselves. In our opinion, the best approach is to offer end-to-end composition from user interface to service invocation, as well as an understandable abstraction of both building blocks and a visual composition technique. In this paper we formalize our vision with regard to the next-generation front-end Web technology that will enable integrated access to services, contents and things in the Future Internet. We present a novel reference architecture designed to empower non-technical end users to create and share their own self-service composite applications. A tool implementing this architecture has been developed as part of the European FP7 FAST Project and EzWeb Project, allowing us to validate the rationale behind our approach.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Enabling real end-user development is the next logical stage in the evolution of Internet-wide service-based applications. Successful composite applications rely on heavyweight service orchestration technologies that raise the bar far above end-user skills. This weakness can be attributed to the fact that the composition model does not satisfy end-user needs rather than to the actual infrastructure technologies. In our opinion, the best way to overcome this weakness is to offer end-to-end composition from the user interface to service invocation, plus an understandable abstraction of building blocks and a visual composition technique empowering end users to develop their own applications. In this paper, we present a visual framework for end users, called FAST, which fulfils this objective. FAST implements a novel composition model designed to empower non-programmer end users to create and share their own self-service composite applications in a fully visual fashion. We projected the development environment implementing this model as part of the European FP7 FAST Project, which was used to validate the rationale behind our approach.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

INTAMAP is a web processing service for the automatic interpolation of measured point data. Requirements were (i) using open standards for spatial data such as developed in the context of the open geospatial consortium (OGC), (ii) using a suitable environment for statistical modelling and computation, and (iii) producing an open source solution. The system couples the 52-North web processing service, accepting data in the form of an observations and measurements (O&M) document with a computing back-end realized in the R statistical environment. The probability distribution of interpolation errors is encoded with UncertML, a new markup language to encode uncertain data. Automatic interpolation needs to be useful for a wide range of applications and the algorithms have been designed to cope with anisotropies and extreme values. In the light of the INTAMAP experience, we discuss the lessons learnt.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Interpolated data are an important part of the environmental information exchange as many variables can only be measured at situate discrete sampling locations. Spatial interpolation is a complex operation that has traditionally required expert treatment, making automation a serious challenge. This paper presents a few lessons learnt from INTAMAP, a project that is developing an interoperable web processing service (WPS) for the automatic interpolation of environmental data using advanced geostatistics, adopting a Service Oriented Architecture (SOA). The “rainbow box” approach we followed provides access to the functionality at a whole range of different levels. We show here how the integration of open standards, open source and powerful statistical processing capabilities allows us to automate a complex process while offering users a level of access and control that best suits their requirements. This facilitates benchmarking exercises as well as the regular reporting of environmental information without requiring remote users to have specialized skills in geostatistics.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

INTAMAP is a Web Processing Service for the automatic spatial interpolation of measured point data. Requirements were (i) using open standards for spatial data such as developed in the context of the Open Geospatial Consortium (OGC), (ii) using a suitable environment for statistical modelling and computation, and (iii) producing an integrated, open source solution. The system couples an open-source Web Processing Service (developed by 52°North), accepting data in the form of standardised XML documents (conforming to the OGC Observations and Measurements standard) with a computing back-end realised in the R statistical environment. The probability distribution of interpolation errors is encoded with UncertML, a markup language designed to encode uncertain data. Automatic interpolation needs to be useful for a wide range of applications and the algorithms have been designed to cope with anisotropy, extreme values, and data with known error distributions. Besides a fully automatic mode, the system can be used with different levels of user control over the interpolation process.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Models are central tools for modern scientists and decision makers, and there are many existing frameworks to support their creation, execution and composition. Many frameworks are based on proprietary interfaces, and do not lend themselves to the integration of models from diverse disciplines. Web based systems, or systems based on web services, such as Taverna and Kepler, allow composition of models based on standard web service technologies. At the same time the Open Geospatial Consortium has been developing their own service stack, which includes the Web Processing Service, designed to facilitate the executing of geospatial processing - including complex environmental models. The current Open Geospatial Consortium service stack employs Extensible Markup Language as a default data exchange standard, and widely-used encodings such as JavaScript Object Notation can often only be used when incorporated with Extensible Markup Language. Similarly, no successful engagement of the Web Processing Service standard with the well-supported technologies of Simple Object Access Protocol and Web Services Description Language has been seen. In this paper we propose a pure Simple Object Access Protocol/Web Services Description Language processing service which addresses some of the issues with the Web Processing Service specication and brings us closer to achieving a degree of interoperability between geospatial models, and thus realising the vision of a useful 'model web'.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Dissertation submitted in partial fulfilment of the requirements for the Degree of Master of Science in Geospatial Technologies