925 resultados para 700103 Information processing services
Resumo:
In economics of information theory, credence products are those whose quality is difficult or impossible for consumers to assess, even after they have consumed the product (Darby & Karni, 1973). This dissertation is focused on the content, consumer perception, and power of online reviews for credence services. Economics of information theory has long assumed, without empirical confirmation, that consumers will discount the credibility of claims about credence quality attributes. The same theories predict that because credence services are by definition obscure to the consumer, reviews of credence services are incapable of signaling quality. Our research aims to question these assumptions. In the first essay we examine how the content and structure of online reviews of credence services systematically differ from the content and structure of reviews of experience services and how consumers judge these differences. We have found that online reviews of credence services have either less important or less credible content than reviews of experience services and that consumers do discount the credibility of credence claims. However, while consumers rationally discount the credibility of simple credence claims in a review, more complex argument structure and the inclusion of evidence attenuate this effect. In the second essay we ask, “Can online reviews predict the worst doctors?” We examine the power of online reviews to detect low quality, as measured by state medical board sanctions. We find that online reviews are somewhat predictive of a doctor’s suitability to practice medicine; however, not all the data are useful. Numerical or star ratings provide the strongest quality signal; user-submitted text provides some signal but is subsumed almost completely by ratings. Of the ratings variables in our dataset, we find that punctuality, rather than knowledge, is the strongest predictor of medical board sanctions. These results challenge the definition of credence products, which is a long-standing construct in economics of information theory. Our results also have implications for online review users, review platforms, and for the use of predictive modeling in the context of information systems research.
Resumo:
This book constitutes the revised selected papers from the 10th Global Sourcing Workshop held in Val d’Isère, France, in February 2016. The 11 papers presented in this volume were carefully reviewed and selected from 47 submissions. The book offers a review of the key topics in outsourcing and offshoring of information technology and business services offering practical frameworks that serve as a tool kit to students and managers. The range of topics covered is wide and diverse, but predominately focused on how to achieve success in shared services and outsourcing. More specifically, the book examines outsourcing decisions and management practices, giving specific attention to shared services that have become one of the dominant sourcing models. The topics discussed combine theoretical and practical insights regarding challenges that industry leaders, policy makers, and professionals face or should be concerned with. Case studies from various organizations, industries and countries such as UK, Italy, The Netherlands, Canada, Australia and Denmark complete the book.
Resumo:
The parallel mutation-selection evolutionary dynamics, in which mutation and replication are independent events, is solved exactly in the case that the Malthusian fitnesses associated to the genomes are described by the random energy model (REM) and by a ferromagnetic version of the REM. The solution method uses the mapping of the evolutionary dynamics into a quantum Ising chain in a transverse field and the Suzuki-Trotter formalism to calculate the transition probabilities between configurations at different times. We find that in the case of the REM landscape the dynamics can exhibit three distinct regimes: pure diffusion or stasis for short times, depending on the fitness of the initial configuration, and a spin-glass regime for large times. The dynamic transition between these dynamical regimes is marked by discontinuities in the mean-fitness as well as in the overlap with the initial reference sequence. The relaxation to equilibrium is described by an inverse time decay. In the ferromagnetic REM, we find in addition to these three regimes, a ferromagnetic regime where the overlap and the mean-fitness are frozen. In this case, the system relaxes to equilibrium in a finite time. The relevance of our results to information processing aspects of evolution is discussed.
Resumo:
The Wyner-Ziv video coding (WZVC) rate distortion performance is highly dependent on the quality of the side information, an estimation of the original frame, created at the decoder. This paper, characterizes the WZVC efficiency when motion compensated frame interpolation (MCFI) techniques are used to generate the side information, a difficult problem in WZVC especially because the decoder only has available some reference decoded frames. The proposed WZVC compression efficiency rate model relates the power spectral of the estimation error to the accuracy of the MCFI motion field. Then, some interesting conclusions may be derived related to the impact of the motion field smoothness and the correlation to the true motion trajectories on the compression performance.
Resumo:
7th Mediterranean Conference on Information Systems, MCIS 2012, Guimaraes, Portugal, September 8-10, 2012, Proceedings Series: Lecture Notes in Business Information Processing, Vol. 129
Resumo:
In Distributed Computer-Controlled Systems (DCCS), both real-time and reliability requirements are of major concern. Architectures for DCCS must be designed considering the integration of processing nodes and the underlying communication infrastructure. Such integration must be provided by appropriate software support services. In this paper, an architecture for DCCS is presented, its structure is outlined, and the services provided by the support software are presented. These are considered in order to guarantee the real-time and reliability requirements placed by current and future systems.
Resumo:
In heterogeneous environments, diversity of resources among the devices may affect their ability to perform services with specific QoS constraints, and drive peers to group themselves in a coalition for cooperative service execution. The dynamic selection of peers should be influenced by user’s QoS requirements as well as local computation availability, tailoring provided service to user’s specific needs. However, complex dynamic real-time scenarios may prevent the possibility of computing optimal service configurations before execution. An iterative refinement approach with the ability to trade off deliberation time for the quality of the solution is proposed. We state the importance of quickly finding a good initial solution and propose heuristic evaluation functions that optimise the rate at which the quality of the current solution improves as the algorithms have more time to run.
Resumo:
No ambiente empresarial actual, cada vez mais competitivo e exigente, é um factor fundamental para o sucesso das empresas a sua capacidade de atingir e melhorar os níveis de satisfação exigidos pelos clientes. Para identificar as melhorias a implementar, as empresas devem ser capazes de monitorizar e controlar todas as suas actividades e processos. O acompanhamento realizado às actividades delegadas a empresas externas, como por exemplo o transporte de mercadorias, é dificultado quando os prestadores destes serviços não possuem ferramentas de apoio que disponibilizem informação necessária para o efeito. A necessidade de colmatar esta dificuldade na recolha da informação durante a distribuição de uma encomenda na empresa Caetano Parts, uma empresa de revenda de peças de substituição automóvel, levou ao desenvolvimento de uma ferramenta que permite fazer o seguimento de uma encomenda em todas as suas fases, permitindo ao responsável pelas operações acompanhar o estado da encomenda desde o instante em que a encomenda é colocada, passando pelo seu processamento dentro das instalações, até à sua entrega ao cliente. O sistema desenvolvido é composto por dois componentes, o front-end e o back-end. O front-end é composto por uma aplicação web, e por uma aplicação Android para dispositivos móveis. A aplicação web disponibiliza a gestão da base de dados, o acompanhamento do estado da encomenda e a análise das operações. A aplicação Android é disponibilizada às empresas responsáveis pelo transporte das encomendas e possibilita a actualização online da informação acerca do processo de entrega. O back-end é composto pela unidade de armazenamento e processamento da informação e encontra-se alojado num servidor com ligação à internet, disponibilizando uma interface com o serviço móvel do tipo serviço web. A concepção, desenvolvimento e descrição das funcionalidades desta ferramenta são abordadas ao longo do trabalho. Os testes realizados ao longo do desenvolvimento validaram o correcto funcionamento da ferramenta, estando pronta para a realização de um teste piloto.
Resumo:
International Conference on Intelligent Sensors, Sensor Networks and Information Processing (ISSNIP 2015). 7 to 9, Apr, 2015. Singapure, Singapore.
Resumo:
Tese de Doutoramento em Tecnologias e Sistemas de Informação.
Resumo:
Extensible Markup Language (XML) is a generic computing language that provides an outstanding case study of commodification of service standards. The development of this language in the late 1990s marked a shift in computer science as its extensibility let store and share any kind of data. Many office suites software rely on it. The chapter highlights how the largest multinational firms pay special attention to gain a recognised international standard for such a major technological innovation. It argues that standardisation processes affects market structures and can lead to market capture. By examining how a strategic use of standardisation arenas can generate profits, it shows that Microsoft succeeded in making its own technical solution a recognised ISO standard in 2008, while the same arena already adopted two years earlier the open source standard set by IBM and Sun Microsystems. Yet XML standardisation also helped to establish a distinct model of information technology services at the expense of Microsoft monopoly on proprietary software
Resumo:
Forensic science is generally defined as the application of science to address questions related to the law. Too often, this view restricts the contribution of science to one single process which eventually aims at bringing individuals to court while minimising risk of miscarriage of justice. In order to go beyond this paradigm, we propose to refocus the attention towards traces themselves, as remnants of a criminal activity, and their information content. We postulate that traces contribute effectively to a wide variety of other informational processes that support decision making inmany situations. In particular, they inform actors of new policing strategies who place the treatment of information and intelligence at the centre of their systems. This contribution of forensic science to these security oriented models is still not well identified and captured. In order to create the best condition for the development of forensic intelligence, we suggest a framework that connects forensic science to intelligence-led policing (part I). Crime scene attendance and processing can be envisaged within this view. This approach gives indications abouthowto structure knowledge used by crime scene examiners in their effective practice (part II).
Resumo:
VALOSADE (Value Added Logistics in Supply and Demand Chains) is the research project of Anita Lukka's VALORE (Value Added Logistics Research) research team inLappeenranta University of Technology. VALOSADE is included in ELO (Ebusiness logistics) technology program of Tekes (Finnish Technology Agency). SMILE (SME-sector, Internet applications and Logistical Efficiency) is one of four subprojects of VALOSADE. SMILE research focuses on case network that is composed of small and medium sized mechanical maintenance service providers and global wood processing customers. Basic principle of SMILE study is communication and ebusiness insupply and demand network. This first phase of research concentrates on creating backgrounds for SMILE study and for ebusiness solutions of maintenance case network. The focus is on general trends of ebusiness in supply chains and networksof different industries; total ebusiness system architecture of company networks; ebusiness strategy of company network; information value chain; different factors, which influence on ebusiness solution of company network; and the correlation between ebusiness and competitive advantage. Literature, interviews and benchmarking were used as research methods in this qualitative case study. Networks and end-to-end supply chains are the organizational structures, which can add value for end customer. Information is one of the key factors in these decentralized structures. Because of decentralization of business, information is produced and used in different companies and in different information systems. Information refinement services are needed to manage information flows in company networksbetween different systems. Furthermore, some new solutions like network information systems are utilised in optimising network performance and in standardizingnetwork common processes. Some cases have however indicated, that utilization of ebusiness in decentralized business model is not always a necessity, but value-add of ICT must be defined case-specifically. In the theory part of report, different ebusiness and architecture models are introduced. These models are compared to empirical case data in research results. The biggest difference between theory and empirical data is that models are mainly developed for large-scale companies - not for SMEs. This is due to that implemented network ebusiness solutions are mainly large company centered. Genuine SME network centred ebusiness models are quite rare, and the study in that area has been few in number. Business relationships between customer and their SME suppliers are nowadays concentrated more on collaborative tactical and strategic initiatives besides transaction based operational initiatives. However, ebusiness systems are further mainly based on exchange of operational transactional data. Collaborative ebusiness solutions are in planning or pilot phase in most case companies. Furthermore, many ebusiness solutions are nowadays between two participants, but network and end-to-end supply chain transparency and information systems are quite rare. Transaction volumes, data formats, the types of exchanged information, information criticality,type and duration of business relationship, internal information systems of partners, processes and operation models (e.g. different ordering models) differ among network companies, and furthermore companies are at different stages on networking and ebusiness readiness. Because of former factors, different customer-supplier combinations in network must utilise totally different ebusiness architectures, technologies, systems and standards.
Resumo:
Thisthesis supplements the systematic approach to competitive intelligence and competitor analysis by introducing an information-processing perspective on management of the competitive environment and competitors therein. The cognitive questions connected to the intelligence process and also the means that organizational actors use in sharing information are discussed. The ultimate aim has been to deepen knowledge of the different intraorganizational processes that are used in acorporate organization to manage and exploit the vast amount of competitor information that is received from the environment. Competitor information and competitive knowledge management is examined as a process, where organizational actorsidentify and perceive the competitive environment by using cognitive simplification, make interpretations resulting in learning and finally utilize competitor information and competitive knowledge in their work processes. The sharing of competitive information and competitive knowledge is facilitated by intraorganizational networks that evolve as a means of developing a shared, organizational level knowledge structure and ensuring that the right information is in the right place at the right time. This thesis approaches competitor information and competitive knowledge management both theoretically and empirically. Based on the conceptual framework developed by theoretical elaboration, further understanding of the studied phenomena is sought by an empirical study. The empirical research was carried out in a multinationally operating forest industry company. This thesis makes some preliminary suggestions of improving the competitive intelligence process. It is concluded that managing competitor information and competitive knowledge is not simply a question of managing information flow or improving sophistication of competitor analysis, but the crucial question to be solved is rather, how to improve the cognitive capabilities connected to identifying and making interpretations of the competitive environment and how to increase learning. It is claimed that competitive intelligence can not be treated like an organizational function or assigned solely to a specialized intelligence unit.