32 resultados para Analisi settore mercato software information technology
Resumo:
Information technology has increased both the speed and medium of communication between nations. It has brought the world closer, but it has also created new challenges for translation — how we think about it, how we carry it out and how we teach it. Translation and Information Technology has brought together experts in computational linguistics, machine translation, translation education, and translation studies to discuss how these new technologies work, the effect of electronic tools, such as the internet, bilingual corpora, and computer software, on translator education and the practice of translation, as well as the conceptual gaps raised by the interface of human and machine.
Resumo:
Commerce is essentially the exchange of goods and services in various forms between sellers and buyers, together with associated financial transactions. Electronic Commerce (EC) is the process of conducing commerce through electronic means, including any electronic commercial activity supported by IT (information technology) (Adam and Yesha, 1996; Kambil, 1997; Yen, 1998). In this sense, EC is not totally new. Industries have used various EC platforms such as advertising on TV and ordering by telephone or fax. Internet Commerce (IC), or Web Commerce, is a specific type of EC (Maddox, 1998; Minoli D. and Minoli E., 1997). While some traditional EC platforms such as TV and telephone have been used to build “TV-gambling” and “telephone-betting” systems for conducting lottery business, Internet Lottery Commerce (ILC) has been assessed as the most promising type of EC in the foreseeable future. There are many social and moral issues relating to the conduct of lottery business on-line. However, this chapter does not debate these but deals only with business and technology issues. The purpose of this chapter is to provide a structured guide to senior executives and strategic planners who are planning on, or interested in, ILC deployment and operation. The guide consists of several stages: (1) an explanation of the industry segment’s traits, value chain, and current status; (2) an analysis of the competition and business issues in the Internet era and an evaluation of the strategic resources; (3) a planning framework that addresses major infrastructure issues; and (4) recommendations comprising the construction of an ILC model, suggested principles, and an approach to strategic deployment. The chapter demonstrates the case for applying the proposed guideline within the lottery business. Faced with a quickly changing technological context, it pays special attention to constructing a conceptual framework that addresses the key components of an ILC model. ILC fulfils the major activities in a lottery commerce value chain—advertising, selling and delivering products, collecting payments for tickets, and paying prizes. Although the guideline has been devised for lottery businesses, it can be applied to many other industry sectors.
Resumo:
To meet changing needs of customers and to survive in the increasingly globalised and competitive environment, it is necessary for companies to equip themselves with intelligent tools, thereby enabling managerial levels to use the tactical decision in a better way. However, the implementation of an intelligent system is always a challenge in Small- and Medium-sized Enterprises (SMEs). Therefore, a new and simple approach with 'process rethinking' ability is proposed to generate ongoing process improvements over time. In this paper, a roadmap of the development of an agent-based information system is described. A case example has also been provided to show how the system can assist non-specialists, for example, managers and engineers to make right decisions for a continual process improvement. Copyright © 2006 Inderscience Enterprises Ltd.
Resumo:
It has been suggested that, in order to maintain its relevance, critical research must develop a strong emphasis on empirical work rather than the conceptual emphasis that has typically characterized critical scholarship in management. A critical project of this nature is applicable in the information systems (IS) arena, which has a growing tradition of qualitative inquiry. Despite its relativist ontology, actor–network theory places a strong emphasis on empirical inquiry and this paper argues that actor–network theory, with its careful tracing and recording of heterogeneous networks, is well suited to the generation of detailed and contextual empirical knowledge about IS. The intention in this paper is to explore the relevance of IS research informed by actor–network theory in the pursuit of a broader critical research project as de? ned in earlier work.
Resumo:
Using current software engineering technology, the robustness required for safety critical software is not assurable. However, different approaches are possible which can help to assure software robustness to some extent. For achieving high reliability software, methods should be adopted which avoid introducing faults (fault avoidance); then testing should be carried out to identify any faults which persist (error removal). Finally, techniques should be used which allow any undetected faults to be tolerated (fault tolerance). The verification of correctness in system design specification and performance analysis of the model, are the basic issues in concurrent systems. In this context, modeling distributed concurrent software is one of the most important activities in the software life cycle, and communication analysis is a primary consideration to achieve reliability and safety. By and large fault avoidance requires human analysis which is error prone; by reducing human involvement in the tedious aspect of modelling and analysis of the software it is hoped that fewer faults will persist into its implementation in the real-time environment. The Occam language supports concurrent programming and is a language where interprocess interaction takes place by communications. This may lead to deadlock due to communication failure. Proper systematic methods must be adopted in the design of concurrent software for distributed computing systems if the communication structure is to be free of pathologies, such as deadlock. The objective of this thesis is to provide a design environment which ensures that processes are free from deadlock. A software tool was designed and used to facilitate the production of fault-tolerant software for distributed concurrent systems. Where Occam is used as a design language then state space methods, such as Petri-nets, can be used in analysis and simulation to determine the dynamic behaviour of the software, and to identify structures which may be prone to deadlock so that they may be eliminated from the design before the program is ever run. This design software tool consists of two parts. One takes an input program and translates it into a mathematical model (Petri-net), which is used for modeling and analysis of the concurrent software. The second part is the Petri-net simulator that takes the translated program as its input and starts simulation to generate the reachability tree. The tree identifies `deadlock potential' which the user can explore further. Finally, the software tool has been applied to a number of Occam programs. Two examples were taken to show how the tool works in the early design phase for fault prevention before the program is ever run.
Resumo:
This chapter provides the theoretical foundation and background on data envelopment analysis (DEA) method. We first introduce the basic DEA models. The balance of this chapter focuses on evidences showing DEA has been extensively applied for measuring efficiency and productivity of services including financial services (banking, insurance, securities, and fund management), professional services, health services, education services, environmental and public services, energy services, logistics, tourism, information technology, telecommunications, transport, distribution, audio-visual, media, entertainment, cultural and other business services. Finally, we provide information on the use of Performance Improvement Management Software (PIM-DEA). A free limited version of this software and downloading procedure is also included in this chapter.
Resumo:
This research studies the issue of using strategic information technology for improving organisational effectiveness. It analyses different academic approaches explaining the nature of information systems and the need organisations feel of developing strategic information systems planning processes, to improve organisational effectiveness. It chooses Managerial Cybernetics as the theoretical foundation supporting development of a "Strategic Information Systems Planning" Framework, and uses it for supporting the analysis of a documented story about the process lived by the Colombian President's Office, in 1990-1992. It argues that by analysing the situation through this new analysis framework we may enlighten some previously unclear situations lived, and not yet properly explained through other approaches to strategic information systems planning. The documented history explains the organisational context and strategic postures of the Colombian President's Office and the Colombian Public Sector, at that time, as well as some of the strategic information systems defined and developed. In particular it analyses a system developed jointly by the President's Office and the National Planning Department, for measuring results of the main national development programmes. Then, it reviews these situations, in the light of the new framework and presents the main findings of the exercise. Finally, it analyses the whole research exercise, the perceived usefulness of the chosen frameworks and tools to enlighten the real situations analysed that were not clear enough, and some open research paths to follow for future researchers interested in the issue.
Resumo:
Initially this thesis examines the various mechanisms by which technology is acquired within anodizing plants. In so doing the history of the evolution of anodizing technology is recorded, with particular reference to the growth of major markets and to the contribution of the marketing efforts of the aluminium industry. The business economics of various types of anodizing plants are analyzed. Consideration is also given to the impact of developments in anodizing technology on production economics and market growth. The economic costs associated with work rejected for process defects are considered. Recent changes in the industry have created conditions whereby information technology has a potentially important role to play in retaining existing knowledge. One such contribution is exemplified by the expert system which has been developed for the identification of anodizing process defects. Instead of using a "rule-based" expert system, a commercial neural networks program has been adapted for the task. The advantages of neural networks over 'rule-based' systems is that they are better suited to production problems, since the actual conditions prevailing when the defect was produced are often not known with certainty. In using the expert system, the user first identifies the process stage at which the defect probably occurred and is then directed to a file enabling the actual defects to be identified. After making this identification, the user can consult a database which gives a more detailed description of the defect, advises on remedial action and provides a bibliography of papers relating to the defect. The database uses a proprietary hypertext program, which also provides rapid cross-referencing to similar types of defect. Additionally, a graphics file can be accessed which (where appropriate) will display a graphic of the defect on screen. A total of 117 defects are included, together with 221 literature references, supplemented by 48 cross-reference hyperlinks. The main text of the thesis contains 179 literature references. (DX186565)
Resumo:
The existing literature has given little consideration to social values of information technology in general or of wireless technology in particular. The purpose of this paper is thus to shed new light on this issue. Based on an interpretive case study, we examine two healthcare organisations and discover that social values are often manifested beyond, as well as within, organisations. A matrix of social values in relation to technology changes and their interactions with various stakeholders is further discussed. The matrix helps understand how various social values emerge from and revolve around organisations’ strategic management of information technology. The implications of the findings about social values are discussed and future research directions are suggested.
Resumo:
Purpose – This Editorial Viewpoint explores the practical developments in manufacturing technology management (MTM) over the last 22 years and links these to some of the subject trends of previous articles in JMTM. Design/methodology/approach – Themes and relevant articles have been identified from the Emerald advanced search facility and linked with developments in hard technologies, information technology and production organisation. Findings – There are numerous examples of where trends in the real world of MTM are reflected in changes to the orientation of JMTM articles, but there are still many articles following more well-worn paths of previous academic research. Research limitations/implications – Evidence for the findings is only from a small sample of articles identified in one journal. Practical implications – Over time, practitioners can find useful connections between published research and their own emerging areas of concern. Originality/value – The paper is based on original bibliographic research, supplemented by extensive editorial and practical experience.
Resumo:
This book contains 11 carefully revised and selected papers from the 5th Workshop on Global Sourcing, held in Courchevel, France, March 14-17, 2011. They have been gleaned from a vast empirical base brought together by leading researchers in information systems, strategic management, and operations. This volume is intended for use by students, academics, and practitioners interested in the outsourcing and offshoring of information technology and business processes. It offers a review of the key topics in outsourcing and offshoring, populated with practical frameworks that serve as a tool kit for students and managers. The topics discussed combine theoretical and practical insights, and they are extensively illustrated by case studies from client and vendor organizations. Last but not least, the book examines current and future trends in outsourcing and offshoring, paying particular attention to how innovation can be realized in global or outsourced software development environments.
Resumo:
This edited book is intended for use by students, academics and practitioners who take interest in outsourcing and offshoring of information technology and business processes. The book offers a review of the key topics in outsourcing and offshoring, populated with practical frameworks that serve as a tool kit to students and managers. The range of topics covered in this book is wide and diverse. Various governance and coordination mechanisms for managing outsourcing relationships are discussed in great depth and the decision-making processes and considerations regarding sourcing arrangements, including multi-sourcing and cloud services, are examined. Vendors’ capabilities for managing global software development are studied in depth. Clients’ capabilities and issues related to compliance and culture are also discussed in association with various sourcing models. Topics discussed in this book combine theoretical and practical insights regarding challenges that both clients and vendors face. Case studies from client and vendor organizations are used extensively throughout the book. Last but not least, the book examines current and future trends in outsourcing and offshoring, placing particular attention on the centrality of innovation in sourcing arrangements, and how innovation can be realized in outsourcing. The book is based on a vast empirical base brought together through years of extensive research by leading researchers in information systems, strategic management and operations.
Resumo:
As a new medium for questionnaire delivery, the internet has the potential to revolutionise the survey process. Online (web-based) questionnaires provide several advantages over traditional survey methods in terms of cost, speed, appearance, flexibility, functionality, and usability [1, 2]. For instance, delivery is faster, responses are received more quickly, and data collection can be automated or accelerated [1- 3]. Online-questionnaires can also provide many capabilities not found in traditional paper-based questionnaires: they can include pop-up instructions and error messages; they can incorporate links; and it is possible to encode difficult skip patterns making such patterns virtually invisible to respondents. Like many new technologies, however, online-questionnaires face criticism despite their advantages. Typically, such criticisms focus on the vulnerability of online-questionnaires to the four standard survey error types: namely, coverage, non-response, sampling, and measurement errors. Although, like all survey errors, coverage error (“the result of not allowing all members of the survey population to have an equal or nonzero chance of being sampled for participation in a survey” [2, pg. 9]) also affects traditional survey methods, it is currently exacerbated in online-questionnaires as a result of the digital divide. That said, many developed countries have reported substantial increases in computer and internet access and/or are targeting this as part of their immediate infrastructural development [4, 5]. Indicating that familiarity with information technologies is increasing, these trends suggest that coverage error will rapidly diminish to an acceptable level (for the developed world at least) in the near future, and in so doing, positively reinforce the advantages of online-questionnaire delivery. The second error type – the non-response error – occurs when individuals fail to respond to the invitation to participate in a survey or abandon a questionnaire before it is completed. Given today’s societal trend towards self-administration [2] the former is inevitable, irrespective of delivery mechanism. Conversely, non-response as a consequence of questionnaire abandonment can be relatively easily addressed. Unlike traditional questionnaires, the delivery mechanism for online-questionnaires makes estimation of questionnaire length and time required for completion difficult1, thus increasing the likelihood of abandonment. By incorporating a range of features into the design of an online questionnaire, it is possible to facilitate such estimation – and indeed, to provide respondents with context sensitive assistance during the response process – and thereby reduce abandonment while eliciting feelings of accomplishment [6]. For online-questionnaires, sampling error (“the result of attempting to survey only some, and not all, of the units in the survey population” [2, pg. 9]) can arise when all but a small portion of the anticipated respondent set is alienated (and so fails to respond) as a result of, for example, disregard for varying connection speeds, bandwidth limitations, browser configurations, monitors, hardware, and user requirements during the questionnaire design process. Similarly, measurement errors (“the result of poor question wording or questions being presented in such a way that inaccurate or uninterpretable answers are obtained” [2, pg. 11]) will lead to respondents becoming confused and frustrated. Sampling, measurement, and non-response errors are likely to occur when an online-questionnaire is poorly designed. Individuals will answer questions incorrectly, abandon questionnaires, and may ultimately refuse to participate in future surveys; thus, the benefit of online questionnaire delivery will not be fully realized. To prevent errors of this kind2, and their consequences, it is extremely important that practical, comprehensive guidelines exist for the design of online questionnaires. Many design guidelines exist for paper-based questionnaire design (e.g. [7-14]); the same is not true for the design of online questionnaires [2, 15, 16]. The research presented in this paper is a first attempt to address this discrepancy. Section 2 describes the derivation of a comprehensive set of guidelines for the design of online-questionnaires and briefly (given space restrictions) outlines the essence of the guidelines themselves. Although online-questionnaires reduce traditional delivery costs (e.g. paper, mail out, and data entry), set up costs can be high given the need to either adopt and acquire training in questionnaire development software or secure the services of a web developer. Neither approach, however, guarantees a good questionnaire (often because the person designing the questionnaire lacks relevant knowledge in questionnaire design). Drawing on existing software evaluation techniques [17, 18], we assessed the extent to which current questionnaire development applications support our guidelines; Section 3 describes the framework used for the evaluation, and Section 4 discusses our findings. Finally, Section 5 concludes with a discussion of further work.
Resumo:
Due to huge popularity of portable terminals based on Wireless LANs and increasing demand for multimedia services from these terminals, the earlier structures and protocols are insufficient to cover the requirements of emerging networks and communications. Most research in this field is tailored to find more efficient ways to optimize the quality of wireless LAN regarding the requirements of multimedia services. Our work is to investigate the effects of modulation modes at the physical layer, retry limits at the MAC layer and packet sizes at the application layer over the quality of media packet transmission. Interrelation among these parameters to extract a cross-layer idea will be discussed as well. We will show how these parameters from different layers jointly contribute to the performance of service delivery by the network. The results obtained could form a basis to suggest independent optimization in each layer (an adaptive approach) or optimization of a set of parameters from different layers (a cross-layer approach). Our simulation model is implemented in the NS-2 simulator. Throughput and delay (latency) of packet transmission are the quantities of our assessments. © 2010 IEEE.
Resumo:
In this work, we present an adaptive unequal loss protection (ULP) scheme for H264/AVC video transmission over lossy networks. This scheme combines erasure coding, H.264/AVC error resilience techniques and importance measures in video coding. The unequal importance of the video packets is identified in the group of pictures (GOP) and the H.264/AVC data partitioning levels. The presented method can adaptively assign unequal amount of forward error correction (FEC) parity across the video packets according to the network conditions, such as the available network bandwidth, packet loss rate and average packet burst loss length. A near optimal algorithm is developed to deal with the FEC assignment for optimization. The simulation results show that our scheme can effectively utilize network resources such as bandwidth, while improving the quality of the video transmission. In addition, the proposed ULP strategy ensures graceful degradation of the received video quality as the packet loss rate increases. © 2010 IEEE.