773 resultados para Analisi settore mercato software information technology
Validation of a light-weight approach to knowledge-based re-engineering by a COBOL-to-Java converter
Resumo:
Information technology has increased both the speed and medium of communication between nations. It has brought the world closer, but it has also created new challenges for translation — how we think about it, how we carry it out and how we teach it. Translation and Information Technology has brought together experts in computational linguistics, machine translation, translation education, and translation studies to discuss how these new technologies work, the effect of electronic tools, such as the internet, bilingual corpora, and computer software, on translator education and the practice of translation, as well as the conceptual gaps raised by the interface of human and machine.
Resumo:
Commerce is essentially the exchange of goods and services in various forms between sellers and buyers, together with associated financial transactions. Electronic Commerce (EC) is the process of conducing commerce through electronic means, including any electronic commercial activity supported by IT (information technology) (Adam and Yesha, 1996; Kambil, 1997; Yen, 1998). In this sense, EC is not totally new. Industries have used various EC platforms such as advertising on TV and ordering by telephone or fax. Internet Commerce (IC), or Web Commerce, is a specific type of EC (Maddox, 1998; Minoli D. and Minoli E., 1997). While some traditional EC platforms such as TV and telephone have been used to build “TV-gambling” and “telephone-betting” systems for conducting lottery business, Internet Lottery Commerce (ILC) has been assessed as the most promising type of EC in the foreseeable future. There are many social and moral issues relating to the conduct of lottery business on-line. However, this chapter does not debate these but deals only with business and technology issues. The purpose of this chapter is to provide a structured guide to senior executives and strategic planners who are planning on, or interested in, ILC deployment and operation. The guide consists of several stages: (1) an explanation of the industry segment’s traits, value chain, and current status; (2) an analysis of the competition and business issues in the Internet era and an evaluation of the strategic resources; (3) a planning framework that addresses major infrastructure issues; and (4) recommendations comprising the construction of an ILC model, suggested principles, and an approach to strategic deployment. The chapter demonstrates the case for applying the proposed guideline within the lottery business. Faced with a quickly changing technological context, it pays special attention to constructing a conceptual framework that addresses the key components of an ILC model. ILC fulfils the major activities in a lottery commerce value chain—advertising, selling and delivering products, collecting payments for tickets, and paying prizes. Although the guideline has been devised for lottery businesses, it can be applied to many other industry sectors.
Resumo:
To meet changing needs of customers and to survive in the increasingly globalised and competitive environment, it is necessary for companies to equip themselves with intelligent tools, thereby enabling managerial levels to use the tactical decision in a better way. However, the implementation of an intelligent system is always a challenge in Small- and Medium-sized Enterprises (SMEs). Therefore, a new and simple approach with 'process rethinking' ability is proposed to generate ongoing process improvements over time. In this paper, a roadmap of the development of an agent-based information system is described. A case example has also been provided to show how the system can assist non-specialists, for example, managers and engineers to make right decisions for a continual process improvement. Copyright © 2006 Inderscience Enterprises Ltd.
Resumo:
It has been suggested that, in order to maintain its relevance, critical research must develop a strong emphasis on empirical work rather than the conceptual emphasis that has typically characterized critical scholarship in management. A critical project of this nature is applicable in the information systems (IS) arena, which has a growing tradition of qualitative inquiry. Despite its relativist ontology, actor–network theory places a strong emphasis on empirical inquiry and this paper argues that actor–network theory, with its careful tracing and recording of heterogeneous networks, is well suited to the generation of detailed and contextual empirical knowledge about IS. The intention in this paper is to explore the relevance of IS research informed by actor–network theory in the pursuit of a broader critical research project as de? ned in earlier work.
Resumo:
Using current software engineering technology, the robustness required for safety critical software is not assurable. However, different approaches are possible which can help to assure software robustness to some extent. For achieving high reliability software, methods should be adopted which avoid introducing faults (fault avoidance); then testing should be carried out to identify any faults which persist (error removal). Finally, techniques should be used which allow any undetected faults to be tolerated (fault tolerance). The verification of correctness in system design specification and performance analysis of the model, are the basic issues in concurrent systems. In this context, modeling distributed concurrent software is one of the most important activities in the software life cycle, and communication analysis is a primary consideration to achieve reliability and safety. By and large fault avoidance requires human analysis which is error prone; by reducing human involvement in the tedious aspect of modelling and analysis of the software it is hoped that fewer faults will persist into its implementation in the real-time environment. The Occam language supports concurrent programming and is a language where interprocess interaction takes place by communications. This may lead to deadlock due to communication failure. Proper systematic methods must be adopted in the design of concurrent software for distributed computing systems if the communication structure is to be free of pathologies, such as deadlock. The objective of this thesis is to provide a design environment which ensures that processes are free from deadlock. A software tool was designed and used to facilitate the production of fault-tolerant software for distributed concurrent systems. Where Occam is used as a design language then state space methods, such as Petri-nets, can be used in analysis and simulation to determine the dynamic behaviour of the software, and to identify structures which may be prone to deadlock so that they may be eliminated from the design before the program is ever run. This design software tool consists of two parts. One takes an input program and translates it into a mathematical model (Petri-net), which is used for modeling and analysis of the concurrent software. The second part is the Petri-net simulator that takes the translated program as its input and starts simulation to generate the reachability tree. The tree identifies `deadlock potential' which the user can explore further. Finally, the software tool has been applied to a number of Occam programs. Two examples were taken to show how the tool works in the early design phase for fault prevention before the program is ever run.
Resumo:
This chapter provides the theoretical foundation and background on data envelopment analysis (DEA) method. We first introduce the basic DEA models. The balance of this chapter focuses on evidences showing DEA has been extensively applied for measuring efficiency and productivity of services including financial services (banking, insurance, securities, and fund management), professional services, health services, education services, environmental and public services, energy services, logistics, tourism, information technology, telecommunications, transport, distribution, audio-visual, media, entertainment, cultural and other business services. Finally, we provide information on the use of Performance Improvement Management Software (PIM-DEA). A free limited version of this software and downloading procedure is also included in this chapter.
Resumo:
This research studies the issue of using strategic information technology for improving organisational effectiveness. It analyses different academic approaches explaining the nature of information systems and the need organisations feel of developing strategic information systems planning processes, to improve organisational effectiveness. It chooses Managerial Cybernetics as the theoretical foundation supporting development of a "Strategic Information Systems Planning" Framework, and uses it for supporting the analysis of a documented story about the process lived by the Colombian President's Office, in 1990-1992. It argues that by analysing the situation through this new analysis framework we may enlighten some previously unclear situations lived, and not yet properly explained through other approaches to strategic information systems planning. The documented history explains the organisational context and strategic postures of the Colombian President's Office and the Colombian Public Sector, at that time, as well as some of the strategic information systems defined and developed. In particular it analyses a system developed jointly by the President's Office and the National Planning Department, for measuring results of the main national development programmes. Then, it reviews these situations, in the light of the new framework and presents the main findings of the exercise. Finally, it analyses the whole research exercise, the perceived usefulness of the chosen frameworks and tools to enlighten the real situations analysed that were not clear enough, and some open research paths to follow for future researchers interested in the issue.
Resumo:
Initially this thesis examines the various mechanisms by which technology is acquired within anodizing plants. In so doing the history of the evolution of anodizing technology is recorded, with particular reference to the growth of major markets and to the contribution of the marketing efforts of the aluminium industry. The business economics of various types of anodizing plants are analyzed. Consideration is also given to the impact of developments in anodizing technology on production economics and market growth. The economic costs associated with work rejected for process defects are considered. Recent changes in the industry have created conditions whereby information technology has a potentially important role to play in retaining existing knowledge. One such contribution is exemplified by the expert system which has been developed for the identification of anodizing process defects. Instead of using a "rule-based" expert system, a commercial neural networks program has been adapted for the task. The advantages of neural networks over 'rule-based' systems is that they are better suited to production problems, since the actual conditions prevailing when the defect was produced are often not known with certainty. In using the expert system, the user first identifies the process stage at which the defect probably occurred and is then directed to a file enabling the actual defects to be identified. After making this identification, the user can consult a database which gives a more detailed description of the defect, advises on remedial action and provides a bibliography of papers relating to the defect. The database uses a proprietary hypertext program, which also provides rapid cross-referencing to similar types of defect. Additionally, a graphics file can be accessed which (where appropriate) will display a graphic of the defect on screen. A total of 117 defects are included, together with 221 literature references, supplemented by 48 cross-reference hyperlinks. The main text of the thesis contains 179 literature references. (DX186565)