989 resultados para legacy system


Relevância:

100.00% 100.00%

Publicador:

Resumo:

When reengineering legacy systems, it is crucial to assess if the legacy behavior has been preserved or how it changed due to the reengineering effort. Ideally if a legacy system is covered by tests, running the tests on the new version can identify potential differences or discrepancies. However, writing tests for an unknown and large system is difficult due to the lack of internal knowledge. It is especially difficult to bring the system to an appropriate state. Our solution is based on the acknowledgment that one of the few trustable piece of information available when approaching a legacy system is the running system itself. Our approach reifies the execution traces and uses logic programming to express tests on them. Thereby it eliminates the need to programatically bring the system in a particular state, and handles the test-writer a high-level abstraction mechanism to query the trace. The resulting system, called TESTLOG, was used on several real-world case studies to validate our claims.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Recent disasters have shown that having clearly defined preventive procedures and decisions is a critical component that minimizes evacuation hazards and ensures a rapid and successful evolution of evacuation plans. In this context, we present our Situation-Aware System for enhancing Evacuation Plans (SASEP) system, which allows creating end-user business rules that technically support the specific events, conditions and actions related to evacuation plans. An experimental validation was carried out where 32 people faced a simulated emergency situation, 16 of them using SASEP and the other 16 using a legacy system based on static signs. From the results obtained, we compare both techniques and discuss in which situations SASEP offers a better evacuation route option, confirming that it is highly valuable when there is a threat in the evacuation route. In addition, a study about user satisfaction using both systems is presented showing in which cases the systems are assessed as satisfactory, relevant and not frustrating.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

This thesis presents experimental investigation of different effects/techniques that can be used to upgrade legacy WDM communication systems. The main issue in upgrading legacy systems is that the fundamental setup, including components settings such as EDFA gains, does not need to be altered thus the improvement must be carried out at the network terminal. A general introduction to optical fibre communications is given at the beginning, including optical communication components and system impairments. Experimental techniques for performing laboratory optical transmission experiments are presented before the experimental work of this thesis. These techniques include optical transmitter and receiver designs as well as the design and operation of the recirculating loop. The main experimental work includes three different studies. The first study involves a development of line monitoring equipment that can be reliably used to monitor the performance of optically amplified long-haul undersea systems. This equipment can provide instant finding of the fault locations along the legacy communication link which in tum enables rapid repair execution to be performed hence upgrading the legacy system. The second study investigates the effect of changing the number of transmitted 1s and Os on the performance of WDM system. This effect can, in reality, be seen in some coding systems, e.g. forward-error correction (FEC) technique, where the proportion of the 1s and Os are changed at the transmitter by adding extra bits to the original bit sequence. The final study presents transmission results after all-optical format conversion from NRZ to CSRZ and from RZ to CSRZ using semiconductor optical amplifier in nonlinear optical loop mirror (SOA-NOLM). This study is mainly based on the fact that the use of all-optical processing, including format conversion, has become attractive for the future data networks that are proposed to be all-optical. The feasibility of the SOA-NOLM device for converting single and WDM signals is described. The optical conversion bandwidth and its limitations for WDM conversion are also investigated. All studies of this thesis employ 10Gbit/s single or WDM signals being transmitted over dispersion managed fibre span in the recirculating loop. The fibre span is composed of single-mode fibres (SMF) whose losses and dispersion are compensated using erbium-doped fibre amplifiers (EDFAs) and dispersion compensating fibres (DCFs), respectively. Different configurations of the fibre span are presented in different parts.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Adopting a model of job enrichment we report on a longitudinal case investigating the perceived impact of an Enterprise Resource Planning (ERP) system on user job design characteristics. Our results indicated that in the context of an ERP geared towards centralisation and standardisation the extent to which users perceived an increase or decrease in job enrichment was associated with aspects such as formal authority and the nature of their work role. Experienced operational employees proficient in the original legacy system perceived ERP system protocols to constrain their actions, limit training and increase dependence on others in the workflow. Conversely, managerial users reported a number of benefits relating to report availability, improved organisational transparency and increased overall job enrichment. These results supported our argument concerning the relationship between ERPs with a standardisation intent and positive job enrichment outcomes for managerial users and negative job-related outcomes for operational users.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

为了满足用户对软件系统内部业务过程定义的定制需求,针对流行的J2EE平台,以工作流技术为基础设计了对遗留软件系统的改造平台JSPMP。JSPMP利用工作流引擎的过程定义与任务管理能力驱动系统业务流程执行,赋予软件系统在业务过程定义方面的动态可定制性。实践证明,开发者利用此方案能够迅速改造遗留系统,满足用户多样化的过程定制需求,极大地降低了改造、定制成本。

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Migrating legacy system with web service is an effective and economic way of reusing legacy software in a SOA environment.In this paper,we present an approach for migrating a three-tie object-oriented legacy system to SOA environment.The key issue of the approach is about services identification from large numbers of classes.And we propose a bottom-up method to model the system with UML and identify services from UML then.This approach can be a reference to an auto-migrating process.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Sharing of information with those in need of it has always been an idealistic goal of networked environments. With the proliferation of computer networks, information is so widely distributed among systems, that it is imperative to have well-organized schemes for retrieval and also discovery. This thesis attempts to investigate the problems associated with such schemes and suggests a software architecture, which is aimed towards achieving a meaningful discovery. Usage of information elements as a modelling base for efficient information discovery in distributed systems is demonstrated with the aid of a novel conceptual entity called infotron.The investigations are focused on distributed systems and their associated problems. The study was directed towards identifying suitable software architecture and incorporating the same in an environment where information growth is phenomenal and a proper mechanism for carrying out information discovery becomes feasible. An empirical study undertaken with the aid of an election database of constituencies distributed geographically, provided the insights required. This is manifested in the Election Counting and Reporting Software (ECRS) System. ECRS system is a software system, which is essentially distributed in nature designed to prepare reports to district administrators about the election counting process and to generate other miscellaneous statutory reports.Most of the distributed systems of the nature of ECRS normally will possess a "fragile architecture" which would make them amenable to collapse, with the occurrence of minor faults. This is resolved with the help of the penta-tier architecture proposed, that contained five different technologies at different tiers of the architecture.The results of experiment conducted and its analysis show that such an architecture would help to maintain different components of the software intact in an impermeable manner from any internal or external faults. The architecture thus evolved needed a mechanism to support information processing and discovery. This necessitated the introduction of the noveI concept of infotrons. Further, when a computing machine has to perform any meaningful extraction of information, it is guided by what is termed an infotron dictionary.The other empirical study was to find out which of the two prominent markup languages namely HTML and XML, is best suited for the incorporation of infotrons. A comparative study of 200 documents in HTML and XML was undertaken. The result was in favor ofXML.The concept of infotron and that of infotron dictionary, which were developed, was applied to implement an Information Discovery System (IDS). IDS is essentially, a system, that starts with the infotron(s) supplied as clue(s), and results in brewing the information required to satisfy the need of the information discoverer by utilizing the documents available at its disposal (as information space). The various components of the system and their interaction follows the penta-tier architectural model and therefore can be considered fault-tolerant. IDS is generic in nature and therefore the characteristics and the specifications were drawn up accordingly. Many subsystems interacted with multiple infotron dictionaries that were maintained in the system.In order to demonstrate the working of the IDS and to discover the information without modification of a typical Library Information System (LIS), an Information Discovery in Library Information System (lDLIS) application was developed. IDLIS is essentially a wrapper for the LIS, which maintains all the databases of the library. The purpose was to demonstrate that the functionality of a legacy system could be enhanced with the augmentation of IDS leading to information discovery service. IDLIS demonstrates IDS in action. IDLIS proves that any legacy system could be augmented with IDS effectively to provide the additional functionality of information discovery service.Possible applications of IDS and scope for further research in the field are covered.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Semantic Analysis is a business analysis method designed to capture system requirements. While these requirements may be represented as text, the method also advocates the use of Ontology Charts to formally denote the system's required roles, relationships and forms of communication. Following model driven engineering techniques, Ontology Charts can be transformed to temporal Database schemas, class diagrams and component diagrams, which can then be used to produce software systems. A nice property of these transformations is that resulting system design models lend themselves to complicated extensions that do not require changes to the design models. For example, resulting databases can be extended with new types of data without the need to modify the database schema of the legacy system. Semantic Analysis is not widely used in software engineering, so there is a lack of experts in the field and no design patterns are available. This make it difficult for the analysts to pass organizational knowledge to the engineers. This study describes an implementation that is readily usable by engineers, which includes an automated technique that can produce a prototype from an Ontology Chart. The use of such tools should enable developers to make use of Semantic Analysis with minimal expertise of ontologies and MDA.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This dissertation document deals with the development of a project, over a span of more than two years, carried out within the scope of the Arrowhead Framework and which bears my personal contribution in several sections. The final part of the project took place during a visiting period at the university of Luleå. The Arrowhead Project is an European project, belonging to the ARTEMIS association, which aims to foster new technologies and unify the access to them into an unique framework. Such technologies include the Internet of Things phe- nomenon, Smart Houses, Electrical Mobility and renewable energy production. An application is considered compliant with such framework when it respects the Service Oriented Architecture paradigm and it is able to interact with a set of defined components called Arrowhead Core Services. My personal contribution to this project is given by the development of several user-friendly API, published in the project's main repository, and the integration of a legacy system within the Arrowhead Framework. The implementation of this legacy system was initiated by me in 2012 and, after many improvements carried out by several developers in UniBO, it has been again significantly modified this year in order to achieve compatibility. The system consists of a simulation of an urban scenario where a certain amount of electrical vehicles are traveling along their specified routes. The vehicles are con-suming their battery and, thus, need to recharge at the charging stations. The electrical vehicles need to use a reservation mechanism to be able to recharge and avoid waiting lines, due to the long recharge process. The integration with the above mentioned framework consists in the publication of the services that the system provides to the end users through the instantiation of several Arrowhead Service Producers, together with a demo Arrowhead- compliant client application able to consume such services.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Dissertação (mestrado)—Universidade de Brasília, Instituto de Ciências Exatas, Departamento de Ciência da Computação, 2015.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This panel presentation provided several use cases that detail the complexity of large-scale digital library system (DLS) migration from the perspective of three university libraries and a statewide academic library services consortium. Each described the methodologies developed at the beginning of their migration process, the unique challenges that arose along the way, how issues were managed, and the outcomes of their work. Florida Atlantic University, Florida International University, and the University of Central Florida are members of the state's academic library services consortium, the Florida Virtual Campus (FLVC). In 2011, the Digital Services Committee members began exploring alternatives to DigiTool, their shared FLVC hosted DLS. After completing a review of functional requirements and existing systems, the universities and FLVC began the implementation process of their chosen platforms. Migrations began in 2013 with limited sets of materials. As functionalities were enhanced to support additional categories of materials from the legacy system, migration paths were created for the remaining materials. Some of the challenges experienced with the institutional and statewide collaborative legacy collections were due to gradual changes in standards, technology, policies, and personnel. This was manifested in the quality of original digital files and metadata, as well as collection and record structures. Additionally, the complexities involved with multiple institutions collaborating and compromising throughout the migration process, as well as the move from a consortial support structure with a vendor solution to open source systems (both locally and consortially supported), presented their own sets of unique challenges. Following the presentation, the speakers discussed commonalities in their migration experience, including learning opportunities for future migrations.