600 resultados para CNPQ::CIENCIAS EXATAS E DA TERRA::CIENCIA DA COMPUTACAO::METODOLOGIA E TECNICAS DA COMPUTACAO::ENGENHARIA DE SOFTWARE
Resumo:
Nonogram is a logical puzzle whose associated decision problem is NP-complete. It has applications in pattern recognition problems and data compression, among others. The puzzle consists in determining an assignment of colors to pixels distributed in a N M matrix that satisfies line and column constraints. A Nonogram is encoded by a vector whose elements specify the number of pixels in each row and column of a figure without specifying their coordinates. This work presents exact and heuristic approaches to solve Nonograms. The depth first search was one of the chosen exact approaches because it is a typical example of brute search algorithm that is easy to implement. Another implemented exact approach was based on the Las Vegas algorithm, so that we intend to investigate whether the randomness introduce by the Las Vegas-based algorithm would be an advantage over the depth first search. The Nonogram is also transformed into a Constraint Satisfaction Problem. Three heuristics approaches are proposed: a Tabu Search and two memetic algorithms. A new function to calculate the objective function is proposed. The approaches are applied on 234 instances, the size of the instances ranging from 5 x 5 to 100 x 100 size, and including logical and random Nonograms
Resumo:
The widespread growth in the use of smart cards (by banks, transport services, and cell phones, etc) has brought an important fact that must be addressed: the need of tools that can be used to verify such cards, so to guarantee the correctness of their software. As the vast majority of cards that are being developed nowadays use the JavaCard technology as they software layer, the use of the Java Modeling Language (JML) to specify their programs appear as a natural solution. JML is a formal language tailored to Java. It has been inspired by methodologies from Larch and Eiffel, and has been widely adopted as the de facto language when dealing with specification of any Java related program. Various tools that make use of JML have already been developed, covering a wide range of functionalities, such as run time and static checking. But the tools existent so far for static checking are not fully automated, and, those that are, do not offer an adequate level of soundness and completeness. Our objective is to contribute to a series of techniques, that can be used to accomplish a fully automated and confident verification of JavaCard applets. In this work we present the first steps to this. With the use of a software platform comprised by Krakatoa, Why and haRVey, we developed a set of techniques to reduce the size of the theory necessary to verify the specifications. Such techniques have yielded very good results, with gains of almost 100% in all tested cases, and has proved as a valuable technique to be used, not only in this, but in most real world problems related to automatic verification
Resumo:
The way to deal with information assets means nowadays the main factor not only for the success but also for keeping the companies in the global world. The number of information security incidents has grown for the last years. The establishment of information security policies that search to keep the security requirements of assets in the desired degrees is the major priority for the companies. This dissertation suggests a unified process for elaboration, maintenance and development of information security policies, the Processo Unificado para Políticas de Segurança da Informação - PUPSI. The elaboration of this proposal started with the construction of a structure of knowledge based on documents and official rules, published in the last two decades, about security policies and information security. It's a model based on the examined documents which defines the needed security policies to be established in the organization, its work flow and identifies the sequence of hierarchy among them. It's also made a model of the entities participating in the process. Being the problem treated by the model so complex, which involves all security policies that the company must have. PUPSI has an interative and developing approach. This approach was obtained from the instantiation of the RUP - Rational Unified Process model. RUP is a platform for software development object oriented, of Rational Software (IBM group). Which uses the best practice known by the market. PUPSI got from RUP a structure of process that offers functionality, diffusion capacity and comprehension, performance and agility for the process adjustment, offering yet capacity of adjustment to technological and structural charges of the market and the company
Resumo:
Nowadays due to the security vulnerability of distributed systems, it is needed mechanisms to guarantee the security requirements of distributed objects communications. Middleware Platforms component integration platforms provide security functions that typically offer services for auditing, for guarantee messages protection, authentication, and access control. In order to support these functions, middleware platforms use digital certificates that are provided and managed by external entities. However, most middleware platforms do not define requirements to get, to maintain, to validate and to delegate digital certificates. In addition, most digital certification systems use X.509 certificates that are complex and have a lot of attributes. In order to address these problems, this work proposes a digital certification generic service for middleware platforms. This service provides flexibility via the joint use of public key certificates, to implement the authentication function, and attributes certificates to the authorization function. It also supports delegation. Certificate based access control is transparent for objects. The proposed service defines the digital certificate format, the store and retrieval system, certificate validation and support for delegation. In order to validate the proposed architecture, this work presents the implementation of the digital certification service for the CORBA middleware platform and a case study that illustrates the service functionalities
Resumo:
The software systems development with domain-specific languages has become increasingly common. Domain-specific languages (DSLs) provide increased of the domain expressiveness, raising the abstraction level by facilitating the generation of models or low-level source code, thus increasing the productivity of systems development. Consequently, methods for the development of software product lines and software system families have also proposed the adoption of domain-specific languages. Recent studies have investigated the limitations of feature model expressiveness and proposing the use of DSLs as a complement or substitute for feature model. However, in complex projects, a single DSL is often insufficient to represent the different views and perspectives of development, being necessary to work with multiple DSLs. In order to address new challenges in this context, such as the management of consistency between DSLs, and the need to methods and tools that support the development with multiple DSLs, over the past years, several approaches have been proposed for the development of generative approaches. However, none of them considers matters relating to the composition of DSLs. Thus, with the aim to address this problem, the main objectives of this dissertation are: (i) to investigate the adoption of the integrated use of feature models and DSLs during the domain and application engineering of the development of generative approaches; (ii) to propose a method for the development of generative approaches with composition DSLs; and (iii) to investigate and evaluate the usage of modern technology based on models driven engineering to implement strategies of integration between feature models and composition of DSLs
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior
Resumo:
The importance of non-functional requirements for computer systems is increasing. Satisfying these requirements requires special attention to the software architecture, since an unsuitable architecture introduces greater complexity in addition to the intrinsic complexity of the system. Some studies have shown that, despite requirements engineering and software architecture activities act on different aspects of development, they must be performed iteratively and intertwined to produce satisfactory software systems. The STREAM process presents a systematic approach to reduce the gap between requirements and architecture development, emphasizing the functional requirements, but using the non-functional requirements in an ad hoc way. However, non-functional requirements typically influence the system as a whole. Thus, the STREAM uses Architectural Patterns to refine the software architecture. These patterns are chosen by using non-functional requirements in an ad hoc way. This master thesis presents a process to improve STREAM in making the choice of architectural patterns systematic by using non-functional requirements, in order to guide the refinement of a software architecture
Resumo:
The occurrence of problems related to the scattering and tangling phenomenon, such as the difficulty to do system maintenance, increasingly frequent. One way to solve this problem is related to the crosscutting concerns identification. To maximize its benefits, the identification must be performed from early stages of development process, but some works have reported that this has not been done in most of cases, making the system development susceptible to the errors incidence and prone to the refactoring later. This situation affects directly to the quality and cost of the system. PL-AOVgraph is a goal-oriented requirements modeling language which offers support to the relationships representation among requirements and provides separation of crosscutting concerns by crosscutting relationships representation. Therefore, this work presents a semi-automatic method to crosscutting concern identification in requirements specifications written in PL-AOVgraph. An adjacency matrix is used to identify the contributions relationships among the elements. The crosscutting concern identification is based in fan-out analysis of contribution relationships from the informations of adjacency matrix. When identified, the crosscutting relationships are created. And also, this method is implemented as a new module of ReqSys-MDD tool
Resumo:
Self-adaptive software system is able to change its structure and/or behavior at runtime due to changes in their requirements, environment or components. One way to archieve self-adaptation is the use a sequence of actions (known as adaptation plans) which are typically defined at design time. This is the approach adopted by Cosmos - a Framework to support the configuration and management of resources in distributed environments. In order to deal with the variability inherent of self-adaptive systems, such as, the appearance of new components that allow the establishment of configurations that were not envisioned at development time, this dissertation aims to give Cosmos the capability of generating adaptation plans of runtime. In this way, it was necessary to perform a reengineering of the Cosmos Framework in order to allow its integration with a mechanism for the dynamic generation of adaptation plans. In this context, our work has been focused on conducting a reengineering of Cosmos. Among the changes made to in the Cosmos, we can highlight: changes in the metamodel used to represent components and applications, which has been redefined based on an architectural description language. These changes were propagated to the implementation of a new Cosmos prototype, which was then used for developing a case study application for purpose of proof of concept. Another effort undertaken was to make Cosmos more attractive by integrating it with another platform, in the case of this dissertation, the OSGi platform, which is well-known and accepted by the industry
Resumo:
This work consists on the study of two important problems arising from the operations of petroleum and natural gas industries. The first problem the pipe dimensioning problem on constrained gas distribution networks consists in finding the least cost combination of diameters from a discrete set of commercially available ones for the pipes of a given gas network, such that it respects minimum pressure requirements at each demand node and upstream pipe conditions. On its turn, the second problem the piston pump unit routing problem comes from the need of defining the piston pump unit routes for visiting a number of non-emergent wells in on-shore fields, i.e., wells which don t have enough pressure to make the oil emerge to surface. The periodic version of this problem takes into account the wells re-filling equation to provide a more accurate planning in the long term. Besides the mathematical formulation of both problems, an exact algorithm and a taboo search were developed for the solution of the first problem and a theoretical limit and a ProtoGene transgenetic algorithm were developed for the solution of the second problem. The main concepts of the metaheuristics are presented along with the details of their application to the cited problems. The obtained results for both applications are promising when compared to theoretical limits and alternate solutions, either relative to the quality of the solutions or to associated running time
Resumo:
There is a growing interest of the Computer Science education community for including testing concepts on introductory programming courses. Aiming at contributing to this issue, we introduce POPT, a Problem-Oriented Programming and Testing approach for Introductory Programming Courses. POPT main goal is to improve the traditional method of teaching introductory programming that concentrates mainly on implementation and neglects testing. POPT extends POP (Problem Oriented Programing) methodology proposed on the PhD Thesis of Andrea Mendonça (UFCG). In both methodologies POPT and POP, students skills in dealing with ill-defined problems must be developed since the first programming courses. In POPT however, students are stimulated to clarify ill-defined problem specifications, guided by de definition of test cases (in a table-like manner). This paper presents POPT, and TestBoot a tool developed to support the methodology. In order to evaluate the approach a case study and a controlled experiment (which adopted the Latin Square design) were performed. In an Introductory Programming course of Computer Science and Software Engineering Graduation Programs at the Federal University of Rio Grande do Norte, Brazil. The study results have shown that, when compared to a Blind Testing approach, POPT stimulates the implementation of programs of better external quality the first program version submitted by POPT students passed in twice the number of test cases (professor-defined ones) when compared to non-POPT students. Moreover, POPT students submitted fewer program versions and spent more time to submit the first version to the automatic evaluation system, which lead us to think that POPT students are stimulated to think better about the solution they are implementing. The controlled experiment confirmed the influence of the proposed methodology on the quality of the code developed by POPT students
Resumo:
One way to deal with the high complexity of current software systems is through selfadaptive systems. Self-adaptive system must be able to monitor themselves and their environment, analyzing the monitored data to determine the need for adaptation, decide how the adaptation will be performed, and finally, make the necessary adjustments. One way to perform the adaptation of a system is generating, at runtime, the process that will perform the adaptation. One advantage of this approach is the possibility to take into account features that can only be evaluated at runtime, such as the emergence of new components that allow new architectural arrangements which were not foreseen at design time. In this work we have as main objective the use of a framework for dynamic generation of processes to generate architectural adaptation plans on OSGi environment. Our main interest is evaluate how this framework for dynamic generation of processes behave in new environments
Resumo:
Multimedia systems must incorporate middleware concepts in order to abstract hardware and operational systems issues. Applications in those systems may be executed in different kinds of platforms, and their components need to communicate with each other. In this context, it is needed the definition of specific communication mechanisms for the transmission of information flow. This work presents a interconnection component model for distributed multimedia environments, and its implementation details. The model offers specific communication mechanisms for transmission of information flow between software components considering the Cosmos framework requirements in order to support component dynamic reconfiguration
Resumo:
The work proposed by Cleverton Hentz (2010) presented an approach to define tests from the formal description of a program s input. Since some programs, such as compilers, may have their inputs formalized through grammars, it is common to use context-free grammars to specify the set of its valid entries. In the original work the author developed a tool that automatically generates tests for compilers. In the present work we identify types of problems in various areas where grammars are used to describe them , for example, to specify software configurations, which are potential situations to use LGen. In addition, we conducted case studies with grammars of different domains and from these studies it was possible to evaluate the behavior and performance of LGen during the generation of sentences, evaluating aspects such as execution time, number of generated sentences and satisfaction of coverage criteria available in LGen
Resumo:
The techniques of Machine Learning are applied in classification tasks to acquire knowledge through a set of data or information. Some learning methods proposed in literature are methods based on semissupervised learning; this is represented by small percentage of labeled data (supervised learning) combined with a quantity of label and non-labeled examples (unsupervised learning) during the training phase, which reduces, therefore, the need for a large quantity of labeled instances when only small dataset of labeled instances is available for training. A commom problem in semi-supervised learning is as random selection of instances, since most of paper use a random selection technique which can cause a negative impact. Much of machine learning methods treat single-label problems, in other words, problems where a given set of data are associated with a single class; however, through the requirement existent to classify data in a lot of domain, or more than one class, this classification as called multi-label classification. This work presents an experimental analysis of the results obtained using semissupervised learning in troubles of multi-label classification using reliability parameter as an aid in the classification data. Thus, the use of techniques of semissupervised learning and besides methods of multi-label classification, were essential to show the results