998 resultados para Abordagem geométrica


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This work proposes a model based approach for pointcut management in the presence of evolution in aspect oriented systems. The proposed approach, called conceptual visions based pointcuts, is motivated by the observation of the shortcomings in traditional approaches pointcuts definition, which generally refer directly to software structure and/or behavior, thereby creating a strong coupling between pointcut definition and the base code. This coupling causes the problem known as pointcut fragility problem and hinders the evolution of aspect-oriented systems. This problem occurs when all the pointcuts of each aspect should be reviewed due to any software changes/evolution, to ensure that they remain valid even after the changes made in the software. Our approach is focused on the pointcuts definition based on a conceptual model, which has definitions of the system's structure in a more abstract level. The conceptual model consists of classifications (called conceptual views) on entities of the business model elements based on common characteristics, and relationships between these views. Thus the pointcuts definitions are created based on the conceptual model rather than directly referencing the base model. Moreover, the conceptual model contains a set of relationships that allows it to be automatically verified if the classifications in the conceptual model remain valid even after a software change. To this end, all the development using the conceptual views based pointcuts approach is supported by a conceptual framework called CrossMDA2 and a development process based on MDA, both also proposed in this work. As proof of concept, we present two versions of a case study, setting up a scenario of evolution that shows how the use of conceptual visions based pointcuts helps detecting and minimizing the pointcuts fragility. For the proposal evaluation the Goal/Question/Metric (GQM) technique is used together with metrics for efficiency analysis in the pointcuts definition

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Research on Wireless Sensor Networks (WSN) has evolved, with potential applications in several domains. However, the building of WSN applications is hampered by the need of programming in low-level abstractions provided by sensor OS and of specific knowledge about each application domain and each sensor platform. We propose a MDA approach do develop WSN applications. This approach allows domain experts to directly contribute in the developing of applications without needing low level knowledge on WSN platforms and, at the same time, it allows network experts to program WSN nodes to met application requirements without specific knowledge on the application domain. Our approach also promotes the reuse of the developed software artifacts, allowing an application model to be reused across different sensor platforms and a platform model to be reused for different applications

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This dissertation presents a model-driven and integrated approach to variability management, customization and execution of software processes. Our approach is founded on the principles and techniques of software product lines and model-driven engineering. Model-driven engineering provides support to the specification of software processes and their transformation to workflow specifications. Software product lines techniques allows the automatic variability management of process elements and fragments. Additionally, in our approach, workflow technologies enable the process execution in workflow engines. In order to evaluate the approach feasibility, we have implemented it using existing model-driven engineering technologies. The software processes are specified using Eclipse Process Framework (EPF). The automatic variability management of software processes has been implemented as an extension of an existing product derivation tool. Finally, ATL and Acceleo transformation languages are adopted to transform EPF process to jPDL workflow language specifications in order to enable the deployment and execution of software processes in the JBoss BPM workflow engine. The approach is evaluated through the modeling and modularization of the project management discipline of the Open Unified Process (OpenUP)

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Through the adoption of the software product line (SPL) approach, several benefits are achieved when compared to the conventional development processes that are based on creating a single software system at a time. The process of developing a SPL differs from traditional software construction, since it has two essential phases: the domain engineering - when common and variables elements of the SPL are defined and implemented; and the application engineering - when one or more applications (specific products) are derived from the reuse of artifacts created in the domain engineering. The test activity is also fundamental and aims to detect defects in the artifacts produced in SPL development. However, the characteristics of an SPL bring new challenges to this activity that must be considered. Several approaches have been recently proposed for the testing process of product lines, but they have been shown limited and have only provided general guidelines. In addition, there is also a lack of tools to support the variability management and customization of automated case tests for SPLs. In this context, this dissertation has the goal of proposing a systematic approach to software product line testing. The approach offers: (i) automated SPL test strategies to be applied in the domain and application engineering, (ii) explicit guidelines to support the implementation and reuse of automated test cases at the unit, integration and system levels in domain and application engineering; and (iii) tooling support for automating the variability management and customization of test cases. The approach is evaluated through its application in a software product line for web systems. The results of this work have shown that the proposed approach can help the developers to deal with the challenges imposed by the characteristics of SPLs during the testing process

Relevância:

20.00% 20.00%

Publicador:

Resumo:

There is a need for multi-agent system designers in determining the quality of systems in the earliest phases of the development process. The architectures of the agents are also part of the design of these systems, and therefore also need to have their quality evaluated. Motivated by the important role that emotions play in our daily lives, embodied agents researchers have aimed to create agents capable of producing affective and natural interaction with users that produces a beneficial or desirable result. For this, several studies proposing architectures of agents with emotions arose without the accompaniment of appropriate methods for the assessment of these architectures. The objective of this study is to propose a methodology for evaluating architectures emotional agents, which evaluates the quality attributes of the design of architectures, in addition to evaluation of human-computer interaction, the effects on the subjective experience of users of applications that implement it. The methodology is based on a model of well-defined metrics. In assessing the quality of architectural design, the attributes assessed are: extensibility, modularity and complexity. In assessing the effects on users' subjective experience, which involves the implementation of the architecture in an application and we suggest to be the domain of computer games, the metrics are: enjoyment, felt support, warm, caring, trust, cooperation, intelligence, interestingness, naturalness of emotional reactions, believabiliy, reducing of frustration and likeability, and the average time and average attempts. We experimented with this approach and evaluate five architectures emotional agents: BDIE, DETT, Camurra-Coglio, EBDI, Emotional-BDI. Two of the architectures, BDIE and EBDI, were implemented in a version of the game Minesweeper and evaluated for human-computer interaction. In the results, DETT stood out with the best architectural design. Users who have played the version of the game with emotional agents performed better than those who played without agents. In assessing the subjective experience of users, the differences between the architectures were insignificant

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the world we are constantly performing everyday actions. Two of these actions are frequent and of great importance: classify (sort by classes) and take decision. When we encounter problems with a relatively high degree of complexity, we tend to seek other opinions, usually from people who have some knowledge or even to the extent possible, are experts in the problem domain in question in order to help us in the decision-making process. Both the classification process as the process of decision making, we are guided by consideration of the characteristics involved in the specific problem. The characterization of a set of objects is part of the decision making process in general. In Machine Learning this classification happens through a learning algorithm and the characterization is applied to databases. The classification algorithms can be employed individually or by machine committees. The choice of the best methods to be used in the construction of a committee is a very arduous task. In this work, it will be investigated meta-learning techniques in selecting the best configuration parameters of homogeneous committees for applications in various classification problems. These parameters are: the base classifier, the architecture and the size of this architecture. We investigated nine types of inductors candidates for based classifier, two methods of generation of architecture and nine medium-sized groups for architecture. Dimensionality reduction techniques have been applied to metabases looking for improvement. Five classifiers methods are investigated as meta-learners in the process of choosing the best parameters of a homogeneous committee.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A remoção de inconsistências em um projeto é menos custosa quando realizadas nas etapas iniciais da sua concepção. A utilização de Métodos Formais melhora a compreensão dos sistemas além de possuir diversas técnicas, como a especificação e verificação formal, para identificar essas inconsistências nas etapas iniciais de um projeto. Porém, a transformação de uma especificação formal para uma linguagem de programação é uma tarefa não trivial. Quando feita manualmente, é uma tarefa passível da inserção de erros. O uso de ferramentas que auxiliem esta etapa pode proporcionar grandes benefícios ao produto final a ser desenvolvido. Este trabalho propõe a extensão de uma ferramenta cujo foco é a tradução automática de especificações em CSPm para Handel-C. CSP é uma linguagem de descrição formal adequada para trabalhar com sistemas concorrentes. Handel-C é uma linguagem de programação cujo resultado pode ser compilado diretamente para FPGA's. A extensão consiste no aumento no número de operadores CSPm aceitos pela ferramenta, permitindo ao usuário definir processos locais, renomear canais e utilizar guarda booleana em escolhas externas. Além disto, propomos também a implementação de um protocolo de comunicação que elimina algumas restrições da composição paralela de processos na tradução para Handel-C, permitindo que a comunicação entre múltiplos processos possa ser mapeada de maneira consistente e que a mesma somente ocorra quando for autorizada.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Removing inconsistencies in a project is a less expensive activity when done in the early steps of design. The use of formal methods improves the understanding of systems. They have various techniques such as formal specification and verification to identify these problems in the initial stages of a project. However, the transformation from a formal specification into a programming language is a non-trivial task and error prone, specially when done manually. The aid of tools at this stage can bring great benefits to the final product to be developed. This paper proposes the extension of a tool whose focus is the automatic translation of specifications written in CSPM into Handel-C. CSP is a formal description language suitable for concurrent systems, and CSPM is the notation used in tools support. Handel-C is a programming language whose result can be compiled directly into FPGA s. Our extension increases the number of CSPM operators accepted by the tool, allowing the user to define local processes, to rename channels in a process and to use Boolean guards on external choices. In addition, we also propose the implementation of a communication protocol that eliminates some restrictions on parallel composition of processes in the translation into Handel-C, allowing communication in a same channel between multiple processes to be mapped in a consistent manner and that improper communication in a channel does not ocurr in the generated code, ie, communications that are not allowed in the system specification

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Checking the conformity between implementation and design rules in a system is an important activity to try to ensure that no degradation occurs between architectural patterns defined for the system and what is actually implemented in the source code. Especially in the case of systems which require a high level of reliability is important to define specific design rules for exceptional behavior. Such rules describe how exceptions should flow through the system by defining what elements are responsible for catching exceptions thrown by other system elements. However, current approaches to automatically check design rules do not provide suitable mechanisms to define and verify design rules related to the exception handling policy of applications. This paper proposes a practical approach to preserve the exceptional behavior of an application or family of applications, based on the definition and runtime automatic checking of design rules for exception handling of systems developed in Java or AspectJ. To support this approach was developed, in the context of this work, a tool called VITTAE (Verification and Information Tool to Analyze Exceptions) that extends the JUnit framework and allows automating test activities to exceptional design rules. We conducted a case study with the primary objective of evaluating the effectiveness of the proposed approach on a software product line. Besides this, an experiment was conducted that aimed to realize a comparative analysis between the proposed approach and an approach based on a tool called JUnitE, which also proposes to test the exception handling code using JUnit tests. The results showed how the exception handling design rules evolve along different versions of a system and that VITTAE can aid in the detection of defects in exception handling code

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This work presents a algorithmic study of Multicast Packing Problem considering a multiobjective approach. The first step realized was an extensive review about the problem. This review serverd as a reference point for the definition of the multiobjective mathematical model. Then, the instances used in the experimentation process were defined, this instances were created based on the main caracteristics from literature. Since both mathematical model and the instances were definined, then several algoritms were created. The algorithms were based on the classical approaches to multiobjective optimization: NSGA2 (3 versions), SPEA2 (3 versions). In addition, the GRASP procedures were adapted to work with multiples objectives, two vesions were created. These algorithms were composed by three recombination operators(C1, C2 e C3), two operator for build solution, a mutation operator and a local search procedure. Finally, a long experimentation process was performed. This process has three stages: the first consisted of adjusting the parameters; the second was perfomed to indentify the best version for each algorithm. After, the best versions for each algorithm were compared in order to identify the best algorithm among all. The algorithms were evaluated based on quality indicators and Hypervolume Multiplicative Epsilon

Relevância:

20.00% 20.00%

Publicador:

Resumo:

When crosscutting concerns identification is performed from the beginning of development, on the activities involved in requirements engineering, there are many gains in terms of quality, cost and efficiency throughout the lifecycle of software development. This early identification supports the evolution of requirements, detects possible flaws in the requirements specification, improves traceability among requirements, provides better software modularity and prevents possible rework. However, despite these several advantages, the crosscutting concerns identification over requirements engineering faces several difficulties such as the lack of systematization and tools that support it. Furthermore, it is difficult to justify why some concerns are identified as crosscutting or not, since this identification is, most often, made without any methodology that systematizes and bases it. In this context, this paper proposes an approach based on Grounded Theory, called GT4CCI, for systematizing and basing the process of identifying crosscutting concerns in the initial stages of the software development process in the requirements document. Grounded Theory is a renowned methodology for qualitative analysis of data. Through the use of GT4CCI it is possible to better understand, track and document concerns, adding gains in terms of quality, reliability and modularity of the entire lifecycle of software

Relevância:

20.00% 20.00%

Publicador:

Resumo:

There is a growing interest of the Computer Science education community for including testing concepts on introductory programming courses. Aiming at contributing to this issue, we introduce POPT, a Problem-Oriented Programming and Testing approach for Introductory Programming Courses. POPT main goal is to improve the traditional method of teaching introductory programming that concentrates mainly on implementation and neglects testing. POPT extends POP (Problem Oriented Programing) methodology proposed on the PhD Thesis of Andrea Mendonça (UFCG). In both methodologies POPT and POP, students skills in dealing with ill-defined problems must be developed since the first programming courses. In POPT however, students are stimulated to clarify ill-defined problem specifications, guided by de definition of test cases (in a table-like manner). This paper presents POPT, and TestBoot a tool developed to support the methodology. In order to evaluate the approach a case study and a controlled experiment (which adopted the Latin Square design) were performed. In an Introductory Programming course of Computer Science and Software Engineering Graduation Programs at the Federal University of Rio Grande do Norte, Brazil. The study results have shown that, when compared to a Blind Testing approach, POPT stimulates the implementation of programs of better external quality the first program version submitted by POPT students passed in twice the number of test cases (professor-defined ones) when compared to non-POPT students. Moreover, POPT students submitted fewer program versions and spent more time to submit the first version to the automatic evaluation system, which lead us to think that POPT students are stimulated to think better about the solution they are implementing. The controlled experiment confirmed the influence of the proposed methodology on the quality of the code developed by POPT students

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Este trabalho apresenta um algoritmo transgenético híbrido para a solução de um Problema de Configuração de uma Rede de Distribuição de Gás Natural. O problema da configuração dessas redes requer a definição de um traçado por onde os dutos devem ser colocados para atender aos clientes. É estudada neste trabalho uma maneira de conectar os clientes em uma rede com arquitetura em forma de árvore. O objetivo é minimizar o custo de construção da rede, mesmo que para isso alguns clientes que não proporcionam lucros deixem de ser atendidos. Esse problema pode ser formulado computacionalmente através do Problema de Steiner com Prêmios. Este é um problema de otimização combinatória da classe dos NPÁrduos. Este trabalho apresenta um algoritmo heurístico para a solução do problema. A abordagem utilizada é chamada de Algoritmos Transgenéticos, que se enquadram na categoria dos algoritmos evolucionários. Para a geração de soluções inicias é utilizado um algoritmo primaldual, e pathrelinking é usado como intensificador

Relevância:

20.00% 20.00%

Publicador:

Resumo:

O presente texto constitui-se de um recorte de uma pesquisa de doutorado realizada no programa de pós-graduação em educação da UFRN. Tem como objetivo discutir o esporte a partir da dimensão do vivido, buscando compreender os significados conferidos à prática esportiva e à experiência estética do atleta como educação. Parte de uma abordagem metodológica centrada na fenomenologia como campo de reflexão e interpretação da experiência vivida. Esse entendimento aponta caminhos para uma reflexão sobre o esporte centrada no corpo do atleta, cujos debates teóricos e as intervenções na Educação Física podem transpor qualquer concepção determinista, por uma educação móvel, sensível e inscrita no corpo em movimento