998 resultados para Software - Desenvolvimento
Resumo:
Software Product Line (SPL) consists of a software development paradigm, whose main focus is to identify features common and variability among applications in a specific domain. An LPS is designed to attend all products requirements from its product family. These requirements and LPS may have changes over time due to several factors, such as evolution of product requirements, evolution of the market, evolution of SLP process, evolution of the technologies used to develop the products. To handle these changes, LPS should be modified and evolve in order to not become obsolete, and adapt itself to new requirements. The Changes Impact Analysis is an activity that understand and identify what consequences these changes are cause on LPS. Impact Analysis on LPS may be supported by traceability relationships, which identify relationships between artefacts created during all phases of software development. Despite the solutions of change impact analysis based on traceability for software, there is a lack of solutions for assessing the change impact analysis based on traceability for LPS, since existing solutions do not include estimates specific to the artefacts of LPS. Thus, this paper proposes a process of change impact analysis and an tool for assessing the change impact through traceability of artefacts in LPS. For this purpose, we specified a process of change impact analysis that considers artifacts produced during the development of LPS. We have also implemented a tool which allows estimating and identifying artefacts and products of LPS affected from changes in other products, changes in class, changes in features, changes between releases of LPS and artefacts related to changes in core assets and variability. Finally, the results were evaluated through metrics
Resumo:
Through the adoption of the software product line (SPL) approach, several benefits are achieved when compared to the conventional development processes that are based on creating a single software system at a time. The process of developing a SPL differs from traditional software construction, since it has two essential phases: the domain engineering - when common and variables elements of the SPL are defined and implemented; and the application engineering - when one or more applications (specific products) are derived from the reuse of artifacts created in the domain engineering. The test activity is also fundamental and aims to detect defects in the artifacts produced in SPL development. However, the characteristics of an SPL bring new challenges to this activity that must be considered. Several approaches have been recently proposed for the testing process of product lines, but they have been shown limited and have only provided general guidelines. In addition, there is also a lack of tools to support the variability management and customization of automated case tests for SPLs. In this context, this dissertation has the goal of proposing a systematic approach to software product line testing. The approach offers: (i) automated SPL test strategies to be applied in the domain and application engineering, (ii) explicit guidelines to support the implementation and reuse of automated test cases at the unit, integration and system levels in domain and application engineering; and (iii) tooling support for automating the variability management and customization of test cases. The approach is evaluated through its application in a software product line for web systems. The results of this work have shown that the proposed approach can help the developers to deal with the challenges imposed by the characteristics of SPLs during the testing process
Resumo:
The tracking between models of the requirements and architecture activities is a strategy that aims to prevent loss of information, reducing the gap between these two initial activities of the software life cycle. In the context of Software Product Lines (SPL), it is important to have this support, which allows the correspondence between this two activities, with management of variability. In order to address this issue, this paper presents a process of bidirectional mapping, defining transformation rules between elements of a goaloriented requirements model (described in PL-AOVgraph) and elements of an architectural description (defined in PL-AspectualACME). These mapping rules are evaluated using a case study: the GingaForAll LPS. To automate this transformation, we developed the MaRiPLA tool (Mapping Requirements to Product Line Architecture), through MDD techniques (Modeldriven Development), including Atlas Transformation Language (ATL) with specification of Ecore metamodels jointly with Xtext , a DSL definition framework, and Acceleo, a code generation tool, in Eclipse environment. Finally, the generated models are evaluated based on quality attributes such as variability, derivability, reusability, correctness, traceability, completeness, evolvability and maintainability, extracted from the CAFÉ Quality Model
Resumo:
The development of smart card applications requires a high level of reliability. Formal methods provide means for this reliability to be achieved. The BSmart method and tool contribute to the development of smart card applications with the support of the B method, generating Java Card code from B specifications. For the development with BSmart to be effectively rigorous without overloading the user it is important to have a library of reusable components built in B. The goal of KitSmart is to provide this support. A first research about the composition of this library was a graduation work from Universidade Federal do Rio Grande do Norte, made by Thiago Dutra in 2006. This first version of the kit resulted in a specification of Java Card primitive types byte, short and boolean in B and the creation of reusable components for application development. This work provides an improvement of KitSmart with the addition of API Java Card specification made in B and a guide for the creation of new components. The API Java Card in B, besides being available to be used for development of applications, is also useful as a documentation of each API class. The reusable components correspond to modules to manipulate specific structures, such as date and time. These structures are not available for B or Java Card. These components for Java Card are generated from specifications formally verified in B. The guide contains quick reference on how to specify some structures and how some situations were adapted from object-orientation to the B Method. This work was evaluated through a case study made through the BSmart tool, that makes use of the KitSmart library. In this case study, it is possible to see the contribution of the components in a B specification. This kit should be useful for B method users and Java Card application developers
Resumo:
The approach Software Product Line (SPL) has become very promising these days, since it allows the production of customized systems on large scale through product families. For the modeling of these families the Features Model is being widely used, however, it is a model that has low level of detail and not may be sufficient to guide the development team of LPS. Thus, it is recommended add the Features Model to other models representing the system from other perspectives. The goals model PL-AOVgraph can assume this role complementary to the Features Model, since it has a to context oriented language of LPS's, which allows the requirements modeling in detail and identification of crosscutting concerns that may arise as result of variability. In order to insert PL-AOVgraph in development of LPS's, this paper proposes a bi-directional mapping between PL-AOVgraph and Features Model, which will be automated by tool ReqSys-MDD. This tool uses the approach of Model-Driven Development (MDD), which allows the construction of systems from high level models through successive transformations. This enables the integration of ReqSys-MDD with other tools MDD that use their output models as input to other transformations. So it is possible keep consistency among the models involved, avoiding loss of informations on transitions between stages of development
Resumo:
Software Products Lines (SPL) is a software engineering approach to developing software system families that share common features and differ in other features according to the requested software systems. The adoption of the SPL approach can promote several benefits such as cost reduction, product quality, productivity, and time to market. On the other hand, the SPL approach brings new challenges to the software evolution that must be considered. Recent research work has explored and proposed automated approaches based on code analysis and traceability techniques for change impact analysis in the context of SPL development. There are existing limitations concerning these approaches such as the customization of the analysis functionalities to address different strategies for change impact analysis, and the change impact analysis of fine-grained variability. This dissertation proposes a change impact analysis tool for SPL development, called Squid Impact Analyzer. The tool allows the implementation of change impact analysis based on information from variability modeling, mapping of variability to code assets, and existing dependency relationships between code assets. An assessment of the tool is conducted through an experiment that compare the change impact analysis results provided by the tool with real changes applied to several evolution releases from a SPL for media management in mobile devices
Resumo:
A colaboração na pesquisa é uma das tarefas centrais da área acadêmica. Atualmente, muitos pesquisadores estão utilizando meios modernos de troca de arquivos digitais através de ferramentas assíncronas e também com o uso de ferramentas mais sofisticadas, do tipo síncronas. Juntamente com o fato da crescente quantidade de artigos sendo gerados, mais complexos, diversificados e aumentando de forma desorganizada, o que trás ao pesquisador uma tarefa difícil para organizá-los de forma a se extrair o melhor conteúdo destes, isto ocorre porque uma subárea da Engenharia de Software (ES) ainda é bastante mal aproveitada, a Engenharia de Software Experimental (ESE). Utilizando-se de um dos tipos de experimentos que a ESE oferece, as revisões sistemáticas entram como uma solução bastante robusta, na qual o pesquisador pode identificar o conhecimento existente em uma área e planejar devidamente sua pesquisa, evitando a repetição de erros em pesquisas já efetivadas por outros pesquisadores no passado. Contudo, estas duas abordagens, a colaboração virtual de pesquisadores e a utilização de revisões sistemáticas, contem problemas: na primeira, sistemas colaborativos são geralmente difíceis de configurar e usar; na segunda, apesar da robustez da metodologia de revisões sistemáticas, ainda se torna necessário uma rigorosa revisão na literatura para se conseguir um resultado satisfatório. Assim, com o foco de unir estas duas abordagens, este trabalho propõe uma maneira de produzir revisões sistemáticas de forma organizada e com a possibilidade de interação entre usuários, com o desenvolvimento de um sistema interativo, no qual as revisões sistemáticas possam ser geradas por usuários em colaboração com outros e também ser avaliadas seguindo a orientação de um profissional da área, tornando o seu conteúdo mais consistente e de melhor qualidade. O sistema não possui níveis de acesso, ou seja, qualquer pessoa pode se cadastrar e usufruir de seus recursos, seja na área acadêmica ou mesmo na área profissional
Resumo:
The software systems development with domain-specific languages has become increasingly common. Domain-specific languages (DSLs) provide increased of the domain expressiveness, raising the abstraction level by facilitating the generation of models or low-level source code, thus increasing the productivity of systems development. Consequently, methods for the development of software product lines and software system families have also proposed the adoption of domain-specific languages. Recent studies have investigated the limitations of feature model expressiveness and proposing the use of DSLs as a complement or substitute for feature model. However, in complex projects, a single DSL is often insufficient to represent the different views and perspectives of development, being necessary to work with multiple DSLs. In order to address new challenges in this context, such as the management of consistency between DSLs, and the need to methods and tools that support the development with multiple DSLs, over the past years, several approaches have been proposed for the development of generative approaches. However, none of them considers matters relating to the composition of DSLs. Thus, with the aim to address this problem, the main objectives of this dissertation are: (i) to investigate the adoption of the integrated use of feature models and DSLs during the domain and application engineering of the development of generative approaches; (ii) to propose a method for the development of generative approaches with composition DSLs; and (iii) to investigate and evaluate the usage of modern technology based on models driven engineering to implement strategies of integration between feature models and composition of DSLs
Resumo:
The component-based development of systems revolutionized the software development process, facilitating the maintenance, providing more confiability and reuse. Nevertheless, even with all the advantages of the development of components, their composition is an important concern. The verification through informal tests is not enough to achieve a safe composition, because they are not based on formal semantic models with which we are able to describe precisally a system s behaviour. In this context, formal methods provide ways to accurately specify systems through mathematical notations providing, among other benefits, more safety. The formal method CSP enables the specification of concurrent systems and verification of properties intrinsic to them, as well as the refinement among different models. Some approaches apply constraints using CSP, to check the behavior of composition between components, assisting in the verification of those components in advance. Hence, aiming to assist this process, considering that the software market increasingly requires more automation, reducing work and providing agility in business, this work presents a tool that automatizes the verification of composition among components, in which all complexity of formal language is kept hidden from users. Thus, through a simple interface, the tool BST (BRIC-Tool-Suport) helps to create and compose components, predicting, in advance, undesirable behaviors in the system, such as deadlocks
Resumo:
INTRODUÇÃO: O ensaio do cometa ou técnica da eletroforese de células isoladas é largamente empregado para avaliação de danos e reparo do DNA em células individuais. O material pode ser corado por técnicas de fluorescência ou por sal de prata. Este último apresenta vantagens técnicas, como o tipo de microscópio utilizado e a possibilidade de armazenamento das lâminas. A análise dos cometas pode ser feita de modo visual, porém há a desvantagem da subjetividade dos resultados, que pode ser minimizada por análise digital automatizada. OBJETIVOS: Desenvolvimento e validação de método de análise digital de cometas corados por sal de prata. MÉTODOS: Cinquenta cometas foram fotografados de maneira padronizada e impressos em papel. Além de medidas manualmente, essas imagens foram classificadas em cinco categorias por três avaliadores, antes e depois de pré-processadas automaticamente pelo software ImageJ 1.38x. As estimativas geradas pelos avaliadores foram comparadas quanto sua correlação e reprodutibilidade. em seguida, foram desenvolvidos algoritmos de análise digital das medidas, com base em filtros estatísticos de mediana e de mínimo. Os valores obtidos foram comparados com os estimados manual e visualmente após o pré-processamento. RESULTADOS: As medidas manuais das imagens pré-processadas apresentaram maior correlação intraclasse do que as imagens preliminares. Os parâmetros automatizados apresentaram alta correlação com as medidas manuais pré-processadas, sugerindo que este sistema aumenta a objetividade da análise, podendo ser utilizado na estimativa dos parâmetros dos cometas. CONCLUSÃO: A presente análise digital proposta para o teste do cometa corado pela prata mostrou-se factível e de melhor reprodutibilidade que a análise visual.
Resumo:
Os sensores inteligentes são dispositivos que se diferenciam dos sensores comuns por apresentar capacidade de processamento sobre os dados monitorados. Eles tipicamente são compostos por uma fonte de alimentação, transdutores (sensores e atuadores), memória, processador e transceptor. De acordo com o padrão IEEE 1451 um sensor inteligente pode ser dividido em módulos TIM e NCAP que devem se comunicar através de uma interface padronizada chamada TII. O módulo NCAP é a parte do sensor inteligente que comporta o processador. Portanto, ele é o responsável por atribuir a característica de inteligência ao sensor. Existem várias abordagens que podem ser utilizadas para o desenvolvimento desse módulo, dentre elas se destacam aquelas que utilizam microcontroladores de baixo custo e/ou FPGA. Este trabalho aborda o desenvolvimento de uma arquitetura hardware/software para um módulo NCAP segundo o padrão IEEE 1451.1. A infra-estrutura de hardware é composta por um driver de interface RS-232, uma memória RAM de 512kB, uma interface TII, o processador embarcado NIOS II e um simulador do módulo TIM. Para integração dos componentes de hardware é utilizada ferramenta de integração automática SOPC Builder. A infra-estrutura de software é composta pelo padrão IEEE 1451.1 e pela aplicação especí ca do NCAP que simula o monitoramento de pressão e temperatura em poços de petróleo com o objetivo de detectar vazamento. O módulo proposto é embarcado em uma FPGA e para a sua prototipação é usada a placa DE2 da Altera que contém a FPGA Cyclone II EP2C35F672C6. O processador embarcado NIOS II é utilizado para dar suporte à infra-estrutura de software do NCAP que é desenvolvido na linguagem C e se baseia no padrão IEEE 1451.1. A descrição do comportamento da infra-estrutura de hardware é feita utilizando a linguagem VHDL
Resumo:
The geological modeling allows, at laboratory scaling, the simulation of the geometric and kinematic evolution of geological structures. The importance of the knowledge of these structures grows when we consider their role in the creation of traps or conduits to oil and water. In the present work we simulated the formation of folds and faults in extensional environment, through physical and numerical modeling, using a sandbox apparatus and MOVE2010 software. The physical modeling of structures developed in the hangingwall of a listric fault, showed the formation of active and inactive axial zones. In consonance with the literature, we verified the formation of a rollover between these two axial zones. The crestal collapse of the anticline formed grabens, limited by secondary faults, perpendicular to the extension, with a curvilinear aspect. Adjacent to these faults we registered the formation of transversal folds, parallel to the extension, characterized by a syncline in the fault hangingwall. We also observed drag folds near the faults surfaces, these faults are parallel to the fault surface and presented an anticline in the footwall and a syncline hangingwall. To observe the influence of geometrical variations (dip and width) in the flat of a flat-ramp fault, we made two experimental series, being the first with the flat varying in dip and width and the second maintaining the flat variation in width but horizontal. These experiments developed secondary faults, perpendicular to the extension, that were grouped in three sets: i) antithetic faults with a curvilinear geometry and synthetic faults, with a more rectilinear geometry, both nucleated in the base of sedimentary pile. The normal antithetic faults can rotate, during the extension, presenting a pseudo-inverse kinematics. ii) Faults nucleated at the top of the sedimentary pile. The propagation of these faults is made through coalescence of segments, originating, sometimes, the formation of relay ramps. iii) Reverse faults, are nucleated in the flat-ramp interface. Comparing the two models we verified that the dip of the flat favors a differentiated nucleation of the faults at the two extremities of the mater fault. V These two flat-ramp models also generated an anticline-syncline pair, drag and transversal folds. The anticline was formed above the flat being sub-parallel to the master fault plane, while the syncline was formed in more distal areas of the fault. Due the geometrical variation of these two folds we can define three structural domains. Using the physical experiments as a template, we also made numerical modeling experiments, with flat-ramp faults presenting variation in the flat. Secondary antithetic, synthetic and reverse faults were generated in both models. The numerical modeling formed two folds, and anticline above the flat and a syncline further away of the master fault. The geometric variation of these two folds allowed the definition of three structural domains parallel to the extension. These data reinforce the physical models. The comparisons between natural data of a flat-ramp fault in the Potiguar basin with the data of physical and numerical simulations, showed that, in both cases, the variation of the geometry of the flat produces, variation in the hangingwall geometry
Resumo:
Não é uma tarefa fácil definir requisitos para os sistemas de software que darão suporte a um negócio, dada a dinâmica de mudanças nos processos. O levantamento de requisitos tem sido feito de forma empírica, sem o apoio de métodos sistematizados que garantam o desenvolvimento baseado nos reais objetivos do negócio. A engenharia de software carece de métodos que tornem mais ordenadas e metódicas as etapas de modelagem de negócios e de levantamento de requisitos de um sistema. Neste artigo é apresentada uma metodologia de desenvolvimento de software resultante da incorporação de atividades propostas para modelagem de negócios e levantamento de requisitos, baseadas em uma arquitetura de modelagem de negócios. Essas atividades tornam o desenvolvimento de software mais sistemático e alinhado aos objetivos da organização, e podem ser incorporadas em qualquer metodologia de desenvolvimento baseada no UP (Unified Process - Processo Unificado).
Resumo:
Os objetivos deste estudo foram desenvolver um sistema computacional que efetue o dimensionamento e a evolução de rebanhos bovinos e criar uma ferramenta que possibilite ao usuário efetuar simulações em um sistema de produção de carne e/ou leite. Foi utilizada a linguagem CA Clipper. As rotinas foram desenvolvidas de forma conversacional, com acesso aos diversos programas, por meio de menus auto-explicativos. O sistema desenvolvido pode auxiliar o técnico e o pecuarista no dimensionamento e na evolução de um rebanho bovino com precisão e considerável rapidez; possibilita ao usuário efetuar inúmeras simulações; e constitui -se em importante ferramenta no auxílio da tomada de decisões.
Resumo:
Incluye Bibliografía