947 resultados para Factory of software
Resumo:
Nowadays, the importance of using software processes is already consolidated and is considered fundamental to the success of software development projects. Large and medium software projects demand the definition and continuous improvement of software processes in order to promote the productive development of high-quality software. Customizing and evolving existing software processes to address the variety of scenarios, technologies, culture and scale is a recurrent challenge required by the software industry. It involves the adaptation of software process models for the reality of their projects. Besides, it must also promote the reuse of past experiences in the definition and development of software processes for the new projects. The adequate management and execution of software processes can bring a better quality and productivity to the produced software systems. This work aimed to explore the use and adaptation of consolidated software product lines techniques to promote the management of the variabilities of software process families. In order to achieve this aim: (i) a systematic literature review is conducted to identify and characterize variability management approaches for software processes; (ii) an annotative approach for the variability management of software process lines is proposed and developed; and finally (iii) empirical studies and a controlled experiment assess and compare the proposed annotative approach against a compositional one. One study a comparative qualitative study analyzed the annotative and compositional approaches from different perspectives, such as: modularity, traceability, error detection, granularity, uniformity, adoption, and systematic variability management. Another study a comparative quantitative study has considered internal attributes of the specification of software process lines, such as modularity, size and complexity. Finally, the last study a controlled experiment evaluated the effort to use and the understandability of the investigated approaches when modeling and evolving specifications of software process lines. The studies bring evidences of several benefits of the annotative approach, and the potential of integration with the compositional approach, to assist the variability management of software process lines
Resumo:
Software Product Line (SPL) consists of a software development paradigm, whose main focus is to identify features common and variability among applications in a specific domain. An LPS is designed to attend all products requirements from its product family. These requirements and LPS may have changes over time due to several factors, such as evolution of product requirements, evolution of the market, evolution of SLP process, evolution of the technologies used to develop the products. To handle these changes, LPS should be modified and evolve in order to not become obsolete, and adapt itself to new requirements. The Changes Impact Analysis is an activity that understand and identify what consequences these changes are cause on LPS. Impact Analysis on LPS may be supported by traceability relationships, which identify relationships between artefacts created during all phases of software development. Despite the solutions of change impact analysis based on traceability for software, there is a lack of solutions for assessing the change impact analysis based on traceability for LPS, since existing solutions do not include estimates specific to the artefacts of LPS. Thus, this paper proposes a process of change impact analysis and an tool for assessing the change impact through traceability of artefacts in LPS. For this purpose, we specified a process of change impact analysis that considers artifacts produced during the development of LPS. We have also implemented a tool which allows estimating and identifying artefacts and products of LPS affected from changes in other products, changes in class, changes in features, changes between releases of LPS and artefacts related to changes in core assets and variability. Finally, the results were evaluated through metrics
Resumo:
This dissertation presents a model-driven and integrated approach to variability management, customization and execution of software processes. Our approach is founded on the principles and techniques of software product lines and model-driven engineering. Model-driven engineering provides support to the specification of software processes and their transformation to workflow specifications. Software product lines techniques allows the automatic variability management of process elements and fragments. Additionally, in our approach, workflow technologies enable the process execution in workflow engines. In order to evaluate the approach feasibility, we have implemented it using existing model-driven engineering technologies. The software processes are specified using Eclipse Process Framework (EPF). The automatic variability management of software processes has been implemented as an extension of an existing product derivation tool. Finally, ATL and Acceleo transformation languages are adopted to transform EPF process to jPDL workflow language specifications in order to enable the deployment and execution of software processes in the JBoss BPM workflow engine. The approach is evaluated through the modeling and modularization of the project management discipline of the Open Unified Process (OpenUP)
Resumo:
Through the adoption of the software product line (SPL) approach, several benefits are achieved when compared to the conventional development processes that are based on creating a single software system at a time. The process of developing a SPL differs from traditional software construction, since it has two essential phases: the domain engineering - when common and variables elements of the SPL are defined and implemented; and the application engineering - when one or more applications (specific products) are derived from the reuse of artifacts created in the domain engineering. The test activity is also fundamental and aims to detect defects in the artifacts produced in SPL development. However, the characteristics of an SPL bring new challenges to this activity that must be considered. Several approaches have been recently proposed for the testing process of product lines, but they have been shown limited and have only provided general guidelines. In addition, there is also a lack of tools to support the variability management and customization of automated case tests for SPLs. In this context, this dissertation has the goal of proposing a systematic approach to software product line testing. The approach offers: (i) automated SPL test strategies to be applied in the domain and application engineering, (ii) explicit guidelines to support the implementation and reuse of automated test cases at the unit, integration and system levels in domain and application engineering; and (iii) tooling support for automating the variability management and customization of test cases. The approach is evaluated through its application in a software product line for web systems. The results of this work have shown that the proposed approach can help the developers to deal with the challenges imposed by the characteristics of SPLs during the testing process
Resumo:
The tracking between models of the requirements and architecture activities is a strategy that aims to prevent loss of information, reducing the gap between these two initial activities of the software life cycle. In the context of Software Product Lines (SPL), it is important to have this support, which allows the correspondence between this two activities, with management of variability. In order to address this issue, this paper presents a process of bidirectional mapping, defining transformation rules between elements of a goaloriented requirements model (described in PL-AOVgraph) and elements of an architectural description (defined in PL-AspectualACME). These mapping rules are evaluated using a case study: the GingaForAll LPS. To automate this transformation, we developed the MaRiPLA tool (Mapping Requirements to Product Line Architecture), through MDD techniques (Modeldriven Development), including Atlas Transformation Language (ATL) with specification of Ecore metamodels jointly with Xtext , a DSL definition framework, and Acceleo, a code generation tool, in Eclipse environment. Finally, the generated models are evaluated based on quality attributes such as variability, derivability, reusability, correctness, traceability, completeness, evolvability and maintainability, extracted from the CAFÉ Quality Model
Resumo:
A great challenge of the Component Based Development is the creation of mechanisms to facilitate the finding of reusable assets that fulfill the requirements of a particular system under development. In this sense, some component repositories have been proposed in order to answer such a need. However, repositories need to represent the asset characteristics that can be taken into account by the consumers when choosing the more adequate assets for their needs. In such a context, the literature presents some models proposed to describe the asset characteristics, such as identification, classification, non-functional requirements, usage and deployment information and component interfaces. Nevertheless, the set of characteristics represented by those models is insufficient to describe information used before, during and after the asset acquisition. This information refers to negotiation, certification, change history, adopted development process, events, exceptions and so on. In order to overcome this gap, this work proposes an XML-based model to represent several characteristics, of different asset types, that may be employed in the component-based development. Besides representing metadata used by consumers, useful for asset discovering, acquisition and usage, this model, called X-ARM, also focus on helping asset developers activities. Since the proposed model represents an expressive amount of information, this work also presents a tool called X-Packager, developed with the goal of helping asset description with X-ARM
Resumo:
The academic community and software industry have shown, in recent years, substantial interest in approaches and technologies related to the area of model-driven development (MDD). At the same time, continues the relentless pursuit of industry for technologies to raise productivity and quality in the development of software products. This work aims to explore those two statements, through an experiment carried by using MDD technology and evaluation of its use on solving an actual problem under the security context of enterprise systems. By building and using a tool, a visual DSL denominated CALV3, inspired by the software factory approach: a synergy between software product line, domainspecific languages and MDD, we evaluate the gains in abstraction and productivity through a systematic case study conducted in a development team. The results and lessons learned from the evaluation of this tool within industry are the main contributions of this work
Resumo:
Software Repository Mining (MSR) is a research area that analyses software repositories in order to derive relevant information for the research and practice of software engineering. The main goal of repository mining is to extract static information from repositories (e.g. code repository or change requisition system) into valuable information providing a way to support the decision making of software projects. On the other hand, another research area called Process Mining (PM) aims to find the characteristics of the underlying process of business organizations, supporting the process improvement and documentation. Recent works have been doing several analyses through MSR and PM techniques: (i) to investigate the evolution of software projects; (ii) to understand the real underlying process of a project; and (iii) create defect prediction models. However, few research works have been focusing on analyzing the contributions of software developers by means of MSR and PM techniques. In this context, this dissertation proposes the development of two empirical studies of assessment of the contribution of software developers to an open-source and a commercial project using those techniques. The contributions of developers are assessed through three different perspectives: (i) buggy commits; (ii) the size of commits; and (iii) the most important bugs. For the opensource project 12.827 commits and 8.410 bugs have been analyzed while 4.663 commits and 1.898 bugs have been analyzed for the commercial project. Our results indicate that, for the open source project, the developers classified as core developers have contributed with more buggy commits (although they have contributed with the majority of commits), more code to the project (commit size) and more important bugs solved while the results could not indicate differences with statistical significance between developer groups for the commercial project
Resumo:
In this study, we sought to address the weaknesses faced by most students when they were studying trigonometric functions sine and cosine. For this, we proposed the use of software Geogebra in performing a sequence of activities about the content covered. The research was a qualitative approach based on observations of the activities performed by the students of 2nd year of high school IFRN - Campus Caicfio. The activities enabled check some diculties encountered by students, well as the interaction between them during the tasks. The results were satisfactory, since they indicate that the use of software contributed to a better understanding of these mathematical concepts studied
Resumo:
Pode-se afirmar que a evolução tecnológica (desenvolvimento de novos instrumentos de medição como, softwares, satélites e computadores, bem como, o barateamento das mídias de armazenamento) permite às Organizações produzirem e adquirirem grande quantidade de dados em curto espaço de tempo. Devido ao volume de dados, Organizações de pesquisa se tornam potencialmente vulneráveis aos impactos da explosão de informações. Uma solução adotada por algumas Organizações é a utilização de ferramentas de sistemas de informação para auxiliar na documentação, recuperação e análise dos dados. No âmbito científico, essas ferramentas são desenvolvidas para armazenar diferentes padrões de metadados (dados sobre dados). Durante o processo de desenvolvimento destas ferramentas, destaca-se a adoção de padrões como a Linguagem Unificada de Modelagem (UML, do Inglês Unified Modeling Language), cujos diagramas auxiliam na modelagem de diferentes aspectos do software. O objetivo deste estudo é apresentar uma ferramenta de sistemas de informação para auxiliar na documentação dos dados das Organizações por meio de metadados e destacar o processo de modelagem de software, por meio da UML. Será abordado o Padrão de Metadados Digitais Geoespaciais, amplamente utilizado na catalogação de dados por Organizações científicas de todo mundo, e os diagramas dinâmicos e estáticos da UML como casos de uso, sequências e classes. O desenvolvimento das ferramentas de sistemas de informação pode ser uma forma de promover a organização e a divulgação de dados científicos. No entanto, o processo de modelagem requer especial atenção para o desenvolvimento de interfaces que estimularão o uso das ferramentas de sistemas de informação.
Resumo:
The objective of the present article is to identify and discuss the possibilities of using qualitative data analysis software in the framework of procedures proposed by SDI (socio-discursive interactionism), emphasizing free distribuited software or free versions of commercial software. A literature review of software for qualitative data analysis in the area of social sciences and humanities, focusing on language studies is presented. Some tools, such as: Wef-tQDA, MLCT, Yoshikoder and Tropes are examined with their respective features and functions. The software called Tropes is examined in more detail because of its particular relation with language and semantic analysis, as well as its embeded classification of linguistic elements such as, types of verbs, adjectives, modalizations, etc. Although trying to completely automate an SDI based analysis is not feasible, the programs appear to be powerful helpers in analyzing specific questions. Still, it seems important to be familiar with software options and use different applications in order to obtain a more diversified vision of the data. It is up to the researcher to be critical of the analysis provided by the machine.
Resumo:
An automatic Procedure with a high current-density anodic electrodissolution unit (HDAE) is proposed for the determination of aluminium, copper and zinc in non-ferroalloys by flame atonic absorption spectrometry, based on the direct solid analysis. It consists of solenoid valve-based commutation in a flow-injection system for on-line sample electro-dissolution and calibration with one multi-element standard, an electrolytic cell equipped with two electrodes (a silver needle acts as cathode, and sample as anode), and an intelligent unit. The latter is assembled in a PC-compatible microcomputer for instrument control, and far data acquisition and processing. General management of the process is achieved by use of software written in Pascal. Electrolyte compositions, flow rates, commutation times, applied current and electrolysis time mere investigated. A 0.5 mol l(-1) HNO3 solution was elected as electrolyte and 300 A/cm(2) as the continuous current pulse. The performance of the proposed system was evaluated by analysing aluminium in Al-allay samples, and copper/zinc in brass and bronze samples, respectively. The system handles about 50 samples per hour. Results are precise (R.S.D < 2%) and in agreement with those obtained by ICP-AES and spectrophotometry at a 95% confidence level.
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
To date, different techniques of navigation for mobile robots have been developed. However, the experimentation of these techniques is not a trivial task because usually it is not possible to reuse the developed control software due to system incompabilities. This paper proposes a software platform that provides means for creating reusable software modules through the standardization of software interfaces, which represent the various robot modules. © 2012 ICROS.
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)