65 resultados para Engenharia de Software Experimental. Interatividade. Colaboração Virtual. Revisão Sistemática


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The nonionic surfactants are composed of substances whose molecules in solution, does not ionize. The solubility of these surfactants in water due to the presence of functional groups that have strong affinity for water. When these surfactants are heated is the formation of two liquid phases, evidenced by the phenomenon of turbidity. This study was aimed to determine the experimental temperature and turbidity nonilfenolpoliethoxyled subsequently perform a thermodynamic modeling, considering the models of Flory-Huggins and the empirical solid-liquid equilibrium (SLE). The method used for determining the turbidity point was the visual method (Inoue et al., 2008). The experimental methodology consisted of preparing synthetic solutions of 0,25%, 0,5%, 1%, 2%, 3%, 4%, 5%, 6%, 7%, 8%, 9%, 10%, 12,5%, 15%, 17% and 20% by weight of surfactant. The nonionic surfactants used according to their degree of ethoxylation (9.5, 10, 11, 12 and 13). During the experiments the solutions were homogenized and the bath temperature was gradually increased while the turbidity of the solution temperature was checked visually Inoue et al. (2003). These temperature data of turbidity were used to feed the models evaluated and obtain thermodynamic parameters for systems of surfactants nonilfenolpoliethoxyled. Then the models can be used in phase separation processes, facilitating the extraction of organic solvents, therefore serve as quantitative and qualitative parameters. It was observed that the solidliquid equilibrium model (ESL) was best represented the experimental data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Through the adoption of the software product line (SPL) approach, several benefits are achieved when compared to the conventional development processes that are based on creating a single software system at a time. The process of developing a SPL differs from traditional software construction, since it has two essential phases: the domain engineering - when common and variables elements of the SPL are defined and implemented; and the application engineering - when one or more applications (specific products) are derived from the reuse of artifacts created in the domain engineering. The test activity is also fundamental and aims to detect defects in the artifacts produced in SPL development. However, the characteristics of an SPL bring new challenges to this activity that must be considered. Several approaches have been recently proposed for the testing process of product lines, but they have been shown limited and have only provided general guidelines. In addition, there is also a lack of tools to support the variability management and customization of automated case tests for SPLs. In this context, this dissertation has the goal of proposing a systematic approach to software product line testing. The approach offers: (i) automated SPL test strategies to be applied in the domain and application engineering, (ii) explicit guidelines to support the implementation and reuse of automated test cases at the unit, integration and system levels in domain and application engineering; and (iii) tooling support for automating the variability management and customization of test cases. The approach is evaluated through its application in a software product line for web systems. The results of this work have shown that the proposed approach can help the developers to deal with the challenges imposed by the characteristics of SPLs during the testing process

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Software Products Lines (SPL) is a software engineering approach to developing software system families that share common features and differ in other features according to the requested software systems. The adoption of the SPL approach can promote several benefits such as cost reduction, product quality, productivity, and time to market. On the other hand, the SPL approach brings new challenges to the software evolution that must be considered. Recent research work has explored and proposed automated approaches based on code analysis and traceability techniques for change impact analysis in the context of SPL development. There are existing limitations concerning these approaches such as the customization of the analysis functionalities to address different strategies for change impact analysis, and the change impact analysis of fine-grained variability. This dissertation proposes a change impact analysis tool for SPL development, called Squid Impact Analyzer. The tool allows the implementation of change impact analysis based on information from variability modeling, mapping of variability to code assets, and existing dependency relationships between code assets. An assessment of the tool is conducted through an experiment that compare the change impact analysis results provided by the tool with real changes applied to several evolution releases from a SPL for media management in mobile devices

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Software Repository Mining (MSR) is a research area that analyses software repositories in order to derive relevant information for the research and practice of software engineering. The main goal of repository mining is to extract static information from repositories (e.g. code repository or change requisition system) into valuable information providing a way to support the decision making of software projects. On the other hand, another research area called Process Mining (PM) aims to find the characteristics of the underlying process of business organizations, supporting the process improvement and documentation. Recent works have been doing several analyses through MSR and PM techniques: (i) to investigate the evolution of software projects; (ii) to understand the real underlying process of a project; and (iii) create defect prediction models. However, few research works have been focusing on analyzing the contributions of software developers by means of MSR and PM techniques. In this context, this dissertation proposes the development of two empirical studies of assessment of the contribution of software developers to an open-source and a commercial project using those techniques. The contributions of developers are assessed through three different perspectives: (i) buggy commits; (ii) the size of commits; and (iii) the most important bugs. For the opensource project 12.827 commits and 8.410 bugs have been analyzed while 4.663 commits and 1.898 bugs have been analyzed for the commercial project. Our results indicate that, for the open source project, the developers classified as core developers have contributed with more buggy commits (although they have contributed with the majority of commits), more code to the project (commit size) and more important bugs solved while the results could not indicate differences with statistical significance between developer groups for the commercial project

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This work shows a project method proposed to design and build software components from the software functional m del up to assembly code level in a rigorous fashion. This method is based on the B method, which was developed with support and interest of British Petroleum (BP). One goal of this methodology is to contribute to solve an important problem, known as The Verifying Compiler. Besides, this work describes a formal model of Z80 microcontroller and a real system of petroleum area. To achieve this goal, the formal model of Z80 was developed and documented, as it is one key component for the verification upto the assembly level. In order to improve the mentioned methodology, it was applied on a petroleum production test system, which is presented in this work. Part of this technique is performed manually. However, almost of these activities can be automated by a specific compiler. To build such compiler, the formal modelling of microcontroller and modelling of production test system should provide relevant knowledge and experiences to the design of a new compiler. In ummary, this work should improve the viability of one of the most stringent criteria for formal verification: speeding up the verification process, reducing design time and increasing the quality and reliability of the product of the final software. All these qualities are very important for systems that involve serious risks or in need of a high confidence, which is very common in the petroleum industry

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico - CNPq

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A computação ubíqua é um paradigma no qual dispositivos com capacidade de processamento e comunicação são embutidos nos elementos comuns de nossas vidas (casas, carros, máquinas fotográficas, telefones, escolas, museus, etc), provendo serviços com um alto grau de mobilidade e transparência. O desenvolvimento de sistemas ubíquos é uma tarefa complexa, uma vez que envolve várias áreas da computação, como Engenharia de Software, Inteligência Artificial e Sistemas Distribuídos. Essa tarefa torna-se ainda mais complexa pela ausência de uma arquitetura de referência para guiar o desenvolvimento de tais sistemas. Arquiteturas de referência têm sido usadas para fornecer uma base comum e dar diretrizes para a construção de arquiteturas de softwares para diferentes classes de sistemas. Por outro lado, as linguagens de descrição arquitetural (ADLs) fornecem uma sintaxe para representação estrutural dos elementos arquiteturais, suas restrições e interações, permitindo-se expressar modelo arquitetural de sistemas. Atualmente não há, na literatura, ADLs baseadas em arquiteturas de referência para o domínio de computação ubíqua. De forma a permitir a modelagem arquitetural de aplicações ubíquas, esse trabalho tem como objetivo principal especificar UbiACME, uma linguagem de descrição arquitetural para aplicações ubíquas, bem como disponibilizar a ferramenta UbiACME Studio, que permitirá arquitetos de software realizar modelagens usando UbiACME. Para esse fim, inicialmente realizamos uma revisão sistemática, de forma a investigar na literatura relacionada com sistemas ubíquos, os elementos comuns a esses sistemas que devem ser considerados no projeto de UbiACME. Além disso, com base na revisão sistemática, definimos uma arquitetura de referência para sistemas ubíquos, RA-Ubi, que é a base para a definição dos elementos necessários para a modelagem arquitetural e, portanto, fornece subsídios para a definição dos elementos de UbiACME. Por fim, de forma a validar a linguagem e a ferramenta, apresentamos um experimento controlado onde arquitetos modelam uma aplicação ubíqua usando UbiACME Studio e comparam com a modelagem da mesma aplicação em SySML.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The measurement of flow through the prediction of differential pressure is widely used in industrial day-to-day, this happens mainly due to the fact that it is used for various types of fluids, such as gas flow and liquid with viscosity distinct even flow of fluids with particles in suspension. The suitability of this equipment for measuring mass flow in two-phase flow is of paramount importance for technological development and reliability of results. When it comes to two-phase flow the relationship between the fluids and their interactions are of paramount importance in predicting the flow. In this paper, we propose the use of concentric orifice plate used in small diameter pipes of 25.4 mm order where a two-phase flow flows between water-air. The measurement of single-phase flow was made with the use of data in NBR 5167-1 which was used to Stolz equation for measuring discharge coefficient. In the two-phase flow was used two correlations widely used in the prognosis of mass flow, the pattern of Zhang (1992) and the model of Chisholm (1967), to the homogeneous flow model. It was observed that the behavior found in Zhang model are consistent more realistic way the mass flow of two-phase flow, since the model Chisholm extrapolate the parameters for the downstream pressure P2, the orifice plate, and the rated discharge coefficient. The use of the change in pressure drop P1-P2 and discharge coefficient, led to a better convergence of the values obtained for the two-phase air-water stream.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study is an analysis, on a trial basis, the fuel consumption of a Flex vehicle, operating with different mixtures of gasoline and ethanol in urban traffic, allowing more consistent results with the reality of the driver. Considering that most owners unaware of the possibility of mixing the fuel at the time of supply, thus enabling the choice of the most economically viable mixing gasoline / ethanol, resulting in lower costs and possibly a decrease in pollutant emission rates. Currently, there is a myth created by the people that supply ethanol only becomes viable if the value of not more than 70% of regular gasoline. However vehicles with this technology make it possible to operate with any percentage of mixture in the fuel tank, but today many of the owners of these vehicles do not use this feature effectively, because they ignore the possibility of mixing or the reason there is a deeper study regarding the optimal percentage of the mixture to provide a higher yield with a lower cost than proposed by the manufacturers.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study is an analysis, on a trial basis, the fuel consumption of a Flex vehicle, operating with different mixtures of gasoline and ethanol in urban traffic, allowing more consistent results with the reality of the driver. Considering that most owners unaware of the possibility of mixing the fuel at the time of supply, thus enabling the choice of the most economically viable mixing gasoline / ethanol, resulting in lower costs and possibly a decrease in pollutant emission rates. Currently, there is a myth created by the people that supply ethanol only becomes viable if the value of not more than 70% of regular gasoline. However vehicles with this technology make it possible to operate with any percentage of mixture in the fuel tank, but today many of the owners of these vehicles do not use this feature effectively, because they ignore the possibility of mixing or the reason there is a deeper study regarding the optimal percentage of the mixture to provide a higher yield with a lower cost than proposed by the manufacturers.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Soft skills and teamwork practices were identi ed as the main de ciencies of recent graduates in computer courses. This issue led to a realization of a qualitative research aimed at investigating the challenges faced by professors of those courses in conducting, monitoring and assessing collaborative software development projects. Di erent challenges were reported by teachers, including di culties in the assessment of students both in the collective and individual levels. In this context, a quantitative research was conducted with the aim to map soft skill of students to a set of indicators that can be extracted from software repositories using data mining techniques. These indicators are aimed at measuring soft skills, such as teamwork, leadership, problem solving and the pace of communication. Then, a peer assessment approach was applied in a collaborative software development course of the software engineering major at the Federal University of Rio Grande do Norte (UFRN). This research presents a correlation study between the students' soft skills scores and indicators based on mining software repositories. This study contributes: (i) in the presentation of professors' perception of the di culties and opportunities for improving management and monitoring practices in collaborative software development projects; (ii) in investigating relationships between soft skills and activities performed by students using software repositories; (iii) in encouraging the development of soft skills and the use of software repositories among software engineering students; (iv) in contributing to the state of the art of three important areas of software engineering, namely software engineering education, educational data mining and human aspects of software engineering.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Soft skills and teamwork practices were identi ed as the main de ciencies of recent graduates in computer courses. This issue led to a realization of a qualitative research aimed at investigating the challenges faced by professors of those courses in conducting, monitoring and assessing collaborative software development projects. Di erent challenges were reported by teachers, including di culties in the assessment of students both in the collective and individual levels. In this context, a quantitative research was conducted with the aim to map soft skill of students to a set of indicators that can be extracted from software repositories using data mining techniques. These indicators are aimed at measuring soft skills, such as teamwork, leadership, problem solving and the pace of communication. Then, a peer assessment approach was applied in a collaborative software development course of the software engineering major at the Federal University of Rio Grande do Norte (UFRN). This research presents a correlation study between the students' soft skills scores and indicators based on mining software repositories. This study contributes: (i) in the presentation of professors' perception of the di culties and opportunities for improving management and monitoring practices in collaborative software development projects; (ii) in investigating relationships between soft skills and activities performed by students using software repositories; (iii) in encouraging the development of soft skills and the use of software repositories among software engineering students; (iv) in contributing to the state of the art of three important areas of software engineering, namely software engineering education, educational data mining and human aspects of software engineering.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: The inspiratory muscle training (IMT) has been considered an option in reversing or preventing decrease in respiratory muscle strength, however, little is known about the adaptations of these muscles arising from the training with charge. Objectives: To investigate the effect of IMT on the diaphragmatic muscle strength and function neural and structural adjustment of diaphragm in sedentary young people, compare the effects of low intensity IMT with moderate intensity IMT on the thickness, mobility and electrical activity of diaphragm and in inspiratory muscles strength and establish a protocol for conducting a systematic review to evaluate the effects of respiratory muscle training in children and adults with neuromuscular diseases. Materials and Methods: A randomized, double-blind, parallel-group, controlled trial, sample of 28 healthy, both sexes, and sedentary young people, divided into two groups: 14 in the low load training group (G10%) and 14 in the moderate load training group (G55%). The volunteers performed for 9 weeks a home IMT protocol with POWERbreathe®. The G55% trained with 55% of maximal inspiratory pressure (MIP) and the G10% used a charge of 10% of MIP. The training was conducted in sessions of 30 repetitions, twice a day, six days per week. Every two weeks was evaluated MIP and adjusted the load. Volunteers were submitted by ultrasound, surface electromyography, spirometry and manometer before and after IMT. Data were analyzed by SPSS 20.0. Were performed Student's t-test for paired samples to compare diaphragmatic thickness, MIP and MEP before and after IMT protocol and Wilcoxon to compare the RMS (root mean square) and median frequency (MedF) values also before and after training protocol. They were then performed the Student t test for independent samples to compare mobility and diaphragm thickness, MIP and MEP between two groups and the Mann-Whitney test to compare the RMS and MedF values also between the two groups. Parallel to experimental study, we developed a protocol with support from the Cochrane Collaboration on IMT in people with neuromuscular diseases. Results: There was, in both groups, increased inspiratory muscle strength (P <0.05) and expiratory in G10% (P = 0.009) increase in RMS and thickness of relaxed muscle in G55% (P = 0.005; P = 0.026) and there was no change in the MedF (P> 0.05). The comparison between two groups showed a difference in RMS (P = 0.04) and no difference in diaphragm thickness and diaphragm mobility and respiratory muscle strength. Conclusions: It was identified increased neural activity and diagrammatic structure with consequent increase in respiratory muscle strength after the IMT with moderate load. IMT with load of 10% of MIP cannot be considered as a placebo dose, it increases the inspiratory muscle strength and IMT with moderate intensity is able to enhance the recruitment of muscle fibers of diaphragm and promote their hypertrophy. The protocol for carrying out the systematic review published in The Cochrane Library.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

VALENTIM, R. A. M. ; SOUZA NETO, Plácido Antônio de. O impacto da utilização de design patterns nas métricas e estimativas de projetos de software: a utilização de padrões tem alguma influência nas estimativas?. Revista da FARN, Natal, v. 4, p. 63-74, 2006

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Smart card applications represent a growing market. Usually this kind of application manipulate and store critical information that requires some level of security, such as financial or confidential information. The quality and trustworthiness of smart card software can be improved through a rigorous development process that embraces formal techniques of software engineering. In this work we propose the BSmart method, a specialization of the B formal method dedicated to the development of smart card Java Card applications. The method describes how a Java Card application can be generated from a B refinement process of its formal abstract specification. The development is supported by a set of tools, which automates the generation of some required refinements and the translation to Java Card client (host) and server (applet) applications. With respect to verification, the method development process was formalized and verified in the B method, using the Atelier B tool [Cle12a]. We emphasize that the Java Card application is translated from the last stage of refinement, named implementation. This translation process was specified in ASF+SDF [BKV08], describing the grammar of both languages (SDF) and the code transformations through rewrite rules (ASF). This specification was an important support during the translator development and contributes to the tool documentation. We also emphasize the KitSmart library [Dut06, San12], an essential component of BSmart, containing models of all 93 classes/interfaces of Java Card API 2:2:2, of Java/Java Card data types and machines that can be useful for the specifier, but are not part of the standard Java Card library. In other to validate the method, its tool support and the KitSmart, we developed an electronic passport application following the BSmart method. We believe that the results reached in this work contribute to Java Card development, allowing the generation of complete (client and server components), and less subject to errors, Java Card applications.