876 resultados para software quality assurance


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Among the potentially polluting economic activities that compromise the quality of groundwater are the gas stations. The city of Natal has about 120 gas stations, of which only has an environmental license for operation. Discontinuities in the offices were notified by the Public Ministry of Rio Grande do Norte to carry out the environmental adaptations, among which is the investigation of environmental liabilities. The preliminary and confirmatory stages of this investigation consisted in the evaluation of soil gas surveys with two confirmatory chemical analysis of BTEX, PAH and TPH. To get a good evaluation and interpretation of results obtained in the field, it became necessary three-dimensional representation of them. We used a CAD software to graph the equipment installed in a retail service station fuel in Natal, as well as the plumes of contamination by volatile organic compounds. The tool was concluded that contamination is not located in the current system of underground storage of fuel development, but reflects the historical past in which tanks were removed not tight gasoline and diesel

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Nowadays, the importance of using software processes is already consolidated and is considered fundamental to the success of software development projects. Large and medium software projects demand the definition and continuous improvement of software processes in order to promote the productive development of high-quality software. Customizing and evolving existing software processes to address the variety of scenarios, technologies, culture and scale is a recurrent challenge required by the software industry. It involves the adaptation of software process models for the reality of their projects. Besides, it must also promote the reuse of past experiences in the definition and development of software processes for the new projects. The adequate management and execution of software processes can bring a better quality and productivity to the produced software systems. This work aimed to explore the use and adaptation of consolidated software product lines techniques to promote the management of the variabilities of software process families. In order to achieve this aim: (i) a systematic literature review is conducted to identify and characterize variability management approaches for software processes; (ii) an annotative approach for the variability management of software process lines is proposed and developed; and finally (iii) empirical studies and a controlled experiment assess and compare the proposed annotative approach against a compositional one. One study a comparative qualitative study analyzed the annotative and compositional approaches from different perspectives, such as: modularity, traceability, error detection, granularity, uniformity, adoption, and systematic variability management. Another study a comparative quantitative study has considered internal attributes of the specification of software process lines, such as modularity, size and complexity. Finally, the last study a controlled experiment evaluated the effort to use and the understandability of the investigated approaches when modeling and evolving specifications of software process lines. The studies bring evidences of several benefits of the annotative approach, and the potential of integration with the compositional approach, to assist the variability management of software process lines

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The tracking between models of the requirements and architecture activities is a strategy that aims to prevent loss of information, reducing the gap between these two initial activities of the software life cycle. In the context of Software Product Lines (SPL), it is important to have this support, which allows the correspondence between this two activities, with management of variability. In order to address this issue, this paper presents a process of bidirectional mapping, defining transformation rules between elements of a goaloriented requirements model (described in PL-AOVgraph) and elements of an architectural description (defined in PL-AspectualACME). These mapping rules are evaluated using a case study: the GingaForAll LPS. To automate this transformation, we developed the MaRiPLA tool (Mapping Requirements to Product Line Architecture), through MDD techniques (Modeldriven Development), including Atlas Transformation Language (ATL) with specification of Ecore metamodels jointly with Xtext , a DSL definition framework, and Acceleo, a code generation tool, in Eclipse environment. Finally, the generated models are evaluated based on quality attributes such as variability, derivability, reusability, correctness, traceability, completeness, evolvability and maintainability, extracted from the CAFÉ Quality Model

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Software Products Lines (SPL) is a software engineering approach to developing software system families that share common features and differ in other features according to the requested software systems. The adoption of the SPL approach can promote several benefits such as cost reduction, product quality, productivity, and time to market. On the other hand, the SPL approach brings new challenges to the software evolution that must be considered. Recent research work has explored and proposed automated approaches based on code analysis and traceability techniques for change impact analysis in the context of SPL development. There are existing limitations concerning these approaches such as the customization of the analysis functionalities to address different strategies for change impact analysis, and the change impact analysis of fine-grained variability. This dissertation proposes a change impact analysis tool for SPL development, called Squid Impact Analyzer. The tool allows the implementation of change impact analysis based on information from variability modeling, mapping of variability to code assets, and existing dependency relationships between code assets. An assessment of the tool is conducted through an experiment that compare the change impact analysis results provided by the tool with real changes applied to several evolution releases from a SPL for media management in mobile devices

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The main goal of Regression Test (RT) is to reuse the test suite of the latest version of a software in its current version, in order to maximize the value of the tests already developed and ensure that old features continue working after the new changes. Even with reuse, it is common that not all tests need to be executed again. Because of that, it is encouraged to use Regression Tests Selection (RTS) techniques, which aims to select from all tests, only those that reveal faults, this reduces costs and makes this an interesting practice for the testing teams. Several recent research works evaluate the quality of the selections performed by RTS techniques, identifying which one presents the best results, measured by metrics such as inclusion and precision. The RTS techniques should seek in the System Under Test (SUT) for tests that reveal faults. However, because this is a problem without a viable solution, they alternatively seek for tests that reveal changes, where faults may occur. Nevertheless, these changes may modify the execution flow of the algorithm itself, leading some tests no longer exercise the same stretch. In this context, this dissertation investigates whether changes performed in a SUT would affect the quality of the selection of tests performed by an RTS, if so, which features the changes present which cause errors, leading the RTS to include or exclude tests wrongly. For this purpose, a tool was developed using the Java language to automate the measurement of inclusion and precision averages achieved by a regression test selection technique for a particular feature of change. In order to validate this tool, an empirical study was conducted to evaluate the RTS technique Pythia, based on textual differencing, on a large web information system, analyzing the feature of types of tasks performed to evolve the SUT

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A Internet atual vem sofrendo vários problemas em termos de escalabilidade, desempenho, mobilidade, etc., devido ao vertiginoso incremento no número de usuários e o surgimento de novos serviços com novas demandas, propiciando assim o nascimento da Internet do Futuro. Novas propostas sobre redes orientadas a conteúdo, como a arquitetura Entidade Titulo (ETArch), proveem novos serviços para este tipo de cenários, implementados sobre o paradigma de redes definidas por software. Contudo, o modelo de transporte do ETArch é equivalente ao modelo best-effort da Internet atual, e vem limitando a confiabilidade das suas comunicações. Neste trabalho, ETArch é redesenhado seguindo o paradigma do sobreaprovisionamento de recursos para conseguir uma alocação de recursos avançada integrada com OpenFlow. Como resultado, o framework SMART (Suporte de Sessões Móveis com Alta Demanda de Recursos de Transporte), permite que a rede defina semanticamente os requisitos qualitativos das sessões para assim gerenciar o controle de Qualidade de Serviço visando manter a melhor Qualidade de Experiência possível. A avaliação do planos de dados e de controle teve lugar na plataforma de testes na ilha do projeto OFELIA, mostrando o suporte de aplicações móveis multimídia com alta demanda de recursos de transporte com QoS e QoE garantidos através de um esquema de sinalização restrito em comparação com o ETArch legado

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work shows a project method proposed to design and build software components from the software functional m del up to assembly code level in a rigorous fashion. This method is based on the B method, which was developed with support and interest of British Petroleum (BP). One goal of this methodology is to contribute to solve an important problem, known as The Verifying Compiler. Besides, this work describes a formal model of Z80 microcontroller and a real system of petroleum area. To achieve this goal, the formal model of Z80 was developed and documented, as it is one key component for the verification upto the assembly level. In order to improve the mentioned methodology, it was applied on a petroleum production test system, which is presented in this work. Part of this technique is performed manually. However, almost of these activities can be automated by a specific compiler. To build such compiler, the formal modelling of microcontroller and modelling of production test system should provide relevant knowledge and experiences to the design of a new compiler. In ummary, this work should improve the viability of one of the most stringent criteria for formal verification: speeding up the verification process, reducing design time and increasing the quality and reliability of the product of the final software. All these qualities are very important for systems that involve serious risks or in need of a high confidence, which is very common in the petroleum industry

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study aimed at evaluating the effects of trace mineral levels and sources supplemented to diets fed to semi-heavy layers in their second laying cycle on the quality of eggs stored for 14 days at different temperatures. The experimental diets consisted of the inclusion of inorganic trace minerals (T1 - control: 100% ITM) and five supplementation levels of organic trace minerals (carboaminophopho chelates) (110, 100, 90, 80, and 70% OTM). Trace mineral inclusion levels (mg/kg feed) were: T1: control - 100% ITM: Zn (54), Fe (54), Mn (72), Cu (10), I (0.61) Se (0.3); T2 - 110% OTM: Zn (59.4), Fe (59.4), Mn (79.2), Cu (11.88), I (1.21) Se (0.59); T3 - 100%: OTM: Zn (54), Fe (54), Mn (72), Cu (10.8), I (1.10) Se (0.54); T4 - 90% OTM: Zn (48.6), Fe (48.6), Mn (64.8), Cu (9.72), I (0.99) Se (0.49); T5 - 80% OTM: Zn (43.2), Fe (43.2), Mn (57.6), Cu (8.64), I (0.88), Se (0.43); T6 - 70% OTM: Zn (37.8), Fe (37.8), Mn (50.4), Cu (7.56), I (0.77) Se (0.38). A completely randomized experimental design in a split-plot arrangement with 60 treatments of four replicates each was applied. The combination of six diets versus storage temperature (room or under refrigeration) was randomized in plots, whereas the sub-plots consisted of storage times (0, 3, 7, 10, and 14 days). Data were submitted to analysis of variance of a model in slip-plots in time using the software package SAS (2000) at 5% probability level. It was concluded that 70% OTM supplementation can be used with no damage to egg quality, independently from storage temperature or time. The quality of refrigerated eggs stored up to 14 days is better than those stored at room temperature.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An experiment was carried out to establish mean bone quality values of the tibiae and femora of ostriches and to evaluate these bones. The right leg bones of 10 males and 10 female African Black ostriches were evaluated. Birds were radiographed immediately after slaughter (during bleeding), with the aid of a portable X-ray apparatus. The obtained radiographs were scanned and bone mineral density means were obtained using software. Bone strength, Seedor index, and dry matter percentage were evaluated and correlated to weight gain during the finishing period (3-13 months of age). Mean values of the evaluated bone quality traits, not previously found in literature, were established. There were no significant differences between males and females in performance or bone quality parameters. It was concluded that male and female ostriches present similar performance and bone quality at slaughter age.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The objective of this research was to evaluate average daily gain (ADG), carcass traits, meat tenderness and profitability of keeping cattle fed different oilseeds and vitamin E in feedlot. A total of 40 Red Norte young bulls with initial average body weight of 339±15 kg were utilized. The experimental design was completely randomized in a 2 × 2 factorial arrangement. The experiment lasted 84 days and experimental diets presented soybeans or cottonseeds as lipid sources associated or not to daily supplementation of 2,500 UI vitamin E per animal. The concentrate:roughage ratio was 60:40. Diets had the same amount of nitrogen (13% CP) and ether extract (6.5%). The data were analyzed by means of statistical software SAS 9.1. Neither vitamin supplementation nor lipid source affected ADG. There was no interaction between lipid source and vitamin supplementation for the variables studied. The inclusion of cottonseed reduced the carcass yield. There was no effect of diets on hot and cold carcass weights or prime cuts. The inclusion of cottonseed reduced the backfat thickness. No effect of experimental diets on the rib-eye area was observed. There was no effect of lipid source or vitamin supplementation on meat tenderness, which was affected, however, by ageing time. Diets with soybeans presented higher cost per animal. The utilization of soybean implied reduction of the gross margin (R$ 59.17 and R$ 60.51 for diets based on soy with and without supplemental vitamin, respectively, vs. R$ 176.42 and R$ 131.79 for diets based on cottonseed). The utilization of cottonseed enables improvement of profitability of feedlot fattening, in spite of negatively affecting some carcass characteristics.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work presents one software developed to process solar radiation data. This software can be used in meteorological and climatic stations, and also as a support for solar radiation measurements in researches of solar energy availability allowing data quality control, statistical calculations and validation of models, as well as ease interchanging of data. (C) 1999 Elsevier B.V. Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The computational program called GIS_EM (Geographic Information System for Environmental Monitoring), a software devised to manage geographic information for monitoring soil, surface, and ground water, developed for use in the Health, Safety, and Environment Division of Paulinia Refinery is presented. This program enables registering and management of alphanumeric information pertaining to specific themes such as drilling performed for sample collection and for installation of monitoring wells, geophysical and other tests, results of chemical analyses of soil, surface, and groundwater, as well as reference values providing orientation for soil and water quality, such as EPA, Dutch List, etc. Management of such themes is performed by means of alphanumeric search tools, with specific filters and, in the case of spatial search, through the selection of spatial elements (themes) in map view. Documents existing in digital form, such as reports, photos, maps, may be registered and managed in the network environment. As the system centralizes information generated upon environmental investigations, it expedites access to and search of documents produced and stored in the network environment, minimizing search time and the need to file printed documents. This is an abstract of a paper presented at the AIChE Annual Meeting and Fall Showcase (Cincinnati, OH 10/30/2005-11/4/2005).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This article presents software architecture for a web-based system to aid project managing, conceptually founded on guidelines of the Project Management Body of Knowledge (PMBoK) and on ISO/IEC 9126, as well as on the result of an empiric study done in Brazil. Based on these guidelines, this study focused on two different points of view about project management: the view of those who develop software systems to aid management and the view of those who use these systems. The designed software architecture is capable of guiding an incremental development of a quality system that will satisfy today's marketing necessities, principally those of small and medium size enterprises.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work has as objectives the implementation of a intelligent computational tool to identify the non-technical losses and to select its most relevant features, considering information from the database with industrial consumers profiles of a power company. The solution to this problem is not trivial and not of regional character, the minimization of non-technical loss represents the guarantee of investments in product quality and maintenance of power systems, introduced by a competitive environment after the period of privatization in the national scene. This work presents using the WEKA software to the proposed objective, comparing various classification techniques and optimization through intelligent algorithms, this way, can be possible to automate applications on Smart Grids. © 2012 IEEE.