28 resultados para linha de produto de software
Resumo:
The tracking between models of the requirements and architecture activities is a strategy that aims to prevent loss of information, reducing the gap between these two initial activities of the software life cycle. In the context of Software Product Lines (SPL), it is important to have this support, which allows the correspondence between this two activities, with management of variability. In order to address this issue, this paper presents a process of bidirectional mapping, defining transformation rules between elements of a goaloriented requirements model (described in PL-AOVgraph) and elements of an architectural description (defined in PL-AspectualACME). These mapping rules are evaluated using a case study: the GingaForAll LPS. To automate this transformation, we developed the MaRiPLA tool (Mapping Requirements to Product Line Architecture), through MDD techniques (Modeldriven Development), including Atlas Transformation Language (ATL) with specification of Ecore metamodels jointly with Xtext , a DSL definition framework, and Acceleo, a code generation tool, in Eclipse environment. Finally, the generated models are evaluated based on quality attributes such as variability, derivability, reusability, correctness, traceability, completeness, evolvability and maintainability, extracted from the CAFÉ Quality Model
Resumo:
This work shows a project method proposed to design and build software components from the software functional m del up to assembly code level in a rigorous fashion. This method is based on the B method, which was developed with support and interest of British Petroleum (BP). One goal of this methodology is to contribute to solve an important problem, known as The Verifying Compiler. Besides, this work describes a formal model of Z80 microcontroller and a real system of petroleum area. To achieve this goal, the formal model of Z80 was developed and documented, as it is one key component for the verification upto the assembly level. In order to improve the mentioned methodology, it was applied on a petroleum production test system, which is presented in this work. Part of this technique is performed manually. However, almost of these activities can be automated by a specific compiler. To build such compiler, the formal modelling of microcontroller and modelling of production test system should provide relevant knowledge and experiences to the design of a new compiler. In ummary, this work should improve the viability of one of the most stringent criteria for formal verification: speeding up the verification process, reducing design time and increasing the quality and reliability of the product of the final software. All these qualities are very important for systems that involve serious risks or in need of a high confidence, which is very common in the petroleum industry
Resumo:
Este trabalho tem como objetivo o desenvolvimento de interfaces com o usuário para aplicativo móvel smartphones com intuito de contribuir para a eficiência das atividades de profissionais e pesquisadores da área de fisioterapia ao oferecer suporte ao acompanhamento clínico da dor no tratamento de pacientes fibromiálgicos. Utilizando a abordagem de Design Centrado no Usuário - DCU, foram realizadas entrevistas e uma investigação contextual para a identificação inicial dos problemas e necessidades dos usuários. Verificou-se que as atividades de monitoramento e acompanhamento das sessões do tratamento de pacientes fibromiálgicos são, tradicionalmente, realizadas por meio de manipulando de formulários e fichas em papel (registro das condições de saúde do paciente) e escalas de classificação da dor em formato impresso (apresentadas ao paciente para indicação de sua dor percebida para cada ponto pré-determinado do corpo). Os procedimentos envolvidos nestas atividades dificultam o gerenciamento do desempenho do tratamento, o que, segundo relatos, reflete no comprometimento dos pacientes na adesão e frequência as sessões. A partir da observação e do levantamento das necessidades desses profissionais diante de suas atividades, foi proposto um aplicativo para smartphone com a intenção de minimizar os problemas ocasionados pelo uso das ferramentas convencionais e de prover informações rápidas acerca dos dados coletados. Então, seguindo a abordagem do DCU foi elaborado um modelo conceitual durante a etapa de concepção de soluções, o qual guiou a criação dos protótipos. A avaliação das interfaces do protótipo foi realizada com o envolvimento dos usuários a partir da técnica de avaliação cooperativa. Seus resultados proporcionaram o refinamento das interfaces e o desenvolvimento de uma nova proposta do design das interfaces em protótipo de alta fidelidade, produzido para o ambiente Android. Assim, esse trabalho faz parte do processo de desenvolvimento de um produto de software personalizado com foco na concepção e avaliação das interfaces com o usuário. Por meio da metodologia aplicada, observaram-se indícios os quais sugerem que as interfaces propostas apresentaram-se como um recurso facilitador e capaz de contribuir para eficiência das atividades no acompanhamento do tratamento de pacientes fibromiálgicos
Resumo:
The mortar is a type of adhesive products used in large scale in construction, it is a function of its variety and ease of application . Although industrialized product and endowed with technology in its production is very frequent occurrence of the same pathology , which causes frequent damage and losses in the construction industry. Faced with this real market situation , the technical and scientific study of the effects of the addition of diatomite on the rheological and mechanical behavior of adhesive mortars are needed. This work back as a suggestion the use of diatomite as a mineral additive in formulations of adhesive mortars for partial replacement of cellulose based additives . The choice of using this mineral occurs through physical, chemical and rheological properties that justify its use in this product line , and is a raw material abundant in our region and can thus contribute positively to the minimization of direct costs cellulose -based additives . Industrial adhesive mortar used for comparison , was type AC1 . Formulations of adhesive mortar with diatomite held constant dosed quantities of sand, cement and the water / cement (w / c ) , or adhesive mortar formulations were developed with levels 10, 20, 30 and 40% of diatomite substituting part of the cellulose -based additives . These mortars were subjected to the following tests that define and evaluate the rheological and mechanical behavior of this type of mortar. The results attest the best performance of the adhesive mortar type AC1 with partial replacement of 30 % of the cellulose-based additive for diatomite
Resumo:
This paper proposes a systematic approach to management of variability modelsdriven and aspects using the mechanisms of approaches Aspect-Oriented Software Development (AOSD) and Model-Driven Development (MDD). The main goal of the approach, named CrossMDA-SPL, is to improve the management(gerência), modularization and isolation ou separation of the variability of the LPSs of architecture in a high level of abstraction (model) at the design and implementing phases of development Software Product Lines (SPLs), exploiting the synergy between AOSD and MDD. The CrossMDA-SPL approach defines some artifacts basis for advance the separation clear in between the mandatory (bounden) and optional features in the architecture of SPL. The artifacts are represented by two models named: (i) core model (base domain) - responsible for specify the common features the all members of the SPL, and (ii) variability model - responsible for represent the variables features of SPL. In addition, the CrossMDA-SPL approach is composed of: (i) guidelines for modeling and representation of variability, (ii) CrossMDA-SPL services and process, and (iii) models of the architecture of SPL or product instance of SPL. The guidelines use the advantages of AOSD and MDD to promote a better modularization of the variable features of the architecture of SPL during the creation of core and variability models of the approach. The services and sub-processes are responsible for combination automatically, through of process of transformation between the core and variability models, and the generation of new models that represent the implementation of the architecture of SPL or a instance model of SPL. Mechanisms for effective modularization of variability for architectures of SPL at model level. The concepts are described and measured with the execution of a case study of an SPL for management systems of transport electronic tickets
Resumo:
Multi-Cloud Applications are composed of services offered by multiple cloud platforms where the user/developer has full knowledge of the use of such platforms. The use of multiple cloud platforms avoids the following problems: (i) vendor lock-in, which is dependency on the application of a certain cloud platform, which is prejudicial in the case of degradation or failure of platform services, or even price increasing on service usage; (ii) degradation or failure of the application due to fluctuations in quality of service (QoS) provided by some cloud platform, or even due to a failure of any service. In multi-cloud scenario is possible to change a service in failure or with QoS problems for an equivalent of another cloud platform. So that an application can adopt the perspective multi-cloud is necessary to create mechanisms that are able to select which cloud services/platforms should be used in accordance with the requirements determined by the programmer/user. In this context, the major challenges in terms of development of such applications include questions such as: (i) the choice of which underlying services and cloud computing platforms should be used based on the defined user requirements in terms of functionality and quality (ii) the need to continually monitor the dynamic information (such as response time, availability, price, availability), related to cloud services, in addition to the wide variety of services, and (iii) the need to adapt the application if QoS violations affect user defined requirements. This PhD thesis proposes an approach for dynamic adaptation of multi-cloud applications to be applied when a service is unavailable or when the requirements set by the user/developer point out that other available multi-cloud configuration meets more efficiently. Thus, this work proposes a strategy composed of two phases. The first phase consists of the application modeling, exploring the similarities representation capacity and variability proposals in the context of the paradigm of Software Product Lines (SPL). In this phase it is used an extended feature model to specify the cloud service configuration to be used by the application (similarities) and the different possible providers for each service (variability). Furthermore, the non-functional requirements associated with cloud services are specified by properties in this model by describing dynamic information about these services. The second phase consists of an autonomic process based on MAPE-K control loop, which is responsible for selecting, optimally, a multicloud configuration that meets the established requirements, and perform the adaptation. The adaptation strategy proposed is independent of the used programming technique for performing the adaptation. In this work we implement the adaptation strategy using various programming techniques such as aspect-oriented programming, context-oriented programming and components and services oriented programming. Based on the proposed steps, we tried to assess the following: (i) the process of modeling and the specification of non-functional requirements can ensure effective monitoring of user satisfaction; (ii) if the optimal selection process presents significant gains compared to sequential approach; and (iii) which techniques have the best trade-off when compared efforts to development/modularity and performance.
Resumo:
The software product line engineering brings advantages when compared with the traditional software development regarding the mass customization of the system components. However, there are scenarios that to maintain separated clones of a software system seems to be an easier and more flexible approach to manage their variabilities of a software product line. This dissertation evaluates qualitatively an approach that aims to support the reconciliation of functionalities between cloned systems. The analyzed approach is based on mining data about the issues and source code of evolved cloned web systems. The next step is to process the merge conflicts collected by the approach and not indicated by traditional control version systems to identify potential integration problems from the cloned software systems. The results of the study show the feasibility of the approach to perform a systematic characterization and analysis of merge conflicts for large-scale web-based systems.
Resumo:
In the Hydrocarbon exploration activities, the great enigma is the location of the deposits. Great efforts are undertaken in an attempt to better identify them, locate them and at the same time, enhance cost-effectiveness relationship of extraction of oil. Seismic methods are the most widely used because they are indirect, i.e., probing the subsurface layers without invading them. Seismogram is the representation of the Earth s interior and its structures through a conveniently disposed arrangement of the data obtained by seismic reflection. A major problem in this representation is the intensity and variety of present noise in the seismogram, as the surface bearing noise that contaminates the relevant signals, and may mask the desired information, brought by waves scattered in deeper regions of the geological layers. It was developed a tool to suppress these noises based on wavelet transform 1D and 2D. The Java language program makes the separation of seismic images considering the directions (horizontal, vertical, mixed or local) and bands of wavelengths that form these images, using the Daubechies Wavelets, Auto-resolution and Tensor Product of wavelet bases. Besides, it was developed the option in a single image, using the tensor product of two-dimensional wavelets or one-wavelet tensor product by identities. In the latter case, we have the wavelet decomposition in a two dimensional signal in a single direction. This decomposition has allowed to lengthen a certain direction the two-dimensional Wavelets, correcting the effects of scales by applying Auto-resolutions. In other words, it has been improved the treatment of a seismic image using 1D wavelet and 2D wavelet at different stages of Auto-resolution. It was also implemented improvements in the display of images associated with breakdowns in each Auto-resolution, facilitating the choices of images with the signals of interest for image reconstruction without noise. The program was tested with real data and the results were good
Resumo:
Nowadays, where the market competition requires products with better quality and a constant search for cost savings and a better use of raw materials, the research for more efficient control strategies becomes vital. In Natural Gas Processin Units (NGPUs), as in the most chemical processes, the quality control is accomplished through their products composition. However, the chemical composition analysis has a long measurement time, even when performed by instruments such as gas chromatographs. This fact hinders the development of control strategies to provide a better process yield. The natural gas processing is one of the most important activities in the petroleum industry. The main economic product of a NGPU is the liquefied petroleum gas (LPG). The LPG is ideally composed by propane and butane, however, in practice, its composition has some contaminants, such as ethane and pentane. In this work is proposed an inferential system using neural networks to estimate the ethane and pentane mole fractions in LPG and the propane mole fraction in residual gas. The goal is to provide the values of these estimated variables in every minute using a single multilayer neural network, making it possibly to apply inferential control techniques in order to monitor the LPG quality and to reduce the propane loss in the process. To develop this work a NGPU was simulated in HYSYS R software, composed by two distillation collumns: deethanizer and debutanizer. The inference is performed through the process variables of the PID controllers present in the instrumentation of these columns. To reduce the complexity of the inferential neural network is used the statistical technique of principal component analysis to decrease the number of network inputs, thus forming a hybrid inferential system. It is also proposed in this work a simple strategy to correct the inferential system in real-time, based on measurements of the chromatographs which may exist in process under study
Resumo:
The aim of the present study was to extract vegetable oil from brown linseed (Linum usitatissimum L.), determine fatty acid levels, the antioxidant capacity of the extracted oil and perform a rapid economic assessment of the SFE process in the manufacture of oil. The experiments were conducted in a test bench extractor capable of operating with carbon dioxide and co-solvents, obeying 23 factorial planning with central point in triplicate, and having process yield as response variable and pressure, temperature and percentage of cosolvent as independent variables. The yield (mass of extracted oil/mass of raw material used) ranged from 2.2% to 28.8%, with the best results obtained at 250 bar and 50ºC, using 5% (v/v) ethanol co-solvent. The influence of the variables on extraction kinetics and on the composition of the linseed oil obtained was investigated. The extraction kinetic curves obtained were based on different mathematical models available in the literature. The Martínez et al. (2003) model and the Simple Single Plate (SSP) model discussed by Gaspar et al. (2003) represented the experimental data with the lowest mean square errors (MSE). A manufacturing cost of US$17.85/kgoil was estimated for the production of linseed oil using TECANALYSIS software and the Rosa and Meireles method (2005). To establish comparisons with SFE, conventional extraction tests were conducted with a Soxhlet device using petroleum ether. These tests obtained mean yields of 35.2% for an extraction time of 5h. All the oil samples were sterilized and characterized in terms of their composition in fatty acids (FA) using gas chromatography. The main fatty acids detected were: palmitic (C16:0), stearic (C18:0), oleic (C18:1), linoleic (C18:2n-6) and α-linolenic (C18:3n-3). The FA contents obtained with Soxhlet dif ered from those obtained with SFE, with higher percentages of saturated and monounsaturated FA with the Soxhlet technique using petroleum ether. With respect to α-linolenic content (main component of linseed oil) in the samples, SFE performed better than Soxhlet extraction, obtaining percentages between 51.18% and 52.71%, whereas with Soxhlet extraction it was 47.84%. The antioxidant activity of the oil was assessed in the β-carotene/linoleic acid system. The percentages of inhibition of the oxidative process reached 22.11% for the SFE oil, but only 6.09% for commercial oil (cold pressing), suggesting that the SFE technique better preserves the phenolic compounds present in the seed, which are likely responsible for the antioxidant nature of the oil. In vitro tests with the sample displaying the best antioxidant response were conducted in rat liver homogenate to investigate the inhibition of spontaneous lipid peroxidation or autooxidation of biological tissue. Linseed oil proved to be more efficient than fish oil (used as standard) in decreasing lipid peroxidation in the liver tissue of Wistar rats, yielding similar results to those obtained with the use of BHT (synthetic antioxidant). Inhibitory capacity may be explained by the presence of phenolic compounds with antioxidant activity in the linseed oil. The results obtained indicate the need for more detailed studies, given the importance of linseed oil as one of the greatest sources of ω3 among vegetable oils
Resumo:
Model-oriented strategies have been used to facilitate products customization in the software products lines (SPL) context and to generate the source code of these derived products through variability management. Most of these strategies use an UML (Unified Modeling Language)-based model specification. Despite its wide application, the UML-based model specification has some limitations such as the fact that it is essentially graphic, presents deficiencies regarding the precise description of the system architecture semantic representation, and generates a large model, thus hampering the visualization and comprehension of the system elements. In contrast, architecture description languages (ADLs) provide graphic and textual support for the structural representation of architectural elements, their constraints and interactions. This thesis introduces ArchSPL-MDD, a model-driven strategy in which models are specified and configured by using the LightPL-ACME ADL. Such strategy is associated to a generic process with systematic activities that enable to automatically generate customized source code from the product model. ArchSPLMDD strategy integrates aspect-oriented software development (AOSD), modeldriven development (MDD) and SPL, thus enabling the explicit modeling as well as the modularization of variabilities and crosscutting concerns. The process is instantiated by the ArchSPL-MDD tool, which supports the specification of domain models (the focus of the development) in LightPL-ACME. The ArchSPL-MDD uses the Ginga Digital TV middleware as case study. In order to evaluate the efficiency, applicability, expressiveness, and complexity of the ArchSPL-MDD strategy, a controlled experiment was carried out in order to evaluate and compare the ArchSPL-MDD tool with the GingaForAll tool, which instantiates the process that is part of the GingaForAll UML-based strategy. Both tools were used for configuring the products of Ginga SPL and generating the product source code
Resumo:
In production lines, the entire process is bound to unexpected happenings which may cost losing the production quality. Thus, it means losses to the manufacturer. Identify such causes and remove them is the task of the processing management. The on-line control system consists of periodic inspection of every month produced item. Once any of those items is quali ed as not t, it is admitted that a change in the fraction of the items occurred, and then the process is stopped for adjustments. This work is an extension of Quinino & Ho (2010) and has as objective main to make the monitoramento in a process through the control on-line of quality for the number of non-conformities about the inspected item. The strategy of decision to verify if the process is under control, is directly associated to the limits of the graphic control of non-conformities of the process. A policy of preventive adjustments is incorporated in order to enlarge the conforming fraction of the process. With the help of the R software, a sensibility analysis of the proposed model is done showing in which situations it is most interesting to execute the preventive adjustment
Resumo:
In the Hydrocarbon exploration activities, the great enigma is the location of the deposits. Great efforts are undertaken in an attempt to better identify them, locate them and at the same time, enhance cost-effectiveness relationship of extraction of oil. Seismic methods are the most widely used because they are indirect, i.e., probing the subsurface layers without invading them. Seismogram is the representation of the Earth s interior and its structures through a conveniently disposed arrangement of the data obtained by seismic reflection. A major problem in this representation is the intensity and variety of present noise in the seismogram, as the surface bearing noise that contaminates the relevant signals, and may mask the desired information, brought by waves scattered in deeper regions of the geological layers. It was developed a tool to suppress these noises based on wavelet transform 1D and 2D. The Java language program makes the separation of seismic images considering the directions (horizontal, vertical, mixed or local) and bands of wavelengths that form these images, using the Daubechies Wavelets, Auto-resolution and Tensor Product of wavelet bases. Besides, it was developed the option in a single image, using the tensor product of two-dimensional wavelets or one-wavelet tensor product by identities. In the latter case, we have the wavelet decomposition in a two dimensional signal in a single direction. This decomposition has allowed to lengthen a certain direction the two-dimensional Wavelets, correcting the effects of scales by applying Auto-resolutions. In other words, it has been improved the treatment of a seismic image using 1D wavelet and 2D wavelet at different stages of Auto-resolution. It was also implemented improvements in the display of images associated with breakdowns in each Auto-resolution, facilitating the choices of images with the signals of interest for image reconstruction without noise. The program was tested with real data and the results were good