953 resultados para Reverse engineering processes
Resumo:
Detecting bugs as early as possible plays an important role in ensuring software quality before shipping. We argue that mining previous bug fixes can produce good knowledge about why bugs happen and how they are fixed. In this paper, we mine the change history of 717 open source projects to extract bug-fix patterns. We also manually inspect many of the bugs we found to get insights into the contexts and reasons behind those bugs. For instance, we found out that missing null checks and missing initializations are very recurrent and we believe that they can be automatically detected and fixed.
Resumo:
While coronary atherosclerosis is a leading cause of mortality, evaluation of coronary lesions was previously limited to either indirect angiographic assessment of the lumen silhouette or post mortem investigations. Intracoronary (IC) imaging modalities have been developed that allow for visualization and characterization of coronary atheroma in living patients. Used alone or in combination, these modalities have enhanced our understanding of pathobiological mechanisms of atherosclerosis, identified factors responsible for disease progression, and documented the ability of various medications to reverse the processes of plaque growth and destabilization. These methodologies have established a link between in vivo plaque characteristics and subsequent coronary events, thereby improving individual risk stratification, paving the way for risk-tailored systemic therapies and raising the option for pre-emptive interventions. Moreover, IC imaging is increasingly used during coronary interventions to support therapeutic decision-making in angiographically inconclusive disease, guide and optimize procedural results in selected lesion and patient subsets, and unravel mechanisms underlying stent failure. This review aims to summarize current evidence regarding the role of IC imaging for diagnosis and risk stratification of coronary atherosclerosis, and to describe its clinical role for guiding percutaneous coronary interventions. Future perspectives for in-depth plaque characterization using novel techniques and multimodality imaging approaches are also discussed.
Resumo:
Software testing is a key aspect of software reliability and quality assurance in a context where software development constantly has to overcome mammoth challenges in a continuously changing environment. One of the characteristics of software testing is that it has a large intellectual capital component and can thus benefit from the use of the experience gained from past projects. Software testing can, then, potentially benefit from solutions provided by the knowledge management discipline. There are in fact a number of proposals concerning effective knowledge management related to several software engineering processes. Objective: We defend the use of a lesson learned system for software testing. The reason is that such a system is an effective knowledge management resource enabling testers and managers to take advantage of the experience locked away in the brains of the testers. To do this, the experience has to be gathered, disseminated and reused. Method: After analyzing the proposals for managing software testing experience, significant weaknesses have been detected in the current systems of this type. The architectural model proposed here for lesson learned systems is designed to try to avoid these weaknesses. This model (i) defines the structure of the software testing lessons learned; (ii) sets up procedures for lesson learned management; and (iii) supports the design of software tools to manage the lessons learned. Results: A different approach, based on the management of the lessons learned that software testing engineers gather from everyday experience, with two basic goals: usefulness and applicability. Conclusion: The architectural model proposed here lays the groundwork to overcome the obstacles to sharing and reusing experience gained in the software testing and test management. As such, it provides guidance for developing software testing lesson learned systems.
Resumo:
Due to the advancement of both, information technology in general, and databases in particular; data storage devices are becoming cheaper and data processing speed is increasing. As result of this, organizations tend to store large volumes of data holding great potential information. Decision Support Systems, DSS try to use the stored data to obtain valuable information for organizations. In this paper, we use both data models and use cases to represent the functionality of data processing in DSS following Software Engineering processes. We propose a methodology to develop DSS in the Analysis phase, respective of data processing modeling. We have used, as a starting point, a data model adapted to the semantics involved in multidimensional databases or data warehouses, DW. Also, we have taken an algorithm that provides us with all the possible ways to automatically cross check multidimensional model data. Using the aforementioned, we propose diagrams and descriptions of use cases, which can be considered as patterns representing the DSS functionality, in regard to DW data processing, DW on which DSS are based. We highlight the reusability and automation benefits that this can be achieved, and we think this study can serve as a guide in the development of DSS.
Resumo:
Linked Data is the key paradigm of the Semantic Web, a new generation of the World Wide Web that promises to bring meaning (semantics) to data. A large number of both public and private organizations have published their data following the Linked Data principles, or have done so with data from other organizations. To this extent, since the generation and publication of Linked Data are intensive engineering processes that require high attention in order to achieve high quality, and since experience has shown that existing general guidelines are not always sufficient to be applied to every domain, this paper presents a set of guidelines for generating and publishing Linked Data in the context of energy consumption in buildings (one aspect of Building Information Models). These guidelines offer a comprehensive description of the tasks to perform, including a list of steps, tools that help in achieving the task, various alternatives for performing the task, and best practices and recommendations. Furthermore, this paper presents a complete example on the generation and publication of Linked Data about energy consumption in buildings, following the presented guidelines, in which the energy consumption data of council sites (e.g., buildings and lights) belonging to the Leeds City Council jurisdiction have been generated and published as Linked Data.
Resumo:
This paper introduces a novel MILP approach for the design of distillation columns sequences of zeotropic mixtures that explicitly include from conventional to fully thermally coupled sequences and divided wall columns with a single wall. The model is based on the use of two superstructure levels. In the upper level a superstructure that includes all the basic sequences of separation tasks is postulated. The lower level is an extended tree that explicitly includes different thermal states and compositions of the feed to a given separation task. In that way, it is possible to a priori optimize all the possible separation tasks involved in the superstructure. A set of logical relationships relates the feasible sequences with the optimized tasks in the extended tree resulting in a MILP to select the optimal sequence. The performance of the model in terms of robustness and computational time is illustrated with several examples.
Resumo:
The optimal integration between heat and work may significantly reduce the energy demand and consequently the process cost. This paper introduces a new mathematical model for the simultaneous synthesis of heat exchanger networks (HENs) in which the pressure levels of the process streams can be adjusted to enhance the heat integration. A superstructure is proposed for the HEN design with pressure recovery, developed via generalized disjunctive programming (GDP) and mixed-integer nonlinear programming (MINLP) formulation. The process conditions (stream temperature and pressure) must be optimized. Furthermore, the approach allows for coupling of the turbines and compressors and selection of the turbines and valves to minimize the total annualized cost, which consists of the operational and capital expenses. The model is tested for its applicability in three case studies, including a cryogenic application. The results indicate that the energy integration reduces the quantity of utilities required, thus decreasing the overall cost.
Resumo:
The optimal integration of work and its interaction with heat can represent large energy savings in industrial plants. This paper introduces a new optimization model for the simultaneous synthesis of work exchange networks (WENs), with heat integration for the optimal pressure recovery of process gaseous streams. The proposed approach for the WEN synthesis is analogous to the well-known problem of synthesis of heat exchanger networks (HENs). Thus, there is work exchange between high-pressure (HP) and low-pressure (LP) streams, achieved by pressure manipulation equipment running on common axes. The model allows the use of several units of single-shaft-turbine-compressor (SSTC), as well as stand-alone compressors, turbines and valves. Helper motors and generators are used to respond to any demand and excess of energy. Moreover, between the WEN stages the streams are sent to the HEN to promote thermal recovery, aiming to enhance the work integration. A multi-stage superstructure is proposed to represent the process. The WEN superstructure is optimized in a mixed-integer nonlinear programming (MINLP) formulation and solved with the GAMS software, with the goal of minimizing the total annualized cost. Three examples are conducted to verify the accuracy of the proposed method. In all case studies, the heat integration between WEN stages is essential to improve the pressure recovery, and to reduce the total costs involved in the process.
Resumo:
The optimization of chemical processes where the flowsheet topology is not kept fixed is a challenging discrete-continuous optimization problem. Usually, this task has been performed through equation based models. This approach presents several problems, as tedious and complicated component properties estimation or the handling of huge problems (with thousands of equations and variables). We propose a GDP approach as an alternative to the MINLP models coupled with a flowsheet program. The novelty of this approach relies on using a commercial modular process simulator where the superstructure is drawn directly on the graphical use interface of the simulator. This methodology takes advantage of modular process simulators (specially tailored numerical methods, reliability, and robustness) and the flexibility of the GDP formulation for the modeling and solution. The optimization tool proposed is successfully applied to the synthesis of a methanol plant where different alternatives are available for the streams, equipment and process conditions.
Resumo:
In this paper, we propose a novel algorithm for the rigorous design of distillation columns that integrates a process simulator in a generalized disjunctive programming formulation. The optimal distillation column, or column sequence, is obtained by selecting, for each column section, among a set of column sections with different number of theoretical trays. The selection of thermodynamic models, properties estimation etc., are all in the simulation environment. All the numerical issues related to the convergence of distillation columns (or column sections) are also maintained in the simulation environment. The model is formulated as a Generalized Disjunctive Programming (GDP) problem and solved using the logic based outer approximation algorithm without MINLP reformulation. Some examples involving from a single column to thermally coupled sequence or extractive distillation shows the performance of the new algorithm.
Resumo:
This paper presents an alternative model to deal with the problem of optimal energy consumption minimization of non-isothermal systems with variable inlet and outlet temperatures. The model is based on an implicit temperature ordering and the “transshipment model” proposed by Papoulias and Grossmann (1983). It is supplemented with a set of logical relationships related to the relative position of the inlet temperatures of process streams and the dynamic temperature intervals. In the extreme situation of fixed inlet and outlet temperatures, the model reduces to the “transshipment model”. Several examples with fixed and variable temperatures are presented to illustrate the model's performance.
Resumo:
Mathematical programming can be used for the optimal design of shell-and-tube heat exchangers (STHEs). This paper proposes a mixed integer non-linear programming (MINLP) model for the design of STHEs, following rigorously the standards of the Tubular Exchanger Manufacturers Association (TEMA). Bell–Delaware Method is used for the shell-side calculations. This approach produces a large and non-convex model that cannot be solved to global optimality with the current state of the art solvers. Notwithstanding, it is proposed to perform a sequential optimization approach of partial objective targets through the division of the problem into sets of related equations that are easier to solve. For each one of these problems a heuristic objective function is selected based on the physical behavior of the problem. The global optimal solution of the original problem cannot be ensured even in the case in which each of the sub-problems is solved to global optimality, but at least a very good solution is always guaranteed. Three cases extracted from the literature were studied. The results showed that in all cases the values obtained using the proposed MINLP model containing multiple objective functions improved the values presented in the literature.
Resumo:
In this work, we analyze the effect of demand uncertainty on the multi-objective optimization of chemical supply chains (SC) considering simultaneously their economic and environmental performance. To this end, we present a stochastic multi-scenario mixed-integer linear program (MILP) with the unique feature of incorporating explicitly the demand uncertainty using scenarios with given probability of occurrence. The environmental performance is quantified following life cycle assessment (LCA) principles, which are represented in the model formulation through standard algebraic equations. The capabilities of our approach are illustrated through a case study. We show that the stochastic solution improves the economic performance of the SC in comparison with the deterministic one at any level of the environmental impact.
Resumo:
We present a derivative-free optimization algorithm coupled with a chemical process simulator for the optimal design of individual and complex distillation processes using a rigorous tray-by-tray model. The proposed approach serves as an alternative tool to the various models based on nonlinear programming (NLP) or mixed-integer nonlinear programming (MINLP) . This is accomplished by combining the advantages of using a commercial process simulator (Aspen Hysys), including especially suited numerical methods developed for the convergence of distillation columns, with the benefits of the particle swarm optimization (PSO) metaheuristic algorithm, which does not require gradient information and has the ability to escape from local optima. Our method inherits the superstructure developed in Yeomans, H.; Grossmann, I. E.Optimal design of complex distillation columns using rigorous tray-by-tray disjunctive programming models. Ind. Eng. Chem. Res.2000, 39 (11), 4326–4335, in which the nonexisting trays are considered as simple bypasses of liquid and vapor flows. The implemented tool provides the optimal configuration of distillation column systems, which includes continuous and discrete variables, through the minimization of the total annual cost (TAC). The robustness and flexibility of the method is proven through the successful design and synthesis of three distillation systems of increasing complexity.
Resumo:
This multidisciplinary study concerns the optimal design of processes with a view to both maximizing profit and minimizing environmental impacts. This can be achieved by a combination of traditional chemical process design methods, measurements of environmental impacts and advanced mathematical optimization techniques. More to the point, this paper presents a hybrid simulation-multiobjective optimization approach that at once optimizes the production cost and minimizes the associated environmental impacts of isobutane alkylation. This approach has also made it possible to obtain the flowsheet configurations and process variables that are needed to manufacture isooctane in a way that satisfies the above-stated double aim. The problem is formulated as a Generalized Disjunctive Programming problem and solved using state-of-the-art logic-based algorithms. It is shown, starting from existing alternatives for the process, that it is possible to systematically generate a superstructure that includes alternatives not previously considered. The optimal solution, in the form a Pareto curve, includes different structural alternatives from which the most suitable design can be selected. To evaluate the environmental impact, Life Cycle Assessment based on two different indicators is employed: Ecoindicator 99 and Global Warming Potential.