995 resultados para Optimization software
Resumo:
We propose a new approach and related indicators for globally distributed software support and development based on a 3-year process improvement project in a globally distributed engineering company. The company develops, delivers and supports a complex software system with tailored hardware components and unique end-customer installations. By applying the domain knowledge from operations management on lead time reduction and its multiple benefits to process performance, the workflows of globally distributed software development and multitier support processes were measured and monitored throughout the company. The results show that the global end-to-end process visibility and centrally managed reporting at all levels of the organization catalyzed a change process toward significantly better performance. Due to the new performance indicators based on lead times and their variation with fixed control procedures, the case company was able to report faster bug-fixing cycle times, improved response times and generally better customer satisfaction in its global operations. In all, lead times to implement new features and to respond to customer issues and requests were reduced by 50%.
Resumo:
An alternative relation to Pareto-dominance relation is proposed. The new relation is based on ranking a set of solutions according to each separate objective and an aggregation function to calculate a scalar fitness value for each solution. The relation is called as ranking-dominance and it tries to tackle the curse of dimensionality commonly observedin evolutionary multi-objective optimization. Ranking-dominance can beused to sort a set of solutions even for a large number of objectives when Pareto-dominance relation cannot distinguish solutions from one another anymore. This permits search to advance even with a large number of objectives. It is also shown that ranking-dominance does not violate Pareto-dominance. Results indicate that selection based on ranking-dominance is able to advance search towards the Pareto-front in some cases, where selection based on Pareto-dominance stagnates. However, in some cases it is also possible that search does not proceed into direction of Pareto-front because the ranking-dominance relation permits deterioration of individual objectives. Results also show that when the number of objectives increases, selection based on just Pareto-dominance without diversity maintenance is able to advance search better than with diversity maintenance. Therefore, diversity maintenance is connive at the curse of dimensionality.
Resumo:
Small centrifugal compressors are more and more widely used in many industrialsystems because of their higher efficiency and better off-design performance comparing to piston and scroll compressors as while as higher work coefficient perstage than in axial compressors. Higher efficiency is always the aim of the designer of compressors. In the present work, the influence of four partsof a small centrifugal compressor that compresses heavy molecular weight real gas has been investigated in order to achieve higher efficiency. Two parts concern the impeller: tip clearance and the circumferential position of the splitter blade. The other two parts concern the diffuser: the pinch shape and vane shape. Computational fluid dynamics is applied in this study. The Reynolds averaged Navier-Stokes flow solver Finflo is used. The quasi-steady approach is utilized. Chien's k-e turbulence model is used to model the turbulence. A new practical real gas model is presented in this study. The real gas model is easily generated, accuracy controllable and fairly fast. The numerical results and measurements show good agreement. The influence of tip clearance on the performance of a small compressor is obvious. The pressure ratio and efficiency are decreased as the size of tip clearance is increased, while the total enthalpy rise keeps almost constant. The decrement of the pressure ratio and efficiency is larger at higher mass flow rates and smaller at lower mass flow rates. The flow angles at the inlet and outlet of the impeller are increased as the size of tip clearance is increased. The results of the detailed flow field show that leakingflow is the main reason for the performance drop. The secondary flow region becomes larger as the size of tip clearance is increased and the area of the main flow is compressed. The flow uniformity is then decreased. A detailed study shows that the leaking flow rate is higher near the exit of the impeller than that near the inlet of the impeller. Based on this phenomenon, a new partiallyshrouded impeller is used. The impeller is shrouded near the exit of the impeller. The results show that the flow field near the exit of the impeller is greatly changed by the partially shrouded impeller, and better performance is achievedthan with the unshrouded impeller. The loading distribution on the impeller blade and the flow fields in the impeller is changed by moving the splitter of the impeller in circumferential direction. Moving the splitter slightly to the suction side of the long blade can improve the performance of the compressor. The total enthalpy rise is reduced if only the leading edge of the splitter ismoved to the suction side of the long blade. The performance of the compressor is decreased if the blade is bended from the radius direction at the leading edge of the splitter. The total pressure rise and the enthalpy rise of thecompressor are increased if pinch is used at the diffuser inlet. Among the fivedifferent pinch shape configurations, at design and lower mass flow rates the efficiency of a straight line pinch is the highest, while at higher mass flow rate, the efficiency of a concave pinch is the highest. The sharp corner of the pinch is the main reason for the decrease of efficiency and should be avoided. The variation of the flow angles entering the diffuser in spanwise direction is decreased if pinch is applied. A three-dimensional low solidity twisted vaned diffuser is designed to match the flow angles entering the diffuser. The numerical results show that the pressure recovery in the twisted diffuser is higher than in a conventional low solidity vaned diffuser, which also leads to higher efficiency of the twisted diffuser. Investigation of the detailed flow fields shows that the separation at lower mass flow rate in the twisted diffuser is later than in the conventional low solidity vaned diffuser, which leads to a possible wider flow range of the twisted diffuser.
Resumo:
This thesis investigates factors that affect software testing practice. The thesis consists of empirical studies, in which the affecting factors were analyzed and interpreted using quantitative and qualitative methods. First, the Delphi method was used to specify the scope of the thesis. Secondly, for the quantitative analysis 40industry experts from 30 organizational units (OUs) were interviewed. The survey method was used to explore factors that affect software testing practice. Conclusions were derived using correlation and regression analysis. Thirdly, from these 30 OUs, five were further selected for an in-depth case study. The data was collected through 41 semi-structured interviews. The affecting factors and their relationships were interpreted with qualitative analysis using grounded theory as the research method. The practice of software testing was analyzed from the process improvement and knowledge management viewpoints. The qualitative and quantitativeresults were triangulated to increase the validity of the thesis. Results suggested that testing ought to be adjusted according to the business orientation of the OU; the business orientation affects the testing organization and knowledge management strategy, and the business orientation andthe knowledge management strategy affect outsourcing. As a special case, the complex relationship between testing schedules and knowledge transfer is discussed. The results of this thesis can be used in improvingtesting processes and knowledge management in software testing.
Resumo:
Thedirect torque control (DTC) has become an accepted vector control method besidethe current vector control. The DTC was first applied to asynchronous machines,and has later been applied also to synchronous machines. This thesis analyses the application of the DTC to permanent magnet synchronous machines (PMSM). In order to take the full advantage of the DTC, the PMSM has to be properly dimensioned. Therefore the effect of the motor parameters is analysed taking the control principle into account. Based on the analysis, a parameter selection procedure is presented. The analysis and the selection procedure utilize nonlinear optimization methods. The key element of a direct torque controlled drive is the estimation of the stator flux linkage. Different estimation methods - a combination of current and voltage models and improved integration methods - are analysed. The effect of an incorrect measured rotor angle in the current model is analysed andan error detection and compensation method is presented. The dynamic performance of an earlier presented sensorless flux estimation method is made better by improving the dynamic performance of the low-pass filter used and by adapting the correction of the flux linkage to torque changes. A method for the estimation ofthe initial angle of the rotor is presented. The method is based on measuring the inductance of the machine in several directions and fitting the measurements into a model. The model is nonlinear with respect to the rotor angle and therefore a nonlinear least squares optimization method is needed in the procedure. A commonly used current vector control scheme is the minimum current control. In the DTC the stator flux linkage reference is usually kept constant. Achieving the minimum current requires the control of the reference. An on-line method to perform the minimization of the current by controlling the stator flux linkage reference is presented. Also, the control of the reference above the base speed is considered. A new estimation flux linkage is introduced for the estimation of the parameters of the machine model. In order to utilize the flux linkage estimates in off-line parameter estimation, the integration methods are improved. An adaptive correction is used in the same way as in the estimation of the controller stator flux linkage. The presented parameter estimation methods are then used in aself-commissioning scheme. The proposed methods are tested with a laboratory drive, which consists of a commercial inverter hardware with a modified software and several prototype PMSMs.
Resumo:
Software engineering is criticized as not being engineering or 'well-developed' science at all. Software engineers seem not to know exactly how long their projects will last, what they will cost, and will the software work properly after release. Measurements have to be taken in software projects to improve this situation. It is of limited use to only collect metrics afterwards. The values of the relevant metrics have to be predicted, too. The predictions (i.e. estimates) form the basis for proper project management. One of the most painful problems in software projects is effort estimation. It has a clear and central effect on other project attributes like cost and schedule, and to product attributes like size and quality. Effort estimation can be used for several purposes. In this thesis only the effort estimation in software projects for project management purposes is discussed. There is a short introduction to the measurement issues, and some metrics relevantin estimation context are presented. Effort estimation methods are covered quite broadly. The main new contribution in this thesis is the new estimation model that has been created. It takes use of the basic concepts of Function Point Analysis, but avoids the problems and pitfalls found in the method. It is relativelyeasy to use and learn. Effort estimation accuracy has significantly improved after taking this model into use. A major innovation related to the new estimationmodel is the identified need for hierarchical software size measurement. The author of this thesis has developed a three level solution for the estimation model. All currently used size metrics are static in nature, but this new proposed metric is dynamic. It takes use of the increased understanding of the nature of the work as specification and design work proceeds. It thus 'grows up' along with software projects. The effort estimation model development is not possible without gathering and analyzing history data. However, there are many problems with data in software engineering. A major roadblock is the amount and quality of data available. This thesis shows some useful techniques that have been successful in gathering and analyzing the data needed. An estimation process is needed to ensure that methods are used in a proper way, estimates are stored, reported and analyzed properly, and they are used for project management activities. A higher mechanism called measurement framework is also introduced shortly. The purpose of the framework is to define and maintain a measurement or estimationprocess. Without a proper framework, the estimation capability of an organization declines. It requires effort even to maintain an achieved level of estimationaccuracy. Estimation results in several successive releases are analyzed. It isclearly seen that the new estimation model works and the estimation improvementactions have been successful. The calibration of the hierarchical model is a critical activity. An example is shown to shed more light on the calibration and the model itself. There are also remarks about the sensitivity of the model. Finally, an example of usage is shown.
Resumo:
Quality management has become a strategic issue for organisations and is very valuable to produce quality software. However, quality management systems (QMS) are not easy to implement and maintain. The authors' experience shows the benefits of developing a QMS by first formalising it using semantic web ontologies and then putting them into practice through a semantic wiki. The QMS ontology that has been developed captures the core concepts of a traditional QMS and combines them with concepts coming from the MPIu'a development process model, which is geared towards obtaining usable and accessible software products. Then, the ontology semantics is directly put into play by a semantics-aware tool, the Semantic MediaWiki. The developed QMS tool is being used for 2 years by the GRIHO research group, where it has manages almost 50 software development projects taking into account the quality management issues. It has also been externally audited by a quality certification organisation. Its users are very satisfied with their daily work with the tool, which manages all the documents created during project development and also allows them to collaborate, thanks to the wiki features.
Resumo:
Background: Design of newly engineered microbial strains for biotechnological purposes would greatly benefit from the development of realistic mathematical models for the processes to be optimized. Such models can then be analyzed and, with the development and application of appropriate optimization techniques, one could identify the modifications that need to be made to the organism in order to achieve the desired biotechnological goal. As appropriate models to perform such an analysis are necessarily non-linear and typically non-convex, finding their global optimum is a challenging task. Canonical modeling techniques, such as Generalized Mass Action (GMA) models based on the power-law formalism, offer a possible solution to this problem because they have a mathematical structure that enables the development of specific algorithms for global optimization. Results: Based on the GMA canonical representation, we have developed in previous works a highly efficient optimization algorithm and a set of related strategies for understanding the evolution of adaptive responses in cellular metabolism. Here, we explore the possibility of recasting kinetic non-linear models into an equivalent GMA model, so that global optimization on the recast GMA model can be performed. With this technique, optimization is greatly facilitated and the results are transposable to the original non-linear problem. This procedure is straightforward for a particular class of non-linear models known as Saturable and Cooperative (SC) models that extend the power-law formalism to deal with saturation and cooperativity. Conclusions: Our results show that recasting non-linear kinetic models into GMA models is indeed an appropriate strategy that helps overcoming some of the numerical difficulties that arise during the global optimization task.
Resumo:
Optimization models in metabolic engineering and systems biology focus typically on optimizing a unique criterion, usually the synthesis rate of a metabolite of interest or the rate of growth. Connectivity and non-linear regulatory effects, however, make it necessary to consider multiple objectives in order to identify useful strategies that balance out different metabolic issues. This is a fundamental aspect, as optimization of maximum yield in a given condition may involve unrealistic values in other key processes. Due to the difficulties associated with detailed non-linear models, analysis using stoichiometric descriptions and linear optimization methods have become rather popular in systems biology. However, despite being useful, these approaches fail in capturing the intrinsic nonlinear nature of the underlying metabolic systems and the regulatory signals involved. Targeting more complex biological systems requires the application of global optimization methods to non-linear representations. In this work we address the multi-objective global optimization of metabolic networks that are described by a special class of models based on the power-law formalism: the generalized mass action (GMA) representation. Our goal is to develop global optimization methods capable of efficiently dealing with several biological criteria simultaneously. In order to overcome the numerical difficulties of dealing with multiple criteria in the optimization, we propose a heuristic approach based on the epsilon constraint method that reduces the computational burden of generating a set of Pareto optimal alternatives, each achieving a unique combination of objectives values. To facilitate the post-optimal analysis of these solutions and narrow down their number prior to being tested in the laboratory, we explore the use of Pareto filters that identify the preferred subset of enzymatic profiles. We demonstrate the usefulness of our approach by means of a case study that optimizes the ethanol production in the fermentation of Saccharomyces cerevisiae.
Resumo:
Software projects have proved to be troublesome to be implemented and as the size of software keeps increasing it is more and more important to follow-up the projects. The proportion of succeeded software projects is still quite low in spite of the research and the development of the project control methodologies. The success and failure factors of projects are known, as well as the project risks but nevertheless the projects still have problems with keeping the schedule and the budget and achieving the defined functionality and adequate quality. The purpose of this thesis was to find out what deviations are there in projects at the moment, what causes them, and what is measured in projects. Also project deviation was defined in the viewpoint of literature and field experts. The analysis was made using a qualitative research approach. It was found out that there are still deviations in software projects with schedule, budget, quality, requirements, documenting, effort, and resources. In addition also changes in requirements were identified. It was also found out that for example schedule deviations can be affected by reducing the size of a task and adding measurements.
Resumo:
Agile software development methods are attempting to provide an answer to the software development industry's need of lighter weight, more agile processes that offer the possibility to react to changes during the software development process. The objective of this thesis is to analyze and experiment the possibility of using agile methods or practices also in small software projects, even in projects containing only one developer. In the practical part of the thesis a small software project was executed with some agile methods and practices that in the theoretical part of the thesis were found possible to be applied to the project. In the project a Bluetooth proxy application that is run in the S60 smartphone platform and PC was developed further to contain some new features. As a result it was found that certain agile practices can be useful even in the very small projects. The selection of the suitable practices depends on the project and the size of the project team.
Resumo:
En esta memoria final se encuentra embebida la investigación realizada para poder generar una aplicación web que permite registrar los procesos realizados para la producción de leche en el Cantón Cayambe de la provincia de Pichincha en Ecuador.
Resumo:
In this research work we searched for open source libraries which supports graph drawing and visualisation and can run in a browser. Subsequent these libraries were evaluated to find out which one is the best for this task. The result was the d3.js is that library which has the greatest functionality, flexibility and customisability. Afterwards we developed an open source software tool where d3.js was included and which was written in JavaScript so that it can run browser-based.
Resumo:
La finalidad de este proyecto es desarrollar un espacio colaborativo donde poder compartir y gestionar conocimiento.