993 resultados para Computational experiment


Relevância:

60.00% 60.00%

Publicador:

Resumo:

“Branch-and-cut” algorithm is one of the most efficient exact approaches to solve mixed integer programs. This algorithm combines the advantages of a pure branch-and-bound approach and cutting planes scheme. Branch-and-cut algorithm computes the linear programming relaxation of the problem at each node of the search tree which is improved by the use of cuts, i.e. by the inclusion of valid inequalities. It should be taken into account that selection of strongest cuts is crucial for their effective use in branch-and-cut algorithm. In this thesis, we focus on the derivation and use of cutting planes to solve general mixed integer problems, and in particular inventory problems combined with other problems such as distribution, supplier selection, vehicle routing, etc. In order to achieve this goal, we first consider substructures (relaxations) of such problems which are obtained by the coherent loss of information. The polyhedral structure of those simpler mixed integer sets is studied to derive strong valid inequalities. Finally those strong inequalities are included in the cutting plane algorithms to solve the general mixed integer problems. We study three mixed integer sets in this dissertation. The first two mixed integer sets arise as a subproblem of the lot-sizing with supplier selection, the network design and the vendor-managed inventory routing problems. These sets are variants of the well-known single node fixed-charge network set where a binary or integer variable is associated with the node. The third set occurs as a subproblem of mixed integer sets where incompatibility between binary variables is considered. We generate families of valid inequalities for those sets, identify classes of facet-defining inequalities, and discuss the separation problems associated with the inequalities. Then cutting plane frameworks are implemented to solve some mixed integer programs. Preliminary computational experiments are presented in this direction.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The assessment of chess players is an increasingly attractive opportunity and an unfortunate necessity. The chess community needs to limit potential reputational damage by inhibiting cheating and unjustified accusations of cheating: there has been a recent rise in both. A number of counter-intuitive discoveries have been made by benchmarking the intrinsic merit of players’ moves: these call for further investigation. Is Capablanca actually, objectively the most accurate World Champion? Has ELO rating inflation not taken place? Stimulated by FIDE/ACP, we revisit the fundamentals of the subject to advance a framework suitable for improved standards of computational experiment and more precise results. Other domains look to chess as the demonstrator of good practice, including the rating of professionals making high-value decisions under pressure, personnel evaluation by Multichoice Assessment and the organization of crowd-sourcing in citizen science projects. The ‘3P’ themes of performance, prediction and profiling pervade all these domains.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The problem of scheduling a parallel program presented by a weighted directed acyclic graph (DAG) to the set of homogeneous processors for minimizing the completion time of the program has been extensively studied as academic optimization problem which occurs in optimizing the execution time of parallel algorithm with parallel computer.In this paper, we propose an application of the Ant Colony Optimization (ACO) to a multiprocessor scheduling problem (MPSP). In the MPSP, no preemption is allowed and each operation demands a setup time on the machines. The problem seeks to compose a schedule that minimizes the total completion time.We therefore rely on heuristics to find solutions since solution methods are not feasible for most problems as such. This novel heuristic searching approach to the multiprocessor based on the ACO algorithm a collection of agents cooperate to effectively explore the search space.A computational experiment is conducted on a suit of benchmark application. By comparing our algorithm result obtained to that of previous heuristic algorithm, it is evince that the ACO algorithm exhibits competitive performance with small error ratio.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Neste trabalho é desenvolvida uma versão do modelo de Aiyagari (1994) com choque de liquidez. Este modelo tem Huggett (1993) e Aiyagari (1994) como casos particulares, mas esta generalização permite dois ativos distintos na economia, um líquido e outro ilíquido. Usar dois ativos diferentes implica em dois retornos afetando o "market clearing", logo, a estratégia computacional usada por Aiyagari e Hugget não funciona. Consequentemente, a triangulação de Scarf substitui o algoritmo. Este experimento computacional mostra que o retorno em equilíbrio do ativo líquido é menor do que o retorno do ilíquido. Além disso, pessoas pobres carregam relativamente mais o ativo líquido, e essa desigualdade não aparece no modelo de Aiyagari.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The Scientific Algorithms are a new metaheuristics inspired in the scientific research process. The new method introduces the idea of theme to search the solution space of hard problems. The inspiration for this class of algorithms comes from the act of researching that comprises thinking, knowledge sharing and disclosing new ideas. The ideas of the new method are illustrated in the Traveling Salesman Problem. A computational experiment applies the proposed approach to a new variant of the Traveling Salesman Problem named Car Renter Salesman Problem. The results are compared to state-of-the-art algorithms for the latter problem

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Aggregation disaggregation is used to reduce the analysis of a large generalized transportation problem to a smaller one. Bounds for the actual difference between the aggregated objective and the original optimal value are used to quantify the error due to aggregation and estimate the quality of the aggregation. The bounds can be calculated either before optimization of the aggregated problem (a priori) or after (a posteriori). Both types of the bounds are derived and numerically compared. A computational experiment was designed to (a) study the correlation between the bounds and the actual error and (b) quantify the difference of the error bounds from the actual error. The experiment shows a significant correlation between some a priori bounds, the a posteriori bounds and the actual error. These preliminary results indicate that calculating the a priori error bound is a useful strategy to select the appropriate aggregation level, since the a priori bound varies in the same way that the actual error does. After the aggregated problem has been selected and optimized, the a posteriori bound provides a good quantitative measure for the error due to aggregation.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In the last few years, crop rotation has gained attention due to its economic, environmental and social importance which explains why it can be highly beneficial for farmers. This paper presents a mathematical model for the Crop Rotation Problem (CRP) that was adapted from literature for this highly complex combinatorial problem. The CRP is devised to find a vegetable planting program that takes into account green fertilization restrictions, the set-aside period, planting restrictions for neighboring lots and for crop sequencing, demand constraints, while, at the same time, maximizing the profitability of the planted area. The main aim of this study is to develop a genetic algorithm and test it in a real context. The genetic algorithm involves a constructive heuristic to build the initial population and the operators of crossover, mutation, migration and elitism. The computational experiment was performed for a medium dimension real planting area with 16 lots, considering 29 crops of 10 different botanical families and a two-year planting rotation. Results showed that the algorithm determined feasible solutions in a reasonable computational time, thus proving its efficacy for dealing with this practical application.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

We consider a one-dimensional cutting stock problem in which the material not used in the cutting patterns, if large enough, is kept for use in the future. Moreover, it is assumed that leftovers should not remain in stock for a long time, hence, such leftovers have priority-in-use compared to standard objects (objects bought by the industry) in stock. A heuristic procedure is proposed for this problem, and its performance is analyzed by solving randomly generated dynamic instances where successive problems are solved in a time horizon. For each period, new demands arise and a new problem is solved on the basis of the information about the stock of the previous periods (remaining standard objects in the stock) and usable leftovers generated during those previous periods. The computational experiments show that the solutions presented by the proposed heuristic are better than the solutions obtained by other heuristics from the literature. © 2012 The Authors. International Transactions in Operational Research © 2012 International Federation of Operational Research Societies.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The capacitated redistricting problem (CRP) has the objective to redefine, under a given criterion, an initial set of districts of an urban area represented by a geographic network. Each node in the network has different types of demands and each district has a limited capacity. Real-world applications consider more than one criteria in the design of the districts, leading to a multicriteria CRP (MCRP). Examples are found in political districting, sales design, street sweeping, garbage collection and mail delivery. This work addresses the MCRP applied to power meter reading and two criteria are considered: compactness and homogeneity of districts. The proposed solution framework is based on a greedy randomized adaptive search procedure and multicriteria scalarization techniques to approximate the Pareto frontier. The computational experiments show the effectiveness of the method for a set of randomly generated networks and for a real-world network extracted from the city of São Paulo. © 2013 Elsevier Ltd.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Short-run forecasting of electricity prices has become necessary for power generation unit schedule, since it is the basis of every profit maximization strategy. In this article a new and very easy method to compute accurate forecasts for electricity prices using mixed models is proposed. The main idea is to develop an efficient tool for one-step-ahead forecasting in the future, combining several prediction methods for which forecasting performance has been checked and compared for a span of several years. Also as a novelty, the 24 hourly time series has been modelled separately, instead of the complete time series of the prices. This allows one to take advantage of the homogeneity of these 24 time series. The purpose of this paper is to select the model that leads to smaller prediction errors and to obtain the appropriate length of time to use for forecasting. These results have been obtained by means of a computational experiment. A mixed model which combines the advantages of the two new models discussed is proposed. Some numerical results for the Spanish market are shown, but this new methodology can be applied to other electricity markets as well

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Electricity price forecasting is an interesting problem for all the agents involved in electricity market operation. For instance, every profit maximisation strategy is based on the computation of accurate one-day-ahead forecasts, which is why electricity price forecasting has been a growing field of research in recent years. In addition, the increasing concern about environmental issues has led to a high penetration of renewable energies, particularly wind. In some European countries such as Spain, Germany and Denmark, renewable energy is having a deep impact on the local power markets. In this paper, we propose an optimal model from the perspective of forecasting accuracy, and it consists of a combination of several univariate and multivariate time series methods that account for the amount of energy produced with clean energies, particularly wind and hydro, which are the most relevant renewable energy sources in the Iberian Market. This market is used to illustrate the proposed methodology, as it is one of those markets in which wind power production is more relevant in terms of its percentage of the total demand, but of course our method can be applied to any other liberalised power market. As far as our contribution is concerned, first, the methodology proposed by García-Martos et al(2007 and 2012) is generalised twofold: we allow the incorporation of wind power production and hydro reservoirs, and we do not impose the restriction of using the same model for 24h. A computational experiment and a Design of Experiments (DOE) are performed for this purpose. Then, for those hours in which there are two or more models without statistically significant differences in terms of their forecasting accuracy, a combination of forecasts is proposed by weighting the best models(according to the DOE) and minimising the Mean Absolute Percentage Error (MAPE). The MAPE is the most popular accuracy metric for comparing electricity price forecasting models. We construct the combi nation of forecasts by solving several nonlinear optimisation problems that allow computation of the optimal weights for building the combination of forecasts. The results are obtained by a large computational experiment that entails calculating out-of-sample forecasts for every hour in every day in the period from January 2007 to Decem ber 2009. In addition, to reinforce the value of our methodology, we compare our results with those that appear in recent published works in the field. This comparison shows the superiority of our methodology in terms of forecasting accuracy.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In this work, an electricity price forecasting model is developed. The performance of the proposed approach is improved by considering renewable energies (wind power and hydro generation) as explanatory variables. Additionally, the resulting forecasts are obtained as an optimal combination of a set of several univariate and multivariate time series models. The large computational experiment carried out using out-of-sample forecasts for every hour and day allows withdrawing statistically sound conclusions

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The classical problem of thermal explosion is modified so that the chemically active gas is not at rest but is flowing in a long cylindrical pipe. Up to a certain section the heat-conducting walls of the pipe are held at low temperature so that the reaction rate is small and there is no heat release; at that section the ambient temperature is increased and an exothermic reaction begins. The question is whether a slow reaction regime will be established or a thermal explosion will occur. The mathematical formulation of the problem is presented. It is shown that when the pipe radius is larger than a critical value, the solution of the new problem exists only up to a certain distance along the axis. The critical radius is determined by conditions in a problem with a uniform axial temperature. The loss of existence is interpreted as a thermal explosion; the critical distance is the safe reactor’s length. Both laminar and developed turbulent flow regimes are considered. In a computational experiment the loss of the existence appears as a divergence of a numerical procedure; numerical calculations reveal asymptotic scaling laws with simple powers for the critical distance.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Multi-objective problems may have many optimal solutions, which together form the Pareto optimal set. A class of heuristic algorithms for those problems, in this work called optimizers, produces approximations of this optimal set. The approximation set kept by the optmizer may be limited or unlimited. The benefit of using an unlimited archive is to guarantee that all the nondominated solutions generated in the process will be saved. However, due to the large number of solutions that can be generated, to keep an archive and compare frequently new solutions to the stored ones may demand a high computational cost. The alternative is to use a limited archive. The problem that emerges from this situation is the need of discarding nondominated solutions when the archive is full. Some techniques were proposed to handle this problem, but investigations show that none of them can surely prevent the deterioration of the archives. This work investigates a technique to be used together with the previously proposed ideas in the literature to deal with limited archives. The technique consists on keeping discarded solutions in a secondary archive, and periodically recycle these solutions, bringing them back to the optimization. Three methods of recycling are presented. In order to verify if these ideas are capable to improve the archive content during the optimization, they were implemented together with other techniques from the literature. An computational experiment with NSGA-II, SPEA2, PAES, MOEA/D and NSGA-III algorithms, applied to many classes of problems is presented. The potential and the difficulties of the proposed techniques are evaluated based on statistical tests.