895 resultados para 230117 Operations Research
Resumo:
Bound-constrained minimization is a subject of active research. To assess the performance of existent solvers, numerical evaluations and comparisons are carried on. Arbitrary decisions that may have a crucial effect on the conclusions of numerical experiments are highlighted in the present work. As a result, a detailed evaluation based on performance profiles is applied to the comparison of bound-constrained minimization solvers. Extensive numerical results are presented and analyzed.
Resumo:
In this study, a dynamic programming approach to deal with the unconstrained two-dimensional non-guillotine cutting problem is presented. The method extends the recently introduced recursive partitioning approach for the manufacturer's pallet loading problem. The approach involves two phases and uses bounds based on unconstrained two-staged and non-staged guillotine cutting. The method is able to find the optimal cutting pattern of a large number of pro blem instances of moderate sizes known in the literature and a counterexample for which the approach fails to find known optimal solutions was not found. For the instances that the required computer runtime is excessive, the approach is combined with simple heuristics to reduce its running time. Detailed numerical experiments show the reliability of the method. Journal of the Operational Research Society (2012) 63, 183-200. doi: 10.1057/jors.2011.6 Published online 17 August 2011
Resumo:
In this paper, we address the problem of defining the product mix in order to maximise a system's throughput. This problem is well known for being NP-Complete and therefore, most contributions to the topic focus on developing heuristics that are able to obtain good solutions for the problem in a short CPU time. In particular, constructive heuristics are available for the problem such as that by Fredendall and Lea, and by Aryanezhad and Komijan. We propose a new constructive heuristic based on the Theory of Constraints and the Knapsack Problem. The computational results indicate that the proposed heuristic yields better results than the existing heuristic.
Resumo:
At each outer iteration of standard Augmented Lagrangian methods one tries to solve a box-constrained optimization problem with some prescribed tolerance. In the continuous world, using exact arithmetic, this subproblem is always solvable. Therefore, the possibility of finishing the subproblem resolution without satisfying the theoretical stopping conditions is not contemplated in usual convergence theories. However, in practice, one might not be able to solve the subproblem up to the required precision. This may be due to different reasons. One of them is that the presence of an excessively large penalty parameter could impair the performance of the box-constraint optimization solver. In this paper a practical strategy for decreasing the penalty parameter in situations like the one mentioned above is proposed. More generally, the different decisions that may be taken when, in practice, one is not able to solve the Augmented Lagrangian subproblem will be discussed. As a result, an improved Augmented Lagrangian method is presented, which takes into account numerical difficulties in a satisfactory way, preserving suitable convergence theory. Numerical experiments are presented involving all the CUTEr collection test problems.
Resumo:
Companies are currently choosing to integrate logics and systems to achieve better solutions. These combinations also include companies striving to join the logic of material requirement planning (MRP) system with the systems of lean production. The purpose of this article was to design an MRP as part of the implementation of an enterprise resource planning (ERP) in a company that produces agricultural implements, which has used the lean production system since 1998. This proposal is based on the innovation theory, theory networks, lean production systems, ERP systems and the hybrid production systems, which use both components and MRP systems, as concepts of lean production systems. The analytical approach of innovation networks enables verification of the links and relationships among the companies and departments of the same corporation. The analysis begins with the MRP implementation project carried out in a Brazilian metallurgical company and follows through the operationalisation of the MRP project, until its production stabilisation. The main point is that the MRP system should help the company's operations with regard to its effective agility to respond in time to demand fluctuations, facilitating the creation process and controlling the branch offices in other countries that use components produced in the matrix, hence ensuring more accurate estimates of stockpiles. Consequently, it presents the enterprise knowledge development organisational modelling methodology in order to represent further models (goals, actors and resources, business rules, business process and concepts) that should be included in this MRP implementation process for the new configuration of the production system.
Resumo:
The aim of solving the Optimal Power Flow problem is to determine the optimal state of an electric power transmission system, that is, the voltage magnitude and phase angles and the tap ratios of the transformers that optimize the performance of a given system, while satisfying its physical and operating constraints. The Optimal Power Flow problem is modeled as a large-scale mixed-discrete nonlinear programming problem. This paper proposes a method for handling the discrete variables of the Optimal Power Flow problem. A penalty function is presented. Due to the inclusion of the penalty function into the objective function, a sequence of nonlinear programming problems with only continuous variables is obtained and the solutions of these problems converge to a solution of the mixed problem. The obtained nonlinear programming problems are solved by a Primal-Dual Logarithmic-Barrier Method. Numerical tests using the IEEE 14, 30, 118 and 300-Bus test systems indicate that the method is efficient. (C) 2012 Elsevier B.V. All rights reserved.
Resumo:
Existing studies of on-line process control are concerned with economic aspects, and the parameters of the processes are optimized with respect to the average cost per item produced. However, an equally important dimension is the adoption of an efficient maintenance policy. In most cases, only the frequency of the corrective adjustment is evaluated because it is assumed that the equipment becomes "as good as new" after corrective maintenance. For this condition to be met, a sophisticated and detailed corrective adjustment system needs to be employed. The aim of this paper is to propose an integrated economic model incorporating the following two dimensions: on-line process control and a corrective maintenance program. Both performances are objects of an average cost per item minimization. Adjustments are based on the location of the measurement of a quality characteristic of interest in a three decision zone. Numerical examples are illustrated in the proposal. (c) 2012 Elsevier B.V. All rights reserved.
Resumo:
In this paper, we carry out robust modeling and influence diagnostics in Birnbaum-Saunders (BS) regression models. Specifically, we present some aspects related to BS and log-BS distributions and their generalizations from the Student-t distribution, and develop BS-t regression models, including maximum likelihood estimation based on the EM algorithm and diagnostic tools. In addition, we apply the obtained results to real data from insurance, which shows the uses of the proposed model. Copyright (c) 2011 John Wiley & Sons, Ltd.
Resumo:
We introduce a new Integer Linear Programming (ILP) approach for solving Integer Programming (IP) problems with bilinear objectives and linear constraints. The approach relies on a series of ILP approximations of the bilinear P. We compare this approach with standard linearization techniques on random instances and a set of real-world product bundling problems. (C) 2011 Elsevier B.V. All rights reserved.
The boundedness of penalty parameters in an augmented Lagrangian method with constrained subproblems
Resumo:
Augmented Lagrangian methods are effective tools for solving large-scale nonlinear programming problems. At each outer iteration, a minimization subproblem with simple constraints, whose objective function depends on updated Lagrange multipliers and penalty parameters, is approximately solved. When the penalty parameter becomes very large, solving the subproblem becomes difficult; therefore, the effectiveness of this approach is associated with the boundedness of the penalty parameters. In this paper, it is proved that under more natural assumptions than the ones employed until now, penalty parameters are bounded. For proving the new boundedness result, the original algorithm has been slightly modified. Numerical consequences of the modifications are discussed and computational experiments are presented.
Resumo:
A chaotic encryption algorithm is proposed based on the "Life-like" cellular automata (CA), which acts as a pseudo-random generator (PRNG). The paper main focus is to use chaos theory to cryptography. Thus, CA was explored to look for this "chaos" property. This way, the manuscript is more concerning on tests like: Lyapunov exponent, Entropy and Hamming distance to measure the chaos in CA, as well as statistic analysis like DIEHARD and ENT suites. Our results achieved higher randomness quality than others ciphers in literature. These results reinforce the supposition of a strong relationship between chaos and the randomness quality. Thus, the "chaos" property of CA is a good reason to be employed in cryptography, furthermore, for its simplicity, low cost of implementation and respectable encryption power. (C) 2012 Elsevier Ltd. All rights reserved.
Resumo:
Texture image analysis is an important field of investigation that has attracted the attention from computer vision community in the last decades. In this paper, a novel approach for texture image analysis is proposed by using a combination of graph theory and partially self-avoiding deterministic walks. From the image, we build a regular graph where each vertex represents a pixel and it is connected to neighboring pixels (pixels whose spatial distance is less than a given radius). Transformations on the regular graph are applied to emphasize different image features. To characterize the transformed graphs, partially self-avoiding deterministic walks are performed to compose the feature vector. Experimental results on three databases indicate that the proposed method significantly improves correct classification rate compared to the state-of-the-art, e.g. from 89.37% (original tourist walk) to 94.32% on the Brodatz database, from 84.86% (Gabor filter) to 85.07% on the Vistex database and from 92.60% (original tourist walk) to 98.00% on the plant leaves database. In view of these results, it is expected that this method could provide good results in other applications such as texture synthesis and texture segmentation. (C) 2012 Elsevier Ltd. All rights reserved.
Resumo:
Fraud is a global problem that has required more attention due to an accentuated expansion of modern technology and communication. When statistical techniques are used to detect fraud, whether a fraud detection model is accurate enough in order to provide correct classification of the case as a fraudulent or legitimate is a critical factor. In this context, the concept of bootstrap aggregating (bagging) arises. The basic idea is to generate multiple classifiers by obtaining the predicted values from the adjusted models to several replicated datasets and then combining them into a single predictive classification in order to improve the classification accuracy. In this paper, for the first time, we aim to present a pioneer study of the performance of the discrete and continuous k-dependence probabilistic networks within the context of bagging predictors classification. Via a large simulation study and various real datasets, we discovered that the probabilistic networks are a strong modeling option with high predictive capacity and with a high increment using the bagging procedure when compared to traditional techniques. (C) 2012 Elsevier Ltd. All rights reserved.
Resumo:
In multi-label classification, examples can be associated with multiple labels simultaneously. The task of learning from multi-label data can be addressed by methods that transform the multi-label classification problem into several single-label classification problems. The binary relevance approach is one of these methods, where the multi-label learning task is decomposed into several independent binary classification problems, one for each label in the set of labels, and the final labels for each example are determined by aggregating the predictions from all binary classifiers. However, this approach fails to consider any dependency among the labels. Aiming to accurately predict label combinations, in this paper we propose a simple approach that enables the binary classifiers to discover existing label dependency by themselves. An experimental study using decision trees, a kernel method as well as Naive Bayes as base-learning techniques shows the potential of the proposed approach to improve the multi-label classification performance.
Resumo:
Statistical methods have been widely employed to assess the capabilities of credit scoring classification models in order to reduce the risk of wrong decisions when granting credit facilities to clients. The predictive quality of a classification model can be evaluated based on measures such as sensitivity, specificity, predictive values, accuracy, correlation coefficients and information theoretical measures, such as relative entropy and mutual information. In this paper we analyze the performance of a naive logistic regression model (Hosmer & Lemeshow, 1989) and a logistic regression with state-dependent sample selection model (Cramer, 2004) applied to simulated data. Also, as a case study, the methodology is illustrated on a data set extracted from a Brazilian bank portfolio. Our simulation results so far revealed that there is no statistically significant difference in terms of predictive capacity between the naive logistic regression models and the logistic regression with state-dependent sample selection models. However, there is strong difference between the distributions of the estimated default probabilities from these two statistical modeling techniques, with the naive logistic regression models always underestimating such probabilities, particularly in the presence of balanced samples. (C) 2012 Elsevier Ltd. All rights reserved.