912 resultados para cutting stock problem with setups
Resumo:
Le problème de conception de réseaux est un problème qui a été beaucoup étudié dans le domaine de la recherche opérationnelle pour ses caractéristiques, et ses applications dans des nombreux domaines tels que le transport, les communications, et la logistique. Nous nous intéressons en particulier dans ce mémoire à résoudre le problème de conception de réseaux avec coûts fixes et sans capacité, en satisfaisant les demandes de tous les produits tout en minimisant la somme des coûts de transport de ces produits et des coûts fixes de conception du réseau. Ce problème se modélise généralement sous la forme d’un programme linéaire en nombres entiers incluant des variables continues. Pour le résoudre, nous avons appliqué la méthode exacte de Branch-and-Bound basée sur une relaxation linéaire du problème avec un critère d’arrêt, tout en exploitant les méthodes de génération de colonnes et de génération de coupes. Nous avons testé la méthode de Branch-and-Price-and-Cut sur 156 instances divisées en cinq groupes de différentes tailles, et nous l’avons comparée à Cplex, l’un des meilleurs solveurs d’optimisation mathématique, ainsi qu’à la méthode de Branch-and- Cut. Notre méthode est compétitive et plus performante sur les instances de grande taille ayant un grand nombre de produits.
Optimal Methodology for Synchronized Scheduling of Parallel Station Assembly with Air Transportation
Resumo:
We present an optimal methodology for synchronized scheduling of production assembly with air transportation to achieve accurate delivery with minimized cost in consumer electronics supply chain (CESC). This problem was motivated by a major PC manufacturer in consumer electronics industry, where it is required to schedule the delivery requirements to meet the customer needs in different parts of South East Asia. The overall problem is decomposed into two sub-problems which consist of an air transportation allocation problem and an assembly scheduling problem. The air transportation allocation problem is formulated as a Linear Programming Problem with earliness tardiness penalties for job orders. For the assembly scheduling problem, it is basically required to sequence the job orders on the assembly stations to minimize their waiting times before they are shipped by flights to their destinations. Hence the second sub-problem is modelled as a scheduling problem with earliness penalties. The earliness penalties are assumed to be independent of the job orders.
Resumo:
The Kelvin Helmholtz (KH) problem, with zero stratification, is examined as a limiting case of the Rayleigh model of a single shear layer whose width tends to zero. The transition of the Rayleigh modal dispersion relation to the KH one, as well as the disappearance of the supermodal transient growth in the KH limit, are both rationalized from the counterpropagating Rossby wave perspective.
Resumo:
4-Dimensional Variational Data Assimilation (4DVAR) assimilates observations through the minimisation of a least-squares objective function, which is constrained by the model flow. We refer to 4DVAR as strong-constraint 4DVAR (sc4DVAR) in this thesis as it assumes the model is perfect. Relaxing this assumption gives rise to weak-constraint 4DVAR (wc4DVAR), leading to a different minimisation problem with more degrees of freedom. We consider two wc4DVAR formulations in this thesis, the model error formulation and state estimation formulation. The 4DVAR objective function is traditionally solved using gradient-based iterative methods. The principle method used in Numerical Weather Prediction today is the Gauss-Newton approach. This method introduces a linearised `inner-loop' objective function, which upon convergence, updates the solution of the non-linear `outer-loop' objective function. This requires many evaluations of the objective function and its gradient, which emphasises the importance of the Hessian. The eigenvalues and eigenvectors of the Hessian provide insight into the degree of convexity of the objective function, while also indicating the difficulty one may encounter while iterative solving 4DVAR. The condition number of the Hessian is an appropriate measure for the sensitivity of the problem to input data. The condition number can also indicate the rate of convergence and solution accuracy of the minimisation algorithm. This thesis investigates the sensitivity of the solution process minimising both wc4DVAR objective functions to the internal assimilation parameters composing the problem. We gain insight into these sensitivities by bounding the condition number of the Hessians of both objective functions. We also precondition the model error objective function and show improved convergence. We show that both formulations' sensitivities are related to error variance balance, assimilation window length and correlation length-scales using the bounds. We further demonstrate this through numerical experiments on the condition number and data assimilation experiments using linear and non-linear chaotic toy models.
Resumo:
We consider the two-level network design problem with intermediate facilities. This problem consists of designing a minimum cost network respecting some requirements, usually described in terms of the network topology or in terms of a desired flow of commodities between source and destination vertices. Each selected link must receive one of two types of edge facilities and the connection of different edge facilities requires a costly and capacitated vertex facility. We propose a hybrid decomposition approach which heuristically obtains tentative solutions for the vertex facilities number and location and use these solutions to limit the computational burden of a branch-and-cut algorithm. We test our method on instances of the power system secondary distribution network design problem. The results show that the method is efficient both in terms of solution quality and computational times. (C) 2010 Elsevier Ltd. All rights reserved.
Resumo:
We consider a Moyal plane and propose to make the noncommutativity parameter Theta(mu nu) bifermionic, i.e. composed of two fermionic (Grassmann odd) parameters. The Moyal product then contains a finite number of derivatives, which avoid the difficulties of the standard approach. As an example, we construct a two-dimensional noncommutative field theory model based on the Moyal product with a bifermionic parameter and show that it has a locally conserved energy-momentum tensor. The model has no problem with the canonical quantization and appears to be renormalizable.
Resumo:
This paper presents the formulation of a combinatorial optimization problem with the following characteristics: (i) the search space is the power set of a finite set structured as a Boolean lattice; (ii) the cost function forms a U-shaped curve when applied to any lattice chain. This formulation applies for feature selection in the context of pattern recognition. The known approaches for this problem are branch-and-bound algorithms and heuristics that explore partially the search space. Branch-and-bound algorithms are equivalent to the full search, while heuristics are not. This paper presents a branch-and-bound algorithm that differs from the others known by exploring the lattice structure and the U-shaped chain curves of the search space. The main contribution of this paper is the architecture of this algorithm that is based on the representation and exploration of the search space by new lattice properties proven here. Several experiments, with well known public data, indicate the superiority of the proposed method to the sequential floating forward selection (SFFS), which is a popular heuristic that gives good results in very short computational time. In all experiments, the proposed method got better or equal results in similar or even smaller computational time. (C) 2009 Elsevier Ltd. All rights reserved.
Resumo:
Prediction of random effects is an important problem with expanding applications. In the simplest context, the problem corresponds to prediction of the latent value (the mean) of a realized cluster selected via two-stage sampling. Recently, Stanek and Singer [Predicting random effects from finite population clustered samples with response error. J. Amer. Statist. Assoc. 99, 119-130] developed best linear unbiased predictors (BLUP) under a finite population mixed model that outperform BLUPs from mixed models and superpopulation models. Their setup, however, does not allow for unequally sized clusters. To overcome this drawback, we consider an expanded finite population mixed model based on a larger set of random variables that span a higher dimensional space than those typically applied to such problems. We show that BLUPs for linear combinations of the realized cluster means derived under such a model have considerably smaller mean squared error (MSE) than those obtained from mixed models, superpopulation models, and finite population mixed models. We motivate our general approach by an example developed for two-stage cluster sampling and show that it faithfully captures the stochastic aspects of sampling in the problem. We also consider simulation studies to illustrate the increased accuracy of the BLUP obtained under the expanded finite population mixed model. (C) 2007 Elsevier B.V. All rights reserved.
Resumo:
In this paper we give a proof of the existence of an orthogonal geodesic chord on a Riemannian manifold homeomorphic to a closed disk and with concave boundary. This kind of study is motivated by the link (proved in Giambo et al. (2005) [8]) of the multiplicity problem with the famous Seifert conjecture (formulated in Seifert (1948) [1]) about multiple brake orbits for a class of Hamiltonian systems at a fixed energy level. (C) 2010 Elsevier Ltd. All rights reserved.
Resumo:
In this paper we show the existence of three new families of stacked spatial central configurations for the six-body problem with the following properties: four bodies are at the vertices of a regular tetrahedron and the other two bodies are on a line connecting one vertex of the tetrahedron with the center of the opposite face. (c) 2009 Elsevier B.V. All rights reserved.
Resumo:
We develop portfolio choice theory taking into consideration the first p~ moments of the underIying assets distribution. A rigorous characterization of the opportunity set and of the efficient portfolios frontier is given, as well as of the solutions to the problem with a general utility function and short sales allowed. The extension of c1assical meanvariance properties, like two-fund separation, is also investigated. A general CAPM is derived, based on the theoretical foundations built, and its empirical consequences and testing are discussed
Resumo:
A estrutura de propriedade anglo-saxônica, e o seu clássico problema de agência, com conflitos entre gestores e acionistas, caracterizados por propriedades pulverizadas, apesar de ser predominante na literatura, não constituí a regra, mas sim a exceção. O Brasil, diferentemente dos Estados Unidos e da Inglaterra, possuí uma estrutura de propriedade concentrada, onde é forte a presença de acionistas majoritários. Nesse caso, o conflito verificado não é entre gestores e acionistas (conflito agente X principal), mas sim entre acionistas majoritários e acionistas minoritários (conflito principal X principal). No mercado de capitais brasileiro, há duas classes de ações, as ordinárias (com direito a voto), e as preferências (sem direito a voto), o que viola a regra existente em muitos países, como nos Estados Unidos, de uma ação, um voto. Sendo assim, em muitos casos, ocorre uma combinação de muito poder com pouca alocação de recursos próprios na empresa. Diante disso, o presente estudo teve como objetivo estimar a magnitude dos direitos de votos, de fluxo de caixa, e do excesso de votos dos acionistas majoritários (das ações ordinárias) das empresas listadas no índice Bovespa – Ibovespa para os anos de 2009 e 2010 (carteira teórica do terceiro quadrimestre dos respectivos anos), separando-as por setor de atuação e por tipo de acionista majoritário. Para este estudo foi analisada uma amostra de 121 empresas, utilizando a metodologia quanto aos fins (descritiva e explicativa), e quanto aos meios (bibliográfica e documental). A coleta de dados foi feita no sistema Economática e nos IAN’s da CVM. Como resultados a pesquisa corroborou as hipóteses da literatura existente de que a estrutura de capital das empresas brasileiras de capital aberto é concentrada, principalmente no capital acionário (média de 51,95% e mediana de 51,20% em 2009, e 47,16% e 51,70% em 2010), e ocorrendo em vários casos uma distância considerável entre o poder de voto e o poder do fluxo de caixa dos acionistas majoritários (média de 1,10 em 2009 e mediana de 1,24, e 1,07 e 0,98 em 2010). Com isso, também se verifica que o conflito principal X principal é o predominante no Brasil.
Resumo:
We extend the static portfolio choice problem with a small background risk to the case of small partially correlated background risks. We show that respecting the theories under which risk substitution appears, except for the independence of background risk, it is perfectly rational for the individual to increase his optimal exposure to portfolio risk when risks are partially negatively correlated. Then, we test empirically the hypothesis of risk substitutability using INSEE data on French households. We find that households respond by increasing their stockholdings in response to the increase in future earnings uncertainty. This conclusion is in contradiction with results obtained in other countries. So, in light of these results, our model provides an explanation to account for the lack of empirical consensus on cross-country tests of risk substitution theory that encompasses and criticises all of them.
Resumo:
Who was the cowboy in Washington? What is the land of sushi? Most people would have answers to these questions readily available,yet, modern search engines, arguably the epitome of technology in finding answers to most questions, are completely unable to do so. It seems that people capture few information items to rapidly converge to a seemingly 'obvious' solution. We will study approaches for this problem, with two additional hard demands that constrain the space of possible theories: the sought model must be both psychologically and neuroscienti cally plausible. Building on top of the mathematical model of memory called Sparse Distributed Memory, we will see how some well-known methods in cryptography can point toward a promising, comprehensive, solution that preserves four crucial properties of human psychology.
Resumo:
The lubricants found in the market are of mineral or synthetic origin and harm to humans and the environment, mainly due to their improper discard. Therefore industries are seeking to develop products that cause less environmental impact, so to decrease mainly, operator aggression the Cutting Fluids became an emulsion of oil / water or water / oil. However, the emulsion was not considered the most suitable solution for environmental question, therefore the search for biodegradable lubricants and which no are toxic continues and so vegetable oils are seen, again, as a basis for the production of lubricants. The biggest problem with these oils is their oxidative instability that is intensified when working at high temperatures. The process transesterification decreases the oxidation, however changes some physical and chemical properties. Therefore soybean oil after the transesterification process was subjected to tests of density, dynamic viscosity, kinematic viscosity which is calculated from two parameters mentioned, flash point and acidity. Besides the physico-chemical test the soybean oil was subjected to a dynamic test in a tribometer adapted from a table vise, whose induced wear was the adhesive and ultimately was used as cutting fluid in a process of turning in two different materials, steel 1045 and cast iron. This latter test presented results below the mineral cutting fluid which it was compared in all tests, already in other experiments the result was satisfactory and other experiments not, so that chemical additives can be added to the oil analyzed to try equate all parameters and so formulate a biolubrificante not toxic to apply in machining processes of metalworking industry