981 resultados para Cutting stock problem


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The focus of this thesis is to contribute to the development of new, exact solution approaches to different combinatorial optimization problems. In particular, we derive dedicated algorithms for a special class of Traveling Tournament Problems (TTPs), the Dial-A-Ride Problem (DARP), and the Vehicle Routing Problem with Time Windows and Temporal Synchronized Pickup and Delivery (VRPTWTSPD). Furthermore, we extend the concept of using dual-optimal inequalities for stabilized Column Generation (CG) and detail its application to improved CG algorithms for the cutting stock problem, the bin packing problem, the vertex coloring problem, and the bin packing problem with conflicts. In all approaches, we make use of some knowledge about the structure of the problem at hand to individualize and enhance existing algorithms. Specifically, we utilize knowledge about the input data (TTP), problem-specific constraints (DARP and VRPTWTSPD), and the dual solution space (stabilized CG). Extensive computational results proving the usefulness of the proposed methods are reported.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Case studies in copper-alloy rolling mill companies showed that existing planning systems suffer from numerous shortcomings. Where computerised systems are in use, these tend to simply emulate older manual systems and still rely heavily on modification by experienced planners on the shopfloor. As the size and number of orders increase, the task of process planners, while seeking to optimise the manufacturing objectives and keep within the production constraints, becomes extremely complicated because of the number of options for mixing or splitting the orders into batches. This thesis develops a modular approach to computerisation of the production management and planning functions. The full functional specification of each module is discussed, together with practical problems associated with their phased implementation. By adapting the Distributed Bill of Material concept from Material Requirements Planning (MRP) philosophy, the production routes generated by the planning system are broken down to identify the rolling stages required. Then to optimise the use of material at each rolling stage, the system generates an optimal cutting pattern using a new algorithm that produces practical solutions to the cutting stock problem. It is shown that the proposed system can be accommodated on a micro-computer, which brings it into the reach of typical companies in the copper-alloy rolling industry, where profit margins are traditionally low and the cost of widespread use of mainframe computers would be prohibitive.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Los problemas de corte y empaquetado son una familia de problemas de optimización combinatoria que han sido ampliamente estudiados en numerosas áreas de la industria y la investigación, debido a su relevancia en una enorme variedad de aplicaciones reales. Son problemas que surgen en muchas industrias de producción donde se debe realizar la subdivisión de un material o espacio disponible en partes más pequeñas. Existe una gran variedad de métodos para resolver este tipo de problemas de optimización. A la hora de proponer un método de resolución para un problema de optimización, es recomendable tener en cuenta el enfoque y las necesidades que se tienen en relación al problema y su solución. Las aproximaciones exactas encuentran la solución óptima, pero sólo es viable aplicarlas a instancias del problema muy pequeñas. Las heurísticas manejan conocimiento específico del problema para obtener soluciones de alta calidad sin necesitar un excesivo esfuerzo computacional. Por otra parte, las metaheurísticas van un paso más allá, ya que son capaces de resolver una clase muy general de problemas computacionales. Finalmente, las hiperheurísticas tratan de automatizar, normalmente incorporando técnicas de aprendizaje, el proceso de selección, combinación, generación o adaptación de heurísticas más simples para resolver eficientemente problemas de optimización. Para obtener lo mejor de estos métodos se requiere conocer, además del tipo de optimización (mono o multi-objetivo) y el tamaño del problema, los medios computacionales de los que se dispone, puesto que el uso de máquinas e implementaciones paralelas puede reducir considerablemente los tiempos para obtener una solución. En las aplicaciones reales de los problemas de corte y empaquetado en la industria, la diferencia entre usar una solución obtenida rápidamente y usar propuestas más sofisticadas para encontrar la solución óptima puede determinar la supervivencia de la empresa. Sin embargo, el desarrollo de propuestas más sofisticadas y efectivas normalmente involucra un gran esfuerzo computacional, que en las aplicaciones reales puede provocar una reducción de la velocidad del proceso de producción. Por lo tanto, el diseño de propuestas efectivas y, al mismo tiempo, eficientes es fundamental. Por esta razón, el principal objetivo de este trabajo consiste en el diseño e implementación de métodos efectivos y eficientes para resolver distintos problemas de corte y empaquetado. Además, si estos métodos se definen como esquemas lo más generales posible, se podrán aplicar a diferentes problemas de corte y empaquetado sin realizar demasiados cambios para adaptarlos a cada uno. Así, teniendo en cuenta el amplio rango de metodologías de resolución de problemas de optimización y las técnicas disponibles para incrementar su eficiencia, se han diseñado e implementado diversos métodos para resolver varios problemas de corte y empaquetado, tratando de mejorar las propuestas existentes en la literatura. Los problemas que se han abordado han sido: el Two-Dimensional Cutting Stock Problem, el Two-Dimensional Strip Packing Problem, y el Container Loading Problem. Para cada uno de estos problemas se ha realizado una amplia y minuciosa revisión bibliográfica, y se ha obtenido la solución de las distintas variantes escogidas aplicando diferentes métodos de resolución: métodos exactos mono-objetivo y paralelizaciones de los mismos, y métodos aproximados multi-objetivo y paralelizaciones de los mismos. Los métodos exactos mono-objetivo aplicados se han basado en técnicas de búsqueda en árbol. Por otra parte, como métodos aproximados multi-objetivo se han seleccionado unas metaheurísticas multi-objetivo, los MOEAs. Además, para la representación de los individuos utilizados por estos métodos se han empleado codificaciones directas mediante una notación postfija, y codificaciones que usan heurísticas de colocación e hiperheurísticas. Algunas de estas metodologías se han mejorado utilizando esquemas paralelos haciendo uso de las herramientas de programación OpenMP y MPI.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This thesis deals with an investigation of Decomposition and Reformulation to solve Integer Linear Programming Problems. This method is often a very successful approach computationally, producing high-quality solutions for well-structured combinatorial optimization problems like vehicle routing, cutting stock, p-median and generalized assignment . However, until now the method has always been tailored to the specific problem under investigation. The principal innovation of this thesis is to develop a new framework able to apply this concept to a generic MIP problem. The new approach is thus capable of auto-decomposition and autoreformulation of the input problem applicable as a resolving black box algorithm and works as a complement and alternative to the normal resolving techniques. The idea of Decomposing and Reformulating (usually called in literature Dantzig and Wolfe Decomposition DWD) is, given a MIP, to convexify one (or more) subset(s) of constraints (slaves) and working on the partially convexified polyhedron(s) obtained. For a given MIP several decompositions can be defined depending from what sets of constraints we want to convexify. In this thesis we mainly reformulate MIPs using two sets of variables: the original variables and the extended variables (representing the exponential extreme points). The master constraints consist of the original constraints not included in any slaves plus the convexity constraint(s) and the linking constraints(ensuring that each original variable can be viewed as linear combination of extreme points of the slaves). The solution procedure consists of iteratively solving the reformulated MIP (master) and checking (pricing) if a variable of reduced costs exists, and in which case adding it to the master and solving it again (columns generation), or otherwise stopping the procedure. The advantage of using DWD is that the reformulated relaxation gives bounds stronger than the original LP relaxation, in addition it can be incorporated in a Branch and bound scheme (Branch and Price) in order to solve the problem to optimality. If the computational time for the pricing problem is reasonable this leads in practice to a stronger speed up in the solution time, specially when the convex hull of the slaves is easy to compute, usually because of its special structure.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The dynamic prediction of complex reservoir development is one of the important research contents of dynamic analysis of oil and gas development. With the increase development of time, the permeabilities and porosities of reservoirs and the permeability of block reservoir at its boundaries are dynamically changing. How to track the dynamic change of permeability and porosity and make certain the permeability of block reservoir at its boundary is an important practical problem. To study developing dynamic prediction of complex reservoir, the key problem of research of dynamic prediction of complex reservoir development is realizing inversion of permeability and porosity. To realize the inversion, first of all, the fast forward and inverse method of 3-dimension reservoir simulation must be studied. Although the inversion has been widely applied to exploration and logging, it has not been applied to3-dimension reservoir simulation. Therefore, the study of fast forward and inverse method of 3-dimension reservoir simulation is a cutting-edge problem, takes on important realistic signification and application value. In this dissertation, 2-dimension and 3-dimension fluid equations in porous media are discretized by finite difference, obtaining finite difference equations to meet the inner boundary conditions by Peaceman's equations, giving successive over relaxation iteration of 3-dimension fluid equations in porous media and the dimensional analysis. Several equation-solving methods are compared in common use, analyzing its convergence and convergence rate. The alternating direction implicit procedure of 2-dimension has been turned into successive over relaxation iteration of alternating direction implicit procedure of 3-dimension fluid equations in porous media, which possesses the virtues of fast computing speed, needing small memory of computer, good adaptability for heterogeneous media and fast convergence rate. The geological model of channel-sandy reservoir has been generated with the help of stochastic simulation technique, whose cross sections of channel-sandy reservoir are parabolic shapes. This method makes the hard data commendably meet, very suit for geological modeling of containing complex boundary surface reservoir. To verify reliability of the method, theoretical solution and numerical solution are compared by simplifying model of 3-dimension fluid equations in porous media, whose results show that the only difference of the two pressure curves is that the numerical solution is lower than theoretical at the wellbore in the same space. It proves that using finite difference to solve fluid equations in porous media is reliable. As numerical examples of 3-dimension heterogeneous reservoir of the single-well and multi-well, the pressure distributions have been computed respectively, which show the pressure distributions there are clearly difference as difference of the permeabilities is greater than one order of magnitude, otherwise there are no clearly difference. As application, the pressure distribution of the channel-sandy reservoir have been computed, which indicates that the space distribution of pressure strongly relies on the direction of permeability, and is sensitive for space distributions of permeability. In this dissertation, the Peaceman's equations have been modified into solving vertical well problem and horizontal well problem simultaneously. In porous media, a 3D layer reservoir in which contain vertical wells and horizontal wells has been calculated with iteration. For channel-sandy reservoir in which there are also vertical wells and horizontal wells, a 3D transient heterogeneous fluid equation has been discretized. As an example, the space distribution of pressure has been calculated with iteration. The results of examples are accord with the fact, which shows the modification of Peaceman's equation is correct. The problem has been solved in the space where there are vertical and horizontal wells. In the dissertation, the nonuniform grid permeability integration equation upscaling method, the nonuniform grid 2D flow rate upscaling method and the nonuniform grid 3D flow rate upscaling method have been studied respectively. In those methods, they enhance computing speed greatly, but the computing speed of 3D flow rate upscaling method is faster than that of 2D flow rate upscaling method, and the precision of 3D flow rate upscaling method is better than that of 2D flow rate upscaling method. The results also show that the solutions of upscaling method are very approximating to that of fine grid blocks. In this paper, 4 methods of fast adaptive nonuniform grid upscaling method of 3D fluid equations in porous media have been put forward, and applied to calculate 3D heterogeneous reservoir and channel-sandy reservoir, whose computing results show that the solutions of nonuniform adaptive upscaling method of 3D heterogeneous fluid equations in porous media are very approximating to that of fine grid blocks in the regions the permeability or porosity being abnormity and very approximating to that of coarsen grid blocks in the other region, however, the computing speed of adaptive upscaling method is 100 times faster than that of fine grid block method. The formula of sensitivity coefficients are derived from initial boundary value problems of fluid equations in porous media by Green's reciprocity principle. The sensitivity coefficients of wellbore pressure to permeability parameters are given by Peaceman's equation and calculated by means of numerical calculation method of 3D transient anisotropic fluid equation in porous media and verified by direct method. The computing results are in excellent agreement with those obtained by the direct method, which shows feasibility of the method. In the dissertation, the calculating examples are also given for 3D reservoir, channel-sandy reservoir and 3D multi-well reservoir, whose numerical results indicate: around the well hole, the value of the sensitivity coefficients of permeability is very large, the value of the sensitivity coefficients of porosity is very large too, but the sensitivity coefficients of porosity is much less than the sensitivity coefficients of permeability, so that the effect of the sensitivity coefficients of permeability for inversion of reservoir parameters is much greater than that of the sensitivity coefficients of porosity. Because computing the sensitivity coefficients needs to call twice the program of reservoir simulation in one iteration, realizing inversion of reservoir parameters must be sustained by the fast forward method. Using the sensitivity coefficients of permeability and porosity, conditioned on observed valley erosion thickness in wells (hard data), the inversion of the permeabilities and porosities in the homogeneous reservoir, homogeneous reservoir only along the certain direction and block reservoir are implemented by Gauss-Newton method or conjugate gradient method respectively. The results of our examples are very approximating to the real data of permeability and porosity, but the convergence rate of conjugate gradient method is much faster than that of Gauss-Newton method.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This paper investigates properties of integer programming models for a class of production planning problems. The models are developed within a decision support system to advise a sales team of the products on which to focus their efforts in gaining new orders in the short term. The products generally require processing on several manufacturing cells and involve precedence relationships. The cells are already (partially) committed with products for stock and to satisfy existing orders and therefore only the residual capacities of each cell in each time period of the planning horizon are considered. The determination of production recommendations to the sales team that make use of residual capacities is a nontrivial optimization problem. Solving such models is computationally demanding and techniques for speeding up solution times are highly desirable. An integer programming model is developed and various preprocessing techniques are investigated and evaluated. In addition, a number of cutting plane approaches have been applied. The performance of these approaches which are both general and application specific is examined.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In this study, a dynamic programming approach to deal with the unconstrained two-dimensional non-guillotine cutting problem is presented. The method extends the recently introduced recursive partitioning approach for the manufacturer's pallet loading problem. The approach involves two phases and uses bounds based on unconstrained two-staged and non-staged guillotine cutting. The method is able to find the optimal cutting pattern of a large number of pro blem instances of moderate sizes known in the literature and a counterexample for which the approach fails to find known optimal solutions was not found. For the instances that the required computer runtime is excessive, the approach is combined with simple heuristics to reduce its running time. Detailed numerical experiments show the reliability of the method. Journal of the Operational Research Society (2012) 63, 183-200. doi: 10.1057/jors.2011.6 Published online 17 August 2011

Relevância:

40.00% 40.00%

Publicador:

Resumo:

We present a new cake−cutting procedure which guarantees everybody a proportional share according to his own valuation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper examines the relationship between the volatility implied in option prices and the subsequently realized volatility by using the S&P/ASX 200 index options (XJO) traded on the Australian Stock Exchange (ASX) during a period of 5 years. Unlike stock index options such as the S&P 100 index options in the US market, the S&P/ASX 200 index options are traded infrequently and in low volumes, and have a long maturity cycle. Thus an errors-in-variables problem for measurement of implied volatility is more likely to exist. After accounting for this problem by instrumental variable method, it is found that both call and put implied volatilities are superior to historical volatility in forecasting future realized volatility. Moreover, implied call volatility is nearly an unbiased forecast of future volatility.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Propagation of subtropical eucalypts is often limited by low production of rooted cuttings in winter. This study tested whether changing the temperature of Corymbia citriodora and Eucalyptus dunnii stock plants from 28/23A degrees C (day/night) to 18/13A degrees C, 23/18A degrees C or 33/28A degrees C affected the production of cuttings by stock plants, the concentrations of Ca and other nutrients in cuttings, and the subsequent percentages of cuttings that formed roots. Optimal temperatures for shoot production were 33/28A degrees C and 28/23A degrees C, with lower temperatures reducing the number of harvested cuttings. Stock plant temperature regulated production of rooted cuttings, firstly by controlling shoot production and, secondly, by affecting the ensuing rooting percentage. Shoot production was the primary factor regulating rooted cutting production by C. citriodora, but both shoot production and root production were key determinants of rooted cutting production in E. dunnii. Effects of lower stock plant temperatures on rooting were not the result of reduced Ca concentration, but consistent relationships were found between adventitious root formation and B concentration. Average rooting percentages were low (1-15% for C. citriodora and 2-22% for E. dunnii) but rooted cutting production per stock plant (e.g. 25 for C. citriodora and 52 for E. dunnii over 14 weeks at 33/28A degrees C) was sufficient to establish clonal field tests for plantation forestry.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Common coral trout Plectropomus leopardus is an iconic fish of the Great Barrier Reef (GBR) and is the most important fish for the commercial fishery there. Most of the catch is exported live to Asia. This stock assessment was undertaken in response to falls in catch sizes and catch rates in recent years, in order to gauge the status of the stock. It is the first stock assessment ever conducted of coral trout on the GBR, and brings together a multitude of different data sources for the first time. The GBR is very large and was divided into a regional structure based on the Bioregions defined by expert committees appointed by the Great Barrier Reef Marine Park Authority (GBRMPA) as part of the 2004 rezoning of the GBR. The regional structure consists of six Regions, from the Far Northern Region in the north to the Swains and Capricorn–Bunker Regions in the south. Regions also closely follow the boundaries between Bioregions. Two of the northern Regions are split into Subregions on the basis of potential changes in fishing intensity between the Subregions; there are nine Subregions altogether, which include four Regions that are not split. Bioregions are split into Subbioregions along the Subregion boundaries. Finally, each Subbioregion is split into a “blue” population which is open to fishing and a “green” population which is closed to fishing. The fishery is unusual in that catch rates as an indicator of abundance of coral trout are heavily influenced by tropical cyclones. After a major cyclone, catch rates fall for two to three years, and rebound after that. This effect is well correlated with the times of occurrence of cyclones, and usually occurs in the same month that the cyclone strikes. However, statistical analyses correlating catch rates with cyclone wind energy did not provide significantly different catch rate trends. Alternative indicators of cyclone strength may explain more of the catch rate decline, and future work should investigate this. Another feature of catch rates is the phenomenon of social learning in coral trout populations, whereby when a population of coral trout is fished, individuals quickly learn not to take bait. Then the catch rate falls sharply even when the population size is still high. The social learning may take place by fish directly observing their fellows being hooked, or perhaps heeding a chemo-sensory cue emitted by fish that are hooked. As part of the assessment, analysis of data from replenishment closures of Boult Reef in the Capricorn–Bunker Region (closed 1983–86) and Bramble Reef in the Townsville Subregion (closed 1992–95) estimated a strong social learning effect. A major data source for the stock assessment was the large collection of underwater visual survey (UVS) data collected by divers who counted the coral trout that they sighted. This allowed estimation of the density of coral trout in the different Bioregions (expressed as a number of fish per hectare). Combined with mapping data of all the 3000 or so reefs making up the GBR, the UVS results provided direct estimates of the population size in each Subbioregion. A regional population dynamic model was developed to account for the intricacies of coral trout population dynamics and catch rates. Because the statistical analysis of catch rates did not attribute much of the decline to tropical cyclones, (and thereby implied “real” declines in biomass), and because in contrast the UVS data indicate relatively stable population sizes, model outputs were unduly influenced by the unlikely hypothesis that falling catch rates are real. The alternative hypothesis that UVS data are closer to the mark and declining catch rates are an artefact of spurious (e.g., cyclone impact) effects is much more probable. Judging by the population size estimates provided by the UVS data, there is no biological problem with the status of coral trout stocks. The estimate of the total number of Plectropomus leopardus on blue zones on the GBR in the mid-1980s (the time of the major UVS series) was 5.34 million legal-sized fish, or about 8400 t exploitable biomass, with an 2 additional 3350 t in green zones (using the current zoning which was introduced on 1 July 2004). For the offshore regions favoured by commercial fishers, the figure was about 4.90 million legal-sized fish in blue zones, or about 7700 t exploitable biomass. There is, however, an economic problem, as indicated by relatively low catch rates and anecdotal information provided by commercial fishers. The costs of fishing the GBR by hook and line (the only method compatible with the GBR’s high conservation status) are high, and commercial fishers are unable to operate profitably when catch rates are depressed (e.g., from a tropical cyclone). The economic problem is compounded by the effect of social learning in coral trout, whereby catch rates fall rapidly if fishers keep returning to the same fishing locations. In response, commercial fishers tend to spread out over the GBR, including the Far Northern and Swains Regions which are far from port and incur higher travel costs. The economic problem provides some logic to a reduction in the TACC. Such a reduction during good times, such as when the fishery is rebounding after a major tropical cyclone, could provide a net benefit to the fishery, as it would provide a margin of stock safety and make the fishery more economically robust by providing higher catch rates during subsequent periods of depressed catches. During hard times when catch rates are low (e.g., shortly after a major tropical cyclone), a change to the TACC would have little effect as even a reduced TACC would not come close to being filled. Quota adjustments based on catch rates should take account of long-term trends in order to mitigate variability and cyclone effects in data.