17 resultados para Multi- Choice mixed integer goal programming

em Biblioteca Digital da Produção Intelectual da Universidade de São Paulo


Relevância:

100.00% 100.00%

Publicador:

Resumo:

We introduce a new Integer Linear Programming (ILP) approach for solving Integer Programming (IP) problems with bilinear objectives and linear constraints. The approach relies on a series of ILP approximations of the bilinear P. We compare this approach with standard linearization techniques on random instances and a set of real-world product bundling problems. (C) 2011 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Setup operations are significant in some production environments. It is mandatory that their production plans consider some features, as setup state conservation across periods through setup carryover and crossover. The modelling of setup crossover allows more flexible decisions and is essential for problems with long setup times. This paper proposes two models for the capacitated lot-sizing problem with backlogging and setup carryover and crossover. The first is in line with other models from the literature, whereas the second considers a disaggregated setup variable, which tracks the starting and completion times of the setup operation. This innovative approach permits a more compact formulation. Computational results show that the proposed models have outperformed other state-of-the-art formulation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This article describes a real-world production planning and scheduling problem occurring at an integrated pulp and paper mill (P&P) which manufactures paper for cardboard out of produced pulp. During the cooking of wood chips in the digester, two by-products are produced: the pulp itself (virgin fibers) and the waste stream known as black liquor. The former is then mixed with recycled fibers and processed in a paper machine. Here, due to significant sequence-dependent setups in paper type changeovers, sizing and sequencing of lots have to be made simultaneously in order to efficiently use capacity. The latter is converted into electrical energy using a set of evaporators, recovery boilers and counter-pressure turbines. The planning challenge is then to synchronize the material flow as it moves through the pulp and paper mills, and energy plant, maximizing customer demand (as backlogging is allowed), and minimizing operation costs. Due to the intensive capital feature of P&P, the output of the digester must be maximized. As the production bottleneck is not fixed, to tackle this problem we propose a new model that integrates the critical production units associated to the pulp and paper mills, and energy plant for the first time. Simple stochastic mixed integer programming based local search heuristics are developed to obtain good feasible solutions for the problem. The benefits of integrating the three stages are discussed. The proposed approaches are tested on real-world data. Our work may help P&P companies to increase their competitiveness and reactiveness in dealing with demand pattern oscillations. (C) 2012 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The integrated production scheduling and lot-sizing problem in a flow shop environment consists of establishing production lot sizes and allocating machines to process them within a planning horizon in a production line with machines arranged in series. The problem considers that demands must be met without backlogging, the capacity of the machines must be respected, and machine setups are sequence-dependent and preserved between periods of the planning horizon. The objective is to determine a production schedule to minimise the setup, production and inventory costs. A mathematical model from the literature is presented, as well as procedures for obtaining feasible solutions. However, some of the procedures have difficulty in obtaining feasible solutions for large-sized problem instances. In addition, we address the problem using different versions of the Asynchronous Team (A-Team) approach. The procedures were compared with literature heuristics based on Mixed Integer Programming. The proposed A-Team procedures outperformed the literature heuristics, especially for large instances. The developed methodologies and the results obtained are presented.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We deal with the optimization of the production of branched sheet metal products. New forming techniques for sheet metal give rise to a wide variety of possible profiles and possible ways of production. In particular, we show how the problem of producing a given profile geometry can be modeled as a discrete optimization problem. We provide a theoretical analysis of the model in order to improve its solution time. In this context we give the complete convex hull description of some substructures of the underlying polyhedron. Moreover, we introduce a new class of facet-defining inequalities that represent connectivity constraints for the profile and show how these inequalities can be separated in polynomial time. Finally, we present numerical results for various test instances, both real-world and academic examples.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The LA-MC-ICP-MS method applied to U-Pb in situ dating is still rapidly evolving due to improvements in both lasers and ICP-MS. To test the validity and reproducibility of the method, 5 different zircon samples, including the standard Temora-2, ranging in age between 2.2 Ga and 246 Ma, were dated using both LA-MC-ICP-MS and SHRIMP. The selected zircons were dated by SHRIMP and, after gentle polishing, the laser spot was driven to the same site or on the same zircon phase with a 213 nm laser microprobe coupled to a multi-collector mixed system. The data were collected with a routine spot size of 25 μm and, in some cases, of 15 and 40 μm. A careful cross-calibration using a diluted U-Th-Pb solution to calculate the Faraday reading to counting rate conversion factors and the highly suitable GJ-1 standard zircon for external calibrations were of paramount importance for obtaining reliable results. All age results were concordant within the experimental errors. The assigned age errors using the LA-MC-ICP-MS technique were, in most cases, higher than those obtained by SHRIMP, but if we are not faced with a high resolution stratigraphy, the laser technique has certain advantages.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, a general scheme for generating extra cuts during the execution of a Benders decomposition algorithm is presented. These cuts are based on feasible and infeasible master problem solutions generated by means of a heuristic. This article includes general guidelines and a case study with a fixed charge network design problem. Computational tests with instances of this problem show the efficiency of the strategy. The most important aspect of the proposed ideas is their generality, which allows them to be used in virtually any Benders decomposition implementation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objectives: Over the last years, it is known that in some cases metal devices for biomedical applications present some disadvantages suggesting absorbable materials (natural or synthetic) as an alternative of choice. Here, our goal was to evaluate the biological response of a xenogenic pin, derived from bovine cortical bone, intraosseously implanted in the femur of rats. Material and methods: After 10, 14, 30 and 60 days from implantation, the animals (n = 5/period) were killed and the femurs carefully collected and dissected out under histological demands. For identifying the osteoclastogenesis level at 60 days, we performed the immunohistochemisty approach using antibody against RANKL. Results: Interestingly, our results showed that the incidence of neutrophils and leukocytes was observed only at the beginning (10 days). Clear evidences of pin degradation by host cells started at 14 days and it was more intensive at 60 days, when we detected the majority of the presence of giant multinucleated cells, which were very similar to osteoclast cells contacting the implanted pin. To check osteoclastogenesis at 60 days, we evaluated RANKL expression and it was positive for those resident multinucleated cells while a new bone deposition was verified surrounding the pins in all evaluated periods. Conclusions: Altogether, our results showed that pins from fully processed bovine bone are biocompatible and absorbable, allowing bone neoformation and it is a promissory device for biomedical applications.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, we propose three novel mathematical models for the two-stage lot-sizing and scheduling problems present in many process industries. The problem shares a continuous or quasi-continuous production feature upstream and a discrete manufacturing feature downstream, which must be synchronized. Different time-based scale representations are discussed. The first formulation encompasses a discrete-time representation. The second one is a hybrid continuous-discrete model. The last formulation is based on a continuous-time model representation. Computational tests with state-of-the-art MIP solver show that the discrete-time representation provides better feasible solutions in short running time. On the other hand, the hybrid model achieves better solutions for longer computational times and was able to prove optimality more often. The continuous-type model is the most flexible of the three for incorporating additional operational requirements, at a cost of having the worst computational performance. Journal of the Operational Research Society (2012) 63, 1613-1630. doi:10.1057/jors.2011.159 published online 7 March 2012

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of solving the Optimal Power Flow problem is to determine the optimal state of an electric power transmission system, that is, the voltage magnitude and phase angles and the tap ratios of the transformers that optimize the performance of a given system, while satisfying its physical and operating constraints. The Optimal Power Flow problem is modeled as a large-scale mixed-discrete nonlinear programming problem. This paper proposes a method for handling the discrete variables of the Optimal Power Flow problem. A penalty function is presented. Due to the inclusion of the penalty function into the objective function, a sequence of nonlinear programming problems with only continuous variables is obtained and the solutions of these problems converge to a solution of the mixed problem. The obtained nonlinear programming problems are solved by a Primal-Dual Logarithmic-Barrier Method. Numerical tests using the IEEE 14, 30, 118 and 300-Bus test systems indicate that the method is efficient. (C) 2012 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Network virtualization is a promising technique for building the Internet of the future since it enables the low cost introduction of new features into network elements. An open issue in such virtualization is how to effect an efficient mapping of virtual network elements onto those of the existing physical network, also called the substrate network. Mapping is an NP-hard problem and existing solutions ignore various real network characteristics in order to solve the problem in a reasonable time frame. This paper introduces new algorithms to solve this problem based on 0–1 integer linear programming, algorithms based on a whole new set of network parameters not taken into account by previous proposals. Approximative algorithms proposed here allow the mapping of virtual networks on large network substrates. Simulation experiments give evidence of the efficiency of the proposed algorithms.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Sugarcane-breeding programs take at least 12 years to develop new commercial cultivars. Molecular markers offer a possibility to study the genetic architecture of quantitative traits in sugarcane, and they may be used in marker-assisted selection to speed up artificial selection. Although the performance of sugarcane progenies in breeding programs are commonly evaluated across a range of locations and harvest years, many of the QTL detection methods ignore two- and three-way interactions between QTL, harvest, and location. In this work, a strategy for QTL detection in multi-harvest-location trial data, based on interval mapping and mixed models, is proposed and applied to map QTL effects on a segregating progeny from a biparental cross of pre-commercial Brazilian cultivars, evaluated at two locations and three consecutive harvest years for cane yield (tonnes per hectare), sugar yield (tonnes per hectare), fiber percent, and sucrose content. In the mixed model, we have included appropriate (co)variance structures for modeling heterogeneity and correlation of genetic effects and non-genetic residual effects. Forty-six QTLs were found: 13 QTLs for cane yield, 14 for sugar yield, 11 for fiber percent, and 8 for sucrose content. In addition, QTL by harvest, QTL by location, and QTL by harvest by location interaction effects were significant for all evaluated traits (30 QTLs showed some interaction, and 16 none). Our results contribute to a better understanding of the genetic architecture of complex traits related to biomass production and sucrose content in sugarcane.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present the results of airborne measurements of carbon monoxide (CO) and aerosol particle number concentration (CN) made during the Balan double dagger o Atmosf,rico Regional de Carbono na Amazonia (BARCA) program. The primary goal of BARCA is to address the question of basin-scale sources and sinks of CO2 and other atmospheric carbon species, a central issue of the Large-scale Biosphere-Atmosphere (LBA) program. The experiment consisted of two aircraft campaigns during November-December 2008 (BARCA-A) and May-June 2009 (BARCA-B), which covered the altitude range from the surface up to about 4500 m, and spanned most of the Amazon Basin. Based on meteorological analysis and measurements of the tracer, SF6, we found that airmasses over the Amazon Basin during the late dry season (BARCA-A, November 2008) originated predominantly from the Southern Hemisphere, while during the late wet season (BARCA-B, May 2009) low-level airmasses were dominated by northern-hemispheric inflow and mid-tropospheric airmasses were of mixed origin. In BARCA-A we found strong influence of biomass burning emissions on the composition of the atmosphere over much of the Amazon Basin, with CO enhancements up to 300 ppb and CN concentrations approaching 10 000 cm(-3); the highest values were in the southern part of the Basin at altitudes of 1-3 km. The Delta CN/Delta CO ratios were diagnostic for biomass burning emissions, and were lower in aged than in fresh smoke. Fresh emissions indicated CO/CO2 and CN/CO emission ratios in good agreement with previous work, but our results also highlight the need to consider the residual smoldering combustion that takes place after the active flaming phase of deforestation fires. During the late wet season, in contrast, there was little evidence for a significant presence of biomass smoke. Low CN concentrations (300-500 cm(-3)) prevailed basinwide, and CO mixing ratios were enhanced by only similar to 10 ppb above the mixing line between Northern and Southern Hemisphere air. There was no detectable trend in CO with distance from the coast, but there was a small enhancement of CO in the boundary layer suggesting diffuse biogenic sources from photochemical degradation of biogenic volatile organic compounds or direct biological emission. Simulations of CO distributions during BARCA-A using a range of models yielded general agreement in spatial distribution and confirm the important contribution from biomass burning emissions, but the models evidence some systematic quantitative differences compared to observed CO concentrations. These mismatches appear to be related to problems with the accuracy of the global background fields, the role of vertical transport and biomass smoke injection height, the choice of model resolution, and reliability and temporal resolution of the emissions data base.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Little is known about the situational contexts in which individuals consume processed sources of dietary sugars. This study aimed to describe the situational contexts associated with the consumption of sweetened food and drink products in a Catholic Middle Eastern Canadian community. A two-stage exploratory sequential mixed-method design was employed with a rationale of triangulation. In stage 1 (n = 62), items and themes describing the situational contexts of sweetened food and drink product consumption were identified from semi-structured interviews and were used to develop the content for the Situational Context Instrument for Sweetened Product Consumption (SCISPC). Face validity, readability and cultural relevance of the instrument were assessed. In stage 2 (n = 192), a cross-sectional study was conducted and exploratory factor analysis was used to examine the structure of themes that emerged from the qualitative analysis as a means of furthering construct validation. The SCISPC reliability and predictive validity on the daily consumption of sweetened products were also assessed. In stage 1, six themes and 40-items describing the situational contexts of sweetened product consumption emerged from the qualitative analysis and were used to construct the first draft of the SCISPC. In stage 2, factor analysis enabled the clarification and/or expansion of the instrument's initial thematic structure. The revised SCISPC has seven factors and 31 items describing the situational contexts of sweetened product consumption. Initial validation of the instrument indicated it has excellent internal consistency and adequate test-retest reliability. Two factors of the SCISPC had predictive validity for the daily consumption of total sugar from sweetened products (Snacking and Energy demands) while the other factors (Socialization, Indulgence, Constraints, Visual Stimuli and Emotional needs) were rather associated to occasional consumption of these products.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper compares the effectiveness of the Tsallis entropy over the classic Boltzmann-Gibbs-Shannon entropy for general pattern recognition, and proposes a multi-q approach to improve pattern analysis using entropy. A series of experiments were carried out for the problem of classifying image patterns. Given a dataset of 40 pattern classes, the goal of our image case study is to assess how well the different entropies can be used to determine the class of a newly given image sample. Our experiments show that the Tsallis entropy using the proposed multi-q approach has great advantages over the Boltzmann-Gibbs-Shannon entropy for pattern classification, boosting image recognition rates by a factor of 3. We discuss the reasons behind this success, shedding light on the usefulness of the Tsallis entropy and the multi-q approach. (C) 2012 Elsevier B.V. All rights reserved.