26 resultados para Production lot-scheduling models

em Biblioteca Digital da Produção Intelectual da Universidade de São Paulo


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The integrated production scheduling and lot-sizing problem in a flow shop environment consists of establishing production lot sizes and allocating machines to process them within a planning horizon in a production line with machines arranged in series. The problem considers that demands must be met without backlogging, the capacity of the machines must be respected, and machine setups are sequence-dependent and preserved between periods of the planning horizon. The objective is to determine a production schedule to minimise the setup, production and inventory costs. A mathematical model from the literature is presented, as well as procedures for obtaining feasible solutions. However, some of the procedures have difficulty in obtaining feasible solutions for large-sized problem instances. In addition, we address the problem using different versions of the Asynchronous Team (A-Team) approach. The procedures were compared with literature heuristics based on Mixed Integer Programming. The proposed A-Team procedures outperformed the literature heuristics, especially for large instances. The developed methodologies and the results obtained are presented.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In this paper, we propose three novel mathematical models for the two-stage lot-sizing and scheduling problems present in many process industries. The problem shares a continuous or quasi-continuous production feature upstream and a discrete manufacturing feature downstream, which must be synchronized. Different time-based scale representations are discussed. The first formulation encompasses a discrete-time representation. The second one is a hybrid continuous-discrete model. The last formulation is based on a continuous-time model representation. Computational tests with state-of-the-art MIP solver show that the discrete-time representation provides better feasible solutions in short running time. On the other hand, the hybrid model achieves better solutions for longer computational times and was able to prove optimality more often. The continuous-type model is the most flexible of the three for incorporating additional operational requirements, at a cost of having the worst computational performance. Journal of the Operational Research Society (2012) 63, 1613-1630. doi:10.1057/jors.2011.159 published online 7 March 2012

Relevância:

50.00% 50.00%

Publicador:

Resumo:

Setup operations are significant in some production environments. It is mandatory that their production plans consider some features, as setup state conservation across periods through setup carryover and crossover. The modelling of setup crossover allows more flexible decisions and is essential for problems with long setup times. This paper proposes two models for the capacitated lot-sizing problem with backlogging and setup carryover and crossover. The first is in line with other models from the literature, whereas the second considers a disaggregated setup variable, which tracks the starting and completion times of the setup operation. This innovative approach permits a more compact formulation. Computational results show that the proposed models have outperformed other state-of-the-art formulation.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This article describes a real-world production planning and scheduling problem occurring at an integrated pulp and paper mill (P&P) which manufactures paper for cardboard out of produced pulp. During the cooking of wood chips in the digester, two by-products are produced: the pulp itself (virgin fibers) and the waste stream known as black liquor. The former is then mixed with recycled fibers and processed in a paper machine. Here, due to significant sequence-dependent setups in paper type changeovers, sizing and sequencing of lots have to be made simultaneously in order to efficiently use capacity. The latter is converted into electrical energy using a set of evaporators, recovery boilers and counter-pressure turbines. The planning challenge is then to synchronize the material flow as it moves through the pulp and paper mills, and energy plant, maximizing customer demand (as backlogging is allowed), and minimizing operation costs. Due to the intensive capital feature of P&P, the output of the digester must be maximized. As the production bottleneck is not fixed, to tackle this problem we propose a new model that integrates the critical production units associated to the pulp and paper mills, and energy plant for the first time. Simple stochastic mixed integer programming based local search heuristics are developed to obtain good feasible solutions for the problem. The benefits of integrating the three stages are discussed. The proposed approaches are tested on real-world data. Our work may help P&P companies to increase their competitiveness and reactiveness in dealing with demand pattern oscillations. (C) 2012 Elsevier Ltd. All rights reserved.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

During the last three decades, several predictive models have been developed to estimate the somatic production of macroinvertebrates. Although the models have been evaluated for their ability to assess the production of macrobenthos in different marine ecosystems, these approaches have not been applied specifically to sandy beach macrofauna and may not be directly applicable to this transitional environment. Hence, in this study, a broad literature review of sandy beach macrofauna production was conducted and estimates obtained with cohort-based and size-based methods were collected. The performance of nine models in estimating the production of individual populations from the sandy beach environment, evaluated for all taxonomic groups combined and for individual groups separately, was assessed, comparing the production predicted by the models to the estimates obtained from the literature (observed production). Most of the models overestimated population production compared to observed production estimates, whether for all populations combined or more specific taxonomic groups. However, estimates by two models developed by Cusson and Bourget provided best fits to measured production, and thus represent the best alternatives to the cohort-based and size-based methods in this habitat. The consistent performance of one of these Cusson and Bourget models, which was developed for the macrobenthos of sandy substrate habitats (C&B-SS), shows that the performance of a model does not depend on whether it was developed for a specific taxonomic group. Moreover, since some widely used models (e.g., the Robertson model) show very different responses when applied to the macrofauna of different marine environments (e.g., sandy beaches and estuaries), prior evaluation of these models is essential.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We investigated the influence of nutrient-rich oceanic waters in comparison to the estuarine outflow from Santos Bay (SE Brazil) on copepod abundance and production on the adjacent inner shelf. Zooplankton samples were collected with a Multinet in spring 2005 and in summer 2006. Copepod biomass was derived from length-weight regressions, and growth rates were estimated from empirical models. Altogether, 58 copepod taxa were identified. The highest abundances were due to small-sized organisms including nauplii, oncaeids and copepodids of paracalanids and clausocalanids. Biomass and secondary production mirrored copepod abundance, with Temora copepodids accompanying the above-mentioned taxa as major contributors. The contribution of naupliar biomass and production was low (2.2 and 3.8% of the total, respectively). The influence of the Santos Bay outflow was observed only in spring, when Coastal Water (CW) dominated at the study site; whereas in summer the inner shelf was occupied by CW in the surface layer and the oceanic South Atlantic Central Water (SACW) in the bottom layer. The SACW intrusion had more of an influence for the increase in copepod production than the Santos Bay plume. The distribution and dynamics of the oceanic water masses seemed to be the most important influence on copepod diversity and production at this subtropical site.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We calculate the relic abundance of mixed axion/neutralino cold dark matter which arises in R-parity conserving supersymmetric (SUSY) models wherein the strong CP problem is solved by the Peccei-Quinn (PQ) mechanism with a concommitant axion/saxion/axino supermultiplet. By numerically solving the coupled Boltzmann equations, we include the combined effects of 1. thermal axino production with cascade decays to a neutralino LSP, 2. thermal saxion production and production via coherent oscillations along with cascade decays and entropy injection, 3. thermal neutralino production and re-annihilation after both axino and saxion decays, 4. gravitino production and decay and 5. axion production both thermally and via oscillations. For SUSY models with too high a standard neutralino thermal abundance, we find the combined effect of SUSY PQ particles is not enough to lower the neutralino abundance down to its measured value, while at the same time respecting bounds on late-decaying neutral particles from BBN. However, models with a standard neutralino underabundance can now be allowed with either neutralino or axion domination of dark matter, and furthermore, these models can allow the PQ breaking scale f(a) to be pushed up into the 10(14) - 10(15) GeV range, which is where it is typically expected to be in string theory models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The objective of this work was to develop and validate linear regression models to estimate the production of dry matter by Tanzania grass (Megathyrsus maximus, cultivar Tanzania) as a function of agrometeorological variables. For this purpose, data on the growth of this forage grass from 2000 to 2005, under dry-field conditions in Sao Carlos, SP, Brazil, were correlated to the following climatic parameters: minimum and mean temperatures, degree-days, and potential and actual evapotranspiration. Simple linear regressions were performed between agrometeorological variables (independent) and the dry matter accumulation rate (dependent). The estimates were validated with independent data obtained in Sao Carlos and Piracicaba, SP, Brazil. The best statistical results in the development and validation of the models were obtained with the agrometeorological parameters that consider thermal and water availability effects together, such as actual evapotranspiration, accumulation of degree-days corrected by water availability, and the climatic growth index, based on average temperature, solar radiation, and water availability. These variables can be used in simulations and models to predict the production of Tanzania grass.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

According to recent research carried out in the foundry sector, one of the most important concerns of the industries is to improve their production planning. A foundry production plan involves two dependent stages: (1) determining the alloys to be merged and (2) determining the lots that will be produced. The purpose of this study is to draw up plans of minimum production cost for the lot-sizing problem for small foundries. As suggested in the literature, the proposed heuristic addresses the problem stages in a hierarchical way. Firstly, the alloys are determined and, subsequently, the items that are produced from them. In this study, a knapsack problem as a tool to determine the items to be produced from furnace loading was proposed. Moreover, we proposed a genetic algorithm to explore some possible sets of alloys and to determine the production planning for a small foundry. Our method attempts to overcome the difficulties in finding good production planning presented by the method proposed in the literature. The computational experiments show that the proposed methods presented better results than the literature. Furthermore, the proposed methods do not need commercial software, which is favorable for small foundries. (C) 2010 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Heavy-flavor production in p + p collisions is a good test of perturbative-quantum-chromodynamics (pQCD) calculations. Modification of heavy-flavor production in heavy-ion collisions relative to binary-collision scaling from p + p results, quantified with the nuclear-modification factor (R-AA), provides information on both cold-and hot-nuclear-matter effects. Midrapidity heavy-flavor R-AA measurements at the Relativistic Heavy Ion Collider have challenged parton-energy-loss models and resulted in upper limits on the viscosity-entropy ratio that are near the quantum lower bound. Such measurements have not been made in the forward-rapidity region. Purpose: Determine transverse-momentum (p(T)) spectra and the corresponding R-AA for muons from heavy-flavor meson decay in p + p and Cu + Cu collisions at root s(NN) = 200 GeV and y = 1.65. Method: Results are obtained using the semileptonic decay of heavy-flavor mesons into negative muons. The PHENIX muon-arm spectrometers measure the p(T) spectra of inclusive muon candidates. Backgrounds, primarily due to light hadrons, are determined with a Monte Carlo calculation using a set of input hadron distributions tuned to match measured-hadron distributions in the same detector and statistically subtracted. Results: The charm-production cross section in p + p collisions at root s = 200 GeV, integrated over p(T) and in the rapidity range 1.4 < y < 1.9, is found to be d(sigma e (e) over bar)/dy = 0.139 +/- 0.029 (stat)(-0.058)(+0.051) (syst) mb. This result is consistent with a perturbative fixed-order-plus-next-to-leading-log calculation within scale uncertainties and is also consistent with expectations based on the corresponding midrapidity charm-production cross section measured by PHENIX. The R-AA for heavy-flavor muons in Cu + Cu collisions is measured in three centrality bins for 1 < p(T) < 4 GeV/c. Suppression relative to binary-collision scaling (R-AA < 1) increases with centrality. Conclusions: Within experimental and theoretical uncertainties, the measured charm yield in p + p collisions is consistent with state-of-the-art pQCD calculations. Suppression in central Cu + Cu collisions suggests the presence of significant cold-nuclear-matter effects and final-state energy loss.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Kaurenoic acid [ent-kaur-16-en-19-oic acid (1)] is a diterpene present in several plants including Sphagneticola trilobata. The only documented evidence for its antinociceptive effect is that it inhibits the writhing response induced by acetic acid in mice. Therefore, the analgesic effect of 1 in different models of pain and its mechanisms in mice were investigated further. Intraperitoneal and oral treatment with 1 dose-dependently inhibited inflammatory nociception induced by acetic acid. Oral treatment with 1 also inhibited overt nociception-like behavior induced by phenyl-p-benzoquinone, complete Freund's adjuvant (CFA), and both phases of the formalin test. Compound 1 also inhibited acute carrageenin- and PGE(2)-induced and chronic CFA-induced inflammatory mechanical hyperalgesia. Mechanistically, 1 inhibited the production of the hyperalgesic cytokines TNF-alpha and IL-1 beta. Furthermore, the analgesic effect of 1 was inhibited by L-NAME, ODQ, KT5823, and glybenclamide treatment, demonstrating that such activity also depends on activation of the NO-cyclic GMP-protein kinase G-ATP-sensitive potassium channel signaling pathway, respectively. These results demonstrate that 1 exhibits an analgesic effect in a consistent manner and that its mechanisms involve the inhibition of cytokine production and activation of the NO-cyclic GMP-protein lcinase G-ATP-sensitive potassium channel signaling pathway.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Sugarcane bagasse was characterized as a feedstock for the production of ethanol using hydrothermal pretreatment. Reaction temperature and time were varied between 160 and 200A degrees C and 5-20 min, respectively, using a response surface experimental design. The liquid fraction was analyzed for soluble carbohydrates and furan aldehydes. The solid fraction was analyzed for structural carbohydrates and Klason lignin. Pretreatment conditions were evaluated based on enzymatic extraction of glucose and xylose and conversion to ethanol using a simultaneous saccharification and fermentation scheme. SSF experiments were conducted with the washed pretreated biomass. The severity of the pretreatment should be sufficient to drive enzymatic digestion and ethanol yields, however, sugars losses and especially sugar conversion into furans needs to be minimized. As expected, furfural production increased with pretreatment severity and specifically xylose release. However, provided that the severity was kept below a general severity factor of 4.0, production of furfural was below an inhibitory concentration and carbohydrate contents were preserved in the pretreated whole hydrolysate. There were significant interactions between time and temperature for all the responses except cellulose digestion. The models were highly predictive for cellulose digestibility (R (2) = 0.8861) and for ethanol production (R (2) = 0.9581), but less so for xylose extraction. Both cellulose digestion and ethanol production increased with severity, however, high levels of furfural generated under more severe pretreatment conditions favor lower severity pretreatments. The optimal pretreatment condition that gave the highest conversion yield of ethanol, while minimizing furfural production, was judged to be 190A degrees C and 17.2 min. The whole hydrolysate was also converted to ethanol using SSF. To reduce the concentration of inhibitors, the liquid fraction was conditioned prior to fermentation by removing inhibitory chemicals using the fungus Coniochaeta ligniaria.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We report on charmonium measurements [J/psi (1S), psi' (2S), and chi(c) (1P)] in p + p collisions at root s = 200 GeV. We find that the fraction of J/psi coming from the feed-down decay of psi' and chi(c) in the midrapidity region (vertical bar y vertical bar < 0: 35) is 9.6 +/- 2.4% and 32 +/- 9%, respectively. We also present the p(T) and rapidity dependencies of the J/psi yield measured via dielectron decay at midrapidity (vertical bar y vertical bar < 0.35) and via dimuon decay at forward rapidity (1.2 < vertical bar y vertical bar < 2.2). The statistical precision greatly exceeds that reported in our previous publication [Phys. Rev. Lett. 98, 232002 (2007)]. The new results are compared with other experiments and discussed in the context of current charmonium production models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Companies are currently choosing to integrate logics and systems to achieve better solutions. These combinations also include companies striving to join the logic of material requirement planning (MRP) system with the systems of lean production. The purpose of this article was to design an MRP as part of the implementation of an enterprise resource planning (ERP) in a company that produces agricultural implements, which has used the lean production system since 1998. This proposal is based on the innovation theory, theory networks, lean production systems, ERP systems and the hybrid production systems, which use both components and MRP systems, as concepts of lean production systems. The analytical approach of innovation networks enables verification of the links and relationships among the companies and departments of the same corporation. The analysis begins with the MRP implementation project carried out in a Brazilian metallurgical company and follows through the operationalisation of the MRP project, until its production stabilisation. The main point is that the MRP system should help the company's operations with regard to its effective agility to respond in time to demand fluctuations, facilitating the creation process and controlling the branch offices in other countries that use components produced in the matrix, hence ensuring more accurate estimates of stockpiles. Consequently, it presents the enterprise knowledge development organisational modelling methodology in order to represent further models (goals, actors and resources, business rules, business process and concepts) that should be included in this MRP implementation process for the new configuration of the production system.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In SUSY models with heavy squarks and gaugino mass unification, the gaugino pair production reaction pp -> (W) over tilde (+/-)(1)(Z) over tilde (2) dominates gluino pair production for m (g) over tilde less than or similar to 1 TeV at LHC with root s = 14 TeV (LHC14). For this mass range, the two-body decays (W) over tilde (1) -> W (Z) over tilde (1) and (Z) over tilde (2) -> h (Z) over tilde (1) are expected to dominate the chargino and neutralino branching fractions. By searching for lb (b) over tilde + is not an element of(T) events from (W) over tilde (+/-)(1)Z(2) production, we show that LHC14 with 100 fb(-1) of integrated luminosity becomes sensitive to chargino masses in the range m((W) over tilde1) similar to 450-550 GeV corresponding to m (g) over tilde similar to 1.5-2 TeV in models with gaugino mass unification. For 10(3) fb(-1), LHC14 is sensitive to the Wh channel for m((W) over tilde1) similar to 300-800 GeV, corresponding to m (g) over tilde similar to 1-2.8 TeV, which is comparable to the reach for gluino pair production followed by cascade decays. The Wh + is not an element of(T) search channel opens up a new complementary avenue for SUSY searches at LHC, and serves to point to SUSYas the origin of any new physics discovered via multijet and multilepton + is not an element of(T) channels.