144 resultados para CUTTING STOCK PROBLEM
Resumo:
We propose a stylized model of a problem-solving organization whoseinternal communication structure is given by a fixed network. Problemsarrive randomly anywhere in this network and must find their way to theirrespective specialized solvers by relying on local information alone.The organization handles multiple problems simultaneously. For this reason,the process may be subject to congestion. We provide a characterization ofthe threshold of collapse of the network and of the stock of foatingproblems (or average delay) that prevails below that threshold. We buildupon this characterization to address a design problem: the determinationof what kind of network architecture optimizes performance for any givenproblem arrival rate. We conclude that, for low arrival rates, the optimalnetwork is very polarized (i.e. star-like or centralized ), whereas it islargely homogenous (or decentralized ) for high arrival rates. We also showthat, if an auxiliary assumption holds, the transition between these twoopposite structures is sharp and they are the only ones to ever qualify asoptimal.
Resumo:
We set up a dynamic model of firm investment in which liquidity constraintsenter explicity into the firm's maximization problem. The optimal policyrules are incorporated into a maximum likelihood procedure which estimatesthe structural parameters of the model. Investment is positively related tothe firm's internal financial position when the firm is relatively poor. This relationship disappears for wealthy firms, which can reach theirdesired level of investment. Borrowing is an increasing function of financial position for poor firms. This relationship is reversed as a firm's financial position improves, and large firms hold little debt.Liquidity constrained firms may be unused credits lines and the capacity toinvest further if they desire. However the fear that liquidity constraintswill become binding in the future induces them to invest only when internalresources increase.We estimate the structural parameters of the model and use them to quantifythe importance of liquidity constraints on firms' investment. We find thatliquidity constraints matter significantly for the investment decisions of firms. If firms can finance investment by issuing fresh equity, rather than with internal funds or debt, average capital stock is almost 35% higher overa period of 20 years. Transitory shocks to internal funds have a sustained effect on the capital stock. This effect lasts for several periods and ismore persistent for small firms than for large firms. A 10% negative shock to firm fundamentals reduces the capital stock of firms which face liquidityconstraints by almost 8% over a period as opposed to only 3.5% for firms which do not face these constraints.
Resumo:
This paper presents a simple Optimised Search Heuristic for the Job Shop Scheduling problem that combines a GRASP heuristic with a branch-and-bound algorithm. The proposed method is compared with similar approaches and leads to better results in terms of solution quality and computing times.
Resumo:
We present new metaheuristics for solving real crew scheduling problemsin a public transportation bus company. Since the crews of thesecompanies are drivers, we will designate the problem by the bus-driverscheduling problem. Crew scheduling problems are well known and severalmathematical programming based techniques have been proposed to solvethem, in particular using the set-covering formulation. However, inpractice, there exists the need for improvement in terms of computationalefficiency and capacity of solving large-scale instances. Moreover, thereal bus-driver scheduling problems that we consider can present variantaspects of the set covering, as for example a different objectivefunction, implying that alternative solutions methods have to bedeveloped. We propose metaheuristics based on the following approaches:GRASP (greedy randomized adaptive search procedure), tabu search andgenetic algorithms. These metaheuristics also present some innovationfeatures based on and genetic algorithms. These metaheuristics alsopresent some innovation features based on the structure of the crewscheduling problem, that guide the search efficiently and able them tofind good solutions. Some of these new features can also be applied inthe development of heuristics to other combinatorial optimizationproblems. A summary of computational results with real-data problems ispresented.
Resumo:
We study the effects of nominal debt on the optimal sequential choice of monetary policy. When the stock of debt is nominal, the incentive to generate unanticipated inflation increases the cost of the outstanding debt even if no unanticipated inflation episodes occur in equilibrium. Without full commitment, the optimal sequential policy is to deplete the outstanding stock of debt progressively until these extra costs disappear. Nominal debt is therefore a burden on monetary policy, not only because it must be serviced, but also because it creates a time inconsistency problem that distorts interest rates. The introduction of alternative forms of taxation may lessen this burden, if there is enough commtiment to fiscal policy. If there is full commitment to an optimal fiscal policy, then the resulting monetary policy is the Friedman rule of zero nominal interest rates.
Resumo:
The paper argues that the market signifficantly overvalues firms with severely underfunded pension plans. These companies earn lower stock returns than firms with healthier pension plans for at least five years after the first emergence of the underfunding. The low returns are not explained by risk, price momentum, earnings momentum, or accruals. Further, the evidence suggests that investors do not anticipate the impact of the pension liability on future earnings, and they are surprised when the negative implications of underfunding ultimately materialize. Finally, underfunded firms have poor operating performance, and they earn low returns, although they are value companies.
Resumo:
This paper studies the equilibrating process of several implementationmechanisms using naive adaptive dynamics. We show that the dynamics convergeand are stable, for the canonical mechanism of implementation in Nash equilibrium.In this way we cast some doubt on the criticism of ``complexity'' commonlyused against this mechanism. For mechanisms that use more refined equilibrium concepts,the dynamics converge but are not stable. Some papers in the literatureon implementation with refined equilibrium concepts have claimed that themechanisms they propose are ``simple'' and implement ``everything'' (incontrast with the canonical mechanism). The fact that some of these ``simple''mechanisms have unstable equilibria suggests that these statements shouldbe interpreted with some caution.
Resumo:
We combine existing balance sheet and stock market data with two new datasets to studywhether, how much, and why bank lending to firms matters for the transmission of monetarypolicy. The first new dataset enables us to quantify the bank dependence of firms precisely,as the ratio of bank debt to total assets. We show that a two standard deviation increase inthe bank dependence of a firm makes its stock price about 25% more responsive to monetarypolicy shocks. We explore the channels through which this effect occurs, and find that thestock prices of bank-dependent firms that borrow from financially weaker banks display astronger sensitivity to monetary policy shocks. This finding is consistent with the banklending channel, a theory according to which the strength of bank balance sheets mattersfor monetary policy transmission. We construct a new database of hedging activities andshow that the stock prices of bank-dependent firms that hedge against interest rate riskdisplay a lower sensitivity to monetary policy shocks. This finding is consistent with aninterest rate pass-through channel that operates via the direct transmission of policy ratesto lending rates associated with the widespread use of floating-rates in bank loans and creditline agreements.
Resumo:
The speed and width of front solutions to reaction-dispersal models are analyzed both analytically and numerically. We perform our analysis for Laplace and Gaussian distribution kernels, both for delayed and nondelayed models. The results are discussed in terms of the characteristic parameters of the models
Resumo:
Les empreses sempre han buscat com optimitzar el màxim els seus recursos i ser més eficients a la hora de realitzar les tasques que li han estat encomanades. És per aquest motiu que constantment les empreses realitzen estudis i valoracions de com poder millorar dia a dia. Aquest fet no és diferenciador a l’empresa Serralleria i Alumini Vilaró (S.A.V), que dia a dia estudia com optimitzar els seus processos o de vegades introduir-ne de nous per tal d’expandir la seva oferta de serveis. L’empresa és dedica a la fabricació de peces metàl•liques el procés ja sigui només de tall i mecanitzat, plegat, soldadura, acabats en inoxidable, pintura i fins i tot embalatge pel que fa a la part productiva, respecte a la part d’oficina tècnica també ofereix serveis de desenvolupament de productes segons especificacions del client i reenginyeria de qualsevol producte, analitzant la part que és vol millorar. En l’actualitat l’empresa ha detectat una mancança que creu que es podria solucionar, el problema és que l’empresa disposa de varies màquines de tall, entre les quals hi ha una màquina de tall làser i el problema principal és que la càrrega de les planxes del calaix de magatzem a la bancada de la màquina es realitza o bé manualment o a través d’un gripper sostingut al pont grua, depenent del pes de la planxa a transportar. L’objectiu principal d’aquest treball és fer el disseny d’una màquina que permeti automatitzar el procés de transportar la planxa metàl•lica del calaix de magatzem dipositat sobre una taula mòbil a la bancada de la màquina de tall. El disseny que pretenem fer és complet començant per fer un disseny estructural de la màquina més els seus respectius càlculs, moviments que volem aconseguir, tria de components ( motors, sensors ...), elaboració d’un pressupost per poder fer una estimació i finalment la elaboració del programa de control de tota la màquina més la interacció amb la màquina a través d’una pantalla tàctil. Es a dir, el que pretenem és realitzar un projecte que puguem fabricar en la realitat utilitzant tota la informació continguda dins del mateix
Resumo:
We present some results attained with different algorithms for the Fm|block|Cmax problem using as experimental data the well-known Taillard instances.
Resumo:
In this paper we deal with the identification of dependencies between time series of equity returns. Marginal distribution functions are assumed to be known, and a bivariate chi-square test of fit is applied in a fully parametric copula approach. Several families of copulas are fitted and compared with Spanish stock market data. The results show that the t-copula generally outperforms other dependence structures, and highlight the difficulty in adjusting a significant number of bivariate data series
Resumo:
A retarded backward equation for a non-Markovian process induced by dichotomous noise (the random telegraphic signal) is deduced. The mean-first-passage time of this process is exactly obtained. The Gaussian white noise and the white shot noise limits are studied. Explicit physical results in first approximation are evaluated.
Resumo:
We have performed a detailed study of the zenith angle dependence of the regeneration factor and distributions of events at SNO and SK for different solutions of the solar neutrino problem. In particular, we discuss the oscillatory behavior and the synchronization effect in the distribution for the LMA solution, the parametric peak for the LOW solution, etc. A physical interpretation of the effects is given. We suggest a new binning of events which emphasizes the distinctive features of the zenith angle distributions for the different solutions. We also find the correlations between the integrated day-night asymmetry and the rates of events in different zenith angle bins. The study of these correlations strengthens the identification power of the analysis.