967 resultados para Operations management


Relevância:

60.00% 60.00%

Publicador:

Relevância:

60.00% 60.00%

Publicador:

Relevância:

60.00% 60.00%

Publicador:

Resumo:

We address the problem of scheduling a multiclass $M/M/m$ queue with Bernoulli feedback on $m$ parallel servers to minimize time-average linear holding costs. We analyze the performance of a heuristic priority-index rule, which extends Klimov's optimal solution to the single-server case: servers select preemptively customers with larger Klimov indices. We present closed-form suboptimality bounds (approximate optimality) for Klimov's rule, which imply that its suboptimality gap is uniformly bounded above with respect to (i) external arrival rates, as long as they stay within system capacity;and (ii) the number of servers. It follows that its relativesuboptimality gap vanishes in a heavy-traffic limit, as external arrival rates approach system capacity (heavy-traffic optimality). We obtain simpler expressions for the special no-feedback case, where the heuristic reduces to the classical $c \mu$ rule. Our analysis is based on comparing the expected cost of Klimov's ruleto the value of a strong linear programming (LP) relaxation of the system's region of achievable performance of mean queue lengths. In order to obtain this relaxation, we derive and exploit a new set ofwork decomposition laws for the parallel-server system. We further report on the results of a computational study on the quality of the $c \mu$ rule for parallel scheduling.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This paper discusses the role of deterministic components in the DGP and in the auxiliaryregression model which underlies the implementation of the Fractional Dickey-Fuller (FDF) test for I(1) against I(d) processes with d [0, 1). This is an important test in many economic applications because I(d) processess with d < 1 are mean-reverting although, when 0.5 = d < 1, like I(1) processes, they are nonstationary. We show how simple is the implementation of the FDF in these situations, and argue that it has better properties than LM tests. A simple testing strategy entailing only asymptotically normally distributedtests is also proposed. Finally, an empirical application is provided where the FDF test allowing for deterministic components is used to test for long-memory in the per capita GDP of several OECD countries, an issue that has important consequences to discriminate between growth theories, and on which there is some controversy.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In this work we discuss some ideas and opinions related with teaching Metaheuristics in Business Schools. The main purpose of the work is to initiate a discussion and collaboration about this topic,with the final objective to improve the teaching and publicity of the area. The main topics to be discussed are the environment and focus of this teaching. We also present a SWOT analysis which lead us to the conclusion that the area of Metaheuristics only can win with the presentation and discussion of metaheuristics and related topics in Business Schools, since it consists in a excellent Decision Support tools for future potential users.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The public transportation is gaining importance every year basically duethe population growth, environmental policies and, route and streetcongestion. Too able an efficient management of all the resources relatedto public transportation, several techniques from different areas are beingapplied and several projects in Transportation Planning Systems, indifferent countries, are being developed. In this work, we present theGIST Planning Transportation Systems, a Portuguese project involving twouniversities and six public transportation companies. We describe indetail one of the most relevant modules of this project, the crew-scheduling module. The crew-scheduling module is based on the application of meta-heuristics, in particular GRASP, tabu search and geneticalgorithm to solve the bus-driver-scheduling problem. The metaheuristicshave been successfully incorporated in the GIST Planning TransportationSystems and are actually used by several companies.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Highly competitive environments are leading companies to implement SupplyChain Management (SCM) to improve performance and gain a competitiveadvantage. SCM involves integration, co-ordination and collaborationacross organisations and throughout the supply chain. It means that SCMrequires internal (intraorganisational) and external (interorganisational)integration. This paper examines the Logistics-Production and Logistics-Marketing interfaces and their relation with the external integrationprocess. The study also investigates the causal impact of these internaland external relationships on the company s logistical service performance.To analyse this, an empirical study was conducted in the Spanish Fast MovingConsumer Goods (FMCG) sector.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This paper proposes an explanation as to why some mergers fail, based on the interactionbetween the pre- and post-merger processes. We argue that failure may stem from informationalasymmetries arising from the pre-merger period, and problems of cooperation andcoordination within recently merged firms. We show that a partner may optimally agree tomerge and abstain from putting forth any post-merger effort, counting on the other partnerto make the necessary efforts. If both follow the same course of action, the merger goesahead but fails. Our unique equilibrium allows us to make predictions on which mergers aremore likely to fail.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The Maximum Capture problem (MAXCAP) is a decision model that addresses the issue of location in a competitive environment. This paper presents a new approach to determine which store s attributes (other than distance) should be included in the newMarket Capture Models and how they ought to be reflected using the Multiplicative Competitive Interaction model. The methodology involves the design and development of a survey; and the application of factor analysis and ordinary least squares. Themethodology has been applied to the supermarket sector in two different scenarios: Milton Keynes (Great Britain) and Barcelona (Spain).

Relevância:

60.00% 60.00%

Publicador:

Resumo:

We present a polyhedral framework for establishing general structural properties on optimal solutions of stochastic scheduling problems, where multiple job classes vie for service resources: the existence of an optimal priority policy in a given family, characterized by a greedoid(whose feasible class subsets may receive higher priority), where optimal priorities are determined by class-ranking indices, under restricted linear performance objectives (partial indexability). This framework extends that of Bertsimas and Niño-Mora (1996), which explained the optimality of priority-index policies under all linear objectives (general indexability). We show that, if performance measures satisfy partial conservation laws (with respect to the greedoid), which extend previous generalized conservation laws, then theproblem admits a strong LP relaxation over a so-called extended greedoid polytope, which has strong structural and algorithmic properties. We present an adaptive-greedy algorithm (which extends Klimov's) taking as input the linear objective coefficients, which (1) determines whether the optimal LP solution is achievable by a policy in the given family; and (2) if so, computes a set of class-ranking indices that characterize optimal priority policies in the family. In the special case of project scheduling, we show that, under additional conditions, the optimal indices can be computed separately for each project (index decomposition). We further apply the framework to the important restless bandit model (two-action Markov decision chains), obtaining new index policies, that extend Whittle's (1988), and simple sufficient conditions for their validity. These results highlight the power of polyhedral methods (the so-called achievable region approach) in dynamic and stochastic optimization.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In this paper we present an algorithm to assign proctors toexams. This NP-hard problem is related to the generalized assignmentproblem with multiple objectives. The problem consists of assigningteaching assistants to proctor final exams at a university. We formulatethis problem as a multiobjective integer program (IP) with a preferencefunction and a workload-fairness function. We then consider also a weightedobjective that combines both functions. We develop a scatter searchprocedure and compare its outcome with solutions found by solving theIP model with CPLEX 6.5. Our test problems are real instances from aUniversity in Spain.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The well-known lack of power of unit root tests has often been attributed to the shortlength of macroeconomic variables and also to DGP s that depart from the I(1)-I(0)alternatives. This paper shows that by using long spans of annual real GNP and GNPper capita (133 years) high power can be achieved, leading to the rejection of both theunit root and the trend-stationary hypothesis. This suggests that possibly neither modelprovides a good characterization of these data. Next, more flexible representations areconsidered, namely, processes containing structural breaks (SB) and fractional ordersof integration (FI). Economic justification for the presence of these features in GNP isprovided. It is shown that the latter models (FI and SB) are in general preferred to theARIMA (I(1) or I(0)) ones. As a novelty in this literature, new techniques are appliedto discriminate between FI and SB models. It turns out that the FI specification ispreferred, implying that GNP and GNP per capita are non-stationary, highly persistentbut mean-reverting series. Finally, it is shown that the results are robust when breaksin the deterministic component are allowed for in the FI model. Some macroeconomicimplications of these findings are also discussed.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The statistical properties of inflation and, in particular, its degree of persistence and stability over time is a subject of intense debate and no consensus has been achieved yet. The goal of this paper is to analyze this controversy using a general approach, with the aim of providing a plausible explanation for the existing contradictory results. We consider the inflation rates of 21 OECD countries which are modelled as fractionally integrated (FI) processes. First, we show analytically that FI can appear in inflation rates after aggregating individual prices from firms that face different costs of adjusting their prices. Then, we provide robust empirical evidence supporting the FI hypothesis using both classical and Bayesian techniques. Next, we estimate impulse response functions and other scalar measures of persistence, achieving an accurate picture of this property and its variation across countries. It is shown that the application of some popular tools for measuring persistence, such as the sum of the AR coefficients, could lead to erroneous conclusions if fractional integration is present. Finally, we explore the existence of changes in inflation inertia using a novel approach. We conclude that the persistence of inflation is very high (although non-permanent) in most post-industrial countries and that it has remained basically unchanged over the last four decades.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Previous covering models for emergency service consider all the calls to be of the sameimportance and impose the same waiting time constraints independently of the service's priority.This type of constraint is clearly inappropriate in many contexts. For example, in urban medicalemergency services, calls that involve danger to human life deserve higher priority over calls formore routine incidents. A realistic model in such a context should allow prioritizing the calls forservice.In this paper a covering model which considers different priority levels is formulated andsolved. The model heritages its formulation from previous research on Maximum CoverageModels and incorporates results from Queuing Theory, in particular Priority Queuing. Theadditional complexity incorporated in the model justifies the use of a heuristic procedure.