936 resultados para Management Science and Operations Research


Relevância:

100.00% 100.00%

Publicador:

Resumo:

We consider the problem of train planning or scheduling for large, busy, complex train stations, which are common in Europe and elsewhere, though not in North America. We develop the constraints and objectives for this problem, but these are too computationally complex to solve by standard combinatorial search or integer programming methods. Also, the problem is somewhat political in nature, that is, it does not have a clear objective function because it involves multiple train operators with conflicting interests. We therefore develop scheduling heuristics analogous to those successfully adopted by train planners using ''manual'' methods. We tested the model and algorithms by applying to a typical large station that exhibits most of the complexities found in practice. The results compare well with those found by traditional methods, and take account of cost and preference trade-offs not handled by those methods. With successive refinements, the algorithm eventually took only a few seconds to run, the time depending on the version of the algorithm and the scheduling problem. The scheduling models and algorithms developed and tested here can be used on their own, or as key components for a more general system for train scheduling for a rail line or network.Train scheduling for a busy station includes ensuring that there are no conflicts between several hundred trains per day going in and out of the station on intersecting paths from multiple in-lines and out-lines to multiple platforms, while ensuring that each train is allowed at least its minimum required headways, dwell time, turnaround time and trip time. This has to be done while minimizing (costs of) deviations from desired times, platforms or lines, allowing for conflicts due to through-platforms, dead-end platforms, multiple sub-platforms, and possible constraints due to infrastructure, safety or business policy.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Exam timetabling is one of the most important administrative activities that takes place in academic institutions. In this paper we present a critical discussion of the research on exam timetabling in the last decade or so. This last ten years has seen an increased level of attention on this important topic. There has been a range of significant contributions to the scientific literature both in terms of theoretical andpractical aspects. The main aim of this survey is to highlight the new trends and key research achievements that have been carried out in the last decade.We also aim to outline a range of relevant important research issues and challenges that have been generated by this body of work.

We first define the problem and review previous survey papers. Algorithmic approaches are then classified and discussed. These include early techniques (e.g. graph heuristics) and state-of-the-art approaches including meta-heuristics, constraint based methods, multi-criteria techniques, hybridisations, and recent new trends concerning neighbourhood structures, which are motivated by raising the generality of the approaches. Summarising tables are presented to provide an overall view of these techniques. We discuss some issues on decomposition techniques, system tools and languages, models and complexity. We also present and discuss some important issues which have come to light concerning the public benchmark exam timetabling data. Different versions of problem datasetswith the same name have been circulating in the scientific community in the last ten years which has generated a significant amount of confusion. We clarify the situation and present a re-naming of the widely studied datasets to avoid future confusion. We also highlight which research papershave dealt with which dataset. Finally, we draw upon our discussion of the literature to present a (non-exhaustive) range of potential future research directions and open issues in exam timetabling research.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, we present a random iterative graph based hyper-heuristic to produce a collection of heuristic sequences to construct solutions of different quality. These heuristic sequences can be seen as dynamic hybridisations of different graph colouring heuristics that construct solutions step by step. Based on these sequences, we statistically analyse the way in which graph colouring heuristics are automatically hybridised. This, to our knowledge, represents a new direction in hyper-heuristic research. It is observed that spending the search effort on hybridising Largest Weighted Degree with Saturation Degree at the early stage of solution construction tends to generate high quality solutions. Based on these observations, an iterative hybrid approach is developed to adaptively hybridise these two graph colouring heuristics at different stages of solution construction. The overall aim here is to automate the heuristic design process, which draws upon an emerging research theme on developing computer methods to design and adapt heuristics automatically. Experimental results on benchmark exam timetabling and graph colouring problems demonstrate the effectiveness and generality of this adaptive hybrid approach compared with previous methods on automatically generating and adapting heuristics. Indeed, we also show that the approach is competitive with the state of the art human produced methods.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper describes the application of multivariate regression techniques to the Tennessee Eastman benchmark process for modelling and fault detection. Two methods are applied : linear partial least squares, and a nonlinear variant of this procedure using a radial basis function inner relation. The performance of the RBF networks is enhanced through the use of a recently developed training algorithm which uses quasi-Newton optimization to ensure an efficient and parsimonious network; details of this algorithm can be found in this paper. The PLS and PLS/RBF methods are then used to create on-line inferential models of delayed process measurements. As these measurements relate to the final product composition, these models suggest that on-line statistical quality control analysis should be possible for this plant. The generation of `soft sensors' for these measurements has the further effect of introducing a redundant element into the system, redundancy which can then be used to generate a fault detection and isolation scheme for these sensors. This is achieved by arranging the sensors and models in a manner comparable to the dedicated estimator scheme of Clarke et al. 1975, IEEE Trans. Pero. Elect. Sys., AES-14R, 465-473. The effectiveness of this scheme is demonstrated on a series of simulated sensor and process faults, with full detection and isolation shown to be possible for sensor malfunctions, and detection feasible in the case of process faults. Suggestions for enhancing the diagnostic capacity in the latter case are covered towards the end of the paper.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Universities planning the provision of space for their teaching requirements need to do so in a fashion that reduces capital and maintenance costs whilst still providing a high-quality level of service. Space plans should aim to provide sufficient capacity without incurring excessive costs due to over-capacity. A simple measure used to estimate over-provision is utilisation. Essentially, the utilisation is the fraction of seats that are used in practice, or the ratio of demand to supply. However, studies usually find that utilisation is low, often only 20–40%, and this is suggestive of significant over-capacity.

Our previous work has provided methods to improve such space planning. They identify a critical level of utilisation as the highest level that can be achieved whilst still reliably satisfying the demand for places to allocate teaching events. In this paper, we extend this body of work to incorporate the notions of event-types and space-types. Teaching events have multiple ‘event-types’, such as lecture, tutorial, workshop, etc., and there are generally corresponding space-types. Matching the type of an event to a room of a corresponding space-type is generally desirable. However, realistically, allocation happens in a mixed space-type environment where teaching events of a given type are allocated to rooms of another space-type; e.g., tutorials will borrow lecture theatres or workshop rooms.

We propose a model and methodology to quantify the effects of space-type mixing and establish methods to search for better space-type profiles; where the term “space-type profile” refers to the relative numbers of each type of space. We give evidence that these methods have the potential to improve utilisation levels. Hence, the contribution of this paper is twofold. Firstly, we present informative studies of the effects of space-type mixing on utilisation, and critical utilisations. Secondly, we present straightforward though novel methods to determine better space-type profiles, and give an example in which the resulting profiles are indeed significantly improved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper studies the dynamic pricing problem of selling fixed stock of perishable items over a finite horizon, where the decision maker does not have the necessary historic data to estimate the distribution of uncertain demand, but has imprecise information about the quantity demand. We model this uncertainty using fuzzy variables. The dynamic pricing problem based on credibility theory is formulated using three fuzzy programming models, viz.: the fuzzy expected revenue maximization model, a-optimistic revenue maximization model, and credibility maximization model. Fuzzy simulations for functions with fuzzy parameters are given and embedded into a genetic algorithm to design a hybrid intelligent algorithm to solve these three models. Finally, a real-world example is presented to highlight the effectiveness of the developed model and algorithm.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The present paper examines the role of organisational learning and transaction costs economics in strategic outsourcing decisions. Interorganisational learning is critical to competitive success, and organisations often learn more effectively by collaborating with other organisations. However, learning processes may also complicate the process of forming interorganisational partnerships which may increase transaction costs. Based on the literature, the authors develop refutable implications for outsourcing supply chain logistics and a sample of 121 firms in the supply chain logistics industry is used to test the hypotheses. The results show that trust and transaction costs are significant and substantial drivers of strategic outsourcing of supply chain logistics (a strategic flexibility action). Learning intent and knowledge acquisition have no significant influence on the decision to outsource supply chain logistics. The paper concludes with a discussion of the different and often conflicting implications for managing interorganisational learning processes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A Monte-Carlo simulation-based model has been constructed to assess a public health scheme involving mobile-volunteer cardiac First-Responders. The scheme being assessed aims to improve survival of Sudden-Cardiac-Arrest (SCA) patients, through reducing the time until administration of life-saving defibrillation treatment, with volunteers being paged to respond to possible SCA incidents alongside the Emergency Medical Services. The need for a model, for example, to assess the impact of the scheme in different geographical regions, was apparent upon collection of observational trial data (given it exhibited stochastic and spatial complexities). The simulation-based model developed has been validated and then used to assess the scheme's benefits in an alternative rural region (not a part of the original trial). These illustrative results conclude that the scheme may not be the most efficient use of National Health Service resources in this geographical region, thus demonstrating the importance and usefulness of simulation modelling in aiding decision making.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In durable goods markets, many brand name manufacturers, including IBM, HP, Epson, and Lenovo, have adopted dual-channel supply chains to market their products. There is scant literature, however, addressing the product durability and its impact on players’ optimal strategies in a dual-channel supply chain. To fill this void, we consider a two-period dual-channel model in which a manufacturer sells a durable product directly through both a manufacturer-owned e-channel and an independent dealer who adopts a mix of selling and leasing to consumers. Our results show that the manufacturer begins encroaching into the market in Period 1, but the dealer starts withdrawing from the retail channel in Period 2. Moreover, as the direct selling cost decreases, the equilibrium quantities and wholesale prices become quite angular and often nonmonotonic. Among other results, we find that both the dealer and the supply chain may benefit from the manufacturer’s encroachment. Our results also indicate that both the market structure and the nature of competition have an important impact on the player’s (dealer’s) optimal choice of leasing and selling.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Keeping a record of operator experience remains a challenge to operation management and a major source of inefficiency in information management. The objective is to develop a framework that enables an explicit presentation of experience based on information use. A purposive sampling method is used to select four small and medium-sized enterprises as case studies. The unit of analysis is the production process in the machine shop. Data collection is by structured interview, observation and documentation. A comparative case analysis is applied. The findings suggest experience is an accumulation of tacit information feedback, which can be made explicit in information use interoperatability matrix. The matrix is conditioned upon information use typology, which is strategic in waste reduction. The limitations include difficulty of participant anonymity where the organisation nominates a participant. Areas for further research include application of the concepts to knowledge management and shop floor resource management.