11 resultados para Routing problems

em Dalarna University College Electronic Archive


Relevância:

60.00% 60.00%

Publicador:

Resumo:

This paper elaborates the routing of cable cycle through available routes in a building in order to link a set of devices, in a most reasonable way. Despite of the similarities to other NP-hard routing problems, the only goal is not only to minimize the cost (length of the cycle) but also to increase the reliability of the path (in case of a cable cut) which is assessed by a risk factor. Since there is often a trade-off between the risk and length factors, a criterion for ranking candidates and deciding the most reasonable solution is defined. A set of techniques is proposed to perform an efficient and exact search among candidates. A novel graph is introduced to reduce the search-space, and navigate the search toward feasible and desirable solutions. Moreover, admissible heuristic length estimation helps to early detection of partial cycles which lead to unreasonable solutions. The results show that the method provides solutions which are both technically and financially reasonable. Furthermore, it is proved that the proposed techniques are very efficient in reducing the computational time of the search to a reasonable amount.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This Thesis Work will concentrate on a very interesting problem, the Vehicle Routing Problem (VRP). In this problem, customers or cities have to be visited and packages have to be transported to each of them, starting from a basis point on the map. The goal is to solve the transportation problem, to be able to deliver the packages-on time for the customers,-enough package for each Customer,-using the available resources- and – of course - to be so effective as it is possible.Although this problem seems to be very easy to solve with a small number of cities or customers, it is not. In this problem the algorithm have to face with several constraints, for example opening hours, package delivery times, truck capacities, etc. This makes this problem a so called Multi Constraint Optimization Problem (MCOP). What’s more, this problem is intractable with current amount of computational power which is available for most of us. As the number of customers grow, the calculations to be done grows exponential fast, because all constraints have to be solved for each customers and it should not be forgotten that the goal is to find a solution, what is best enough, before the time for the calculation is up. This problem is introduced in the first chapter: form its basics, the Traveling Salesman Problem, using some theoretical and mathematical background it is shown, why is it so hard to optimize this problem, and although it is so hard, and there is no best algorithm known for huge number of customers, why is it a worth to deal with it. Just think about a huge transportation company with ten thousands of trucks, millions of customers: how much money could be saved if we would know the optimal path for all our packages.Although there is no best algorithm is known for this kind of optimization problems, we are trying to give an acceptable solution for it in the second and third chapter, where two algorithms are described: the Genetic Algorithm and the Simulated Annealing. Both of them are based on obtaining the processes of nature and material science. These algorithms will hardly ever be able to find the best solution for the problem, but they are able to give a very good solution in special cases within acceptable calculation time.In these chapters (2nd and 3rd) the Genetic Algorithm and Simulated Annealing is described in details, from their basis in the “real world” through their terminology and finally the basic implementation of them. The work will put a stress on the limits of these algorithms, their advantages and disadvantages, and also the comparison of them to each other.Finally, after all of these theories are shown, a simulation will be executed on an artificial environment of the VRP, with both Simulated Annealing and Genetic Algorithm. They will both solve the same problem in the same environment and are going to be compared to each other. The environment and the implementation are also described here, so as the test results obtained.Finally the possible improvements of these algorithms are discussed, and the work will try to answer the “big” question, “Which algorithm is better?”, if this question even exists.

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The potential changes to the territory of the Russian Arctic open up unique possibilities for the development of tourism. More favourable transport opportunities along the Northern Sea Route (NSR) create opportunities for tourism development based on the utilisation of the extensive areas of sea shores and river basins. A major challenge for the Russian Arctic sea and river ports is their strong cargo transport orientation originated by natural resource extraction industries. A careful assessment of the prospects of current and future tourism development is presented here based on the development of regions located along the shores of the Arctic ocean (including Murmansk and Arkhangelsk oblast, Nenets Autonomous okrug (AO), Yamal-Nenets AO, Taymyr AO, Republic of Sakha, Chykotsky AO). An evaluation of the present development of tourism in maritime cities suggests that a considerable qualitative and quantitative increase of tourism activities organised by domestic tourism firms is made virtually impossible. There are several factors contributing to this. The previously established Soviet system of state support for the investments into the port facilities as well as the sea fleet were not effectively replaced by creation of new structures. The necessary investments for reconstruction could be contributed by the federal government but the priorities are not set towards the increased passenger transportation. Having in mind, increased environmental pressures in this highly sensitive area it is especially vital to establish a well-functioning monitoring and rescue system in the situation of ever increasing risks which come not only from the increased transports along the NSR, but also from the exploitation of the offshore oil and gas reserves in the Arctic seas. The capacity and knowledge established in Nordic countries (Norway, Finland) concerning cruise tourism should not be underestimated and the already functioning cooperation in Barents Region should expand towards this particular segment of the tourism industry. The current stage of economic development in Russia makes it clear that tourism development is not able to compete with the well-needed increase in the cargo transportation, which means that Russia’s fleet is going to be utilised by other industries. However, opening up this area to both local and international visitors could contribute to the economic prosperity of these remote areas and if carefully managed could sustain already existing maritime cities along the shores of the Arctic Ocean.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Quadratic assignment problems (QAPs) are commonly solved by heuristic methods, where the optimum is sought iteratively. Heuristics are known to provide good solutions but the quality of the solutions, i.e., the confidence interval of the solution is unknown. This paper uses statistical optimum estimation techniques (SOETs) to assess the quality of Genetic algorithm solutions for QAPs. We examine the functioning of different SOETs regarding biasness, coverage rate and length of interval, and then we compare the SOET lower bound with deterministic ones. The commonly used deterministic bounds are confined to only a few algorithms. We show that, the Jackknife estimators have better performance than Weibull estimators, and when the number of heuristic solutions is as large as 100, higher order JK-estimators perform better than lower order ones. Compared with the deterministic bounds, the SOET lower bound performs significantly better than most deterministic lower bounds and is comparable with the best deterministic ones. 

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Solutions to combinatorial optimization problems, such as problems of locating facilities, frequently rely on heuristics to minimize the objective function. The optimum is sought iteratively and a criterion is needed to decide when the procedure (almost) attains it. Pre-setting the number of iterations dominates in OR applications, which implies that the quality of the solution cannot be ascertained. A small, almost dormant, branch of the literature suggests using statistical principles to estimate the minimum and its bounds as a tool to decide upon stopping and evaluating the quality of the solution. In this paper we examine the functioning of statistical bounds obtained from four different estimators by using simulated annealing on p-median test problems taken from Beasley’s OR-library. We find the Weibull estimator and the 2nd order Jackknife estimator preferable and the requirement of sample size to be about 10 being much less than the current recommendation. However, reliable statistical bounds are found to depend critically on a sample of heuristic solutions of high quality and we give a simple statistic useful for checking the quality. We end the paper with an illustration on using statistical bounds in a problem of locating some 70 distribution centers of the Swedish Post in one Swedish region. 

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Combinatorial optimization problems, are one of the most important types of problems in operational research. Heuristic and metaheuristics algorithms are widely applied to find a good solution. However, a common problem is that these algorithms do not guarantee that the solution will coincide with the optimum and, hence, many solutions to real world OR-problems are afflicted with an uncertainty about the quality of the solution. The main aim of this thesis is to investigate the usability of statistical bounds to evaluate the quality of heuristic solutions applied to large combinatorial problems. The contributions of this thesis are both methodological and empirical. From a methodological point of view, the usefulness of statistical bounds on p-median problems is thoroughly investigated. The statistical bounds have good performance in providing informative quality assessment under appropriate parameter settings. Also, they outperform the commonly used Lagrangian bounds. It is demonstrated that the statistical bounds are shown to be comparable with the deterministic bounds in quadratic assignment problems. As to empirical research, environment pollution has become a worldwide problem, and transportation can cause a great amount of pollution. A new method for calculating and comparing the CO2-emissions of online and brick-and-mortar retailing is proposed. It leads to the conclusion that online retailing has significantly lesser CO2-emissions. Another problem is that the Swedish regional division is under revision and the border effect to public service accessibility is concerned of both residents and politicians. After analysis, it is shown that borders hinder the optimal location of public services and consequently the highest achievable economic and social utility may not be attained.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we propose a new method for solving large scale p-median problem instances based on real data. We compare different approaches in terms of runtime, memory footprint and quality of solutions obtained. In order to test the different methods on real data, we introduce a new benchmark for the p-median problem based on real Swedish data. Because of the size of the problem addressed, up to 1938 candidate nodes, a number of algorithms, both exact and heuristic, are considered. We also propose an improved hybrid version of a genetic algorithm called impGA. Experiments show that impGA behaves as well as other methods for the standard set of medium-size problems taken from Beasley’s benchmark, but produces comparatively good results in terms of quality, runtime and memory footprint on our specific benchmark based on real Swedish data.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: A wide range of health problems has been reported in elderly post-stroke patients. AIM: The aim of this study was to analyse the prevalence and timing of health problems identified by patient interviews and scrutiny of primary health care and municipality elderly health care records during the first post-stroke year. METHODS: A total of 390 consecutive patients, ≥65 years, discharged alive from hospital after a stroke event, were followed for 1 year post-admission. Information on the health care situation during the first post-stroke year was obtained from primary health care and municipal elderly health care records and through interviews with the stroke survivors, at 1 week after discharge, and 3 and 12 months after hospital admission. RESULTS: More than 90% had some health problem at some time during the year, while based on patient record data only 4-8% had problems during a given week. The prevalence of interview-based health problems was generally higher than record-based prevalence, and the ranking order was moderately different. The most frequently interview-reported problems were associated with perception, activity, and tiredness, while the most common record-based findings indicated pain, bladder and bowel function, and breathing and circulation problems. There was co-occurrence between some problems, such as those relating to cognition, activity, and tiredness. CONCLUSIONS: Almost all patients had a health problem during the year, but few occurred in a given week. Cognitive and communication problems were more common in interview data than record data. Co-occurrence may be used to identify subtle health problems.