11 resultados para Completion problems
em Dalarna University College Electronic Archive
Resumo:
The problem of scheduling a parallel program presented by a weighted directed acyclic graph (DAG) to the set of homogeneous processors for minimizing the completion time of the program has been extensively studied as academic optimization problem which occurs in optimizing the execution time of parallel algorithm with parallel computer.In this paper, we propose an application of the Ant Colony Optimization (ACO) to a multiprocessor scheduling problem (MPSP). In the MPSP, no preemption is allowed and each operation demands a setup time on the machines. The problem seeks to compose a schedule that minimizes the total completion time.We therefore rely on heuristics to find solutions since solution methods are not feasible for most problems as such. This novel heuristic searching approach to the multiprocessor based on the ACO algorithm a collection of agents cooperate to effectively explore the search space.A computational experiment is conducted on a suit of benchmark application. By comparing our algorithm result obtained to that of previous heuristic algorithm, it is evince that the ACO algorithm exhibits competitive performance with small error ratio.
Resumo:
The potential changes to the territory of the Russian Arctic open up unique possibilities for the development of tourism. More favourable transport opportunities along the Northern Sea Route (NSR) create opportunities for tourism development based on the utilisation of the extensive areas of sea shores and river basins. A major challenge for the Russian Arctic sea and river ports is their strong cargo transport orientation originated by natural resource extraction industries. A careful assessment of the prospects of current and future tourism development is presented here based on the development of regions located along the shores of the Arctic ocean (including Murmansk and Arkhangelsk oblast, Nenets Autonomous okrug (AO), Yamal-Nenets AO, Taymyr AO, Republic of Sakha, Chykotsky AO). An evaluation of the present development of tourism in maritime cities suggests that a considerable qualitative and quantitative increase of tourism activities organised by domestic tourism firms is made virtually impossible. There are several factors contributing to this. The previously established Soviet system of state support for the investments into the port facilities as well as the sea fleet were not effectively replaced by creation of new structures. The necessary investments for reconstruction could be contributed by the federal government but the priorities are not set towards the increased passenger transportation. Having in mind, increased environmental pressures in this highly sensitive area it is especially vital to establish a well-functioning monitoring and rescue system in the situation of ever increasing risks which come not only from the increased transports along the NSR, but also from the exploitation of the offshore oil and gas reserves in the Arctic seas. The capacity and knowledge established in Nordic countries (Norway, Finland) concerning cruise tourism should not be underestimated and the already functioning cooperation in Barents Region should expand towards this particular segment of the tourism industry. The current stage of economic development in Russia makes it clear that tourism development is not able to compete with the well-needed increase in the cargo transportation, which means that Russia’s fleet is going to be utilised by other industries. However, opening up this area to both local and international visitors could contribute to the economic prosperity of these remote areas and if carefully managed could sustain already existing maritime cities along the shores of the Arctic Ocean.
Resumo:
Quadratic assignment problems (QAPs) are commonly solved by heuristic methods, where the optimum is sought iteratively. Heuristics are known to provide good solutions but the quality of the solutions, i.e., the confidence interval of the solution is unknown. This paper uses statistical optimum estimation techniques (SOETs) to assess the quality of Genetic algorithm solutions for QAPs. We examine the functioning of different SOETs regarding biasness, coverage rate and length of interval, and then we compare the SOET lower bound with deterministic ones. The commonly used deterministic bounds are confined to only a few algorithms. We show that, the Jackknife estimators have better performance than Weibull estimators, and when the number of heuristic solutions is as large as 100, higher order JK-estimators perform better than lower order ones. Compared with the deterministic bounds, the SOET lower bound performs significantly better than most deterministic lower bounds and is comparable with the best deterministic ones.
Resumo:
Solutions to combinatorial optimization problems, such as problems of locating facilities, frequently rely on heuristics to minimize the objective function. The optimum is sought iteratively and a criterion is needed to decide when the procedure (almost) attains it. Pre-setting the number of iterations dominates in OR applications, which implies that the quality of the solution cannot be ascertained. A small, almost dormant, branch of the literature suggests using statistical principles to estimate the minimum and its bounds as a tool to decide upon stopping and evaluating the quality of the solution. In this paper we examine the functioning of statistical bounds obtained from four different estimators by using simulated annealing on p-median test problems taken from Beasley’s OR-library. We find the Weibull estimator and the 2nd order Jackknife estimator preferable and the requirement of sample size to be about 10 being much less than the current recommendation. However, reliable statistical bounds are found to depend critically on a sample of heuristic solutions of high quality and we give a simple statistic useful for checking the quality. We end the paper with an illustration on using statistical bounds in a problem of locating some 70 distribution centers of the Swedish Post in one Swedish region.
Resumo:
Combinatorial optimization problems, are one of the most important types of problems in operational research. Heuristic and metaheuristics algorithms are widely applied to find a good solution. However, a common problem is that these algorithms do not guarantee that the solution will coincide with the optimum and, hence, many solutions to real world OR-problems are afflicted with an uncertainty about the quality of the solution. The main aim of this thesis is to investigate the usability of statistical bounds to evaluate the quality of heuristic solutions applied to large combinatorial problems. The contributions of this thesis are both methodological and empirical. From a methodological point of view, the usefulness of statistical bounds on p-median problems is thoroughly investigated. The statistical bounds have good performance in providing informative quality assessment under appropriate parameter settings. Also, they outperform the commonly used Lagrangian bounds. It is demonstrated that the statistical bounds are shown to be comparable with the deterministic bounds in quadratic assignment problems. As to empirical research, environment pollution has become a worldwide problem, and transportation can cause a great amount of pollution. A new method for calculating and comparing the CO2-emissions of online and brick-and-mortar retailing is proposed. It leads to the conclusion that online retailing has significantly lesser CO2-emissions. Another problem is that the Swedish regional division is under revision and the border effect to public service accessibility is concerned of both residents and politicians. After analysis, it is shown that borders hinder the optimal location of public services and consequently the highest achievable economic and social utility may not be attained.
Resumo:
In this paper, we propose a new method for solving large scale p-median problem instances based on real data. We compare different approaches in terms of runtime, memory footprint and quality of solutions obtained. In order to test the different methods on real data, we introduce a new benchmark for the p-median problem based on real Swedish data. Because of the size of the problem addressed, up to 1938 candidate nodes, a number of algorithms, both exact and heuristic, are considered. We also propose an improved hybrid version of a genetic algorithm called impGA. Experiments show that impGA behaves as well as other methods for the standard set of medium-size problems taken from Beasley’s benchmark, but produces comparatively good results in terms of quality, runtime and memory footprint on our specific benchmark based on real Swedish data.
Resumo:
In a global economy, manufacturers mainly compete with cost efficiency of production, as the price of raw materials are similar worldwide. Heavy industry has two big issues to deal with. On the one hand there is lots of data which needs to be analyzed in an effective manner, and on the other hand making big improvements via investments in cooperate structure or new machinery is neither economically nor physically viable. Machine learning offers a promising way for manufacturers to address both these problems as they are in an excellent position to employ learning techniques with their massive resource of historical production data. However, choosing modelling a strategy in this setting is far from trivial and this is the objective of this article. The article investigates characteristics of the most popular classifiers used in industry today. Support Vector Machines, Multilayer Perceptron, Decision Trees, Random Forests, and the meta-algorithms Bagging and Boosting are mainly investigated in this work. Lessons from real-world implementations of these learners are also provided together with future directions when different learners are expected to perform well. The importance of feature selection and relevant selection methods in an industrial setting are further investigated. Performance metrics have also been discussed for the sake of completion.
Resumo:
BACKGROUND: A wide range of health problems has been reported in elderly post-stroke patients. AIM: The aim of this study was to analyse the prevalence and timing of health problems identified by patient interviews and scrutiny of primary health care and municipality elderly health care records during the first post-stroke year. METHODS: A total of 390 consecutive patients, ≥65 years, discharged alive from hospital after a stroke event, were followed for 1 year post-admission. Information on the health care situation during the first post-stroke year was obtained from primary health care and municipal elderly health care records and through interviews with the stroke survivors, at 1 week after discharge, and 3 and 12 months after hospital admission. RESULTS: More than 90% had some health problem at some time during the year, while based on patient record data only 4-8% had problems during a given week. The prevalence of interview-based health problems was generally higher than record-based prevalence, and the ranking order was moderately different. The most frequently interview-reported problems were associated with perception, activity, and tiredness, while the most common record-based findings indicated pain, bladder and bowel function, and breathing and circulation problems. There was co-occurrence between some problems, such as those relating to cognition, activity, and tiredness. CONCLUSIONS: Almost all patients had a health problem during the year, but few occurred in a given week. Cognitive and communication problems were more common in interview data than record data. Co-occurrence may be used to identify subtle health problems.