901 resultados para Multi-Criteria Problems
Resumo:
Este trabalho trata da logística envolvida em operações de resposta a desastres, com foco na entrega final de suprimentos destinados a ajudar vítimas. Seu propósito é investigar os objetivos pertinentes ao planejamento do transporte da carga e encontrar uma metodologia para definir estratégia que sirva à tomada de decisão em campo. Para tanto, primeiramente identifica-se os objetivos adotados em modelos de Pesquisa Operacional para a tarefa em questão, através da análise de conteúdo das publicações pertinentes. Então, a abordagem do Pensamento Focado em Valores é utilizada para estruturar o problema. Finalmente, o método Simple Multi-Attribute Rating Technique Exploiting Ranks (SMARTER) é empregado na construção de um modelo de Análise da Decisão Multicritério (ADM), com consulta a um profissional experiente da área humanitária e aproveitando a análise da literatura previamente realizada. Neste processo, são elaboradas e avaliadas seis alternativas para a tomada de decisão condizentes com os valores da comunidade humanitária. Os resultados obtidos mostram que existe incompatibilidade entre os critérios de desempenho identificados nas publicações existentes e os objetivos perseguidos pelo Tomador da Decisão (TD) real. De acordo com o modelo construído, o atendimento de prioridades e a manutenção da sustentabilidade da operação são os objetivos que devem ser levados em conta para planejar a entrega de carga em pós-desastre, sendo que o custo e a equidade da distribuição não devem ser considerados. Conclui-se que o método adotado é útil à definição destes critérios e também ao desenvolvimento de estratégias que resultem em distribuições de ajuda melhores, aos olhos do próprio TD. Desta forma, ressalta-se que este trabalho contribui à área da Logística Humanitária com a investigação dos objetivos, assim como ao campo da ADM pela formalização dos processos de elaboração de alternativas, além da adição de mais uma aplicação possível ao repertório do método SMARTER.
Resumo:
Conventional feed forward Neural Networks have used the sum-of-squares cost function for training. A new cost function is presented here with a description length interpretation based on Rissanen's Minimum Description Length principle. It is a heuristic that has a rough interpretation as the number of data points fit by the model. Not concerned with finding optimal descriptions, the cost function prefers to form minimum descriptions in a naive way for computational convenience. The cost function is called the Naive Description Length cost function. Finding minimum description models will be shown to be closely related to the identification of clusters in the data. As a consequence the minimum of this cost function approximates the most probable mode of the data rather than the sum-of-squares cost function that approximates the mean. The new cost function is shown to provide information about the structure of the data. This is done by inspecting the dependence of the error to the amount of regularisation. This structure provides a method of selecting regularisation parameters as an alternative or supplement to Bayesian methods. The new cost function is tested on a number of multi-valued problems such as a simple inverse kinematics problem. It is also tested on a number of classification and regression problems. The mode-seeking property of this cost function is shown to improve prediction in time series problems. Description length principles are used in a similar fashion to derive a regulariser to control network complexity.
Resumo:
To solve multi-objective problems, multiple reward signals are often scalarized into a single value and further processed using established single-objective problem solving techniques. While the field of multi-objective optimization has made many advances in applying scalarization techniques to obtain good solution trade-offs, the utility of applying these techniques in the multi-objective multi-agent learning domain has not yet been thoroughly investigated. Agents learn the value of their decisions by linearly scalarizing their reward signals at the local level, while acceptable system wide behaviour results. However, the non-linear relationship between weighting parameters of the scalarization function and the learned policy makes the discovery of system wide trade-offs time consuming. Our first contribution is a thorough analysis of well known scalarization schemes within the multi-objective multi-agent reinforcement learning setup. The analysed approaches intelligently explore the weight-space in order to find a wider range of system trade-offs. In our second contribution, we propose a novel adaptive weight algorithm which interacts with the underlying local multi-objective solvers and allows for a better coverage of the Pareto front. Our third contribution is the experimental validation of our approach by learning bi-objective policies in self-organising smart camera networks. We note that our algorithm (i) explores the objective space faster on many problem instances, (ii) obtained solutions that exhibit a larger hypervolume, while (iii) acquiring a greater spread in the objective space.
Resumo:
The papers is dedicated to the questions of modeling and basing super-resolution measuring- calculating systems in the context of the conception “device + PC = new possibilities”. By the authors of the article the new mathematical method of solution of the multi-criteria optimization problems was developed. The method is based on physic-mathematical formalism of reduction of fuzzy disfigured measurements. It is shown, that determinative part is played by mathematical properties of physical models of the object, which is measured, surroundings, measuring components of measuring-calculating systems and theirs cooperation as well as the developed mathematical method of processing and interpretation of measurements problem solution.
Resumo:
Egyes alternatívák, forgatókönyvek, technológiák stb. fenntarthatóságának értékelése – definíciószerűen többdimenziós probléma. A megfelelő alternatíva kiválasztásánál ugyanis a döntéshozóknak egyszerre kell figyelembe venniük környezetvédelmi, gazdasági és társadalmi szempontokat. Az ilyen döntéseket támogathatják többszempontú döntéshozatali modellek. A tanulmány hét többszempontú döntési módszertan (MAU, AHP, ELECTRE, PROMETHEE, REGIME, NAIADE és ideális-referencia pont) alkalmazhatóságát vizsgálja részvételi körülmények között. Az utóbbi évek e témában publikált esettanulmányait áttekintve megállapítható, hogy egyik módszer sem dominálja a többit, azok különböző feltételek mellett eltérő sikerrel használhatók. Ennek ellenére a különböző technikák kombinációjával előállíthatunk olyan eljárásokat, melyekkel az egyes módszerek előnyeit még jobban kiaknázhatjuk. ________ Measuring and comparing the sustainability of certain actions, scenarios, technologies, etc. – by definition – is a multidimensional problem. Decision makers must consider environmental, economic and social aspects when choosing an alternative course of action. Such decisions can be aided by multi-criteria decision analysis (MCDA). In this paper participatory seven different MCDA methodologies are investigated (MAU, the Analytic Hierarchic Process (AHP), the ELECTRE, PROMETHEE, REGIME, and NAIADE methods and the “Ideal and reference point” approaches). It is based on a series of reports, in which more than 30 real world case studies focusing on participatory MCDA were reviewed. It is emphasized that there is no “best” choice from the list of MCDA techniques, but some methods fit certain decision problems more than others. However, with the combination of these methodologies some complementary benefits of the different techniques can be exploited.
Resumo:
There is growing popularity in the use of composite indices and rankings for cross-organizational benchmarking. However, little attention has been paid to alternative methods and procedures for the computation of these indices and how the use of such methods may impact the resulting indices and rankings. This dissertation developed an approach for assessing composite indices and rankings based on the integration of a number of methods for aggregation, data transformation and attribute weighting involved in their computation. The integrated model developed is based on the simulation of composite indices using methods and procedures proposed in the area of multi-criteria decision making (MCDM) and knowledge discovery in databases (KDD). The approach developed in this dissertation was automated through an IT artifact that was designed, developed and evaluated based on the framework and guidelines of the design science paradigm of information systems research. This artifact dynamically generates multiple versions of indices and rankings by considering different methodological scenarios according to user specified parameters. The computerized implementation was done in Visual Basic for Excel 2007. Using different performance measures, the artifact produces a number of excel outputs for the comparison and assessment of the indices and rankings. In order to evaluate the efficacy of the artifact and its underlying approach, a full empirical analysis was conducted using the World Bank's Doing Business database for the year 2010, which includes ten sub-indices (each corresponding to different areas of the business environment and regulation) for 183 countries. The output results, which were obtained using 115 methodological scenarios for the assessment of this index and its ten sub-indices, indicated that the variability of the component indicators considered in each case influenced the sensitivity of the rankings to the methodological choices. Overall, the results of our multi-method assessment were consistent with the World Bank rankings except in cases where the indices involved cost indicators measured in per capita income which yielded more sensitive results. Low income level countries exhibited more sensitivity in their rankings and less agreement between the benchmark rankings and our multi-method based rankings than higher income country groups.
Resumo:
Large scale disasters, such as the one caused by the Typhoon Haiyan, which devastated portions of the Philippines in 2013, or the catastrophic 2010 Haiti earthquake, which caused major damage in Port-au-Prince and other settlements in the region, have massive and lasting effects on populations. Nowadays, disasters can be considered as a consequence of inappropriately managed risk. These risks are the product of hazards and vulnerability, which refers to the extent to which a community can be affected by the impact of a hazard. In this way, developing countries, due to their greater vulnerability, suffer the highest costs when a disaster occurs. Disaster relief is a challenge for politics, economies, and societies worldwide. Humanitarian organizations face multiple decision problems when responding to disasters. In particular, once a disaster strikes, the distribution of humanitarian aid to the population affected is one of the most fundamental operations in what is called humanitarian logistics. This term is defined as the process of planning, implementing and controlling the effcient, cost-effective ow and storage of goods and materials as well as related information, from the point of origin to the point of consumption, for the purpose of meeting the end bene- ciaries' requirements and alleviate the suffering of vulnerable people, [the Humanitarian Logistics Conference, 2004 (Fritz Institute)]. During the last decade there has been an increasing interest in the OR/MS community in studying this topic, pointing out the similarities and differences between humanitarian and business logistics, and developing models suited to handle the special characteristics of these problems. Several authors have pointed out that traditional logistic objectives, such as minimizing operation cost, are not the most relevant goals in humanitarian operations. Other factors, such as the time of operation, or the design of safe and equitable distribution plans, come to the front, and new models and algorithms are needed to cope with these special features. Up to six attributes related to the distribution plan are considered in our multi-criteria approach. Even though there are usually simple ways to measure the cost of an operation, the evaluation of some other attributes such as security or equity is not easy. As a result, several attribute measures are proposed and developed, focusing on different aspects of the solutions. Furthermore, when metaheuristic solution methods are used, considering non linear objective functions does not increase the complexity of the algorithms significantly, and thus more accurate measures can be utilized...
Resumo:
One of the global phenomena with threats to environmental health and safety is artisanal mining. There are ambiguities in the manner in which an ore-processing facility operates which hinders the mining capacity of these miners in Ghana. These problems are reviewed on the basis of current socio-economic, health and safety, environmental, and use of rudimentary technologies which limits fair-trade deals to miners. This research sought to use an established data-driven, geographic information (GIS)-based system employing the spatial analysis approach for locating a centralized processing facility within the Wassa Amenfi-Prestea Mining Area (WAPMA) in the Western region of Ghana. A spatial analysis technique that utilizes ModelBuilder within the ArcGIS geoprocessing environment through suitability modeling will systematically and simultaneously analyze a geographical dataset of selected criteria. The spatial overlay analysis methodology and the multi-criteria decision analysis approach were selected to identify the most preferred locations to site a processing facility. For an optimal site selection, seven major criteria including proximity to settlements, water resources, artisanal mining sites, roads, railways, tectonic zones, and slopes were considered to establish a suitable location for a processing facility. Site characterizations and environmental considerations, incorporating identified constraints such as proximity to large scale mines, forest reserves and state lands to site an appropriate position were selected. The analysis was limited to criteria that were selected and relevant to the area under investigation. Saaty’s analytical hierarchy process was utilized to derive relative importance weights of the criteria and then a weighted linear combination technique was applied to combine the factors for determination of the degree of potential site suitability. The final map output indicates estimated potential sites identified for the establishment of a facility centre. The results obtained provide intuitive areas suitable for consideration
Resumo:
This thesis deals with control of stock in an inventory, focusing on inventory placement. The purpose of this thesis is to reduce the transport distance within the main stock house while gathering inventory. This will be achieved by reconstructing the inventory placement in consideration with how frequently the inventories get picked and mass of the inventory. In particular, the literature and the data that is collected from the company´s business system have laid the foundation for the thesis. In general, interviews and observations also contributed to the data collection. To fulfill the aim and to produce arbitrary results, two issues have been developed regarding which attributes that should determine the position of the inventory in the stock house and how to obtain a more effective inventory structure? The authors have jointly produced a result of suggestions for future inventory placement in terms of picking frequency and weight. Initially a situation analysis was conducted to identify known problems with the inventory´s placement and storage systems. The problems that were identified were that the inventory placement has no consideration regarding picking frequency. To determine the most frequent picked inventory an ABC analysis was conducted. All of the inventories were spread out throughout the whole stock house. To take in account, the additional criterion, which was weight, a multi-criteria analysis was performed in combination with the ABC analysis. The results of the combined analysis provided that the basis for drawing up concepts for future inventory placement. The proposal includes optimized inventory placements in different zones of the most frequently picked inventory with weight as an additional criterion.
Resumo:
As the complexity of parallel applications increase, the performance limitations resulting from computational load imbalance become dominant. Mapping the problem space to the processors in a parallel machine in a manner that balances the workload of each processors will typically reduce the run-time. In many cases the computation time required for a given calculation cannot be predetermined even at run-time and so static partition of the problem returns poor performance. For problems in which the computational load across the discretisation is dynamic and inhomogeneous, for example multi-physics problems involving fluid and solid mechanics with phase changes, the workload for a static subdomain will change over the course of a computation and cannot be estimated beforehand. For such applications the mapping of loads to process is required to change dynamically, at run-time in order to maintain reasonable efficiency. The issue of dynamic load balancing are examined in the context of PHYSICA, a three dimensional unstructured mesh multi-physics continuum mechanics computational modelling code.
Resumo:
Currently, the decision analysis in production processes involves a level of detail, in which the problem is subdivided to analyze it in terms of different and conflicting points of view. The multi-criteria analysis has been an important tool that helps assertive decisions related to the production process. This process of analysis has been incorporated into various areas of production engineering, by applying multi-criteria methods in solving the problems of the productive sector. This research presents a statistical study on the use of multi-criteria methods in the areas of Production Engineering, where 935 papers were filtered from 20.663 publications in scientific journals, considering a level of the publication quality based on the impact factor published by the JCR between 2010 and 2015. In this work, the descriptive statistics is used to represent some information and statistical analysis on the volume of applications methods. Relevant results were found with respect to the "amount of advanced methods that are being applied and in which areas related to Production Engineering." This information may provide support to researchers when preparing a multi-criteria application, whereupon it will be possible to check in which issues and how often the other authors have used multi-criteria methods.
Resumo:
Phylogenetic inference consist in the search of an evolutionary tree to explain the best way possible genealogical relationships of a set of species. Phylogenetic analysis has a large number of applications in areas such as biology, ecology, paleontology, etc. There are several criterias which has been defined in order to infer phylogenies, among which are the maximum parsimony and maximum likelihood. The first one tries to find the phylogenetic tree that minimizes the number of evolutionary steps needed to describe the evolutionary history among species, while the second tries to find the tree that has the highest probability of produce the observed data according to an evolutionary model. The search of a phylogenetic tree can be formulated as a multi-objective optimization problem, which aims to find trees which satisfy simultaneously (and as much as possible) both criteria of parsimony and likelihood. Due to the fact that these criteria are different there won't be a single optimal solution (a single tree), but a set of compromise solutions. The solutions of this set are called "Pareto Optimal". To find this solutions, evolutionary algorithms are being used with success nowadays.This algorithms are a family of techniques, which aren’t exact, inspired by the process of natural selection. They usually find great quality solutions in order to resolve convoluted optimization problems. The way this algorithms works is based on the handling of a set of trial solutions (trees in the phylogeny case) using operators, some of them exchanges information between solutions, simulating DNA crossing, and others apply aleatory modifications, simulating a mutation. The result of this algorithms is an approximation to the set of the “Pareto Optimal” which can be shown in a graph with in order that the expert in the problem (the biologist when we talk about inference) can choose the solution of the commitment which produces the higher interest. In the case of optimization multi-objective applied to phylogenetic inference, there is open source software tool, called MO-Phylogenetics, which is designed for the purpose of resolving inference problems with classic evolutionary algorithms and last generation algorithms. REFERENCES [1] C.A. Coello Coello, G.B. Lamont, D.A. van Veldhuizen. Evolutionary algorithms for solving multi-objective problems. Spring. Agosto 2007 [2] C. Zambrano-Vega, A.J. Nebro, J.F Aldana-Montes. MO-Phylogenetics: a phylogenetic inference software tool with multi-objective evolutionary metaheuristics. Methods in Ecology and Evolution. En prensa. Febrero 2016.
Resumo:
Resumen La Evaluación Múlticriterio (EMC), integra las diferentes dimensiones de una realidad en un sólo marco de análisis, para brindar un acercamiento de la gestión del recurso hídrico en los cantones Barva, Santa Bárbara y San Rafael de Heredia, con el objetivo de generar las políticas hídricas locales adecuadas. Esta estructura metodológica presenta una gran transparencia como herramienta en la toma de decisiones, identificando claramente los diferentes actores involucrados, describiendo, al mismo tiempo los problemas de gestión del recurso hídrico en la zona; a la vez que permite delimitar los conflictos sociales y mostrar diferentes posibilidades para su solución a través de compromisos y diálogo entre las partes. De éste diálogo emergen soluciones concretas, estructuradas como políticas locales hídricas, tales como: Planes de Gestión Hídrica, Inversión Pública y Privada, Coordinación Institucional, Reforma Institucional/legal. La zona presenta una atmósfera conflictiva alrededor de la gestión del agua y por tanto en la estructuración de políticas hídricas locales. Esta conflictividad ‘sectorial’(es decir por cantón) se superpone a una extraordinaria conflictividad ‘territorial’. La escasez o competencia sobre el agua se fundamenta en unas demandas crecientes que son expresión de un proceso de desarrollo urbano y turístico acelerado y desordenado. Abstract The Evaluation Multi-criteria analysis (EMA), integrates the different dimensions of a reality in an analysis mark, to offer an approach of the administration of the hydric resources in the Heredia´s cities of Barva, Santa Bárbara and San Rafael, with the objective of generating the local adequate hydrics policies. This methodological structure presents a great transparency like tool in the taking of decisions, identifying the different involved actors clearly, describing, at the same time, the problems of administration of the hydric resources in the area; and at the same time, it allows to define the social conflicts, as showing different possibilities for their solution through commitments and dialogue among the parts. Of this dialogue concrete solutions they emerge, structured as hydrics local policies, such as: Plans of hydric management, Public and Private Investment, Institutional Coordination, Institucional/legal reforms. The area presents a conflicting atmosphere around the administration of the water and therefore in the structuring of local hydrics policies. This conflict 'sectorial' (to say for canton) it is superimposed to an extraordinary 'territorial' conflict. The shortage or competition for water are based in some growing demands that are expression of a process of quick and disordered urban and tourist development.
Resumo:
The study of Quality of Life (Qol) has been conducted on various scales throughout the years with focus on assessing overall quality of living amongst citizens. The main focus in these studies have been on economic factors, with the purpose of creating a Quality of Life Index (QLI).When it comes down to narrowing the focus to the environment and factors like Urban Green Spaces (UGS) and air quality the topic gets more focused on pointing out how each alternative meets this certain criteria. With the benefits of UGS and a healthy environment in focus a new Environmental Quality of Life Index (EQLI) will be proposed by incorporating Multi Criteria Analysis (MCA) and Geographical Information Systems (GIS). Working with MCA on complex environmental problems and incorporating it with GIS is a challenging but rewarding task, and has proven to be an efficient approach among environmental scientists. Background information on three MCA methods will be shown: Analytical Hierarchy Process (AHP), Regime Analysis and PROMETHEE. A survey based on a previous study conducted on the status of UGS within European cities was sent to 18 municipalities in the study area. The survey consists of evaluating the current status of UGS as well as planning and management of UGS with in municipalities for the purpose of getting criteria material for the selected MCA method. The current situation of UGS is assessed with use of GIS software and change detection is done on a 10 year period using NDVI index for comparison purposes to one of the criteria in the MCA. To add to the criteria, interpolation of nitrogen dioxide levels was performed with ordinary kriging and the results transformed into indicator values. The final outcome is an EQLI map with indicators of environmentally attractive municipalities with ranking based on predefinedMCA criteria using PROMETHEE I pairwise comparison and PROMETHEE II complete ranking of alternatives. The proposed methodology is applied to Lisbon’s Metropolitan Area, Portugal.
Resumo:
Technologies for Big Data and Data Science are receiving increasing research interest nowadays. This paper introduces the prototyping architecture of a tool aimed to solve Big Data Optimization problems. Our tool combines the jMetal framework for multi-objective optimization with Apache Spark, a technology that is gaining momentum. In particular, we make use of the streaming facilities of Spark to feed an optimization problem with data from different sources. We demonstrate the use of our tool by solving a dynamic bi-objective instance of the Traveling Salesman Problem (TSP) based on near real-time traffic data from New York City, which is updated several times per minute. Our experiment shows that both jMetal and Spark can be integrated providing a software platform to deal with dynamic multi-optimization problems.