892 resultados para Constructive heuristics
Resumo:
Solutions to combinatorial optimization problems, such as problems of locating facilities, frequently rely on heuristics to minimize the objective function. The optimum is sought iteratively and a criterion is needed to decide when the procedure (almost) attains it. Pre-setting the number of iterations dominates in OR applications, which implies that the quality of the solution cannot be ascertained. A small, almost dormant, branch of the literature suggests using statistical principles to estimate the minimum and its bounds as a tool to decide upon stopping and evaluating the quality of the solution. In this paper we examine the functioning of statistical bounds obtained from four different estimators by using simulated annealing on p-median test problems taken from Beasley’s OR-library. We find the Weibull estimator and the 2nd order Jackknife estimator preferable and the requirement of sample size to be about 10 being much less than the current recommendation. However, reliable statistical bounds are found to depend critically on a sample of heuristic solutions of high quality and we give a simple statistic useful for checking the quality. We end the paper with an illustration on using statistical bounds in a problem of locating some 70 distribution centers of the Swedish Post in one Swedish region.
Resumo:
Solutions to combinatorial optimization, such as p-median problems of locating facilities, frequently rely on heuristics to minimize the objective function. The minimum is sought iteratively and a criterion is needed to decide when the procedure (almost) attains it. However, pre-setting the number of iterations dominates in OR applications, which implies that the quality of the solution cannot be ascertained. A small branch of the literature suggests using statistical principles to estimate the minimum and use the estimate for either stopping or evaluating the quality of the solution. In this paper we use test-problems taken from Baesley's OR-library and apply Simulated Annealing on these p-median problems. We do this for the purpose of comparing suggested methods of minimum estimation and, eventually, provide a recommendation for practioners. An illustration ends the paper being a problem of locating some 70 distribution centers of the Swedish Post in a region.
Resumo:
Solutions to combinatorial optimization problems frequently rely on heuristics to minimize an objective function. The optimum is sought iteratively and pre-setting the number of iterations dominates in operations research applications, which implies that the quality of the solution cannot be ascertained. Deterministic bounds offer a mean of ascertaining the quality, but such bounds are available for only a limited number of heuristics and the length of the interval may be difficult to control in an application. A small, almost dormant, branch of the literature suggests using statistical principles to derive statistical bounds for the optimum. We discuss alternative approaches to derive statistical bounds. We also assess their performance by testing them on 40 test p-median problems on facility location, taken from Beasley’s OR-library, for which the optimum is known. We consider three popular heuristics for solving such location problems; simulated annealing, vertex substitution, and Lagrangian relaxation where only the last offers deterministic bounds. Moreover, we illustrate statistical bounds in the location of 71 regional delivery points of the Swedish Post. We find statistical bounds reliable and much more efficient than deterministic bounds provided that the heuristic solutions are sampled close to the optimum. Statistical bounds are also found computationally affordable.
Resumo:
The p-median problem is often used to locate p service centers by minimizing their distances to a geographically distributed demand (n). The optimal locations are sensitive to geographical context such as road network and demand points especially when they are asymmetrically distributed in the plane. Most studies focus on evaluating performances of the p-median model when p and n vary. To our knowledge this is not a very well-studied problem when the road network is alternated especially when it is applied in a real world context. The aim in this study is to analyze how the optimal location solutions vary, using the p-median model, when the density in the road network is alternated. The investigation is conducted by the means of a case study in a region in Sweden with an asymmetrically distributed population (15,000 weighted demand points), Dalecarlia. To locate 5 to 50 service centers we use the national transport administrations official road network (NVDB). The road network consists of 1.5 million nodes. To find the optimal location we start with 500 candidate nodes in the network and increase the number of candidate nodes in steps up to 67,000. To find the optimal solution we use a simulated annealing algorithm with adaptive tuning of the temperature. The results show that there is a limited improvement in the optimal solutions when nodes in the road network increase and p is low. When p is high the improvements are larger. The results also show that choice of the best network depends on p. The larger p the larger density of the network is needed.
Resumo:
Corporal punishment is a worldwide problem. The purpose withthis thesis is to promote a constructive discussion about the problem andconnect this to children’s rights. This gives the possibility to start adiscussion about suggestions and measures to reduce the problem. Thetheory is that corporal punishment is used as a disciplinary method tochange behavior. Children’s rights is regulated by conventions and nationallaws. The method is to conduct an analysis with interpretations andcommentaries of the research materials from South Africa and Sweden.The conclusion is that those who are positive to corporal punishment thinksit is an efficient working method, and it is about children’s safety. Thosewho are negative have experienced that alternative methods works. Asuggestion is to involve children in the work with children’s rights andeducate them in human and children’s rights with focus on obligations andresponsibility.
Resumo:
Optimal location on the transport infrastructure is the preferable requirement for many decision making processes. Most studies have focused on evaluating performances of optimally locate p facilities by minimizing their distances to a geographically distributed demand (n) when p and n vary. The optimal locations are also sensitive to geographical context such as road network, especially when they are asymmetrically distributed in the plane. The influence of alternating road network density is however not a very well-studied problem especially when it is applied in a real world context. This paper aims to investigate how the density level of the road network affects finding optimal location by solving the specific case of p-median location problem. A denser network is found needed when a higher number of facilities are to locate. The best solution will not always be obtained in the most detailed network but in a middle density level. The solutions do not further improve or improve insignificantly as the density exceeds 12,000 nodes, some solutions even deteriorate. The hierarchy of the different densities of network can be used according to location and transportation purposes and increase the efficiency of heuristic methods. The method in this study can be applied to other location-allocation problem in transportation analysis where the road network density can be differentiated.
Resumo:
To have good data quality with high complexity is often seen to be important. Intuition says that the higher accuracy and complexity the data have the better the analytic solutions becomes if it is possible to handle the increasing computing time. However, for most of the practical computational problems, high complexity data means that computational times become too long or that heuristics used to solve the problem have difficulties to reach good solutions. This is even further stressed when the size of the combinatorial problem increases. Consequently, we often need a simplified data to deal with complex combinatorial problems. In this study we stress the question of how the complexity and accuracy in a network affect the quality of the heuristic solutions for different sizes of the combinatorial problem. We evaluate this question by applying the commonly used p-median model, which is used to find optimal locations in a network of p supply points that serve n demand points. To evaluate this, we vary both the accuracy (the number of nodes) of the network and the size of the combinatorial problem (p). The investigation is conducted by the means of a case study in a region in Sweden with an asymmetrically distributed population (15,000 weighted demand points), Dalecarlia. To locate 5 to 50 supply points we use the national transport administrations official road network (NVDB). The road network consists of 1.5 million nodes. To find the optimal location we start with 500 candidate nodes in the network and increase the number of candidate nodes in steps up to 67,000 (which is aggregated from the 1.5 million nodes). To find the optimal solution we use a simulated annealing algorithm with adaptive tuning of the temperature. The results show that there is a limited improvement in the optimal solutions when the accuracy in the road network increase and the combinatorial problem (low p) is simple. When the combinatorial problem is complex (large p) the improvements of increasing the accuracy in the road network are much larger. The results also show that choice of the best accuracy of the network depends on the complexity of the combinatorial (varying p) problem.
Resumo:
In this paper we describe our system for automatically extracting "correct" programs from proofs using a development of the Curry-Howard process. Although program extraction has been developed by many authors, our system has a number of novel features designed to make it very easy to use and as close as possible to ordinary mathematical terminology and practice. These features include 1. the use of Henkin's technique to reduce higher-order logic to many-sorted (first-order) logic; 2. the free use of new rules for induction subject to certain conditions; 3. the extensive use of previously programmed (total, recursive) functions; 4. the use of templates to make the reasoning much closer to normal mathematical proofs and 5. a conceptual distinction between the computational type theory (for representing programs)and the logical type theory (for reasoning about programs). As an example of our system we give a constructive proof of the well known theorem that every graph of even parity, which is non-trivial in the sense that it does not consist of isolated vertices, has a cycle. Given such a graph as input, the extracted program produces a cycle as promised.
Resumo:
In this paper we describe a new protocol that we call the Curry-Howard protocol between a theory and the programs extracted from it. This protocol leads to the expansion of the theory and the production of more powerful programs. The methodology we use for automatically extracting “correct” programs from proofs is a development of the well-known Curry-Howard process. Program extraction has been developed by many authors, but our presentation is ultimately aimed at a practical, usable system and has a number of novel features. These include 1. a very simple and natural mimicking of ordinary mathematical practice and likewise the use of established computer programs when we obtain programs from formal proofs, and 2. a conceptual distinction between programs on the one hand, and proofs of theorems that yield programs on the other. An implementation of our methodology is the Fred system. As an example of our protocol we describe a constructive proof of the well-known theorem that every graph of even parity can be decomposed into a list of disjoint cycles. Given such a graph as input, the extracted program produces a list of the (non-trivial) disjoint cycles as promised.
Resumo:
The Object Managment Group’s Meta-Object Facility (MOF) is a semiformal approach to writing models and metamodels (models of models). The MOF was developed to enable systematic model/metamodel interchange and integration. The approach is problematic, unless metamodels are correctly specified: an error in a metamodel specification will propagate throughout instantiating models and final model implementations. An important open question is how to develop provably correct metamodels. This paper outlines a solution to the question, in which the MOF metamodelling approach is formalized within constructive type theory.
Resumo:
The Rational Agent model have been a foundational basis for theoretical models such as Economics, Management Science, Artificial Intelligence and Game Theory, mainly by the ¿maximization under constraints¿ principle, e.g. the ¿Expected Utility Models¿, among them, the Subjective Expected Utility (SEU) Theory, from Savage, placed as most influence player over theoretical models we¿ve seen nowadays, even though many other developments have been done, indeed also in non-expected utility theories field. Having the ¿full rationality¿ assumption, going for a less idealistic sight ¿bounded rationality¿ of Simon, or for classical anomalies studies, such as the ¿heuristics and bias¿ analysis by Kahneman e Tversky, ¿Prospect Theory¿ also by Kahneman & Tversky, or Thaler¿s Anomalies, and many others, what we can see now is that Rational Agent Model is a ¿Management by Exceptions¿ example, as for each new anomalies¿s presentation, in sequence, a ¿problem solving¿ development is needed. This work is a theoretical essay, which tries to understand: 1) The rational model as a ¿set of exceptions¿; 2) The actual situation unfeasibility, since once an anomalie is identified, we need it¿s specific solution developed, and since the number of anomalies increases every year, making strongly difficult to manage rational model; 3) That behaviors judged as ¿irrationals¿ or deviated, by the Rational Model, are truly not; 4) That¿s the right moment to emerge a Theory including mental processes used in decision making; and 5) The presentation of an alternative model, based on some cognitive and experimental psychology analysis, such as conscious and uncounscious processes, cognition, intuition, analogy-making, abstract roles, and others. Finally, we present conclusions and future research, that claims for deeper studies in this work¿s themes, for mathematical modelling, and studies about a rational analysis and cognitive models possible integration. .
Resumo:
We address specific problems to be considered once the Protocolo de Fortaleza becomes a fully recognised agreement. Elimination of anti-dumping measures a la WTO, services competition and the interrelationship between the regulatory agencies and the competition offices, issues regarding the concept of relevant market and, in a broader view, harmonisation of rules, criteria, institutions and regional competitive environments, are discussed. The European experience, if properly adapted, can be of value. In this context, two important principles are singled-out: acknowledgement of an acquis Mercosul – which allows a realistic and constructive perspective when facing the integration challenges, and the wise use of subsidiarity, for faster developments with lighter central institutions. Dispute settlement, in the competition framework, is not tackled, though – in this case – we are favourable to the creation of a supranational organism. All these points do not naturally encompass everything required for the full implementation of a competition policy in Mercosul.
Resumo:
This thesis provides three original contributions to the field of Decision Sciences. The first contribution explores the field of heuristics and biases. New variations of the Cognitive Reflection Test (CRT--a test to measure "the ability or disposition to resist reporting the response that first comes to mind"), are provided. The original CRT (S. Frederick [2005] Journal of Economic Perspectives, v. 19:4, pp.24-42) has items in which the response is immediate--and erroneous. It is shown that by merely varying the numerical parameters of the problems, large deviations in response are found. Not only the final results are affected by the proposed variations, but so is processing fluency. It seems that numbers' magnitudes serve as a cue to activate system-2 type reasoning. The second contribution explores Managerial Algorithmics Theory (M. Moldoveanu [2009] Strategic Management Journal, v. 30, pp. 737-763); an ambitious research program that states that managers display cognitive choices with a "preference towards solving problems of low computational complexity". An empirical test of this hypothesis is conducted, with results showing that this premise is not supported. A number of problems are designed with the intent of testing the predictions from managerial algorithmics against the predictions of cognitive psychology. The results demonstrate (once again) that framing effects profoundly affect choice, and (an original insight) that managers are unable to distinguish computational complexity problem classes. The third contribution explores a new approach to a computationally complex problem in marketing: the shelf space allocation problem (M-H Yang [2001] European Journal of Operational Research, v. 131, pp.107--118). A new representation for a genetic algorithm is developed, and computational experiments demonstrate its feasibility as a practical solution method. These studies lie at the interface of psychology and economics (with bounded rationality and the heuristics and biases programme), psychology, strategy, and computational complexity, and heuristics for computationally hard problems in management science.
Resumo:
Decisões humanas foram a preocupação central de Herbert Simon em sua vasta produção acadêmica através da qual difundiu sua abordagem de racionalidade limitada pela economia. O reconhecimento do ambiente complexo e dos limites cognitivos do ser humano, levaram-no a propor mecanismos usados para facilitar o processo decisório. Dentre eles, salientou como o mais importante o uso de heurísticas, regras que simplificam a tomada de decisão. Em torno dessa ideia um novo e promissor caminho para o estudo das decisões humanas em economia tem se desenvolvido e inúmeros trabalhos têm se debruçado sobre o assunto. Mais atualmente o tema remete ao trabalho de Daniel Kahneman e Amos Tversky que analisaram comportamentos anômalos em relação à teoria da decisão mais tradicional devido ao uso de heurísticas. Essa abordagem chamada de heuristics and biases ganhou um grande espaço na academia sendo utilizada na análise de muitos eventos empíricos na administração, direito, economia e medicina. A presente tese está estruturada em três artigos. O primeiro artigo trata do uso de heurística na análise do comportamento do agente econômico a partir da contribuição de Simon, Kahneman e Tversky. A apresentação de críticas feitas às duas propostas jogam luz sobre o debate em torno de questões quanto a possível relação entre elas. A partir da análise da literatura, este trabalho propõe uma complementaridade promissora para a economia com a construção de uma teoria comportamental em torno de heurísticas. No segundo artigo, as contribuições de Herbert Simon, Daniel Kahneman e Amos Tversky são utilizadas na análise do comportamento do consumidor. Através de um modelo de simulação baseada em agentes são comparadas cinco heurísticas que representam diferentes regras utilizadas pelo consumidor na decisão de compra: Menor preço de 3, 4 e 5 alternativas pesquisadas, Take-The-Best (TTB), proposta por Gigerenzer e Goldstein, e Time-Is-Money (TIM). Os resultados obtidos se afastam da maximização mas podem ser interpretados como eficientes em função da redução do esforço de pesquisa e do preço obtido. Duas heurísticas mostram grande eficiência: a Menor preço de 3 alternativas e a TTB. A inclusão de custo crescente de pesquisa na análise torna muito eficientes os resultados da TIM e chama a atenção para a relevância da definição de custo na avaliação da eficiência da heurística. O terceiro artigo discute um mecanismo de adaptação do comportamento que objetiva melhorias do resultado obtido com a decisão. Através de simulação baseada em agentes são modelados consumidores de bens homogêneos que utilizam heurísticas para decidir sua compra. É desenvolvida uma heurística, a Take-The-Best adaptaviva (TTBA), que incorpora uma proposta de Simon de um mecanismo de adaptação como reação a performances recentes que pode alterar a aspiração em relação aos resultados futuros e, dessa forma, definir a extensão da pesquisa por alternativas. Os resultados alcançados com o uso da TTBA são comparados a três outras heurísticas: Procura Randômica, Menor de 3 alternativas e Take-The-Best (TTB). A simulação mostrou que a Menor de 3 continua obtendo bons resultados e que a incorporação à TTB do mecanismo de adaptação gera eficiência à TTBA.
Resumo:
A pesquisa aqui representada teve por objetivo identificar quais vieses podem influenciar os tomadores de decisões estratégicas de organizações brasileiras localizadas no estado do Rio de Janeiro. O trabalho realizado apoiou-se em questionário adaptado de Bazerman (2004). A partir de estudos sobre o cognitivo, este autor apresenta as heurísticas e respectivos vieses, objetos desta dissertação. Os tipos de pesquisas utilizados foram a bibliográfica e a de campo. Esta pesquisa de campo foi realizada com presidentes e diretores executivos do ambiente corporativo. Em suas funções, são eles os responsáveis por decisões estratégicas de empresas brasileiras localizadas no Brasil. A pesquisa revelou que os vieses apresentados por Bazerman (2004) foram identificados nos executivos entrevistados.