530 resultados para Judgmental heuristics
Resumo:
In order to handle Natural disasters, emergency areas are often individuated over the territory, close to populated centres. In these areas, rescue services are located which respond with resources and materials for population relief. A method of automatic positioning of these centres in case of a flood or an earthquake is presented. The positioning procedure consists of two distinct parts developed by the research group of Prof Michael G. H. Bell of Imperial College, London, refined and applied to real cases at the University of Bologna under the coordination of Prof Ezio Todini. There are certain requirements that need to be observed such as the maximum number of rescue points as well as the number of people involved. Initially, the candidate points are decided according to the ones proposed by the local civil protection services. We then calculate all possible routes from each candidate rescue point to all other points, generally using the concept of the "hyperpath", namely a set of paths each one of which may be optimal. The attributes of the road network are of fundamental importance, both for the calculation of the ideal distance and eventual delays due to the event measured in travel time units. In a second phase, the distances are used to decide the optimum rescue point positions using heuristics. This second part functions by "elimination". In the beginning, all points are considered rescue centres. During every interaction we wish to delete one point and calculate the impact it creates. In each case, we delete the point that creates less impact until we reach the number of rescue centres we wish to keep.
Resumo:
The aim of this thesis was to investigate the respective contribution of prior information and sensorimotor constraints to action understanding, and to estimate their consequences on the evolution of human social learning. Even though a huge amount of literature is dedicated to the study of action understanding and its role in social learning, these issues are still largely debated. Here, I critically describe two main perspectives. The first perspective interprets faithful social learning as an outcome of a fine-grained representation of others’ actions and intentions that requires sophisticated socio-cognitive skills. In contrast, the second perspective highlights the role of simpler decision heuristics, the recruitment of which is determined by individual and ecological constraints. The present thesis aims to show, through four experimental works, that these two contributions are not mutually exclusive. A first study investigates the role of the inferior frontal cortex (IFC), the anterior intraparietal area (AIP) and the primary somatosensory cortex (S1) in the recognition of other people’s actions, using a transcranial magnetic stimulation adaptation paradigm (TMSA). The second work studies whether, and how, higher-order and lower-order prior information (acquired from the probabilistic sampling of past events vs. derived from an estimation of biomechanical constraints of observed actions) interacts during the prediction of other people’s intentions. Using a single-pulse TMS procedure, the third study investigates whether the interaction between these two classes of priors modulates the motor system activity. The fourth study tests the extent to which behavioral and ecological constraints influence the emergence of faithful social learning strategies at a population level. The collected data contribute to elucidate how higher-order and lower-order prior expectations interact during action prediction, and clarify the neural mechanisms underlying such interaction. Finally, these works provide/open promising perspectives for a better understanding of social learning, with possible extensions to animal models.
Resumo:
The Capacitated Location-Routing Problem (CLRP) is a NP-hard problem since it generalizes two well known NP-hard problems: the Capacitated Facility Location Problem (CFLP) and the Capacitated Vehicle Routing Problem (CVRP). The Multi-Depot Vehicle Routing Problem (MDVRP) is known to be a NP-hard since it is a generalization of the well known Vehicle Routing Problem (VRP), arising with one depot. This thesis addresses heuristics algorithms based on the well-know granular search idea introduced by Toth and Vigo (2003) to solve the CLRP and the MDVRP. Extensive computational experiments on benchmark instances for both problems have been performed to determine the effectiveness of the proposed algorithms. This work is organized as follows: Chapter 1 describes a detailed overview and a methodological review of the literature for the the Capacitated Location-Routing Problem (CLRP) and the Multi-Depot Vehicle Routing Problem (MDVRP). Chapter 2 describes a two-phase hybrid heuristic algorithm to solve the CLRP. Chapter 3 shows a computational comparison of heuristic algorithms for the CLRP. Chapter 4 presents a hybrid granular tabu search approach for solving the MDVRP.
Resumo:
Ökonomische Entscheidungen sind ebenso wie alltägliche Entscheidungen von der Aktivität von Hirnregionen abhängig, die zur Kontrolle verschiedener Teilschritte der Entscheidung beitragen. Aktivierung und Desaktivierung dieser Hirnregionen können mit Hilfe moderner bildgebender Verfahren, wie z.B. der funktionellen Magnet-Resonanz-Tomographie (fMRI) dargestellt werden. Die vorliegende Publikation gibt einen Überblick über das interdisziplinäre wissenschaftliche Arbeitsgebiet der „Neuroökonomie“ – einem jungen Forschungsfeld der Neurowissenschaften. Dieser Überblick ist auf sieben Hauptaspekte ökonomischer und finanzieller Entscheidungen fokusiert: 1. In welcher Weise werden ökonomische Parameter wie Wert und Nutzen einer Belohnung, Gewinn oder Verlust, Risiko und Ungewissheit in spezifischen Hirnregionen abgebildet? 2. In welcher spezifischen Weise tragen anatomisch definierte Areale des Gehirns zum Entscheidungsprozess bei? 3. In welcher Weise sind die Entscheidungsprozesse durch Läsion entscheidungsrelevanter Areale des Gehirns gestört? 4. In welcher Weise sind Hirnregionen, die an den Prozessen der Entscheidung beteiligt sind, miteinander vernetzt, um durch Interaktion die Entscheidung herbeizuführen? 5. In welcher Weise ist der Entscheidungsprozess von Persönlichkeitseigenschaften, von genetischen Variationen neuronaler Funktionen und von physiologischer Regulation, z.B. durch Hormone bestimmt? 6. In welcher Weise hängt der Entscheidungsprozess vom sozialen und kulturellen Umfeld des Entscheiders ab? 7. Auf welche Weise werden bei unvollständiger Information über die Optionen der Entscheidung Heuristiken oder Intuitionen genutzt, und in welcher Weise sind Entscheidungen durch Biases beeinflussbar? Der zentrale Teil dieser Publikation gibt einen zusammenfassenden Überblick (review) über die Ergebnisse neuroökonomischer Studien, die die fMRI-Technik nutzen (bis Juni 2010).
Resumo:
When designing metaheuristic optimization methods, there is a trade-off between application range and effectiveness. For large real-world instances of combinatorial optimization problems out-of-the-box metaheuristics often fail, and optimization methods need to be adapted to the problem at hand. Knowledge about the structure of high-quality solutions can be exploited by introducing a so called bias into one of the components of the metaheuristic used. These problem-specific adaptations allow to increase search performance. This thesis analyzes the characteristics of high-quality solutions for three constrained spanning tree problems: the optimal communication spanning tree problem, the quadratic minimum spanning tree problem and the bounded diameter minimum spanning tree problem. Several relevant tree properties, that should be explored when analyzing a constrained spanning tree problem, are identified. Based on the gained insights on the structure of high-quality solutions, efficient and robust solution approaches are designed for each of the three problems. Experimental studies analyze the performance of the developed approaches compared to the current state-of-the-art.
Resumo:
In the past few decades, integrated circuits have become a major part of everyday life. Every circuit that is created needs to be tested for faults so faulty circuits are not sent to end-users. The creation of these tests is time consuming, costly and difficult to perform on larger circuits. This research presents a novel method for fault detection and test pattern reduction in integrated circuitry under test. By leveraging the FPGA's reconfigurability and parallel processing capabilities, a speed up in fault detection can be achieved over previous computer simulation techniques. This work presents the following contributions to the field of Stuck-At-Fault detection: We present a new method for inserting faults into a circuit net list. Given any circuit netlist, our tool can insert multiplexers into a circuit at correct internal nodes to aid in fault emulation on reconfigurable hardware. We present a parallel method of fault emulation. The benefit of the FPGA is not only its ability to implement any circuit, but its ability to process data in parallel. This research utilizes this to create a more efficient emulation method that implements numerous copies of the same circuit in the FPGA. A new method to organize the most efficient faults. Most methods for determinin the minimum number of inputs to cover the most faults require sophisticated softwareprograms that use heuristics. By utilizing hardware, this research is able to process data faster and use a simpler method for an efficient way of minimizing inputs.
Resumo:
Software repositories have been getting a lot of attention from researchers in recent years. In order to analyze software repositories, it is necessary to first extract raw data from the version control and problem tracking systems. This poses two challenges: (1) extraction requires a non-trivial effort, and (2) the results depend on the heuristics used during extraction. These challenges burden researchers that are new to the community and make it difficult to benchmark software repository mining since it is almost impossible to reproduce experiments done by another team. In this paper we present the TA-RE corpus. TA-RE collects extracted data from software repositories in order to build a collection of projects that will simplify extraction process. Additionally the collection can be used for benchmarking. As the first step we propose an exchange language capable of making sharing and reusing data as simple as possible.
Resumo:
Self-stabilization is a property of a distributed system such that, regardless of the legitimacy of its current state, the system behavior shall eventually reach a legitimate state and shall remain legitimate thereafter. The elegance of self-stabilization stems from the fact that it distinguishes distributed systems by a strong fault tolerance property against arbitrary state perturbations. The difficulty of designing and reasoning about self-stabilization has been witnessed by many researchers; most of the existing techniques for the verification and design of self-stabilization are either brute-force, or adopt manual approaches non-amenable to automation. In this dissertation, we first investigate the possibility of automatically designing self-stabilization through global state space exploration. In particular, we develop a set of heuristics for automating the addition of recovery actions to distributed protocols on various network topologies. Our heuristics equally exploit the computational power of a single workstation and the available parallelism on computer clusters. We obtain existing and new stabilizing solutions for classical protocols like maximal matching, ring coloring, mutual exclusion, leader election and agreement. Second, we consider a foundation for local reasoning about self-stabilization; i.e., study the global behavior of the distributed system by exploring the state space of just one of its components. It turns out that local reasoning about deadlocks and livelocks is possible for an interesting class of protocols whose proof of stabilization is otherwise complex. In particular, we provide necessary and sufficient conditions – verifiable in the local state space of every process – for global deadlock- and livelock-freedom of protocols on ring topologies. Local reasoning potentially circumvents two fundamental problems that complicate the automated design and verification of distributed protocols: (1) state explosion and (2) partial state information. Moreover, local proofs of convergence are independent of the number of processes in the network, thereby enabling our assertions about deadlocks and livelocks to apply on rings of arbitrary sizes without worrying about state explosion.
Resumo:
Undergraduate education has a historical tradition of preparing students to meet the problem-solving challenges they will encounter in work, civic, and personal contexts. This thesis research was conducted to study the role of rhetoric in engineering problem solving and decision making and to pose pedagogical strategies for preparing undergraduate students for workplace problem solving. Exploratory interviews with engineering managers as well as the heuristic analyses of engineering A3 project planning reports suggest that Aristotelian rhetorical principles are critical to the engineer's success: Engineers must ascertain the rhetorical situation surrounding engineering problems; apply and adapt invention heuristics to conduct inquiry; draw from their investigation to find innovative solutions; and influence decision making by navigating workplace decision-making systems and audiences using rhetorically constructed discourse. To prepare undergraduates for workplace problem solving, university educators are challenged to help undergraduates understand the exigence and realize the kairotic potential inherent in rhetorical problem solving. This thesis offers pedagogical strategies that focus on mentoring learning communities in problem-posing experiences that are situated in many disciplinary, work, and civic contexts. Undergraduates build a flexible rhetorical technê for problem solving as they navigate the nuances of relevant problem-solving systems through the lens of rhetorical practice.
Resumo:
Planning in realistic domains typically involves reasoning under uncertainty, operating under time and resource constraints, and finding the optimal subset of goals to work on. Creating optimal plans that consider all of these features is a computationally complex, challenging problem. This dissertation develops an AO* search based planner named CPOAO* (Concurrent, Probabilistic, Over-subscription AO*) which incorporates durative actions, time and resource constraints, concurrent execution, over-subscribed goals, and probabilistic actions. To handle concurrent actions, action combinations rather than individual actions are taken as plan steps. Plan optimization is explored by adding two novel aspects to plans. First, parallel steps that serve the same goal are used to increase the plan’s probability of success. Traditionally, only parallel steps that serve different goals are used to reduce plan execution time. Second, actions that are executing but are no longer useful can be terminated to save resources and time. Conventional planners assume that all actions that were started will be carried out to completion. To reduce the size of the search space, several domain independent heuristic functions and pruning techniques were developed. The key ideas are to exploit dominance relations for candidate action sets and to develop relaxed planning graphs to estimate the expected rewards of states. This thesis contributes (1) an AO* based planner to generate parallel plans, (2) domain independent heuristics to increase planner efficiency, and (3) the ability to execute redundant actions and to terminate useless actions to increase plan efficiency.
Resumo:
Back-in-time debuggers are extremely useful tools for identifying the causes of bugs, as they allow us to inspect the past states of objects no longer present in the current execution stack. Unfortunately the "omniscient" approaches that try to remember all previous states are impractical because they either consume too much space or they are far too slow. Several approaches rely on heuristics to limit these penalties, but they ultimately end up throwing out too much relevant information. In this paper we propose a practical approach to back-in-time debugging that attempts to keep track of only the relevant past data. In contrast to other approaches, we keep object history information together with the regular objects in the application memory. Although seemingly counter-intuitive, this approach has the effect that past data that is not reachable from current application objects (and hence, no longer relevant) is automatically garbage collected. In this paper we describe the technical details of our approach, and we present benchmarks that demonstrate that memory consumption stays within practical bounds. Furthermore since our approach works at the virtual machine level, the performance penalty is significantly better than with other approaches.
Resumo:
Dieser Beitrag zeigt die Anwendung des Ant-Colony-System (ACS) Algorithmus auf die Sequenzierung von Querverteil-Wagen in einem Lager. Wir erweitern den Basisalgorithmus der Ant-Colony-Optimierung (ACO) für die Minimierung der Bearbeitungszeit einer Menge von Fahraufträgen für die Querverteil-Wagen. Im Vergleich zu dem Greedy-Algorithmus ist der ACO-Algorithmus wettbewerbsfähig und schnell. In vielen Lagerverwaltungssystemen werden die Fahraufträge nach dem FIFO-Prinzip (First-in-First-out) ausgeführt. In diesem Beitrag wird der ACO-Algorithmus genutzt, um eine optimale Sequenz der Fahraufträge zu bilden.
Resumo:
Cost-efficient operation while satisfying performance and availability guarantees in Service Level Agreements (SLAs) is a challenge for Cloud Computing, as these are potentially conflicting objectives. We present a framework for SLA management based on multi-objective optimization. The framework features a forecasting model for determining the best virtual machine-to-host allocation given the need to minimize SLA violations, energy consumption and resource wasting. A comprehensive SLA management solution is proposed that uses event processing for monitoring and enables dynamic provisioning of virtual machines onto the physical infrastructure. We validated our implementation against serveral standard heuristics and were able to show that our approach is significantly better.
Resumo:
In order to analyze software systems, it is necessary to model them. Static software models are commonly imported by parsing source code and related data. Unfortunately, building custom parsers for most programming languages is a non-trivial endeavour. This poses a major bottleneck for analyzing software systems programmed in languages for which importers do not already exist. Luckily, initial software models do not require detailed parsers, so it is possible to start analysis with a coarse-grained importer, which is then gradually refined. In this paper we propose an approach to "agile modeling" that exploits island grammars to extract initial coarse-grained models, parser combinators to enable gradual refinement of model importers, and various heuristics to recognize language structure, keywords and other language artifacts.
Resumo:
INFLUENCE OF ANCHORING ON MISCARRIAGE RISK PERCEPTION ASSOCIATED WITH AMNIOCENTESIS Publication No. ___________ Regina Nuccio, BS Supervisory Professor: Claire N. Singletary, MS, CGC Amniocentesis is the most common invasive procedure performed during pregnancy (Eddleman, et al., 2006). One important factor that women consider when making a decision about amniocentesis is the risk of miscarriage associated with the procedure. People use heuristics such as anchoring, the action of using a prior belief regarding the magnitude of risk as a frame of reference for new information to be synthesized, to better understand risks that they encounter in their lives. This study aimed to determine a woman’s perception of miscarriage risk associated with amniocentesis before and after a genetic counseling session and to determine what factors are most likely to anchor a woman’s perception of miscarriage risk associated with amniocentesis. Most women perceived the risk as low or average pre-counseling and were likely to indicate the numeric risk of amniocentesis as <1% risk. A higher percentage of patients correctly identified the numeric risk as <1% post-counseling when compared to pre-counseling. However, the majority of patients’ feeling about the risk perception did not change after the genetic counseling session (60%), regardless of how they perceived the risk before discussing amniocentesis with a genetic counselor. Those whose risk perception did change after discussing amniocentesis with a genetic counselor showed a decreased risk perception (p<0.0001). Of the multitude of factors studied, only two showed significance: having a friend or relative with a personal or family history of a genetic disorder was associated with a lower risk perception (p=0.001) and having a child already was associated with a lower risk perception (p=0.038). The lack of significant factors may reflect the uniqueness of each patient’s heuristic framework and reinforces the importance of genetic counseling to elucidate individual concerns.