13 resultados para HEURISTICS
em CentAUR: Central Archive University of Reading - UK
Resumo:
A new heuristic for the Steiner Minimal Tree problem is presented here. The method described is based on the detection of particular sets of nodes in networks, the “Hot Spot” sets, which are used to obtain better approximations of the optimal solutions. An algorithm is also proposed which is capable of improving the solutions obtained by classical heuristics, by means of a stirring process of the nodes in solution trees. Classical heuristics and an enumerative method are used CIS comparison terms in the experimental analysis which demonstrates the goodness of the heuristic discussed in this paper.
Resumo:
A new heuristic for the Steiner minimal tree problem is presented. The method described is based on the detection of particular sets of nodes in networks, the “hot spot” sets, which are used to obtain better approximations of the optimal solutions. An algorithm is also proposed which is capable of improving the solutions obtained by classical heuristics, by means of a stirring process of the nodes in solution trees. Classical heuristics and an enumerative method are used as comparison terms in the experimental analysis which demonstrates the capability of the heuristic discussed
Resumo:
Recognition as a cue to judgment in a novel, multi-option domain (the Sunday Times Rich List) is explored. As in previous studies, participants were found to make use of name recognition as a cue to the presumed wealth of individuals. Names that were recognized were judged to be the richest name from amongst the set presented at above chance levels. This effect persisted across situations in which more than one name was recognized; recognition was used as an inclusion criterion for the sub-set of names to be considered the richest of the set presented. However, when the question was reversed, and a “poorest” judgment was required, use of recognition as an exclusion criterion was observed only when a single name was recognized. Reaction times when making these judgments also show a distinction between “richest” and “poorest” questions with recognition of none of the options taking the longest time to judge in the “richest” question condition and full recognition of all the names presented taking longest to judge in the “poorest” question condition. Implications for decision-making using simple heuristics are discussed.
Resumo:
“Fast & frugal” heuristics represent an appealing way of implementing bounded rationality and decision-making under pressure. The recognition heuristic is the simplest and most fundamental of these heuristics. Simulation and experimental studies have shown that this ignorance-driven heuristic inference can prove superior to knowledge based inference (Borges, Goldstein, Ortman & Gigerenzer, 1999; Goldstein & Gigerenzer, 2002) and have shown how the heuristic could develop from ACT-R’s forgetting function (Schooler & Hertwig, 2005). Mathematical analyses also demonstrate that, under certain conditions, a “less-is-more effect” will always occur (Goldstein & Gigerenzer, 2002). The further analyses presented in this paper show, however, that these conditions may constitute a special case and that the less-is-more effect in decision-making is subject to the moderating influence of the number of options to be considered and the framing of the question.
Resumo:
The Web's link structure (termed the Web Graph) is a richly connected set of Web pages. Current applications use this graph for indexing and information retrieval purposes. In contrast the relationship between Web Graph and application is reversed by letting the structure of the Web Graph influence the behaviour of an application. Presents a novel Web crawling agent, AlienBot, the output of which is orthogonally coupled to the enemy generation strategy of a computer game. The Web Graph guides AlienBot, causing it to generate a stochastic process. Shows the effectiveness of such unorthodox coupling to both the playability of the game and the heuristics of the Web crawler. In addition, presents the results of the sample of Web pages collected by the crawling process. In particular, shows: how AlienBot was able to identify the power law inherent in the link structure of the Web; that 61.74 per cent of Web pages use some form of scripting technology; that the size of the Web can be estimated at just over 5.2 billion pages; and that less than 7 per cent of Web pages fully comply with some variant of (X)HTML.
Resumo:
Foundation construction process has been an important key point in a successful construction engineering. The frequency of using diaphragm wall construction method among many deep excavation construction methods in Taiwan is the highest in the world. The traditional view of managing diaphragm wall unit in the sequencing of construction activities is to establish each phase of the sequencing of construction activities by heuristics. However, it conflicts final phase of engineering construction with unit construction and effects planning construction time. In order to avoid this kind of situation, we use management of science in the study of diaphragm wall unit construction to formulate multi-objective combinational optimization problem. Because the characteristic (belong to NP-Complete problem) of problem mathematic model is multi-objective and combining explosive, it is advised that using the 2-type Self-Learning Neural Network (SLNN) to solve the N=12, 24, 36 of diaphragm wall unit in the sequencing of construction activities program problem. In order to compare the liability of the results, this study will use random researching method in comparison with the SLNN. It is found that the testing result of SLNN is superior to random researching method in whether solution-quality or Solving-efficiency.
Resumo:
The current state of the art in the planning and coordination of autonomous vehicles is based upon the presence of speed lanes. In a traffic scenario where there is a large diversity between vehicles the removal of speed lanes can generate a significantly higher traffic bandwidth. Vehicle navigation in such unorganized traffic is considered. An evolutionary based trajectory planning technique has the advantages of making driving efficient and safe, however it also has to surpass the hurdle of computational cost. In this paper, we propose a real time genetic algorithm with Bezier curves for trajectory planning. The main contribution is the integration of vehicle following and overtaking behaviour for general traffic as heuristics for the coordination between vehicles. The resultant coordination strategy is fast and near-optimal. As the vehicles move, uncertainties may arise which are constantly adapted to, and may even lead to either the cancellation of an overtaking procedure or the initiation of one. Higher level planning is performed by Dijkstra's algorithm which indicates the route to be followed by the vehicle in a road network. Re-planning is carried out when a road blockage or obstacle is detected. Experimental results confirm the success of the algorithm subject to optimal high and low-level planning, re-planning and overtaking.
Resumo:
Sophisticated, intentional decision-making is a hallmark of mature, self-aware behaviour. Although neural, psychological, interpersonal, and socioeconomic elements that contribute to such adaptive, foresighted behaviour mature and/or change throughout the life-span, here we concentrate on relevant maturational processes that take place during adolescence, a period of disproportionate developmental opportunity and risk. A brief, eclectic overview is presented of recent evidence, new challenges, and current thinking on the fundamental mechanisms that mature throughout adolescence to support adaptive, self-controlled decision-making. This is followed by a proposal for the putative contribution of frontostriatal mechanisms to the moment-to-moment assembly of evaluative heuristics that mediate increased decision-making sophistication, promoting the maturation of self-regulated behaviour through adolescence and young adulthood.
Resumo:
Abstract Managers face hard choices between process and outcome systems of accountability in evaluating employees, but little is known about how managers resolve them. Building on the premise that political ideologies serve as uncertainty-reducing heuristics, two studies of working managers show that: (1) conservatives prefer outcome accountability and liberals prefer process accountability in an unspecified policy domain; (2) this split becomes more pronounced in a controversial domain (public schools) in which the foreground value is educational efficiency but reverses direction in a controversial domain (affirmative action) in which the foreground value is demographic equality; (3) managers who discover employees have subverted their preferred system favor tinkering over switching to an alternative system; (4) but bipartisan consensus arises when managers have clear evidence about employee trustworthiness and the tightness of the causal links between employee effort and success. These findings shed light on ideological and contextual factors that shape preferences for accountability systems.
Resumo:
Bloom filters are a data structure for storing data in a compressed form. They offer excellent space and time efficiency at the cost of some loss of accuracy (so-called lossy compression). This work presents a yes-no Bloom filter, which as a data structure consisting of two parts: the yes-filter which is a standard Bloom filter and the no-filter which is another Bloom filter whose purpose is to represent those objects that were recognised incorrectly by the yes-filter (that is, to recognise the false positives of the yes-filter). By querying the no-filter after an object has been recognised by the yes-filter, we get a chance of rejecting it, which improves the accuracy of data recognition in comparison with the standard Bloom filter of the same total length. A further increase in accuracy is possible if one chooses objects to include in the no-filter so that the no-filter recognises as many as possible false positives but no true positives, thus producing the most accurate yes-no Bloom filter among all yes-no Bloom filters. This paper studies how optimization techniques can be used to maximize the number of false positives recognised by the no-filter, with the constraint being that it should recognise no true positives. To achieve this aim, an Integer Linear Program (ILP) is proposed for the optimal selection of false positives. In practice the problem size is normally large leading to intractable optimal solution. Considering the similarity of the ILP with the Multidimensional Knapsack Problem, an Approximate Dynamic Programming (ADP) model is developed making use of a reduced ILP for the value function approximation. Numerical results show the ADP model works best comparing with a number of heuristics as well as the CPLEX built-in solver (B&B), and this is what can be recommended for use in yes-no Bloom filters. In a wider context of the study of lossy compression algorithms, our researchis an example showing how the arsenal of optimization methods can be applied to improving the accuracy of compressed data.
Resumo:
This paper argues that the intellectual contribution of Alan Rugman reflects his distinctive research methodology. Alan Rugman trained as an economist, and relied heavily on economic principles throughout his work. He believed that one good theory was sufficient for IB studies, and that theory, he maintained, was internalisation theory. He rejected theoretical pluralism, and believed that IB suffered from a surfeit of theories. Alan was a positivist. The test of a good theory was that it led to clear predictions which were corroborated by empirical evidence. Many IB theories, Alan believed, were weak; their proliferation sowed confusion and they needed to be refuted. Alan’s interpretation of internalisation was, however, unconventional in some respects. He played down the trade-offs presented in Coase’s original work, and substituted heuristics in their place. Instead of analysing internalisation as a context-specific choice between alternative contractual arrangements, he presented it as a strategic imperative for firms possessing strong knowledge advantages. His heuristics did not apply to every possible case, but in Alan’s view they applied in the great majority of cases and were therefore a basis for management action.