530 resultados para Judgmental heuristics
Resumo:
This paper investigates how to make improved action selection for online policy learning in robotic scenarios using reinforcement learning (RL) algorithms. Since finding control policies using any RL algorithm can be very time consuming, we propose to combine RL algorithms with heuristic functions for selecting promising actions during the learning process. With this aim, we investigate the use of heuristics for increasing the rate of convergence of RL algorithms and contribute with a new learning algorithm, Heuristically Accelerated Q-learning (HAQL), which incorporates heuristics for action selection to the Q-Learning algorithm. Experimental results on robot navigation show that the use of even very simple heuristic functions results in significant performance enhancement of the learning rate.
Resumo:
This work deals with the problem of minimizing the waste of space that occurs on a rotational placement of a set of irregular two dimensional polygons inside a two dimensional container. This problem is approached with an heuristic based on simulated annealing. Traditional 14 external penalization"" techniques are avoided through the application of the no-fit polygon, that determinates the collision free area for each polygon before its placement. The simulated annealing controls: the rotation applied, the placement and the sequence of placement of the polygons. For each non placed polygon, a limited depth binary search is performed to find a scale factor that when applied to the polygon, would allow it to be fitted in the container. It is proposed a crystallization heuristic, in order to increase the number of accepted solutions. The bottom left and larger first deterministic heuristics were also studied. The proposed process is suited for non convex polygons and containers, the containers can have holes inside. (C) 2009 Elsevier Ltd. All rights reserved.
Resumo:
This work deals with the problem of minimizing the waste of space that occurs on a rotational placement of a set of irregular bi-dimensional items inside a bi-dimensional container. This problem is approached with a heuristic based on Simulated Annealing (SA) with adaptive neighborhood. The objective function is evaluated in a constructive approach, where the items are placed sequentially. The placement is governed by three different types of parameters: sequence of placement, the rotation angle and the translation. The rotation applied and the translation of the polygon are cyclic continuous parameters, and the sequence of placement defines a combinatorial problem. This way, it is necessary to control cyclic continuous and discrete parameters. The approaches described in the literature deal with only type of parameter (sequence of placement or translation). In the proposed SA algorithm, the sensibility of each continuous parameter is evaluated at each iteration increasing the number of accepted solutions. The sensibility of each parameter is associated to its probability distribution in the definition of the next candidate.
Resumo:
This paper addresses the minimization of the mean absolute deviation from a common due date in a two-machine flowshop scheduling problem. We present heuristics that use an algorithm, based on proposed properties, which obtains an optimal schedule fora given job sequence. A new set of benchmark problems is presented with the purpose of evaluating the heuristics. Computational experiments show that the developed heuristics outperform results found in the literature for problems up to 500 jobs. (C) 2007 Elsevier Ltd. All rights reserved.
Resumo:
In this paper, we consider a real-life heterogeneous fleet vehicle routing problem with time windows and split deliveries that occurs in a major Brazilian retail group. A single depot attends 519 stores of the group distributed in 11 Brazilian states. To find good solutions to this problem, we propose heuristics as initial solutions and a scatter search (SS) approach. Next, the produced solutions are compared with the routes actually covered by the company. Our results show that the total distribution cost can be reduced significantly when such methods are used. Experimental testing with benchmark instances is used to assess the merit of our proposed procedure. (C) 2008 Published by Elsevier B.V.
Resumo:
Scheduling parallel and distributed applications efficiently onto grid environments is a difficult task and a great variety of scheduling heuristics has been developed aiming to address this issue. A successful grid resource allocation depends, among other things, on the quality of the available information about software artifacts and grid resources. In this article, we propose a semantic approach to integrate selection of equivalent resources and selection of equivalent software artifacts to improve the scheduling of resources suitable for a given set of application execution requirements. We also describe a prototype implementation of our approach based on the Integrade grid middleware and experimental results that illustrate its benefits. Copyright (C) 2009 John Wiley & Sons, Ltd.
Resumo:
In this work, a wide analysis of local search multiuser detection (LS-MUD) for direct sequence/code division multiple access (DS/CDMA) systems under multipath channels is carried out considering the performance-complexity trade-off. It is verified the robustness of the LS-MUD to variations in loading, E(b)/N(0), near-far effect, number of fingers of the Rake receiver and errors in the channel coefficients estimates. A compared analysis of the bit error rate (BER) and complexity trade-off is accomplished among LS, genetic algorithm (GA) and particle swarm optimization (PSO). Based on the deterministic behavior of the LS algorithm, it is also proposed simplifications over the cost function calculation, obtaining more efficient algorithms (simplified and combined LS-MUD versions) and creating new perspectives for the MUD implementation. The computational complexity is expressed in terms of the number of operations in order to converge. Our conclusion pointed out that the simplified LS (s-LS) method is always more efficient, independent of the system conditions, achieving a better performance with a lower complexity than the others heuristics detectors. Associated to this, the deterministic strategy and absence of input parameters made the s-LS algorithm the most appropriate for the MUD problem. (C) 2008 Elsevier GmbH. All rights reserved.
Resumo:
This paper addresses the single machine scheduling problem with a common due date aiming to minimize earliness and tardiness penalties. Due to its complexity, most of the previous studies in the literature deal with this problem using heuristics and metaheuristics approaches. With the intention of contributing to the study of this problem, a branch-and-bound algorithm is proposed. Lower bounds and pruning rules that exploit properties of the problem are introduced. The proposed approach is examined through a computational comparative study with 280 problems involving different due date scenarios. In addition, the values of optimal solutions for small problems from a known benchmark are provided.
Resumo:
We tested the effects of four data characteristics on the results of reserve selection algorithms. The data characteristics were nestedness of features (land types in this case), rarity of features, size variation of sites (potential reserves) and size of data sets (numbers of sites and features). We manipulated data sets to produce three levels, with replication, of each of these data characteristics while holding the other three characteristics constant. We then used an optimizing algorithm and three heuristic algorithms to select sites to solve several reservation problems. We measured efficiency as the number or total area of selected sites, indicating the relative cost of a reserve system. Higher nestedness increased the efficiency of all algorithms (reduced the total cost of new reserves). Higher rarity reduced the efficiency of all algorithms (increased the total cost of new reserves). More variation in site size increased the efficiency of all algorithms expressed in terms of total area of selected sites. We measured the suboptimality of heuristic algorithms as the percentage increase of their results over optimal (minimum possible) results. Suboptimality is a measure of the reliability of heuristics as indicative costing analyses. Higher rarity reduced the suboptimality of heuristics (increased their reliability) and there is some evidence that more size variation did the same for the total area of selected sites. We discuss the implications of these results for the use of reserve selection algorithms as indicative and real-world planning tools.
Resumo:
This article deals with the efficiency of fractional integration parameter estimators. This study was based on Monte Carlo experiments involving simulated stochastic processes with integration orders in the range]-1,1[. The evaluated estimation methods were classified into two groups: heuristics and semiparametric/maximum likelihood (ML). The study revealed that the comparative efficiency of the estimators, measured by the lesser mean squared error, depends on the stationary/non-stationary and persistency/anti-persistency conditions of the series. The ML estimator was shown to be superior for stationary persistent processes; the wavelet spectrum-based estimators were better for non-stationary mean reversible and invertible anti-persistent processes; the weighted periodogram-based estimator was shown to be superior for non-invertible anti-persistent processes.
Resumo:
This article presents a proposal of a systemic model composed for the micro and small companies (MSE) of the region of Ribeiro Preto and the agents which influenced their environment. The proposed model was based on Stafford Beer`s (Diagnosing the system for organizations. Chichester, Wiley, 1985) systemic methodologies VSM (Viable System Model) and on Werner Ulrich`s (1983) CSH (Critical Systems Heuristics). The VSM is a model for the diagnosis of the structure of an organization and of its flows of information through the application of the cybernetics concepts (Narvarte, In El Modelo del Sistema Viable-MSV: experiencias de su aplicacin en Chile. Proyecto Cerebro Colectivo del IAS, Santiago, 2001). On the other hand, CSH focus on the context of the social group applied to the systemic vision as a counterpoint to the organizational management view considered by the VSM. MSE of Ribeiro Preto and Sertozinho had been analyzed as organizations inserted in systems that relate and integrate with other systems concerning the public administration, entities of representation and promotion agencies. The research questions: which are the bonds of interaction among the subsystems in this process and who are the agents involved? The systemic approach not only diagnosed a social group, formed by MSE of Ribeiro Preto and Sertozinho, public authorities and support entities, but could also delineate answers that aimed the clarification of obscure questions generating financial assistance to the formularization of efficient actions for the development of this system.
Resumo:
Models of population dynamics are commonly used to predict risks in ecology, particularly risks of population decline. There is often considerable uncertainty associated with these predictions. However, alternatives to predictions based on population models have not been assessed. We used simulation models of hypothetical species to generate the kinds of data that might typically be available to ecologists and then invited other researchers to predict risks of population declines using these data. The accuracy of the predictions was assessed by comparison with the forecasts of the original model. The researchers used either population models or subjective judgement to make their predictions. Predictions made using models were only slightly more accurate than subjective judgements of risk. However, predictions using models tended to be unbiased, while subjective judgements were biased towards over-estimation. Psychology literature suggests that the bias of subjective judgements is likely to vary somewhat unpredictably among people, depending on their stake in the outcome. This will make subjective predictions more uncertain and less transparent than those based on models. (C) 2004 Elsevier SAS. All rights reserved.
Resumo:
We consider algorithms for computing the Smith normal form of integer matrices. A variety of different strategies have been proposed, primarily aimed at avoiding the major obstacle that occurs in such computations-explosive growth in size of intermediate entries. We present a new algorithm with excellent performance. We investigate the complexity of such computations, indicating relationships with NP-complete problems. We also describe new heuristics which perform well in practice. Wie present experimental evidence which shows our algorithm outperforming previous methods. (C) 1997 Academic Press Limited.
Resumo:
Our interest lies in applying the principles of critical systems thinking to human activity systems in developing countries in situations where issues of natural resource sustainability constrain the feasible set of long-term strategies. The concept of sustainable development provides an expanded domain for critical systems thinking. The fundamental values underpinning sustainable development are that both intragenerational and intergenerational equity are important. As a consequence, key stakeholders are often excluded from power-sharing within current social systems. Addressing these issues requires renewed focus on emancipatory commitment and methodologies. To date, Ulrich's critical systems heuristics is the only critical systems methodology that offers practicable tools for emancipation. A case study analysis in Tigray, northern Ethiopia, provides insights in relation to the application of critical system heuristics to issues of sustainable development and highlights the need to extend the use of critical systems heuristics beyond the design and monitoring of structured interventions.
Resumo:
The problem of designing spatially cohesive nature reserve systems that meet biodiversity objectives is formulated as a nonlinear integer programming problem. The multiobjective function minimises a combination of boundary length, area and failed representation of the biological attributes we are trying to conserve. The task is to reserve a subset of sites that best meet this objective. We use data on the distribution of habitats in the Northern Territory, Australia, to show how simulated annealing and a greedy heuristic algorithm can be used to generate good solutions to such large reserve design problems, and to compare the effectiveness of these methods.