970 resultados para decision directed algorithm
Resumo:
We introduce a novel way of measuring the entropy of a set of values undergoing changes. Such a measure becomes useful when analyzing the temporal development of an algorithm designed to numerically update a collection of values such as artificial neural network weights undergoing adjustments during learning. We measure the entropy as a function of the phase-space of the values, i.e. their magnitude and velocity of change, using a method based on the abstract measure of entropy introduced by the philosopher Rudolf Carnap. By constructing a time-dynamic two-dimensional Voronoi diagram using Voronoi cell generators with coordinates of value- and value-velocity (change of magnitude), the entropy becomes a function of the cell areas. We term this measure teleonomic entropy since it can be used to describe changes in any end-directed (teleonomic) system. The usefulness of the method is illustrated when comparing the different approaches of two search algorithms, a learning artificial neural network and a population of discovering agents. (C) 2004 Elsevier Inc. All rights reserved.
Resumo:
Power systems are large scale nonlinear systems with high complexity. Various optimization techniques and expert systems have been used in power system planning. However, there are always some factors that cannot be quantified, modeled, or even expressed by expert systems. Moreover, such planning problems are often large scale optimization problems. Although computational algorithms that are capable of handling large dimensional problems can be used, the computational costs are still very high. To solve these problems, in this paper, investigation is made to explore the efficiency and effectiveness of combining mathematic algorithms with human intelligence. It had been discovered that humans can join the decision making progresses by cognitive feedback. Based on cognitive feedback and genetic algorithm, a new algorithm called cognitive genetic algorithm is presented. This algorithm can clarify and extract human's cognition. As an important application of this cognitive genetic algorithm, a practical decision method for power distribution system planning is proposed. By using this decision method, the optimal results that satisfy human expertise can be obtained and the limitations of human experts can be minimized in the mean time.
Resumo:
Refraction simulators used for undergraduate training at Aston University did not realistically reflect variations in the relationship between vision and ametropia. This was because they used an algorithm, taken from the research literature, that strictly only applied to myopes or older hyperopes and did not factor in age and pupil diameter. The aim of this study was to generate new algorithms that overcame these limitations. Clinical data were collected from the healthy right eyes of 873 white subjects aged between 20 and 70 years. Vision and refractive error were recorded along with age and pupil diameter. Re-examination of 34 subjects enabled the calculation of coefficients of repeatability. The study population was slightly biased towards females and included many contact lens wearers. Sex and contact lens wear were, therefore, recorded in order to determine whether these might influence the findings. In addition, iris colour and cylinder axis orientation were recorded as these might also be influential. A novel Blur Sensitivity Ratio (BSR) was derived by dividing vision (expressed as minimum angle of resolution) by refractive error (expressed as a scalar vector, U). Alteration of the scalar vector, to account for additional vision reduction due to oblique cylinder axes, was not found to be useful. Decision tree analysis showed that sex, contact lens wear, iris colour and cylinder axis orientation did not influence the BSR. The following algorithms arose from two stepwise multiple linear regressions: BSR (myopes) = 1.13 + (0.24 x pupil diameter) + (0.14 x U) BSR (hyperopes) = (0.11 x pupil diameter) + (0.03 x age) - 0.22 These algorithms together accounted for 84% of the observed variance. They showed that pupil diameter influenced vision in both forms of ametropia. They also showed the age-related decline in the ability to accommodate in order to overcome reduced vision in hyperopia.
Resumo:
Integrated supplier selection and order allocation is an important decision for both designing and operating supply chains. This decision is often influenced by the concerned stakeholders, suppliers, plant operators and customers in different tiers. As firms continue to seek competitive advantage through supply chain design and operations they aim to create optimized supply chains. This calls for on one hand consideration of multiple conflicting criteria and on the other hand consideration of uncertainties of demand and supply. Although there are studies on supplier selection using advanced mathematical models to cover a stochastic approach, multiple criteria decision making techniques and multiple stakeholder requirements separately, according to authors' knowledge there is no work that integrates these three aspects in a common framework. This paper proposes an integrated method for dealing with such problems using a combined Analytic Hierarchy Process-Quality Function Deployment (AHP-QFD) and chance constrained optimization algorithm approach that selects appropriate suppliers and allocates orders optimally between them. The effectiveness of the proposed decision support system has been demonstrated through application and validation in the bioenergy industry.
Resumo:
Transition P Systems are a parallel and distributed computational model based on the notion of the cellular membrane structure. Each membrane determines a region that encloses a multiset of objects and evolution rules. Transition P Systems evolve through transitions between two consecutive configurations that are determined by the membrane structure and multisets present inside membranes. Moreover, transitions between two consecutive configurations are provided by an exhaustive non-deterministic and parallel application of active evolution rules subset inside each membrane of the P system. But, to establish the active evolution rules subset, it is required the previous calculation of useful and applicable rules. Hence, computation of applicable evolution rules subset is critical for the whole evolution process efficiency, because it is performed in parallel inside each membrane in every evolution step. The work presented here shows advantages of incorporating decision trees in the evolution rules applicability algorithm. In order to it, necessary formalizations will be presented to consider this as a classification problem, the method to obtain the necessary decision tree automatically generated and the new algorithm for applicability based on it.
Resumo:
This paper considers the problem of concept generalization in decision-making systems where such features of real-world databases as large size, incompleteness and inconsistence of the stored information are taken into account. The methods of the rough set theory (like lower and upper approximations, positive regions and reducts) are used for the solving of this problem. The new discretization algorithm of the continuous attributes is proposed. It essentially increases an overall performance of generalization algorithms and can be applied to processing of real value attributes in large data tables. Also the search algorithm of the significant attributes combined with a stage of discretization is developed. It allows avoiding splitting of continuous domains of insignificant attributes into intervals.
Resumo:
Vendor-managed inventory (VMI) is a widely used collaborative inventory management policy in which manufacturers manages the inventory of retailers and takes responsibility for making decisions related to the timing and extent of inventory replenishment. VMI partnerships help organisations to reduce demand variability, inventory holding and distribution costs. This study provides empirical evidence that significant economic benefits can be achieved with the use of a genetic algorithm (GA)-based decision support system (DSS) in a VMI supply chain. A two-stage serial supply chain in which retailers and their supplier are operating VMI in an uncertain demand environment is studied. Performance was measured in terms of cost, profit, stockouts and service levels. The results generated from GA-based model were compared to traditional alternatives. The study found that the GA-based approach outperformed traditional methods and its use can be economically justified in small- and medium-sized enterprises (SMEs).
Resumo:
In recent years, rough set approach computing issues concerning
reducts of decision tables have attracted the attention of many researchers.
In this paper, we present the time complexity of an algorithm
computing reducts of decision tables by relational database approach. Let
DS = (U, C ∪ {d}) be a consistent decision table, we say that A ⊆ C is a
relative reduct of DS if A contains a reduct of DS. Let s =
Resumo:
Three-Dimensional (3-D) imaging is vital in computer-assisted surgical planning including minimal invasive surgery, targeted drug delivery, and tumor resection. Selective Internal Radiation Therapy (SIRT) is a liver directed radiation therapy for the treatment of liver cancer. Accurate calculation of anatomical liver and tumor volumes are essential for the determination of the tumor to normal liver ratio and for the calculation of the dose of Y-90 microspheres that will result in high concentration of the radiation in the tumor region as compared to nearby healthy tissue. Present manual techniques for segmentation of the liver from Computed Tomography (CT) tend to be tedious and greatly dependent on the skill of the technician/doctor performing the task. ^ This dissertation presents the development and implementation of a fully integrated algorithm for 3-D liver and tumor segmentation from tri-phase CT that yield highly accurate estimations of the respective volumes of the liver and tumor(s). The algorithm as designed requires minimal human intervention without compromising the accuracy of the segmentation results. Embedded within this algorithm is an effective method for extracting blood vessels that feed the tumor(s) in order to plan effectively the appropriate treatment. ^ Segmentation of the liver led to an accuracy in excess of 95% in estimating liver volumes in 20 datasets in comparison to the manual gold standard volumes. In a similar comparison, tumor segmentation exhibited an accuracy of 86% in estimating tumor(s) volume(s). Qualitative results of the blood vessel segmentation algorithm demonstrated the effectiveness of the algorithm in extracting and rendering the vasculature structure of the liver. Results of the parallel computing process, using a single workstation, showed a 78% gain. Also, statistical analysis carried out to determine if the manual initialization has any impact on the accuracy showed user initialization independence in the results. ^ The dissertation thus provides a complete 3-D solution towards liver cancer treatment planning with the opportunity to extract, visualize and quantify the needed statistics for liver cancer treatment. Since SIRT requires highly accurate calculation of the liver and tumor volumes, this new method provides an effective and computationally efficient process required of such challenging clinical requirements.^
Resumo:
Supply chain operations directly affect service levels. Decision on amendment of facilities is generally decided based on overall cost, leaving out the efficiency of each unit. Decomposing the supply chain superstructure, efficiency analysis of the facilities (warehouses or distribution centers) that serve customers can be easily implemented. With the proposed algorithm, the selection of a facility is based on service level maximization and not just cost minimization as this analysis filters all the feasible solutions utilizing Data Envelopment Analysis (DEA) technique. Through multiple iterations, solutions are filtered via DEA and only the efficient ones are selected leading to cost minimization. In this work, the problem of optimal supply chain networks design is addressed based on a DEA based algorithm. A Branch and Efficiency (B&E) algorithm is deployed for the solution of this problem. Based on this DEA approach, each solution (potentially installed warehouse, plant etc) is treated as a Decision Making Unit, thus is characterized by inputs and outputs. The algorithm through additional constraints named “efficiency cuts”, selects only efficient solutions providing better objective function values. The applicability of the proposed algorithm is demonstrated through illustrative examples.
Resumo:
This paper deals with a very important issue in any knowledge engineering discipline: the accurate representation and modelling of real life data and its processing by human experts. The work is applied to the GRiST Mental Health Risk Screening Tool for assessing risks associated with mental-health problems. The complexity of risk data and the wide variations in clinicians' expert opinions make it difficult to elicit representations of uncertainty that are an accurate and meaningful consensus. It requires integrating each expert's estimation of a continuous distribution of uncertainty across a range of values. This paper describes an algorithm that generates a consensual distribution at the same time as measuring the consistency of inputs. Hence it provides a measure of the confidence in the particular data item's risk contribution at the input stage and can help give an indication of the quality of subsequent risk predictions. © 2010 IEEE.
Resumo:
BACKGROUND: Less than 1% of severely obese US adults undergo bariatric surgery annually. It is critical to understand the factors that contribute to its utilization. OBJECTIVES: To understand how primary care physicians (PCPs) make decisions regarding severe obesity treatment and bariatric surgery referral. SETTING: Focus groups with PCPs practicing in small, medium, and large cities in Wisconsin. METHODS: PCPs were asked to discuss prioritization of treatment for a severely obese patient with multiple co-morbidities and considerations regarding bariatric surgery referral. Focus group sessions were analyzed by using a directed approach to content analysis. A taxonomy of consensus codes was developed. Code summaries were created and representative quotes identified. RESULTS: Sixteen PCPs participated in 3 focus groups. Four treatment prioritization approaches were identified: (1) treat the disease that is easiest to address; (2) treat the disease that is perceived as the most dangerous; (3) let the patient set the agenda; and (4) address obesity first because it is the common denominator underlying other co-morbid conditions. Only the latter approach placed emphasis on obesity treatment. Five factors made PCPs hesitate to refer patients for bariatric surgery: (1) wanting to "do no harm"; (2) questioning the long-term effectiveness of bariatric surgery; (3) limited knowledge about bariatric surgery; (4) not wanting to recommend bariatric surgery too early; and (5) not knowing if insurance would cover bariatric surgery. CONCLUSION: Decision making by PCPs for severely obese patients seems to underprioritize obesity treatment and overestimate bariatric surgery risks. This could be addressed with PCP education and improvements in communication between PCPs and bariatric surgeons.
Resumo:
Planning is an essential process in teams of multiple agents pursuing a common goal. When the effects of actions undertaken by agents are uncertain, evaluating the potential risk of such actions alongside their utility might lead to more rational decisions upon planning. This challenge has been recently tackled for single agent settings, yet domains with multiple agents that present diverse viewpoints towards risk still necessitate comprehensive decision making mechanisms that balance the utility and risk of actions. In this work, we propose a novel collaborative multi-agent planning framework that integrates (i) a team-level online planner under uncertainty that extends the classical UCT approximate algorithm, and (ii) a preference modeling and multicriteria group decision making approach that allows agents to find accepted and rational solutions for planning problems, predicated on the attitude each agent adopts towards risk. When utilised in risk-pervaded scenarios, the proposed framework can reduce the cost of reaching the common goal sought and increase effectiveness, before making collective decisions by appropriately balancing risk and utility of actions.
Resumo:
In the past years, we could observe a significant amount of new robotic systems in science, industry, and everyday life. To reduce the complexity of these systems, the industry constructs robots that are designated for the execution of a specific task such as vacuum cleaning, autonomous driving, observation, or transportation operations. As a result, such robotic systems need to combine their capabilities to accomplish complex tasks that exceed the abilities of individual robots. However, to achieve emergent cooperative behavior, multi-robot systems require a decision process that copes with the communication challenges of the application domain. This work investigates a distributed multi-robot decision process, which addresses unreliable and transient communication. This process composed by five steps, which we embedded into the ALICA multi-agent coordination language guided by the PROViDE negotiation middleware. The first step encompasses the specification of the decision problem, which is an integral part of the ALICA implementation. In our decision process, we describe multi-robot problems by continuous nonlinear constraint satisfaction problems. The second step addresses the calculation of solution proposals for this problem specification. Here, we propose an efficient solution algorithm that integrates incomplete local search and interval propagation techniques into a satisfiability solver, which forms a satisfiability modulo theories (SMT) solver. In the third decision step, the PROViDE middleware replicates the solution proposals among the robots. This replication process is parameterized with a distribution method, which determines the consistency properties of the proposals. In a fourth step, we investigate the conflict resolution. Therefore, an acceptance method ensures that each robot supports one of the replicated proposals. As we integrated the conflict resolution into the replication process, a sound selection of the distribution and acceptance methods leads to an eventual convergence of the robot proposals. In order to avoid the execution of conflicting proposals, the last step comprises a decision method, which selects a proposal for implementation in case the conflict resolution fails. The evaluation of our work shows that the usage of incomplete solution techniques of the constraint satisfaction solver outperforms the runtime of other state-of-the-art approaches for many typical robotic problems. We further show by experimental setups and practical application in the RoboCup environment that our decision process is suitable for making quick decisions in the presence of packet loss and delay. Moreover, PROViDE requires less memory and bandwidth compared to other state-of-the-art middleware approaches.
Resumo:
Over the last few years, more and more heuristic decision making techniques have been inspired by nature, e.g. evolutionary algorithms, ant colony optimisation and simulated annealing. More recently, a novel computational intelligence technique inspired by immunology has emerged, called Artificial Immune Systems (AIS). This immune system inspired technique has already been useful in solving some computational problems. In this keynote, we will very briefly describe the immune system metaphors that are relevant to AIS. We will then give some illustrative real-world problems suitable for AIS use and show a step-by-step algorithm walkthrough. A comparison of AIS to other well-known algorithms and areas for future work will round this keynote off. It should be noted that as AIS is still a young and evolving field, there is not yet a fixed algorithm template and hence actual implementations might differ somewhat from the examples given here