25 resultados para Rough fuzzy controller


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Type 1 diabetic patients depend on external insulin delivery to keep their blood glucose within near-normal ranges. In this work, two robust closed-loop controllers for blood glucose regulation are developed to prevent the life-threatening hypoglycemia, as well as to avoid extended hyperglycemia. The proposed controllers are designed by using the sliding mode control technique in a Smith predictor structure. To improve meal disturbance rejection, a simple feedforward controller is added to inject meal-time insulin bolus. Simulations scenarios were used to test the controllers, and showed the controllers ability to maintain the glucose levels within the safe limits in the presence of errors in measurements, modeling and meal estimation

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents a control strategy for blood glucose(BG) level regulation in type 1 diabetic patients. To design the controller, model-based predictive control scheme has been applied to a newly developed diabetic patient model. The controller is provided with a feedforward loop to improve meal compensation, a gain-scheduling scheme to account for different BG levels, and an asymmetric cost function to reduce hypoglycemic risk. A simulation environment that has been approved for testing of artificial pancreas control algorithms has been used to test thecontroller. The simulation results show a good controller performance in fasting conditions and meal disturbance rejection, and robustness against model–patient mismatch and errors in mealestimation

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Vagueness and high dimensional space data are usual features of current data. The paper is an approach to identify conceptual structures among fuzzy three dimensional data sets in order to get conceptual hierarchy. We propose a fuzzy extension of the Galois connections that allows to demonstrate an isomorphism theorem between fuzzy sets closures which is the basis for generating lattices ordered-sets

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This work focuses on the prediction of the two main nitrogenous variables that describe the water quality at the effluent of a Wastewater Treatment Plant. We have developed two kind of Neural Networks architectures based on considering only one output or, in the other hand, the usual five effluent variables that define the water quality: suspended solids, biochemical organic matter, chemical organic matter, total nitrogen and total Kjedhal nitrogen. Two learning techniques based on a classical adaptative gradient and a Kalman filter have been implemented. In order to try to improve generalization and performance we have selected variables by means genetic algorithms and fuzzy systems. The training, testing and validation sets show that the final networks are able to learn enough well the simulated available data specially for the total nitrogen

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Many classification systems rely on clustering techniques in which a collection of training examples is provided as an input, and a number of clusters c1,...cm modelling some concept C results as an output, such that every cluster ci is labelled as positive or negative. Given a new, unlabelled instance enew, the above classification is used to determine to which particular cluster ci this new instance belongs. In such a setting clusters can overlap, and a new unlabelled instance can be assigned to more than one cluster with conflicting labels. In the literature, such a case is usually solved non-deterministically by making a random choice. This paper presents a novel, hybrid approach to solve this situation by combining a neural network for classification along with a defeasible argumentation framework which models preference criteria for performing clustering.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

PLFC is a first-order possibilistic logic dealing with fuzzy constants and fuzzily restricted quantifiers. The refutation proof method in PLFC is mainly based on a generalized resolution rule which allows an implicit graded unification among fuzzy constants. However, unification for precise object constants is classical. In order to use PLFC for similarity-based reasoning, in this paper we extend a Horn-rule sublogic of PLFC with similarity-based unification of object constants. The Horn-rule sublogic of PLFC we consider deals only with disjunctive fuzzy constants and it is equipped with a simple and efficient version of PLFC proof method. At the semantic level, it is extended by equipping each sort with a fuzzy similarity relation, and at the syntactic level, by fuzzily “enlarging” each non-fuzzy object constant in the antecedent of a Horn-rule by means of a fuzzy similarity relation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Possibilistic Defeasible Logic Programming (P-DeLP) is a logic programming language which combines features from argumentation theory and logic programming, incorporating the treatment of possibilistic uncertainty at the object-language level. In spite of its expressive power, an important limitation in P-DeLP is that imprecise, fuzzy information cannot be expressed in the object language. One interesting alternative for solving this limitation is the use of PGL+, a possibilistic logic over Gödel logic extended with fuzzy constants. Fuzzy constants in PGL+ allow expressing disjunctive information about the unknown value of a variable, in the sense of a magnitude, modelled as a (unary) predicate. The aim of this article is twofold: firstly, we formalize DePGL+, a possibilistic defeasible logic programming language that extends P-DeLP through the use of PGL+ in order to incorporate fuzzy constants and a fuzzy unification mechanism for them. Secondly, we propose a way to handle conflicting arguments in the context of the extended framework.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The time required to image large samples is an important limiting factor in SPM-based systems. In multiprobe setups, especially when working with biological samples, this drawback can make impossible to conduct certain experiments. In this work, we present a feedfordward controller based on bang-bang and adaptive controls. The controls are based in the difference between the maximum speeds that can be used for imaging depending on the flatness of the sample zone. Topographic images of Escherichia coli bacteria samples were acquired using the implemented controllers. Results show that to go faster in the flat zones, rather than using a constant scanning speed for the whole image, speeds up the imaging process of large samples by up to a 4x factor.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Creació d’un sistema format per un algoritme genètic que permeti dissenyar de forma automática, les dades dels valors lingüístics d’un controlador fuzzy, per a un robot amb tracció diferencial. Les dades que s’han d’obtenir han de donar-li al robot, la capacitat d’arribar a un destí, evitant els obstacles que vagi trobant al llarg del camí

Relevância:

20.00% 20.00%

Publicador:

Resumo:

One main assumption in the theory of rough sets applied to information tables is that the elements that exhibit the same information are indiscernible (similar) and form blocks that can be understood as elementary granules of knowledge about the universe. We propose a variant of this concept defining a measure of similarity between the elements of the universe in order to consider that two objects can be indiscernible even though they do not share all the attribute values because the knowledge is partial or uncertain. The set of similarities define a matrix of a fuzzy relation satisfying reflexivity and symmetry but transitivity thus a partition of the universe is not attained. This problem can be solved calculating its transitive closure what ensure a partition for each level belonging to the unit interval [0,1]. This procedure allows generalizing the theory of rough sets depending on the minimum level of similarity accepted. This new point of view increases the rough character of the data because increases the set of indiscernible objects. Finally, we apply our results to a not real application to be capable to remark the differences and the improvements between this methodology and the classical one