890 resultados para rough sets
Resumo:
* The research is supported partly by INTAS: 04-77-7173 project, http://www.intas.be
Resumo:
The issues relating fuzzy sets definition are under consideration including the analogue for separation axiom, statistical interpretation and membership function representation by the conditional Probabilities.
Resumo:
For inference purposes in both classical and fuzzy logic, neither the information itself should be contradictory, nor should any of the items of available information contradict each other. In order to avoid these troubles in fuzzy logic, a study about contradiction was initiated by Trillas et al. in [5] and [6]. They introduced the concepts of both self-contradictory fuzzy set and contradiction between two fuzzy sets. Moreover, the need to study not only contradiction but also the degree of such contradiction is pointed out in [1] and [2], suggesting some measures for this purpose. Nevertheless, contradiction could have been measured in some other way. This paper focuses on the study of contradiction between two fuzzy sets dealing with the problem from a geometrical point of view that allow us to find out new ways to measure the contradiction degree. To do this, the two fuzzy sets are interpreted as a subset of the unit square, and the so called contradiction region is determined. Specially we tackle the case in which both sets represent a curve in [0,1]2. This new geometrical approach allows us to obtain different functions to measure contradiction throughout distances. Moreover, some properties of these contradiction measure functions are established and, in some particular case, the relations among these different functions are obtained.
Resumo:
Authors analyses questions of the subjective uncertainty and inexactness situations in the moment of using expert information and another questions which are connected with expert information uncertainty by fuzzy sets with rough membership functions in this article. You can find information about integral problems of individual expert marks and about connection among total marks “degree of inexactness” with sensibility of measurement scale. A lot of different situation which are connected with distribution of the function accessory significance and orientation of the concrete take to task decision making are analyses here.
Resumo:
2000 Mathematics Subject Classification: Primary 42B20; Secondary 42B15, 42B25
Resumo:
Theodore Motzkin proved, in 1936, that any polyhedral convex set can be expressed as the (Minkowski) sum of a polytope and a polyhedral convex cone. We have provided several characterizations of the larger class of closed convex sets, Motzkin decomposable, in finite dimensional Euclidean spaces which are the sum of a compact convex set with a closed convex cone. These characterizations involve different types of representations of closed convex sets as the support functions, dual cones and linear systems whose relationships are also analyzed. The obtaining of information about a given closed convex set F and the parametric linear optimization problem with feasible set F from each of its different representations, including the Motzkin decomposition, is also discussed. Another result establishes that a closed convex set is Motzkin decomposable if and only if the set of extreme points of its intersection with the linear subspace orthogonal to its lineality is bounded. We characterize the class of the extended functions whose epigraphs are Motzkin decomposable sets showing, in particular, that these functions attain their global minima when they are bounded from below. Calculus of Motzkin decomposable sets and functions is provided.
Resumo:
2000 Mathematics Subject Classification: 90C26, 90C20, 49J52, 47H05, 47J20.
Resumo:
ACM Computing Classification System (1998): G.2.1.
Resumo:
Здравко Д. Славов - В тази работа се разглеждат Паретовските решения в непрекъсната многокритериална оптимизация. Обсъжда се ролята на някои предположения, които влияят на характеристиките на Паретовските множества. Авторът се е опитал да премахне предположенията за вдлъбнатост на целевите функции и изпъкналост на допустимата област, които обикновено се използват в многокритериалната оптимизация. Резултатите са на базата на конструирането на ретракция от допустимата област върху Парето-оптималното множество.
Resumo:
Симеон Т. Стефанов, Велика И. Драгиева - В работата е изследвана еволюцията на системи от множества върху n-мерната евклидова сфера S^n. Установена е връзката на такива системи с хомотопичните групи на сферите. Получени са някои комбинаторни приложения за многостени.
Resumo:
2000 Mathematics Subject Classification: 60F05, 60B10.
Resumo:
Big data comes in various ways, types, shapes, forms and sizes. Indeed, almost all areas of science, technology, medicine, public health, economics, business, linguistics and social science are bombarded by ever increasing flows of data begging to be analyzed efficiently and effectively. In this paper, we propose a rough idea of a possible taxonomy of big data, along with some of the most commonly used tools for handling each particular category of bigness. The dimensionality p of the input space and the sample size n are usually the main ingredients in the characterization of data bigness. The specific statistical machine learning technique used to handle a particular big data set will depend on which category it falls in within the bigness taxonomy. Large p small n data sets for instance require a different set of tools from the large n small p variety. Among other tools, we discuss Preprocessing, Standardization, Imputation, Projection, Regularization, Penalization, Compression, Reduction, Selection, Kernelization, Hybridization, Parallelization, Aggregation, Randomization, Replication, Sequentialization. Indeed, it is important to emphasize right away that the so-called no free lunch theorem applies here, in the sense that there is no universally superior method that outperforms all other methods on all categories of bigness. It is also important to stress the fact that simplicity in the sense of Ockham’s razor non-plurality principle of parsimony tends to reign supreme when it comes to massive data. We conclude with a comparison of the predictive performance of some of the most commonly used methods on a few data sets.
Resumo:
In recent years, rough set approach computing issues concerning
reducts of decision tables have attracted the attention of many researchers.
In this paper, we present the time complexity of an algorithm
computing reducts of decision tables by relational database approach. Let
DS = (U, C ∪ {d}) be a consistent decision table, we say that A ⊆ C is a
relative reduct of DS if A contains a reduct of DS. Let s =
Resumo:
We consider von Neumann -- Morgenstern stable sets in assignment games with one seller and many buyers. We prove that a set of imputations is a stable set if and only if it is the graph of a certain type of continuous and monotone function. This characterization enables us to interpret the standards of behavior encompassed by the various stable sets as possible outcomes of well-known auction procedures when groups of buyers may form bidder rings. We also show that the union of all stable sets can be described as the union of convex polytopes all of whose vertices are marginal contribution payoff vectors. Consequently, each stable set is contained in the Weber set. The Shapley value, however, typically falls outside the union of all stable sets.
Resumo:
The purpose of this research was to demonstrate the applicability of reduced-size STR (Miniplex) primer sets to challenging samples and to provide the forensic community with new information regarding the analysis of degraded and inhibited DNA. The Miniplex primer sets were validated in accordance with guidelines set forth by the Scientific Working Group on DNA Analysis Methods (SWGDAM) in order to demonstrate the scientific validity of the kits. The Miniplex sets were also used in the analysis of DNA extracted from human skeletal remains and telogen hair. In addition, a method for evaluating the mechanism of PCR inhibition was developed using qPCR. The Miniplexes were demonstrated to be a robust and sensitive tool for the analysis of DNA with as low as 100 pg of template DNA. They also proved to be better than commercial kits in the analysis of DNA from human skeletal remains, with 64% of samples tested producing full profiles, compared to 16% for a commercial kit. The Miniplexes also produced amplification of nuclear DNA from human telogen hairs, with partial profiles obtained from as low as 60 pg of template DNA. These data suggest smaller PCR amplicons may provide a useful alternative to mitochondrial DNA for forensic analysis of degraded DNA from human skeletal remains, telogen hairs, and other challenging samples. In the evaluation of inhibition by qPCR, the effect of amplicon length and primer melting temperature was evaluated in order to determine the binding mechanisms of different PCR inhibitors. Several mechanisms were indicated by the inhibitors tested, including binding of the polymerase, binding to the DNA, and effects on the processivity of the polymerase during primer extension. The data obtained from qPCR illustrated a method by which the type of inhibitor could be inferred in forensic samples, and some methods of reducing inhibition for specific inhibitors were demonstrated. An understanding of the mechanism of the inhibitors found in forensic samples will allow analysts to select the proper methods for inhibition removal or the type of analysis that can be performed, and will increase the information that can be obtained from inhibited samples.