927 resultados para Random finite set theory
Resumo:
The doctoral thesis focuses on the Studies on fuzzy Matroids and related topics.Since the publication of the classical paper on fuzzy sets by L. A. Zadeh in 1965.the theory of fuzzy mathematics has gained more and more recognition from many researchers in a wide range of scientific fields. Among various branches of pure and applied mathematics, convexity was one of the areas where the notion of fuzzy set was applied. Many researchers have been involved in extending the notion of abstract convexity to the broader framework of fuzzy setting. As a result, a number of concepts have been formulated and explored. However. many concepts are yet to be fuzzified. The main objective of this thesis was to extend some basic concepts and results in convexity theory to the fuzzy setting. The concept like matroids, independent structures. classical convex invariants like Helly number, Caratheodoty number, Radon number and Exchange number form an important area of study in crisp convexity theory. In this thesis, we try to generalize some of these concepts to the fuzzy setting. Finally, we have defined different types of fuzzy matroids derived from vector spaces and discussed some of their properties.
Resumo:
This paper highlights the prediction of learning disabilities (LD) in school-age children using rough set theory (RST) with an emphasis on application of data mining. In rough sets, data analysis start from a data table called an information system, which contains data about objects of interest, characterized in terms of attributes. These attributes consist of the properties of learning disabilities. By finding the relationship between these attributes, the redundant attributes can be eliminated and core attributes determined. Also, rule mining is performed in rough sets using the algorithm LEM1. The prediction of LD is accurately done by using Rosetta, the rough set tool kit for analysis of data. The result obtained from this study is compared with the output of a similar study conducted by us using Support Vector Machine (SVM) with Sequential Minimal Optimisation (SMO) algorithm. It is found that, using the concepts of reduct and global covering, we can easily predict the learning disabilities in children
Resumo:
Ontic is an interactive system for developing and verifying mathematics. Ontic's verification mechanism is capable of automatically finding and applying information from a library containing hundreds of mathematical facts. Starting with only the axioms of Zermelo-Fraenkel set theory, the Ontic system has been used to build a data base of definitions and lemmas leading to a proof of the Stone representation theorem for Boolean lattices. The Ontic system has been used to explore issues in knowledge representation, automated deduction, and the automatic use of large data bases.
Resumo:
The computation of a piecewise smooth function that approximates a finite set of data points may be decomposed into two decoupled tasks: first, the computation of the locally smooth models, and hence, the segmentation of the data into classes that consist on the sets of points best approximated by each model, and second, the computation of the normalized discriminant functions for each induced class. The approximating function may then be computed as the optimal estimator with respect to this measure field. We give an efficient procedure for effecting both computations, and for the determination of the optimal number of components.
Resumo:
We consider the problem of a society whose members must choose from a finite set of alternatives. After knowing the chosen alternative, members may reconsider their membership in the society by either staying or exiting. In turn, and as a consequence of the exit of some of its members, other members might now find undersirable to belong to the society as well. We analyze the voting behavior of members who take into account the effect of their votes not only on the chosen alternative, but also on the final composition of the society
Resumo:
Las bases moleculares para el reconocimiento y la respuesta inmune están en la presentación de péptidos antigénicos. Se utilizaron la teoría de conjuntos y los datos experimentales para realizar una caracterización matemática de la región central de unión del péptido mediante la definición de 8 reglas asociadas a la unión al HLA clase II. Estas reglas se aplicaron a 4 péptidos promiscuos, 25 secuencias peptídicas naturales de la región central, de las cuales 13 presentaron unión, mientras que los demás no, y 19 péptidos sintéticos buscando diferenciar los péptidos. A excepción de uno, todos los péptidos de unión y no unión fueron caracterizados acertadamente. Esta metodología puede ser útil para escoger péptidos clave en el desarrollo de vacunas.
Resumo:
This paper describes a human management model as conceived in organizations that carry out a strategic direction of staff, based on a critical look of traditional management and some of its notions, such as the classical perspective of strategic addressing and human resources management. The privileged theoretical framework is the epistemological ground of the organizational theory and some of its sociological resources. In addition to the documentary review and the proposal of experts in consulting, a group of graphics made under the basic logicof set theory, designed from the analysis of several Colombian organizations, are presented. The main finding is that despite the efforts of executives, consultants and scholars to build management models different from functionalists, the way they have been thought in order to make them more strategic has made them still more functionalists that in the traditional approach. The strategic human management reproduces, with enormous power, the ideology of the macroeconomic model.
Resumo:
Se presenta aquí, en forma breve, el origen de la matematización económica y el campo de la economía matemática. Un enfoque histórico inicial divide dicho campo en un primer periodo denominado marginalista, otro donde se utiliza la teoría de los conjuntos y modelos lineales y por último un periodo que integra los dos anteriores. Posteriormente, se analiza la evolución de la Teoría del Equilibrio General desde Quesnay, pasando por Walras y desarrollos posteriores hasta su culminación con los trabajos de Arrow, Debreu y sus contemporáneos. Finalmente, se describe la influencia de las matemáticas, en especial de la optimización dinámica, en la teoría macroeconómica y a otras áreas de la economía.
Resumo:
The control and prediction of wastewater treatment plants poses an important goal: to avoid breaking the environmental balance by always keeping the system in stable operating conditions. It is known that qualitative information — coming from microscopic examinations and subjective remarks — has a deep influence on the activated sludge process. In particular, on the total amount of effluent suspended solids, one of the measures of overall plant performance. The search for an input–output model of this variable and the prediction of sudden increases (bulking episodes) is thus a central concern to ensure the fulfillment of current discharge limitations. Unfortunately, the strong interrelation between variables, their heterogeneity and the very high amount of missing information makes the use of traditional techniques difficult, or even impossible. Through the combined use of several methods — rough set theory and artificial neural networks, mainly — reasonable prediction models are found, which also serve to show the different importance of variables and provide insight into the process dynamics
Resumo:
There are various situations in which it is natural to ask whether a given collection of k functions, ρ j (r 1,…,r j ), j=1,…,k, defined on a set X, are the first k correlation functions of a point process on X. Here we describe some necessary and sufficient conditions on the ρ j ’s for this to be true. Our primary examples are X=ℝ d , X=ℤ d , and X an arbitrary finite set. In particular, we extend a result by Ambartzumian and Sukiasian showing realizability at sufficiently small densities ρ 1(r). Typically if any realizing process exists there will be many (even an uncountable number); in this case we prove, when X is a finite set, the existence of a realizing Gibbs measure with k body potentials which maximizes the entropy among all realizing measures. We also investigate in detail a simple example in which a uniform density ρ and translation invariant ρ 2 are specified on ℤ; there is a gap between our best upper bound on possible values of ρ and the largest ρ for which realizability can be established.
Resumo:
Associative memory networks such as Radial Basis Functions, Neurofuzzy and Fuzzy Logic used for modelling nonlinear processes suffer from the curse of dimensionality (COD), in that as the input dimension increases the parameterization, computation cost, training data requirements, etc. increase exponentially. Here a new algorithm is introduced for the construction of a Delaunay input space partitioned optimal piecewise locally linear models to overcome the COD as well as generate locally linear models directly amenable to linear control and estimation algorithms. The training of the model is configured as a new mixture of experts network with a new fast decision rule derived using convex set theory. A very fast simulated reannealing (VFSR) algorithm is utilized to search a global optimal solution of the Delaunay input space partition. A benchmark non-linear time series is used to demonstrate the new approach.
Resumo:
A new autonomous ship collision free (ASCF) trajectory navigation and control system has been introduced with a new recursive navigation algorithm based on analytic geometry and convex set theory for ship collision free guidance. The underlying assumption is that the geometric information of ship environment is available in the form of a polygon shaped free space, which may be easily generated from a 2D image or plots relating to physical hazards or other constraints such as collision avoidance regulations. The navigation command is given as a heading command sequence based on generating a way point which falls within a small neighborhood of the current position, and the sequence of the way points along the trajectory are guaranteed to lie within a bounded obstacle free region using convex set theory. A neurofuzzy network predictor which in practice uses only observed input/output data generated by on board sensors or external sensors (or a sensor fusion algorithm), based on using rudder deflection angle for the control of ship heading angle, is utilised in the simulation of an ESSO 190000 dwt tanker model to demonstrate the effectiveness of the system.
Resumo:
Familial idiopathic basal ganglia calcification, also known as ""Fahr`s disease"" (FD), is a neuropsychiatric disorder with autosomal dominant pattern of inheritance and characterized by symmetric basal ganglia calcifications and, occasionally, other brain regions. Currently, there are three loci linked to this devastating disease. The first one (IBGC1) is located in 14q11.2-21.3 and the other two have been identified in 2q37 (IBGC2) and 8p21.1-q11.13 (IBGC3). Further studies identified a heterozygous variation (rs36060072) which consists in the change of the cytosine to guanine located at MGEA6/CTAGE5 gene, present in all of the affected large American family linked to IBGC1. This missense substitution, which induces changes of a proline to alanine at the 521 position (P521A), in a proline-rich and highly conserved protein domain was considered a rare variation, with a minor allele frequency (MAF) of 0.0058 at the US population. Considering that the population frequency of a given variation is an indirect indicative of potential pathogenicity, we screened 200 chromosomes in a random control set of Brazilian samples and in two nuclear families, comparing with our previous analysis in a US population. In addition, we accomplished analyses through bioinformatics programs to predict the pathogenicity of such variation. Our genetic screen found no P521A carriers. Polling these data together with the previous study in the USA, we have now a MAF of 0.0036, showing that this mutation is very rare. On the other hand, the bioinformatics analysis provided conflicting findings. There are currently various candidate genes and loci that could be involved with the underlying molecular basis of FD etiology, and other groups suggested the possible role played by genes in 2q37, related to calcium metabolism, and at chromosome 8 (NRG1 and SNTG1). Additional mutagenesis and in vivo studies are necessary to confirm the pathogenicity for variation in the P521A MGEA6.
Resumo:
This paper presents the formulation of a combinatorial optimization problem with the following characteristics: (i) the search space is the power set of a finite set structured as a Boolean lattice; (ii) the cost function forms a U-shaped curve when applied to any lattice chain. This formulation applies for feature selection in the context of pattern recognition. The known approaches for this problem are branch-and-bound algorithms and heuristics that explore partially the search space. Branch-and-bound algorithms are equivalent to the full search, while heuristics are not. This paper presents a branch-and-bound algorithm that differs from the others known by exploring the lattice structure and the U-shaped chain curves of the search space. The main contribution of this paper is the architecture of this algorithm that is based on the representation and exploration of the search space by new lattice properties proven here. Several experiments, with well known public data, indicate the superiority of the proposed method to the sequential floating forward selection (SFFS), which is a popular heuristic that gives good results in very short computational time. In all experiments, the proposed method got better or equal results in similar or even smaller computational time. (C) 2009 Elsevier Ltd. All rights reserved.