15 resultados para Algebraic Homogeneous Spaces
em Doria (National Library of Finland DSpace Services) - National Library of Finland, Finland
Resumo:
Fuzzy subsets and fuzzy subgroups are basic concepts in fuzzy mathematics. We shall concentrate on fuzzy subgroups dealing with some of their algebraic, topological and complex analytical properties. Explorations are theoretical belonging to pure mathematics. One of our ideas is to show how widely fuzzy subgroups can be used in mathematics, which brings out the wealth of this concept. In complex analysis we focus on Möbius transformations, combining them with fuzzy subgroups in the algebraic and topological sense. We also survey MV spaces with or without a link to fuzzy subgroups. Spectral space is known in MV algebra. We are interested in its topological properties in MV-semilinear space. Later on, we shall study MV algebras in connection with Riemann surfaces. In fact, the Riemann surface as a concept belongs to complex analysis. On the other hand, Möbius transformations form a part of the theory of Riemann surfaces. In general, this work gives a good understanding how it is possible to fit together different fields of mathematics.
Resumo:
Fuzzy set theory and Fuzzy logic is studied from a mathematical point of view. The main goal is to investigatecommon mathematical structures in various fuzzy logical inference systems and to establish a general mathematical basis for fuzzy logic when considered as multi-valued logic. The study is composed of six distinct publications. The first paper deals with Mattila'sLPC+Ch Calculus. THis fuzzy inference system is an attempt to introduce linguistic objects to mathematical logic without defining these objects mathematically.LPC+Ch Calculus is analyzed from algebraic point of view and it is demonstratedthat suitable factorization of the set of well formed formulae (in fact, Lindenbaum algebra) leads to a structure called ET-algebra and introduced in the beginning of the paper. On its basis, all the theorems presented by Mattila and many others can be proved in a simple way which is demonstrated in the Lemmas 1 and 2and Propositions 1-3. The conclusion critically discusses some other issues of LPC+Ch Calculus, specially that no formal semantics for it is given.In the second paper the characterization of solvability of the relational equation RoX=T, where R, X, T are fuzzy relations, X the unknown one, and o the minimum-induced composition by Sanchez, is extended to compositions induced by more general products in the general value lattice. Moreover, the procedure also applies to systemsof equations. In the third publication common features in various fuzzy logicalsystems are investigated. It turns out that adjoint couples and residuated lattices are very often present, though not always explicitly expressed. Some minor new results are also proved.The fourth study concerns Novak's paper, in which Novak introduced first-order fuzzy logic and proved, among other things, the semantico-syntactical completeness of this logic. He also demonstrated that the algebra of his logic is a generalized residuated lattice. In proving that the examination of Novak's logic can be reduced to the examination of locally finite MV-algebras.In the fifth paper a multi-valued sentential logic with values of truth in an injective MV-algebra is introduced and the axiomatizability of this logic is proved. The paper developes some ideas of Goguen and generalizes the results of Pavelka on the unit interval. Our proof for the completeness is purely algebraic. A corollary of the Completeness Theorem is that fuzzy logic on the unit interval is semantically complete if, and only if the algebra of the valuesof truth is a complete MV-algebra. The Compactness Theorem holds in our well-defined fuzzy sentential logic, while the Deduction Theorem and the Finiteness Theorem do not. Because of its generality and good-behaviour, MV-valued logic can be regarded as a mathematical basis of fuzzy reasoning. The last paper is a continuation of the fifth study. The semantics and syntax of fuzzy predicate logic with values of truth in ana injective MV-algerba are introduced, and a list of universally valid sentences is established. The system is proved to be semanticallycomplete. This proof is based on an idea utilizing some elementary properties of injective MV-algebras and MV-homomorphisms, and is purely algebraic.
Resumo:
Ajankohtaista
Resumo:
The thesis discusses games and the gaming experience. It is divided into two main sections; the first examines games in general, while the second concentrates exclusively on electronic games. The text approaches games from two distinct directions by looking at both their spatiality and their narrativity at the same time. These two points of view are combined right from the beginning of the text as they are used in conceptualising the nature of the gaming experience. The purpose of the thesis is to investigate two closely related issues concerning both the field of game studies and the nature of games. In regard to studying games, the focus is placed on the juxtaposition of ludology and narratology, which acts as a framework for looking at gaming. In addition to aiming to find out whether or not it is possible to undermine the said state of affairs through the spatiality of games, the text looks at the interrelationships of games and their spaces as well as the role of narratives in those spaces. The thesis is characterised by discussing alternative points of view and its hypothetical nature. During the text, it becomes apparent that the relationship between games and narratives is strongly twofold: on one hand, the player continuously narrativizes the states the game is in while playing, while the narratives residing within the game space form their own partially separate narrative spaces, on the other. These spaces affect the conception the player has of the game states and the events taking place in the game space itself.
Resumo:
This thesis studies the properties and usability of operators called t-norms, t-conorms, uninorms, as well as many valued implications and equivalences. Into these operators, weights and a generalized mean are embedded for aggregation, and they are used for comparison tasks and for this reason they are referred to as comparison measures. The thesis illustrates how these operators can be weighted with a differential evolution and aggregated with a generalized mean, and the kinds of measures of comparison that can be achieved from this procedure. New operators suitable for comparison measures are suggested. These operators are combination measures based on the use of t-norms and t-conorms, the generalized 3_-uninorm and pseudo equivalence measures based on S-type implications. The empirical part of this thesis demonstrates how these new comparison measures work in the field of classification, for example, in the classification of medical data. The second application area is from the field of sports medicine and it represents an expert system for defining an athlete's aerobic and anaerobic thresholds. The core of this thesis offers definitions for comparison measures and illustrates that there is no actual difference in the results achieved in comparison tasks, by the use of comparison measures based on distance, versus comparison measures based on many valued logical structures. The approach has been highly practical in this thesis and all usage of the measures has been validated mainly by practical testing. In general, many different types of operators suitable for comparison tasks have been presented in fuzzy logic literature and there has been little or no experimental work with these operators.
Resumo:
The increasing incidence of type 1 diabetes has led researchers on a quest to find the reason behind this phenomenon. The rate of increase is too great to be caused simply by changes in the genetic component, and many environmental factors are under investigation for their possible contribution. These studies require, however, the participation of those individuals most likely to develop the disease, and the approach chosen by many is to screen vast populations to find persons with increased genetic risk factors. The participating individuals are then followed for signs of disease development, and their exposure to suspected environmental factors is studied. The main purpose of this study was to find a suitable tool for easy and inexpensive screening of certain genetic risk markers for type 1 diabetes. The method should be applicable to using whole blood dried on sample collection cards as sample material, since the shipping and storage of samples in this format is preferred. However, the screening of vast sample libraries of extracted genomic DNA should also be possible, if such a need should arise, for example, when studying the effect of newly discovered genetic risk markers. The method developed in this study is based on homogeneous assay chemistry and an asymmetrical polymerase chain reaction (PCR). The generated singlestranded PCR product is probed by lanthanide-labelled, LNA (locked nucleic acid)-spiked, short oligonucleotides with exact complementary sequences. In the case of a perfect match, the probe is hybridised to the product. However, if even a single nucleotide difference occurs, the probe is bound instead of the PCR product to a complementary quencher-oligonucleotide labelled with a dabcyl-moiety, causing the signal of the lanthanide label to be quenched. The method was applied to the screening of the well-known type 1 diabetes risk alleles of the HLA-DQB1 gene. The method was shown to be suitable as an initial screening step including thousands of samples in the scheme used in the TEDDY (The Environmental Determinants of Diabetes in the Young) study to identify those individuals at increased genetic risk. The method was further developed into dry-reagent form to allow an even simpler approach to screening. The reagents needed in the assay were in dry format in the reaction vessel, and performing the assay required only the addition of the sample and, if necessary, water to rehydrate the reagents. This allows the assay to be successfully executed even by a person with minimal laboratory experience.
Resumo:
Ajankohtaista
Resumo:
The aim of the present study was to demonstrate the wide applicability of the novel photoluminescent labels called upconverting phosphors (UCPs) in proximity-based bioanalytical assays. The exceptional features of the lanthanide-doped inorganic UCP compounds stem from their capability for photon upconversion resulting in anti-Stokes photoluminescence at visible wavelengths under near-infrared (NIR) excitation. Major limitations related to conventional photoluminescent labels are avoided, rendering the UCPs a competitive next-generation label technology. First, the background luminescence is minimized due to total elimination of autofluorescence. Consequently, improvements in detectability are expected. Second, at the long wavelengths (>600 nm) used for exciting and detecting the UCPs, the transmittance of sample matrixes is significantly greater in comparison with shorter wavelengths. Colored samples are no longer an obstacle to the luminescence measurement, and more flexibility is allowed even in homogeneous assay concepts, where the sample matrix remains present during the entire analysis procedure, including label detection. To transform a UCP particle into a biocompatible label suitable for bioanalytical assays, it must be colloidal in an aqueous environment and covered with biomolecules capable of recognizing the analyte molecule. At the beginning of this study, only UCP bulk material was available, and it was necessary to process the material to submicrometer-sized particles prior to use. Later, the ground UCPs, with irregular shape, wide size-distribution and heterogeneous luminescence properties, were substituted by a smaller-sized spherical UCP material. The surface functionalization of the UCPs was realized by producing a thin hydrophilic coating. Polymer adsorption on the UCP surface is a simple way to introduce functional groups for bioconjugation purposes, but possible stability issues encouraged us to optimize an optional silica-encapsulation method which produces a coating that is not detached in storage or assay conditions. An extremely thin monolayer around the UCPs was pursued due to their intended use as short-distance energy donors, and much attention was paid to controlling the thickness of the coating. The performance of the UCP technology was evaluated in three different homogeneous resonance energy transfer-based bioanalytical assays: a competitive ligand binding assay, a hybridization assay for nucleic acid detection and an enzyme activity assay. To complete the list, a competitive immunoassay has been published previously. Our systematic investigation showed that a nonradiative energy transfer mechanism is indeed involved, when a UCP and an acceptor fluorophore are brought into close proximity in aqueous suspension. This process is the basis for the above-mentioned homogeneous assays, in which the distance between the fluorescent species depends on a specific biomolecular binding event. According to the studies, the submicrometer-sized UCP labels allow versatile proximity-based bioanalysis with low detection limits (a low-nanomolar concentration for biotin, 0.01 U for benzonase enzyme, 0.35 nM for target DNA sequence).
Resumo:
Technological developments in microprocessors and ICT landscape have made a shift to a new era where computing power is embedded in numerous small distributed objects and devices in our everyday lives. These small computing devices are ne-tuned to perform a particular task and are increasingly reaching our society at every level. For example, home appliances such as programmable washing machines, microwave ovens etc., employ several sensors to improve performance and convenience. Similarly, cars have on-board computers that use information from many di erent sensors to control things such as fuel injectors, spark plug etc., to perform their tasks e ciently. These individual devices make life easy by helping in taking decisions and removing the burden from their users. All these objects and devices obtain some piece of information about the physical environment. Each of these devices is an island with no proper connectivity and information sharing between each other. Sharing of information between these heterogeneous devices could enable a whole new universe of innovative and intelligent applications. The information sharing between the devices is a diffcult task due to the heterogeneity and interoperability of devices. Smart Space vision is to overcome these issues of heterogeneity and interoperability so that the devices can understand each other and utilize services of each other by information sharing. This enables innovative local mashup applications based on shared data between heterogeneous devices. Smart homes are one such example of Smart Spaces which facilitate to bring the health care system to the patient, by intelligent interconnection of resources and their collective behavior, as opposed to bringing the patient into the health system. In addition, the use of mobile handheld devices has risen at a tremendous rate during the last few years and they have become an essential part of everyday life. Mobile phones o er a wide range of different services to their users including text and multimedia messages, Internet, audio, video, email applications and most recently TV services. The interactive TV provides a variety of applications for the viewers. The combination of interactive TV and the Smart Spaces could give innovative applications that are personalized, context-aware, ubiquitous and intelligent by enabling heterogeneous systems to collaborate each other by sharing information between them. There are many challenges in designing the frameworks and application development tools for rapid and easy development of these applications. The research work presented in this thesis addresses these issues. The original publications presented in the second part of this thesis propose architectures and methodologies for interactive and context-aware applications, and tools for the development of these applications. We demonstrated the suitability of our ontology-driven application development tools and rule basedapproach for the development of dynamic, context-aware ubiquitous iTV applications.
Resumo:
Information gained from the human genome project and improvements in compound synthesizing have increased the number of both therapeutic targets and potential lead compounds. This has evolved a need for better screening techniques to have a capacity to screen number of compound libraries against increasing amount of targets. Radioactivity based assays have been traditionally used in drug screening but the fluorescence based assays have become more popular in high throughput screening (HTS) as they avoid safety and waste problems confronted with radioactivity. In comparison to conventional fluorescence more sensitive detection is obtained with time-resolved luminescence which has increased the popularity of time-resolved fluorescence resonance energy transfer (TR-FRET) based assays. To simplify the current TR-FRET based assay concept the luminometric homogeneous single-label utilizing assay technique, Quenching Resonance Energy Transfer (QRET), was developed. The technique utilizes soluble quencher to quench non-specifically the signal of unbound fraction of lanthanide labeled ligand. One labeling procedure and fewer manipulation steps in the assay concept are saving resources. The QRET technique is suitable for both biochemical and cell-based assays as indicated in four studies:1) ligand screening study of β2 -adrenergic receptor (cell-based), 2) activation study of Gs-/Gi-protein coupled receptors by measuring intracellular concentration of cyclic adenosine monophosphate (cell-based), 3) activation study of G-protein coupled receptors by observing the binding of guanosine-5’-triphosphate (cell membranes), and 4) activation study of small GTP binding protein Ras (biochemical). Signal-to-background ratios were between 2.4 to 10 and coefficient of variation varied from 0.5 to 17% indicating their suitability to HTS use.
Resumo:
Kirjallisuusarvostelu
Resumo:
Human activity recognition in everyday environments is a critical, but challenging task in Ambient Intelligence applications to achieve proper Ambient Assisted Living, and key challenges still remain to be dealt with to realize robust methods. One of the major limitations of the Ambient Intelligence systems today is the lack of semantic models of those activities on the environment, so that the system can recognize the speci c activity being performed by the user(s) and act accordingly. In this context, this thesis addresses the general problem of knowledge representation in Smart Spaces. The main objective is to develop knowledge-based models, equipped with semantics to learn, infer and monitor human behaviours in Smart Spaces. Moreover, it is easy to recognize that some aspects of this problem have a high degree of uncertainty, and therefore, the developed models must be equipped with mechanisms to manage this type of information. A fuzzy ontology and a semantic hybrid system are presented to allow modelling and recognition of a set of complex real-life scenarios where vagueness and uncertainty are inherent to the human nature of the users that perform it. The handling of uncertain, incomplete and vague data (i.e., missing sensor readings and activity execution variations, since human behaviour is non-deterministic) is approached for the rst time through a fuzzy ontology validated on real-time settings within a hybrid data-driven and knowledgebased architecture. The semantics of activities, sub-activities and real-time object interaction are taken into consideration. The proposed framework consists of two main modules: the low-level sub-activity recognizer and the high-level activity recognizer. The rst module detects sub-activities (i.e., actions or basic activities) that take input data directly from a depth sensor (Kinect). The main contribution of this thesis tackles the second component of the hybrid system, which lays on top of the previous one, in a superior level of abstraction, and acquires the input data from the rst module's output, and executes ontological inference to provide users, activities and their in uence in the environment, with semantics. This component is thus knowledge-based, and a fuzzy ontology was designed to model the high-level activities. Since activity recognition requires context-awareness and the ability to discriminate among activities in di erent environments, the semantic framework allows for modelling common-sense knowledge in the form of a rule-based system that supports expressions close to natural language in the form of fuzzy linguistic labels. The framework advantages have been evaluated with a challenging and new public dataset, CAD-120, achieving an accuracy of 90.1% and 91.1% respectively for low and high-level activities. This entails an improvement over both, entirely data-driven approaches, and merely ontology-based approaches. As an added value, for the system to be su ciently simple and exible to be managed by non-expert users, and thus, facilitate the transfer of research to industry, a development framework composed by a programming toolbox, a hybrid crisp and fuzzy architecture, and graphical models to represent and con gure human behaviour in Smart Spaces, were developed in order to provide the framework with more usability in the nal application. As a result, human behaviour recognition can help assisting people with special needs such as in healthcare, independent elderly living, in remote rehabilitation monitoring, industrial process guideline control, and many other cases. This thesis shows use cases in these areas.
Resumo:
Posiva Oy’s final disposal facility’s encapsulation plant will start to operate in the 2020s. Once the operation starts, the facility is designed to run more than a hundred years. The encapsulation plant will be first of its kind in the world, being part of the solution to solve a global issue of final disposal of nuclear waste. In the encapsulation plant’s fuel handling cell the spent nuclear fuel will be processed to be deposited into the Finnish bedrock, into ONKALO. In the fuel handling cell, the environment is highly radioactive forming a permit-required enclosed space. Remote observation is needed in order to monitor the fuel handling process. The purpose of this thesis is to map (Part I) and compare (Part II) remote observation methods to observe Posiva Oy’s fuel handling cell’s process, and provide a possible theoretical solution for this case. Secondary purpose for this thesis is to provide resources for other remote observation cases, as well as to inform about possible future technology to enable readiness in the design of the encapsulation plant. The approach was to theoretically analyze the mapped remote observation methods. Firstly, the methods were filtered by three environmental challenges. These are the high levels of radiation, the permit-required confined space and the hundred year timespan. Secondly, the most promising methods were selected by the experts designing the facility. Thirdly, a customized feasibility analysis was created and performed on the selected methods to rank the methods with scores. The results are the mapped methods and the feasibility analysis scores. The three highest scoring methods were radiation tolerant camera, fiberscope and audio feed. A combination of these three methods was given as a possible theoretical solution for this case. As this case is first in the world, remote observation methods for it had not been thoroughly researched. The findings in this thesis will act as initial data for the design of the fuel handling cell’s remote observation systems and can potentially effect on the overall design of the facility by providing unique and case specific information. In addition, this thesis could provide resources for other remote observation cases.