940 resultados para Equivalence-preserving
Resumo:
This master thesis work introduces the fuzzy tolerance/equivalence relation and its application in cluster analysis. The work presents about the construction of fuzzy equivalence relations using increasing generators. Here, we investigate and research on the role of increasing generators for the creation of intersection, union and complement operators. The objective is to develop different varieties of fuzzy tolerance/equivalence relations using different varieties of increasing generators. At last, we perform a comparative study with these developed varieties of fuzzy tolerance/equivalence relations in their application to a clustering method.
Resumo:
The objective of this study was to develop pitanga nectar formulations in which sucrose was replaced with different sweeteners. Consumer tests were conducted with 50 fruit juice consumers, and a just-about-right scale was used to determine the ideal pulp dilution and ideal sweetness with sucrose. Furthermore, the adequate concentrations of six sweeteners were determined to obtain the equivalent sweetness of sucrose using relative to these concentrations the magnitude estimation model with 19 selected assessors. The ideal dilution test resulted in 25% pulp, and the ideal sweetness test, 10% sucrose. Sweetener concentrations to replace sucrose were 0.0160%, 0.0541%, 0.1000%, 0.0999%, 0.0017%, and 0.0360%, respectively, for sucralose, aspartame, stevia 40% rebaudioside A, stevia 95% rebaudioside A, neotame, and a 2:1 cyclamate/saccharin blend. These results can be used to prepare pitanga nectar with different sweeteners and obtain the same sweetness intensity in less caloric products than that of nectar prepared with sucrose.
Resumo:
This thesis focuses on the private membership test (PMT) problem and presents three single server protocols to resolve this problem. In the presented solutions, a client can perform an inclusion test for some record x in a server's database, without revealing his record. Moreover after executing the protocols, the contents of server's database remain secret. In each of these solutions, a different cryptographic protocol is utilized to construct a privacy preserving variant of Bloom filter. The three suggested solutions are slightly different from each other, from privacy perspective and also from complexity point of view. Therefore, their use cases are different and it is impossible to choose one that is clearly the best between all three. We present the software developments of the three protocols by utilizing various pseudocodes. The performance of our implementation is measured based on a real case scenario. This thesis is a spin-off from the Academy of Finland research project "Cloud Security Services".
Resumo:
The present study evaluated the use of stimulus equivalence in teaching monetary skills to school aged children with autism. An AB within-subject design with periodic probes was used. At pretest, three participants demonstrated relation DA, an auditory-visual relation (matching dictated coin values to printed coin prices). Using a three-choice match-to-sample procedure, with a multi-component intervention package, these participants were taught two trained relations, BA (matching coins to printed prices) and CA (matching coin combinations to printed prices). Two participants achieved positive tests of equivalence, and the third participant demonstrated emergent performances with a symmetric and transitive relation. In addition, two participants were able to show generalization of learned skills with a parent, in a second naturalistic setting. The present research replicates and extends the results of previous studies by demonstrating that stimulus equivalence can be used to teach an adaptive skill to children with autism.
Resumo:
Stimulus equivalence involves teaching two conditional discriminations that share one stimulus in common and testing all possible conditional discriminations not taught (Saunders & Green, 1999). Despite considerable research in the laboratory, applied studies of stimulus equivalence have been limited (Vause, Martin, Marion, & Sakko, 2005). This study investigated the field-effectiveness of stimulus equivalence in teaching reading skills to children with Autism. Participants were four children with Autism receiving centre-based intensive behavioural intervention (lBI) treatment. Three of the participants, who already matched pictures to their dictated names, demonstrated six to eight more emergent performances after being taught only to match written words to the same names. One participant struggled with the demands of the study and his participation was discontinued. Results suggest that stimulus equivalence provided an effective and efficient teaching strategy for three of the four participants in this study.
Resumo:
We consider the problem of provisioon and cost-sharing of multiple public goods. the efficient equal factor equivalent allocation rule makes every agent indifferent between what he receives and the opportunity of choosing the bundle of public goods subject to the constraint of paying r times its cost, where r is set as low as possible.
Resumo:
L'utilisation des méthodes formelles est de plus en plus courante dans le développement logiciel, et les systèmes de types sont la méthode formelle qui a le plus de succès. L'avancement des méthodes formelles présente de nouveaux défis, ainsi que de nouvelles opportunités. L'un des défis est d'assurer qu'un compilateur préserve la sémantique des programmes, de sorte que les propriétés que l'on garantit à propos de son code source s'appliquent également au code exécutable. Cette thèse présente un compilateur qui traduit un langage fonctionnel d'ordre supérieur avec polymorphisme vers un langage assembleur typé, dont la propriété principale est que la préservation des types est vérifiée de manière automatisée, à l'aide d'annotations de types sur le code du compilateur. Notre compilateur implante les transformations de code essentielles pour un langage fonctionnel d'ordre supérieur, nommément une conversion CPS, une conversion des fermetures et une génération de code. Nous présentons les détails des représentation fortement typées des langages intermédiaires, et les contraintes qu'elles imposent sur l'implantation des transformations de code. Notre objectif est de garantir la préservation des types avec un minimum d'annotations, et sans compromettre les qualités générales de modularité et de lisibilité du code du compilateur. Cet objectif est atteint en grande partie dans le traitement des fonctionnalités de base du langage (les «types simples»), contrairement au traitement du polymorphisme qui demande encore un travail substantiel pour satisfaire la vérification de type.
Resumo:
Le théorème ergodique de Birkhoff nous renseigne sur la convergence de suites de fonctions. Nous nous intéressons alors à étudier la convergence en moyenne et presque partout de ces suites, mais dans le cas où la suite est une suite strictement croissante de nombres entiers positifs. C’est alors que nous définirons les suites uniformes et étudierons la convergence presque partout pour ces suites. Nous regarderons également s’il existe certaines suites pour lesquelles la convergence n’a pas lieu. Nous présenterons alors un résultat dû en partie à Alexandra Bellow qui dit que de telles suites existent. Finalement, nous démontrerons une équivalence entre la notion de transformatiuon fortement mélangeante et la convergence d'une certaine suite qui utilise des “poids” qui satisfont certaines propriétés.
Resumo:
This paper contributes to the study of Freely Rewriting Restarting Automata (FRR-automata) and Parallel Communicating Grammar Systems (PCGS), which both are useful models in computational linguistics. For PCGSs we study two complexity measures called 'generation complexity' and 'distribution complexity', and we prove that a PCGS Pi, for which the generation complexity and the distribution complexity are both bounded by constants, can be transformed into a freely rewriting restarting automaton of a very restricted form. From this characterization it follows that the language L(Pi) generated by Pi is semi-linear, that its characteristic analysis is of polynomial size, and that this analysis can be computed in polynomial time.
Resumo:
In the first part of this paper we show a similarity between the principle of Structural Risk Minimization Principle (SRM) (Vapnik, 1982) and the idea of Sparse Approximation, as defined in (Chen, Donoho and Saunders, 1995) and Olshausen and Field (1996). Then we focus on two specific (approximate) implementations of SRM and Sparse Approximation, which have been used to solve the problem of function approximation. For SRM we consider the Support Vector Machine technique proposed by V. Vapnik and his team at AT&T Bell Labs, and for Sparse Approximation we consider a modification of the Basis Pursuit De-Noising algorithm proposed by Chen, Donoho and Saunders (1995). We show that, under certain conditions, these two techniques are equivalent: they give the same solution and they require the solution of the same quadratic programming problem.
Resumo:
Resumen tomado de la publicaci??n
Resumo:
Resumen tomado de la publicaci??n
Resumo:
Resumen tomado de la publicaci??n