795 resultados para Fuzzy Set Theory
Resumo:
L’avancement en âge est associé à plusieurs modifications cognitives, dont un déclin des capacités à mémoriser et/ou à rappeler les événements vécus personnellement. Il amène parallèlement une augmentation des faux souvenirs, c.-à-d. le rappel d’événements qui ne se sont pas réellement déroulés. Les faux souvenirs peuvent avoir d’importantes répercussions dans la vie quotidienne des personnes âgées et il importe donc de mieux comprendre ce phénomène en vieillissement normal. Des études ont démontré l’importance de la fonction des lobes temporaux médians (FTM)/mémoire et de la fonction des lobes frontaux (FF)/fonctions exécutives dans l’effet de faux souvenirs. Ainsi, la première étude de la thèse visait à valider en français une version adaptée d’une méthode proposée par Glisky, Polster, & Routhieaux (1995), permettant de mesurer ces fonctions cognitives (Chapitre 2). L’analyse factorielle de cette étude démontre que les scores neuropsychologiques associés à la mémoire se regroupent en un facteur, le facteur FTM/mémoire, alors que ceux associés aux fonctions exécutives se regroupent en un deuxième facteur, le facteur FF/fonctions exécutives. Des analyses « bootstrap » effectuées avec 1 000 ré-échantillons démontrent la stabilité des résultats pour la majorité des scores. La deuxième étude de cette thèse visait à éclairer les mécanismes cognitifs (FTM/mémoire et FF/fonctions exécutives) ainsi que théoriques de l’effet de faux souvenirs accru en vieillissement normal (Chapitre 3). La Théorie des Traces Floues (TTF; Brainerd & Reyna, 1990) propose des explications de l’effet de faux souvenirs pour lesquelles la FTM/mémoire semble davantage importante, alors que celles proposées par la Théorie de l’Activation et du Monitorage (TAM; Roediger, Balota, & Watson, 2001) sont davantage reliées à la FF/fonctions exécutives. Les tests neuropsychologiques mesurant la FTM/mémoire ainsi que ceux mesurant la FF/fonctions exécutives ont été administrés à 52 participants âgés (moyenne de 67,81 ans). Basé sur l’étude de validation précédente, un score composite de la FTM/mémoire et un score composite de la FF/fonctions exécutives ont été calculés pour chaque participant. Ces derniers ont d’abord été séparés en deux sous-groupes, un premier au score FTM/mémoire élevé (n = 29, âge moyen de 67,45 ans) et un deuxième au score FTM/mémoire faible (n = 23, âge moyen de 68,26 ans) en s’assurant de contrôler statistiquement plusieurs variables, dont le score de la FF/fonctions exécutives. Enfin, ces participants ont été séparés en deux sous-groupes, un premier au score FF/fonctions exécutives élevé (n = 26, âge moyen 68,08 ans) et un deuxième au score FF/fonctions exécutives faible (n = 25, âge moyen de 67,36 ans), en contrôlant les variables confondantes, dont le score de la FTM/mémoire. Les proportions de vraie et de fausse mémoire (cibles et leurres associatifs) ont été mesurées à l’aide d’un paradigme Deese-Roediger et McDermott (DRM; Deese, 1959; Roediger & McDermott, 1995), avec rappel et reconnaissance jumelée à une procédure « Je me souviens / Je sais » (Tulving, 1985) chez les 52 participants âgés ainsi que chez 22 jeunes (âge moyen de 24,59 ans), apparié pour les années de scolarité. D’abord, afin de tester l’hypothèse de la TTF (Brainerd & Reyna, 1990), ces proportions ont été comparées entre les jeunes adultes et les deux sous-groupes de personnes âgées catégorisées selon le score de la FTM/mémoire. Ensuite, afin de tester l’hypothèse de la TAM (Roediger et al., 2001), ces proportions ont été comparées entre les jeunes adultes et les deux sous-groupes de personnes âgées catégorisées selon le score de la FF/fonctions exécutives. Il s’agit de la première étude qui compare directement ces hypothèses à travers de nombreuses mesures de vraie et de fausse mémoire. Les résultats démontrent que seule la FTM/mémoire modulait l’effet d’âge en vraie mémoire, et de manière quelque peu indirecte, en fausse mémoire et dans la relation entre la vraie et la fausse remémoration. Ensuite, les résultats démontrent que seule la FF/fonctions exécutives jouerait un rôle dans la fausse reconnaissance des leurres associatifs. Par ailleurs, en des effets d’âge sont présents en faux rappel et fausse remémorations de leurres associatifs, entre les jeunes adultes et les personnes âgées au fonctionnement cognitif élevé, peu importe la fonction cognitive étudiée. Ces résultats suggèrent que des facteurs autres que la FTM/mémoire et la FF/fonctions exécutives doivent être identifiés afin d’expliquer la vulnérabilité des personnes âgées aux faux souvenirs. Les résultats de cette thèse sont discutés à la lumière des hypothèses théoriques et cognitives en faux souvenirs (Chapitre 4).
Resumo:
This paper highlights the prediction of learning disabilities (LD) in school-age children using rough set theory (RST) with an emphasis on application of data mining. In rough sets, data analysis start from a data table called an information system, which contains data about objects of interest, characterized in terms of attributes. These attributes consist of the properties of learning disabilities. By finding the relationship between these attributes, the redundant attributes can be eliminated and core attributes determined. Also, rule mining is performed in rough sets using the algorithm LEM1. The prediction of LD is accurately done by using Rosetta, the rough set tool kit for analysis of data. The result obtained from this study is compared with the output of a similar study conducted by us using Support Vector Machine (SVM) with Sequential Minimal Optimisation (SMO) algorithm. It is found that, using the concepts of reduct and global covering, we can easily predict the learning disabilities in children
Resumo:
Ontic is an interactive system for developing and verifying mathematics. Ontic's verification mechanism is capable of automatically finding and applying information from a library containing hundreds of mathematical facts. Starting with only the axioms of Zermelo-Fraenkel set theory, the Ontic system has been used to build a data base of definitions and lemmas leading to a proof of the Stone representation theorem for Boolean lattices. The Ontic system has been used to explore issues in knowledge representation, automated deduction, and the automatic use of large data bases.
Resumo:
Las bases moleculares para el reconocimiento y la respuesta inmune están en la presentación de péptidos antigénicos. Se utilizaron la teoría de conjuntos y los datos experimentales para realizar una caracterización matemática de la región central de unión del péptido mediante la definición de 8 reglas asociadas a la unión al HLA clase II. Estas reglas se aplicaron a 4 péptidos promiscuos, 25 secuencias peptídicas naturales de la región central, de las cuales 13 presentaron unión, mientras que los demás no, y 19 péptidos sintéticos buscando diferenciar los péptidos. A excepción de uno, todos los péptidos de unión y no unión fueron caracterizados acertadamente. Esta metodología puede ser útil para escoger péptidos clave en el desarrollo de vacunas.
Resumo:
This paper describes a human management model as conceived in organizations that carry out a strategic direction of staff, based on a critical look of traditional management and some of its notions, such as the classical perspective of strategic addressing and human resources management. The privileged theoretical framework is the epistemological ground of the organizational theory and some of its sociological resources. In addition to the documentary review and the proposal of experts in consulting, a group of graphics made under the basic logicof set theory, designed from the analysis of several Colombian organizations, are presented. The main finding is that despite the efforts of executives, consultants and scholars to build management models different from functionalists, the way they have been thought in order to make them more strategic has made them still more functionalists that in the traditional approach. The strategic human management reproduces, with enormous power, the ideology of the macroeconomic model.
Resumo:
Se presenta aquí, en forma breve, el origen de la matematización económica y el campo de la economía matemática. Un enfoque histórico inicial divide dicho campo en un primer periodo denominado marginalista, otro donde se utiliza la teoría de los conjuntos y modelos lineales y por último un periodo que integra los dos anteriores. Posteriormente, se analiza la evolución de la Teoría del Equilibrio General desde Quesnay, pasando por Walras y desarrollos posteriores hasta su culminación con los trabajos de Arrow, Debreu y sus contemporáneos. Finalmente, se describe la influencia de las matemáticas, en especial de la optimización dinámica, en la teoría macroeconómica y a otras áreas de la economía.
Resumo:
Para el administrador el proceso de la toma de decisiones es uno de sus mayores retos y responsabilidades, ya que en su desarrollo se debe definir el camino más acertado en un sin número de alternativas, teniendo en cuenta los obstáculos sociales, políticos y económicos del entorno empresarial. Para llegar a la decisión adecuada no hay que perder de vista los objetivos y metas propuestas, además de tener presente el proceso lógico, detectando, analizando y demostrando el porqué de esa elección. Consecuentemente el análisis que propone esta investigación aportara conocimientos sobre los tipos de lógica utilizados en la toma de decisiones estratégicas al administrador para satisfacer las demandas asociadas con el mercadeo para que de esta manera se pueda generar y ampliar eficientemente las competencia idóneas del administrador en la inserción internacional de un mercado laboral cada vez mayor (Valero, 2011). A lo largo de la investigación se pretende desarrollar un estudio teórico para explicar la relación entre la lógica y la toma de decisiones estratégicas de marketing y como estos conceptos se combinan para llegar a un resultado final. Esto se llevara a cabo por medio de un análisis de planes de marketing, iniciando por conceptos básicos como marketing, lógica, decisiones estratégicas, dirección de marketing seguido de los principios lógicos y contradicciones que se pueden llegar a generar entre la fundamentación teórica
Resumo:
The control and prediction of wastewater treatment plants poses an important goal: to avoid breaking the environmental balance by always keeping the system in stable operating conditions. It is known that qualitative information — coming from microscopic examinations and subjective remarks — has a deep influence on the activated sludge process. In particular, on the total amount of effluent suspended solids, one of the measures of overall plant performance. The search for an input–output model of this variable and the prediction of sudden increases (bulking episodes) is thus a central concern to ensure the fulfillment of current discharge limitations. Unfortunately, the strong interrelation between variables, their heterogeneity and the very high amount of missing information makes the use of traditional techniques difficult, or even impossible. Through the combined use of several methods — rough set theory and artificial neural networks, mainly — reasonable prediction models are found, which also serve to show the different importance of variables and provide insight into the process dynamics
Resumo:
Uncertainty contributes a major part in the accuracy of a decision-making process while its inconsistency is always difficult to be solved by existing decision-making tools. Entropy has been proved to be useful to evaluate the inconsistency of uncertainty among different respondents. The study demonstrates an entropy-based financial decision support system called e-FDSS. This integrated system provides decision support to evaluate attributes (funding options and multiple risks) available in projects. Fuzzy logic theory is included in the system to deal with the qualitative aspect of these options and risks. An adaptive genetic algorithm (AGA) is also employed to solve the decision algorithm in the system in order to provide optimal and consistent rates to these attributes. Seven simplified and parallel projects from a Hong Kong construction small and medium enterprise (SME) were assessed to evaluate the system. The result shows that the system calculates risk adjusted discount rates (RADR) of projects in an objective way. These rates discount project cash flow impartially. Inconsistency of uncertainty is also successfully evaluated by the use of the entropy method. Finally, the system identifies the favourable funding options that are managed by a scheme called SME Loan Guarantee Scheme (SGS). Based on these results, resource allocation could then be optimized and the best time to start a new project could also be identified throughout the overall project life cycle.
Resumo:
Genetic algorithms (GAs) have been introduced into site layout planning as reported in a number of studies. In these studies, the objective functions were defined so as to employ the GAs in searching for the optimal site layout. However, few studies have been carried out to investigate the actual closeness of relationships between site facilities; it is these relationships that ultimately govern the site layout. This study has determined that the underlying factors of site layout planning for medium-size projects include work flow, personnel flow, safety and environment, and personal preferences. By finding the weightings on these factors and the corresponding closeness indices between each facility, a closeness relationship has been deduced. Two contemporary mathematical approaches - fuzzy logic theory and an entropy measure - were adopted in finding these results in order to minimize the uncertainty and vagueness of the collected data and improve the quality of the information. GAs were then applied to searching for the optimal site layout in a medium-size government project using the GeneHunter software. The objective function involved minimizing the total travel distance. An optimal layout was obtained within a short time. This reveals that the application of GA to site layout planning is highly promising and efficient.
Resumo:
A new autonomous ship collision free (ASCF) trajectory navigation and control system has been introduced with a new recursive navigation algorithm based on analytic geometry and convex set theory for ship collision free guidance. The underlying assumption is that the geometric information of ship environment is available in the form of a polygon shaped free space, which may be easily generated from a 2D image or plots relating to physical hazards or other constraints such as collision avoidance regulations. The navigation command is given as a heading command sequence based on generating a way point which falls within a small neighborhood of the current position, and the sequence of the way points along the trajectory are guaranteed to lie within a bounded obstacle free region using convex set theory. A neurofuzzy network predictor which in practice uses only observed input/output data generated by on board sensors or external sensors (or a sensor fusion algorithm), based on using rudder deflection angle for the control of ship heading angle, is utilised in the simulation of an ESSO 190000 dwt tanker model to demonstrate the effectiveness of the system.
Resumo:
This article investigates the determinants of union inclusiveness towards agency workers in Western Europe, using an index which combines unionization rates with dimensions of collective agreements covering agency workers. Using fuzzy-set Qualitative Comparative Analysis, we identify two combinations of conditions leading to inclusiveness: the ‘Northern path’ includes high union density, high bargaining coverage and high union authority, and is consistent with the power resources approach. The ‘Southern path’ combines high union authority, high bargaining coverage, statutory regulations of agency work and working-class orientation, showing that ideology rather than institutional incentives shapes union strategies towards the marginal workforce.
Resumo:
There is a family of well-known external clustering validity indexes to measure the degree of compatibility or similarity between two hard partitions of a given data set, including partitions with different numbers of categories. A unified, fully equivalent set-theoretic formulation for an important class of such indexes was derived and extended to the fuzzy domain in a previous work by the author [Campello, R.J.G.B., 2007. A fuzzy extension of the Rand index and other related indexes for clustering and classification assessment. Pattern Recognition Lett., 28, 833-841]. However, the proposed fuzzy set-theoretic formulation is not valid as a general approach for comparing two fuzzy partitions of data. Instead, it is an approach for comparing a fuzzy partition against a hard referential partition of the data into mutually disjoint categories. In this paper, generalized external indexes for comparing two data partitions with overlapping categories are introduced. These indexes can be used as general measures for comparing two partitions of the same data set into overlapping categories. An important issue that is seldom touched in the literature is also addressed in the paper, namely, how to compare two partitions of different subsamples of data. A number of pedagogical examples and three simulation experiments are presented and analyzed in details. A review of recent related work compiled from the literature is also provided. (c) 2010 Elsevier B.V. All rights reserved.
Resumo:
Neste trabalho foram realizadas classificações utilizando-se as bandas 1 a 5 e 7 dos sensores Landsat 5 TM (1987) e Landsat 7 ETM+ (2000). A caracterização espectral dos materiais foi realizada em laboratório utilizando um espectrorradiômetro, e através das bandas 1 a 5 e 7 dos sensores Landsat 5 TM (1987) e Landsat 7 ETM+ (2000). A transformação dos dados multiespectrais de imagens de sensoriamento remoto é uma maneira de reduzir o volume de dados através da identificação de classes de interesse numa imagem digital. No intuito de verificar condições de melhoramento na classificação de alvos urbanos em imagens digitais, identificados por procedimentos já conhecidos, como a classificação pela Máxima Verossimilhança, escolheu-se um classificador baseado na lógica fuzzy. O classificador utilizado foi o Fuzzy Set Membership classification - Fuzclass, que faz parte de um conjunto de classificadores não-rígidos disponíveis no programa Idrisi 32. Uma vez que informações sobre o desempenho de produtos deste classificador em áreas urbanas são escassas, foram conduzidos ensaios de comparação de resultados obtidos por este classificador com a verdade terrestre, representada por uma imagem de alta resolução espacial do satélite QuickBird. As áreas teste selecionadas desta imagem atendem ao critério de inalterância das condições de ocupação para o intervalo temporal considerado A comparação feita, permite concluir que o classificador apresenta limitações na classificação de áreas urbanas devido ao comportamento espectral semelhante dos materiais que fazem parte dessa cobertura. A utilização de uma classe única para identificar áreas impermeáveis foi a solução adotada para contornar este óbice. O emprego de áreas teste possibilitou acertar a escolha do grau de possibilidade de presença da classe no pixel (PPCP). Uma comparação entre os resultados apresentados na classificação de áreas impermeáveis, com base nos classificadores Máxima Verossimilhança e Fuzclass, demonstrou um desempenho melhor do classificador fuzzy, em função do nível de PPCP ajustado durante a análise comparativa Landsat e Quickbird nas áreas teste. Um procedimento alternativo de estimativa de áreas impermeáveis em bacias urbanas é apresentado no final.