940 resultados para rule-based


Relevância:

60.00% 60.00%

Publicador:

Resumo:

In a world of almost permanent and rapidly increasing electronic data availability, techniques of filtering, compressing, and interpreting this data to transform it into valuable and easily comprehensible information is of utmost importance. One key topic in this area is the capability to deduce future system behavior from a given data input. This book brings together for the first time the complete theory of data-based neurofuzzy modelling and the linguistic attributes of fuzzy logic in a single cohesive mathematical framework. After introducing the basic theory of data-based modelling, new concepts including extended additive and multiplicative submodels are developed and their extensions to state estimation and data fusion are derived. All these algorithms are illustrated with benchmark and real-life examples to demonstrate their efficiency. Chris Harris and his group have carried out pioneering work which has tied together the fields of neural networks and linguistic rule-based algortihms. This book is aimed at researchers and scientists in time series modeling, empirical data modeling, knowledge discovery, data mining, and data fusion.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Dual-system models suggest that English past tense morphology involves two processing routes: rule application for regular verbs and memory retrieval for irregular verbs (Pinker, 1999). In second language (L2) processing research, Ullman (2001a) suggested that both verb types are retrieved from memory, but more recently Clahsen and Felser (2006) and Ullman (2004) argued that past tense rule application can be automatised with experience by L2 learners. To address this controversy, we tested highly proficient Greek-English learners with naturalistic or classroom L2 exposure compared to native English speakers in a self-paced reading task involving past tense forms embedded in plausible sentences. Our results suggest that, irrespective to the type of exposure, proficient L2 learners of extended L2 exposure apply rule-based processing.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This project is concerned with the way that illustrations, photographs, diagrams and graphs, and typographic elements interact to convey ideas on the book page. A framework for graphic description is proposed to elucidate this graphic language of ‘complex texts’. The model is built up from three main areas of study, with reference to a corpus of contemporary children’s science books. First, a historical survey puts the subjects for study in context. Then a multidisciplinary discussion of graphic communication provides a theoretical underpinning for the model; this leads to various proposals, such as the central importance of ratios and relationships among parts in creating meaning in graphic communication. Lastly a series of trials in description contribute to the structure of the model itself. At the heart of the framework is an organising principle that integrates descriptive models from fields of design, literary criticism, art history, and linguistics, among others, as well as novel categories designed specifically for book design. Broadly, design features are described in terms of elemental component parts (micro-level), larger groupings of these (macro-level), and finally in terms of overarching, ‘whole book’ qualities (meta-level). Various features of book design emerge at different levels; for instance, the presence of nested discursive structures, a form of graphic recursion in editorial design, is proposed at the macro-level. Across these three levels are the intersecting categories of ‘rule’ and ‘context’, offering different perspectives with which to describe graphic characteristics. Contextbased features are contingent on social and cultural environment, the reader’s previous knowledge, and the actual conditions of reading; rule-based features relate to the systematic or codified aspects of graphic language. The model aims to be a frame of reference for graphic description, of use in different forms of qualitative or quantitative research and as a heuristic tool in practice and teaching.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Individual differences in cognitive style can be characterized along two dimensions: ‘systemizing’ (S, the drive to analyze or build ‘rule-based’ systems) and ‘empathizing’ (E, the drive to identify another's mental state and respond to this with an appropriate emotion). Discrepancies between these two dimensions in one direction (S > E) or the other (E > S) are associated with sex differences in cognition: on average more males show an S > E cognitive style, while on average more females show an E > S profile. The neurobiological basis of these different profiles remains unknown. Since individuals may be typical or atypical for their sex, it is important to move away from the study of sex differences and towards the study of differences in cognitive style. Using structural magnetic resonance imaging we examined how neuroanatomy varies as a function of the discrepancy between E and S in 88 adult males from the general population. Selecting just males allows us to study discrepant E-S profiles in a pure way, unconfounded by other factors related to sex and gender. An increasing S > E profile was associated with increased gray matter volume in cingulate and dorsal medial prefrontal areas which have been implicated in processes related to cognitive control, monitoring, error detection, and probabilistic inference. An increasing E > S profile was associated with larger hypothalamic and ventral basal ganglia regions which have been implicated in neuroendocrine control, motivation and reward. These results suggest an underlying neuroanatomical basis linked to the discrepancy between these two important dimensions of individual differences in cognitive style.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Ensemble learning techniques generate multiple classifiers, so called base classifiers, whose combined classification results are used in order to increase the overall classification accuracy. In most ensemble classifiers the base classifiers are based on the Top Down Induction of Decision Trees (TDIDT) approach. However, an alternative approach for the induction of rule based classifiers is the Prism family of algorithms. Prism algorithms produce modular classification rules that do not necessarily fit into a decision tree structure. Prism classification rulesets achieve a comparable and sometimes higher classification accuracy compared with decision tree classifiers, if the data is noisy and large. Yet Prism still suffers from overfitting on noisy and large datasets. In practice ensemble techniques tend to reduce the overfitting, however there exists no ensemble learner for modular classification rule inducers such as the Prism family of algorithms. This article describes the first development of an ensemble learner based on the Prism family of algorithms in order to enhance Prism’s classification accuracy by reducing overfitting.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Anti-spoofing is attracting growing interest in biometrics, considering the variety of fake materials and new means to attack biometric recognition systems. New unseen materials continuously challenge state-of-the-art spoofing detectors, suggesting for additional systematic approaches to target anti-spoofing. By incorporating liveness scores into the biometric fusion process, recognition accuracy can be enhanced, but traditional sum-rule based fusion algorithms are known to be highly sensitive to single spoofed instances. This paper investigates 1-median filtering as a spoofing-resistant generalised alternative to the sum-rule targeting the problem of partial multibiometric spoofing where m out of n biometric sources to be combined are attacked. Augmenting previous work, this paper investigates the dynamic detection and rejection of livenessrecognition pair outliers for spoofed samples in true multi-modal configuration with its inherent challenge of normalisation. As a further contribution, bootstrap aggregating (bagging) classifiers for fingerprint spoof-detection algorithm is presented. Experiments on the latest face video databases (Idiap Replay- Attack Database and CASIA Face Anti-Spoofing Database), and fingerprint spoofing database (Fingerprint Liveness Detection Competition 2013) illustrate the efficiency of proposed techniques.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Delineation of commuting regions has always been based on statistical units, often municipalities or wards. However, using these units has certain disadvantages as their land areas differ considerably. Much information is lost in the larger spatial base units and distortions in self-containment values, the main criterion in rule-based delineation procedures, occur. Alternatively, one can start from relatively small standard size units such as hexagons. In this way, much greater detail in spatial patterns is obtained. In this paper, regions are built by means of intrazonal maximization (Intramax) on the basis of hexagons. The use of geoprocessing tools, specifically developed for the processing ofcommuting data, speeds up processing time considerably. The results of the Intramax analysis are evaluated with travel-to-work area constraints, and comparisons are made with commuting fields, accessibility to employment, commuting flow density and network commuting flow size. From selected steps in the regionalization process, a hierarchy of nested commuting regions emerges, revealing the complexity of commuting patterns.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

HydroShare is an online, collaborative system being developed for open sharing of hydrologic data and models. The goal of HydroShare is to enable scientists to easily discover and access hydrologic data and models, retrieve them to their desktop or perform analyses in a distributed computing environment that may include grid, cloud or high performance computing model instances as necessary. Scientists may also publish outcomes (data, results or models) into HydroShare, using the system as a collaboration platform for sharing data, models and analyses. HydroShare is expanding the data sharing capability of the CUAHSI Hydrologic Information System by broadening the classes of data accommodated, creating new capability to share models and model components, and taking advantage of emerging social media functionality to enhance information about and collaboration around hydrologic data and models. One of the fundamental concepts in HydroShare is that of a Resource. All content is represented using a Resource Data Model that separates system and science metadata and has elements common to all resources as well as elements specific to the types of resources HydroShare will support. These will include different data types used in the hydrology community and models and workflows that require metadata on execution functionality. The HydroShare web interface and social media functions are being developed using the Drupal content management system. A geospatial visualization and analysis component enables searching, visualizing, and analyzing geographic datasets. The integrated Rule-Oriented Data System (iRODS) is being used to manage federated data content and perform rule-based background actions on data and model resources, including parsing to generate metadata catalog information and the execution of models and workflows. This presentation will introduce the HydroShare functionality developed to date, describe key elements of the Resource Data Model and outline the roadmap for future development.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Due to the increase in water demand and hydropower energy, it is getting more important to operate hydraulic structures in an efficient manner while sustaining multiple demands. Especially, companies, governmental agencies, consultant offices require effective, practical integrated tools and decision support frameworks to operate reservoirs, cascades of run-of-river plants and related elements such as canals by merging hydrological and reservoir simulation/optimization models with various numerical weather predictions, radar and satellite data. The model performance is highly related with the streamflow forecast, related uncertainty and its consideration in the decision making. While deterministic weather predictions and its corresponding streamflow forecasts directly restrict the manager to single deterministic trajectories, probabilistic forecasts can be a key solution by including uncertainty in flow forecast scenarios for dam operation. The objective of this study is to compare deterministic and probabilistic streamflow forecasts on an earlier developed basin/reservoir model for short term reservoir management. The study is applied to the Yuvacık Reservoir and its upstream basin which is the main water supply of Kocaeli City located in the northwestern part of Turkey. The reservoir represents a typical example by its limited capacity, downstream channel restrictions and high snowmelt potential. Mesoscale Model 5 and Ensemble Prediction System data are used as a main input and the flow forecasts are done for 2012 year using HEC-HMS. Hydrometeorological rule-based reservoir simulation model is accomplished with HEC-ResSim and integrated with forecasts. Since EPS based hydrological model produce a large number of equal probable scenarios, it will indicate how uncertainty spreads in the future. Thus, it will provide risk ranges in terms of spillway discharges and reservoir level for operator when it is compared with deterministic approach. The framework is fully data driven, applicable, useful to the profession and the knowledge can be transferred to other similar reservoir systems.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Neste artigo estimamos e simulamos um modelo macroeconômico aberto de expectativas racionais (Batini e Haldane [4]) para a economia brasileira, com o objetivo de identificar as características das regras monetárias ótimas e a dinâmica de curto prazo gerada por elas. Trabalhamos com uma versão forward-Iooking e uma versão backward-Iooking a fim de comparar o desempenho de três parametrizações de regras monetárias, que diferem em relação à variável de inflação: a tradicional regra de Taylor, que se baseia na inflação passada; uma regra que combina inflação e taxa de câmbio real (ver Ball [5]) e uma regra que utiliza previsões de inflação (ver Bank af England [3]). Resolvemos o modelo numericamente e contruímos fronteiras eficientes em relação às variâncias do produto e da infiação por simulações estocásticas, para choques i.i.d. ou correlacionados. Os conjuntos de regras ótimas para as duas versões são qualitativamente distintos. Devido à incerteza quanto ao grau de forward-Iookingness sugerimos a escolha das regras pela soma das funções objetivos nas duas versões. Concluímos que as regras escolhidas com base neste critério têm perdas moderadas em relação às regras ótimas, mas previnem perdas maiores que resultariam da escolha da regra com base na versão errada. Finalmente calculamos funções de resposta a impulso dos dois modelos para algumas regras selecionadas, a fim de avaliar como diferentes regras monetárias alteram a dinâmica de curto prazo dos dois modelos.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Vague words and expressions are present throughout the standards that comprise the accounting and auditing professions. Vagueness is considered to be a significant source of inexactness in many accounting decision problems and many authors have argued that the neglect of this issue may cause accounting information to be less useful. On the other hand, we can assume that the use of vague terms in accounting standards is inherent to principle based standards (different from rule based standards) and that to avoid vague terms, standard setters would have to incur excessive transaction costs. Auditors are required to exercise their own professional judgment throughout the audit process and it has been argued that the inherent vagueness in accounting standards may influence their decision making processes. The main objective of this paper is to analyze the decision making process of auditors and to investigate whether vague accounting standards create a problem for the decision making process of auditors, or lead to a better outcome. This paper makes the argument that vague standards prompt the use of System 2 type processing by auditors, allowing more comprehensive analytical thinking; therefore, reducing the biases associated with System 1 heuristic processing. If our argument is valid, the repercussions of vague accounting standards are not as negative as presented in previous literature, instead they are positive.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In this paper we construct sunspot equilibria that arrise from chaotic deterministic dynamics. These equilibria are robust and therefore observables. We prove that they may be learned by a sim pie rule based on the histograms or past state variables. This work gives the theoretical justification or deterministic models that might compete with stochastic models to explain real data.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

De todo ICMS arrecadado pelos estados brasileiros, 25% é distribuído aos municípios. Os estados são responsáveis por definir as regras de distribuição de 25% destes 25% do ICMS que é transferido aos municípios, os outros 75% seguem o critério do Valor Adicionado Fiscal. Alguns estados alteraram suas leis para que a distribuição seja realizada em função do desempenho dos municípios em algumas áreas com o intuito de incentiva-lo a melhorarem sua performance em busca de uma maior fatia do ICMS. Seguindo esta lógica está o estado do Ceará onde 100% do ICMS distribuído segundo regras estaduais é calculado a partir do desempenho dos municípios em indicadores de resultado nas áreas da educação (72%), saúde (20%) e meio ambiente (8%). Este estudo tem como objetivo estimar o efeito que a mudança da Lei de distribuição do ICMS do Ceará teve em indicadores de resultado da área da educação: IDEB e Prova Brasil. Para tanto, foi utilizado o método da Dupla Diferença por meio da construção de grupos de controle e tratamento. Assim, comparou-se a evolução do desempenho, anteriormente e posteriormente à mudança, de municípios cearenses com municípios parecidos de estados vizinhos, porém, não submetidos a mesma regra de distribuição de ICMS. De forma complementar, foram feitas outras duas analises separando os municípios do estado do Ceará entre ganhadores e perdedores de recursos de ICMS com a mudança na Lei e entre os detentores dos melhores e piores desempenhos de PIB per capita. Os resultados apontam impactos positivos no desempenho dos municípios cearenses tanto no IDEB quanto na Prova Brasil. Mesmo os municípios que perderam recursos com mudança das regras de distribuição de ICMS, melhoraram sua performance na educação. Os municípios mais pobres do estado, que apresentam desempenho pior do que os municípios mais ricos, aumentaram o desempenho reduzindo a diferença de proficiência se comparada aos municípios mais ricos. Neste sentido, há indícios de que a mudança na Lei do ICMS implementada pelo estado do Ceará gerou impactos positivos no desempenho dos municípios no IDEB e na Prova Brasil.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Market risk exposure plays a key role for nancial institutions risk management. A possible measure for this exposure is to evaluate losses likely to incurwhen the price of the portfolio's assets declines using Value-at-Risk (VaR) estimates, one of the most prominent measure of nancial downside market risk. This paper suggests an evolving possibilistic fuzzy modeling approach for VaR estimation. The approach is based on an extension of the possibilistic fuzzy c-means clustering and functional fuzzy rule-based modeling, which employs memberships and typicalities to update clusters and creates new clusters based on a statistical control distance-based criteria. ePFM also uses an utility measure to evaluate the quality of the current cluster structure. Computational experiments consider data of the main global equity market indexes of United States, London, Germany, Spain and Brazil from January 2000 to December 2012 for VaR estimation using ePFM, traditional VaR benchmarks such as Historical Simulation, GARCH, EWMA, and Extreme Value Theory and state of the art evolving approaches. The results show that ePFM is a potential candidate for VaR modeling, with better performance than alternative approaches.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This work aims to examine the television social representation by mothers/educators as a TV viewers, to understand the meaning of this media in their quotidian and which relations occur between teachers and students in the classroom. The study purpose is the educational television rule, based on a social representation approach. It look for to reveal, through the discourses of five educators who are engaged in pedagogic activities in the Public Elementary School of the Natal city, a significant experience in the media education field progress. It’s also a way to understand which representations the educators have about the television can contribute to aid the idea and critic analysis about the media meaning in the teacher’s formation. Some questions were in the basis of the investigation as: What is the television for the educators who are also TV viewers? How it reaches the classroom? Their relation with the media interfere in the pedagogic practice? Assuming that the verbal technical is one of the formal ways to access the representations, the methodological strategy employed was the open interview, guided by a wide and flexible schedule, leaving the interviewees free to expose their ideas, a attitude adopted to avoid the imposition of interviwer’s points of view, that result in a rich material. Through this strategy it was possible to confirm or to reject presumptions raised in the beginning of the investigation and modify some planning direction lines. The study has as the theory presupposition the contribution of the Mexican researcher Guillermo Orozco Gómez, who, based on the Paulo Freire e Jesús Martín-Barbero ideas, establishes a dialogue between popular Education and the communication theories, mainly the television reception, when he develops an integral view focused on the audience or on the multiple mediations model. The school – and the family, as well – is an important mediator of the media information. The relationship which the teachers establish between the television and their representations about it in their lives reflects effectively and directly on their professional practice and on the media dialogue within the school, it can contribute to the critic reflection which students establish with the media trough the educators mediation