819 resultados para rule-based algorithms
Resumo:
A new approach is presented to identify the number of incoming signals in antenna array processing. The new method exploits the inherent properties existing in the noise eigenvalues of the covariance matrix of the array output. A single threshold has been established concerning information about the signal and noise strength, data length, and array size. When the subspace-based algorithms are adopted the computation cost of the signal number detector can almost be neglected. The performance of the threshold is robust against low SNR and short data length.
Resumo:
Dual-system models suggest that English past tense morphology involves two processing routes: rule application for regular verbs and memory retrieval for irregular verbs (Pinker, 1999). In second language (L2) processing research, Ullman (2001a) suggested that both verb types are retrieved from memory, but more recently Clahsen and Felser (2006) and Ullman (2004) argued that past tense rule application can be automatised with experience by L2 learners. To address this controversy, we tested highly proficient Greek-English learners with naturalistic or classroom L2 exposure compared to native English speakers in a self-paced reading task involving past tense forms embedded in plausible sentences. Our results suggest that, irrespective to the type of exposure, proficient L2 learners of extended L2 exposure apply rule-based processing.
Resumo:
This project is concerned with the way that illustrations, photographs, diagrams and graphs, and typographic elements interact to convey ideas on the book page. A framework for graphic description is proposed to elucidate this graphic language of ‘complex texts’. The model is built up from three main areas of study, with reference to a corpus of contemporary children’s science books. First, a historical survey puts the subjects for study in context. Then a multidisciplinary discussion of graphic communication provides a theoretical underpinning for the model; this leads to various proposals, such as the central importance of ratios and relationships among parts in creating meaning in graphic communication. Lastly a series of trials in description contribute to the structure of the model itself. At the heart of the framework is an organising principle that integrates descriptive models from fields of design, literary criticism, art history, and linguistics, among others, as well as novel categories designed specifically for book design. Broadly, design features are described in terms of elemental component parts (micro-level), larger groupings of these (macro-level), and finally in terms of overarching, ‘whole book’ qualities (meta-level). Various features of book design emerge at different levels; for instance, the presence of nested discursive structures, a form of graphic recursion in editorial design, is proposed at the macro-level. Across these three levels are the intersecting categories of ‘rule’ and ‘context’, offering different perspectives with which to describe graphic characteristics. Contextbased features are contingent on social and cultural environment, the reader’s previous knowledge, and the actual conditions of reading; rule-based features relate to the systematic or codified aspects of graphic language. The model aims to be a frame of reference for graphic description, of use in different forms of qualitative or quantitative research and as a heuristic tool in practice and teaching.
Resumo:
Individual differences in cognitive style can be characterized along two dimensions: ‘systemizing’ (S, the drive to analyze or build ‘rule-based’ systems) and ‘empathizing’ (E, the drive to identify another's mental state and respond to this with an appropriate emotion). Discrepancies between these two dimensions in one direction (S > E) or the other (E > S) are associated with sex differences in cognition: on average more males show an S > E cognitive style, while on average more females show an E > S profile. The neurobiological basis of these different profiles remains unknown. Since individuals may be typical or atypical for their sex, it is important to move away from the study of sex differences and towards the study of differences in cognitive style. Using structural magnetic resonance imaging we examined how neuroanatomy varies as a function of the discrepancy between E and S in 88 adult males from the general population. Selecting just males allows us to study discrepant E-S profiles in a pure way, unconfounded by other factors related to sex and gender. An increasing S > E profile was associated with increased gray matter volume in cingulate and dorsal medial prefrontal areas which have been implicated in processes related to cognitive control, monitoring, error detection, and probabilistic inference. An increasing E > S profile was associated with larger hypothalamic and ventral basal ganglia regions which have been implicated in neuroendocrine control, motivation and reward. These results suggest an underlying neuroanatomical basis linked to the discrepancy between these two important dimensions of individual differences in cognitive style.
Resumo:
Tropical Applications of Meteorology Using Satellite and Ground-Based Observations (TAMSAT) rainfall estimates are used extensively across Africa for operational rainfall monitoring and food security applications; thus, regional evaluations of TAMSAT are essential to ensure its reliability. This study assesses the performance of TAMSAT rainfall estimates, along with the African Rainfall Climatology (ARC), version 2; the Tropical Rainfall Measuring Mission (TRMM) 3B42 product; and the Climate Prediction Center morphing technique (CMORPH), against a dense rain gauge network over a mountainous region of Ethiopia. Overall, TAMSAT exhibits good skill in detecting rainy events but underestimates rainfall amount, while ARC underestimates both rainfall amount and rainy event frequency. Meanwhile, TRMM consistently performs best in detecting rainy events and capturing the mean rainfall and seasonal variability, while CMORPH tends to overdetect rainy events. Moreover, the mean difference in daily rainfall between the products and rain gauges shows increasing underestimation with increasing elevation. However, the distribution in satellite–gauge differences demon- strates that although 75% of retrievals underestimate rainfall, up to 25% overestimate rainfall over all eleva- tions. Case studies using high-resolution simulations suggest underestimation in the satellite algorithms is likely due to shallow convection with warm cloud-top temperatures in addition to beam-filling effects in microwave- based retrievals from localized convective cells. The overestimation by IR-based algorithms is attributed to nonraining cirrus with cold cloud-top temperatures. These results stress the importance of understanding re- gional precipitation systems causing uncertainties in satellite rainfall estimates with a view toward using this knowledge to improve rainfall algorithms.
Resumo:
BACKGROUND: Optical spectroscopy is a noninvasive technique with potential applications for diagnosis of oral dysplasia and early cancer. In this study, we evaluated the diagnostic performance of a depth-sensitive optical spectroscopy (DSOS) system for distinguishing dysplasia and carcinoma from non-neoplastic oral mucosa. METHODS: Patients with oral lesions and volunteers without any oral abnormalities were recruited to participate. Autofluorescence and diffuse reflectance spectra of selected oral sites were measured using the DSOS system. A total of 424 oral sites in 124 subjects were measured and analyzed, including 154 sites in 60 patients with oral lesions and 270 sites in 64 normal volunteers. Measured optical spectra were used to develop computer-based algorithms to identify the presence of dysplasia or cancer. Sensitivity and specificity were calculated using a gold standard of histopathology for patient sites and clinical impression for normal volunteer sites. RESULTS: Differences in oral spectra were observed in: (1) neoplastic versus nonneoplastic sites, (2) keratinized versus nonkeratinized tissue, and (3) shallow versus deep depths within oral tissue. Algorithms based on spectra from 310 nonkeratinized anatomic sites (buccal, tongue, floor of mouth, and lip) yielded an area under the receiver operating characteristic curve of 0.96 in the training set and 0.93 in the validation set. CONCLUSIONS: The ability to selectively target epithelial and shallow stromal depth regions appeared to be diagnostically useful. For nonkeratinized oral sites, the sensitivity and specificity of this objective diagnostic technique were comparable to that of clinical diagnosis by expert observers. Thus, DSOS has potential to augment oral cancer screening efforts in community settings. Cancer 2009;115:1669-79. (C) 2009 American Cancer Society.
Resumo:
Delineation of commuting regions has always been based on statistical units, often municipalities or wards. However, using these units has certain disadvantages as their land areas differ considerably. Much information is lost in the larger spatial base units and distortions in self-containment values, the main criterion in rule-based delineation procedures, occur. Alternatively, one can start from relatively small standard size units such as hexagons. In this way, much greater detail in spatial patterns is obtained. In this paper, regions are built by means of intrazonal maximization (Intramax) on the basis of hexagons. The use of geoprocessing tools, specifically developed for the processing ofcommuting data, speeds up processing time considerably. The results of the Intramax analysis are evaluated with travel-to-work area constraints, and comparisons are made with commuting fields, accessibility to employment, commuting flow density and network commuting flow size. From selected steps in the regionalization process, a hierarchy of nested commuting regions emerges, revealing the complexity of commuting patterns.
Resumo:
HydroShare is an online, collaborative system being developed for open sharing of hydrologic data and models. The goal of HydroShare is to enable scientists to easily discover and access hydrologic data and models, retrieve them to their desktop or perform analyses in a distributed computing environment that may include grid, cloud or high performance computing model instances as necessary. Scientists may also publish outcomes (data, results or models) into HydroShare, using the system as a collaboration platform for sharing data, models and analyses. HydroShare is expanding the data sharing capability of the CUAHSI Hydrologic Information System by broadening the classes of data accommodated, creating new capability to share models and model components, and taking advantage of emerging social media functionality to enhance information about and collaboration around hydrologic data and models. One of the fundamental concepts in HydroShare is that of a Resource. All content is represented using a Resource Data Model that separates system and science metadata and has elements common to all resources as well as elements specific to the types of resources HydroShare will support. These will include different data types used in the hydrology community and models and workflows that require metadata on execution functionality. The HydroShare web interface and social media functions are being developed using the Drupal content management system. A geospatial visualization and analysis component enables searching, visualizing, and analyzing geographic datasets. The integrated Rule-Oriented Data System (iRODS) is being used to manage federated data content and perform rule-based background actions on data and model resources, including parsing to generate metadata catalog information and the execution of models and workflows. This presentation will introduce the HydroShare functionality developed to date, describe key elements of the Resource Data Model and outline the roadmap for future development.
Resumo:
Due to the increase in water demand and hydropower energy, it is getting more important to operate hydraulic structures in an efficient manner while sustaining multiple demands. Especially, companies, governmental agencies, consultant offices require effective, practical integrated tools and decision support frameworks to operate reservoirs, cascades of run-of-river plants and related elements such as canals by merging hydrological and reservoir simulation/optimization models with various numerical weather predictions, radar and satellite data. The model performance is highly related with the streamflow forecast, related uncertainty and its consideration in the decision making. While deterministic weather predictions and its corresponding streamflow forecasts directly restrict the manager to single deterministic trajectories, probabilistic forecasts can be a key solution by including uncertainty in flow forecast scenarios for dam operation. The objective of this study is to compare deterministic and probabilistic streamflow forecasts on an earlier developed basin/reservoir model for short term reservoir management. The study is applied to the Yuvacık Reservoir and its upstream basin which is the main water supply of Kocaeli City located in the northwestern part of Turkey. The reservoir represents a typical example by its limited capacity, downstream channel restrictions and high snowmelt potential. Mesoscale Model 5 and Ensemble Prediction System data are used as a main input and the flow forecasts are done for 2012 year using HEC-HMS. Hydrometeorological rule-based reservoir simulation model is accomplished with HEC-ResSim and integrated with forecasts. Since EPS based hydrological model produce a large number of equal probable scenarios, it will indicate how uncertainty spreads in the future. Thus, it will provide risk ranges in terms of spillway discharges and reservoir level for operator when it is compared with deterministic approach. The framework is fully data driven, applicable, useful to the profession and the knowledge can be transferred to other similar reservoir systems.
Resumo:
Neste artigo estimamos e simulamos um modelo macroeconômico aberto de expectativas racionais (Batini e Haldane [4]) para a economia brasileira, com o objetivo de identificar as características das regras monetárias ótimas e a dinâmica de curto prazo gerada por elas. Trabalhamos com uma versão forward-Iooking e uma versão backward-Iooking a fim de comparar o desempenho de três parametrizações de regras monetárias, que diferem em relação à variável de inflação: a tradicional regra de Taylor, que se baseia na inflação passada; uma regra que combina inflação e taxa de câmbio real (ver Ball [5]) e uma regra que utiliza previsões de inflação (ver Bank af England [3]). Resolvemos o modelo numericamente e contruímos fronteiras eficientes em relação às variâncias do produto e da infiação por simulações estocásticas, para choques i.i.d. ou correlacionados. Os conjuntos de regras ótimas para as duas versões são qualitativamente distintos. Devido à incerteza quanto ao grau de forward-Iookingness sugerimos a escolha das regras pela soma das funções objetivos nas duas versões. Concluímos que as regras escolhidas com base neste critério têm perdas moderadas em relação às regras ótimas, mas previnem perdas maiores que resultariam da escolha da regra com base na versão errada. Finalmente calculamos funções de resposta a impulso dos dois modelos para algumas regras selecionadas, a fim de avaliar como diferentes regras monetárias alteram a dinâmica de curto prazo dos dois modelos.
Resumo:
Vague words and expressions are present throughout the standards that comprise the accounting and auditing professions. Vagueness is considered to be a significant source of inexactness in many accounting decision problems and many authors have argued that the neglect of this issue may cause accounting information to be less useful. On the other hand, we can assume that the use of vague terms in accounting standards is inherent to principle based standards (different from rule based standards) and that to avoid vague terms, standard setters would have to incur excessive transaction costs. Auditors are required to exercise their own professional judgment throughout the audit process and it has been argued that the inherent vagueness in accounting standards may influence their decision making processes. The main objective of this paper is to analyze the decision making process of auditors and to investigate whether vague accounting standards create a problem for the decision making process of auditors, or lead to a better outcome. This paper makes the argument that vague standards prompt the use of System 2 type processing by auditors, allowing more comprehensive analytical thinking; therefore, reducing the biases associated with System 1 heuristic processing. If our argument is valid, the repercussions of vague accounting standards are not as negative as presented in previous literature, instead they are positive.
Resumo:
In this paper we construct sunspot equilibria that arrise from chaotic deterministic dynamics. These equilibria are robust and therefore observables. We prove that they may be learned by a sim pie rule based on the histograms or past state variables. This work gives the theoretical justification or deterministic models that might compete with stochastic models to explain real data.
Resumo:
De todo ICMS arrecadado pelos estados brasileiros, 25% é distribuído aos municípios. Os estados são responsáveis por definir as regras de distribuição de 25% destes 25% do ICMS que é transferido aos municípios, os outros 75% seguem o critério do Valor Adicionado Fiscal. Alguns estados alteraram suas leis para que a distribuição seja realizada em função do desempenho dos municípios em algumas áreas com o intuito de incentiva-lo a melhorarem sua performance em busca de uma maior fatia do ICMS. Seguindo esta lógica está o estado do Ceará onde 100% do ICMS distribuído segundo regras estaduais é calculado a partir do desempenho dos municípios em indicadores de resultado nas áreas da educação (72%), saúde (20%) e meio ambiente (8%). Este estudo tem como objetivo estimar o efeito que a mudança da Lei de distribuição do ICMS do Ceará teve em indicadores de resultado da área da educação: IDEB e Prova Brasil. Para tanto, foi utilizado o método da Dupla Diferença por meio da construção de grupos de controle e tratamento. Assim, comparou-se a evolução do desempenho, anteriormente e posteriormente à mudança, de municípios cearenses com municípios parecidos de estados vizinhos, porém, não submetidos a mesma regra de distribuição de ICMS. De forma complementar, foram feitas outras duas analises separando os municípios do estado do Ceará entre ganhadores e perdedores de recursos de ICMS com a mudança na Lei e entre os detentores dos melhores e piores desempenhos de PIB per capita. Os resultados apontam impactos positivos no desempenho dos municípios cearenses tanto no IDEB quanto na Prova Brasil. Mesmo os municípios que perderam recursos com mudança das regras de distribuição de ICMS, melhoraram sua performance na educação. Os municípios mais pobres do estado, que apresentam desempenho pior do que os municípios mais ricos, aumentaram o desempenho reduzindo a diferença de proficiência se comparada aos municípios mais ricos. Neste sentido, há indícios de que a mudança na Lei do ICMS implementada pelo estado do Ceará gerou impactos positivos no desempenho dos municípios no IDEB e na Prova Brasil.
Resumo:
Market risk exposure plays a key role for nancial institutions risk management. A possible measure for this exposure is to evaluate losses likely to incurwhen the price of the portfolio's assets declines using Value-at-Risk (VaR) estimates, one of the most prominent measure of nancial downside market risk. This paper suggests an evolving possibilistic fuzzy modeling approach for VaR estimation. The approach is based on an extension of the possibilistic fuzzy c-means clustering and functional fuzzy rule-based modeling, which employs memberships and typicalities to update clusters and creates new clusters based on a statistical control distance-based criteria. ePFM also uses an utility measure to evaluate the quality of the current cluster structure. Computational experiments consider data of the main global equity market indexes of United States, London, Germany, Spain and Brazil from January 2000 to December 2012 for VaR estimation using ePFM, traditional VaR benchmarks such as Historical Simulation, GARCH, EWMA, and Extreme Value Theory and state of the art evolving approaches. The results show that ePFM is a potential candidate for VaR modeling, with better performance than alternative approaches.
Resumo:
This work aims to examine the television social representation by mothers/educators as a TV viewers, to understand the meaning of this media in their quotidian and which relations occur between teachers and students in the classroom. The study purpose is the educational television rule, based on a social representation approach. It look for to reveal, through the discourses of five educators who are engaged in pedagogic activities in the Public Elementary School of the Natal city, a significant experience in the media education field progress. It’s also a way to understand which representations the educators have about the television can contribute to aid the idea and critic analysis about the media meaning in the teacher’s formation. Some questions were in the basis of the investigation as: What is the television for the educators who are also TV viewers? How it reaches the classroom? Their relation with the media interfere in the pedagogic practice? Assuming that the verbal technical is one of the formal ways to access the representations, the methodological strategy employed was the open interview, guided by a wide and flexible schedule, leaving the interviewees free to expose their ideas, a attitude adopted to avoid the imposition of interviwer’s points of view, that result in a rich material. Through this strategy it was possible to confirm or to reject presumptions raised in the beginning of the investigation and modify some planning direction lines. The study has as the theory presupposition the contribution of the Mexican researcher Guillermo Orozco Gómez, who, based on the Paulo Freire e Jesús Martín-Barbero ideas, establishes a dialogue between popular Education and the communication theories, mainly the television reception, when he develops an integral view focused on the audience or on the multiple mediations model. The school – and the family, as well – is an important mediator of the media information. The relationship which the teachers establish between the television and their representations about it in their lives reflects effectively and directly on their professional practice and on the media dialogue within the school, it can contribute to the critic reflection which students establish with the media trough the educators mediation