5 resultados para context processing
em Repositório Institucional UNESP - Universidade Estadual Paulista "Julio de Mesquita Filho"
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
CONTEXTO E OBJETIVO: Crianças e adolescentes que vivem em situação de vulnerabilidade social apresentam uma série de problemas de saúde. Apesar disso, ainda é controversa a afirmação sobre a existência de alterações cognitivas e/ou sensoriais. O objetivo deste estudo foi investigar aspectos relacionados ao processamento auditivo, através da aplicação de testes de potencial evocado auditivo de tronco encefálico (PEATE) e avaliação comportamental do processamento auditivo em crianças em situação de rua, comparando a um grupo controle. TIPO DE ESTUDO E LOCAL: Estudo transversal no Laboratório de Processamento Auditivo, Faculdade de Medicina da Universidade de São Paulo. MÉTODOS: Os testes de processamento auditivo foram aplicados em um grupo de 27 indivíduos, subdivididos em grupos de 11 crianças (7 a 10 anos) e 16 adolescentes (11 a 16 anos) de ambos os sexos, em situação de vulnerabilidade social, e comparado a um grupo controle, formado por 21 crianças, subdivididas em grupos de 10 crianças e 11 adolescentes, pareados por idade, sem queixas. Também se aplicou os PEATE para investigação da integridade da via auditiva. RESULTADOS: Para ambas as faixas etárias, foram encontradas diferenças significantes entre grupos estudo e controle para a maioria dos testes aplicados, sendo que o grupo estudo apresentou desempenho estatisticamente pior do que o controle para todos os testes, exceto para o teste pediatric speech intelligibility. Apenas uma criança apresentou resultado alterado para os PEATE. CONCLUSÕES: Os resultados demonstraram pior desempenho do grupo estudo (crianças e adolescentes) para os testes comportamentais de processamento auditivo, apesar de estes apresentarem integridade da via auditiva em nível de tronco encefálico, demonstrada pela normalidade nos resultados do PEATE.
Resumo:
A body of research has developed within the context of nonlinear signal and image processing that deals with the automatic, statistical design of digital window-based filters. Based on pairs of ideal and observed signals, a filter is designed in an effort to minimize the error between the ideal and filtered signals. The goodness of an optimal filter depends on the relation between the ideal and observed signals, but the goodness of a designed filter also depends on the amount of sample data from which it is designed. In order to lessen the design cost, a filter is often chosen from a given class of filters, thereby constraining the optimization and increasing the error of the optimal filter. To a great extent, the problem of filter design concerns striking the correct balance between the degree of constraint and the design cost. From a different perspective and in a different context, the problem of constraint versus sample size has been a major focus of study within the theory of pattern recognition. This paper discusses the design problem for nonlinear signal processing, shows how the issue naturally transitions into pattern recognition, and then provides a review of salient related pattern-recognition theory. In particular, it discusses classification rules, constrained classification, the Vapnik-Chervonenkis theory, and implications of that theory for morphological classifiers and neural networks. The paper closes by discussing some design approaches developed for nonlinear signal processing, and how the nature of these naturally lead to a decomposition of the error of a designed filter into a sum of the following components: the Bayes error of the unconstrained optimal filter, the cost of constraint, the cost of reducing complexity by compressing the original signal distribution, the design cost, and the contribution of prior knowledge to a decrease in the error. The main purpose of the paper is to present fundamental principles of pattern recognition theory within the framework of active research in nonlinear signal processing.
Resumo:
The post-processing of association rules is a difficult task, since a huge number of rules that are generated are of no interest to the user. To overcome this problem many approaches have been developed, such as objective measures and clustering. However, objective measures don't reduce nor organize the collection of rules, therefore making the understanding of the domain difficult. On the other hand, clustering doesn't reduce the exploration space nor direct the user to find interesting knowledge, therefore making the search for relevant knowledge not so easy. In this context this paper presents the PAR-COM methodology that, by combining clustering and objective measures, reduces the association rule exploration space directing the user to what is potentially interesting. An experimental study demonstrates the potential of PAR-COM to minimize the user's effort during the post-processing process. © 2012 Springer-Verlag.
Resumo:
Digital data sets constitute rich sources of information, which can be extracted and evaluated applying computational tools, for example, those ones for Information Visualization. Web-based applications, such as social network environments, forums and virtual environments for Distance Learning, are good examples for such sources. The great amount of data has direct impact on processing and analysis tasks. This paper presents the computational tool Mapper, defined and implemented to use visual representations - maps, graphics and diagrams - for supporting the decision making process by analyzing data stored in Virtual Learning Environment TelEduc-Unesp. © 2012 IEEE.