54 resultados para Inteligência escalável


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Central Nervous System are the most common pediatric solid tumors. 60% of these tumors arise in posterior fossa, mainly in cerebellum. The first therapeutic approach is surgical resection. Malignant tumors require additional strategies - chemotherapy and radiotherapy. The increasing survival evidences that childhood brain tumors result in academic and social difficulties that compromise the quality of life of the patients. This study investigated the intellectual functioning of children between 7 to 15 years diagnosed with posterior fossa tumors and treated at CEHOPE - Recife / PE. 21 children were eligible - including 13 children with pilocytic astrocytoma (G1) who underwent only surgery resection, and eight children with medulloblastoma (G2) - submitted to surgical resection, chemotherapy and craniospinal radiotherapy. Participants were evaluated by the Wechsler Intelligence Scale for Children - WISC-III. Children of G1 scored better than children of G2. Inferential tools (Mann-Whitney Ü Test) identified significant diferences (p ≤ 0.05) between the Performance IQ (PIQ) and Processing Speed Index (PSI) as a function of treatment modality; Full Scale IQ (FSIQ), PIQ and PSI as a function of parental educational level; PIQ, FSIQ, IVP and Freedom from Distractibility (FDI) as a function of time between diagnosis and evaluation. These results showed the late and progressive impact of radiotherapy on white matter and information processing speed. Furthermore, children whose parents have higher educational level showed better intellectual performance, indicating the influence of xxii socio-cultural variables on cognitive development. The impact of cancer and its treatment on cognitive development and learning should not be underestimated. These results support the need to increase the understanding of such effects in order to propose therapeutic strategies which ensure that, in addition to the cure, the full development of children with this pathology

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The sleep is an active brain process that allows the efficient realization of daily tasks. The changes on sleep patterns may influence the different cognitive processes performance. Many recent studies show the possibility of cognitive performance improvement, through the cognitive training with the use of computer games. The question is if these interventions may be influenced by the sleep quality. Thus, we evaluated the sleep quality effect about the efficacy of an intervention with computers games based on the working memory and attention for a cognitive performance training of elementary school students. The sample was constituted by 42 students with average age of 10,43 years old (SD=1,23), with 22 male participants and 20 female participants. We used to evaluate the sleep with the parents a sleep questionnaire, a sleep diary and the Sleep Behavior Questionnaire. In regard to intervention, the subjectives were distributed in an experimental and in a control group, both with 21 participants. In the first group occurred the intervention that consisted in the working memory and attention training with two cognitive tasks (Safari e Brain Workshop) during 30 daily minutes, for a 6 weeks period. In an equal period, the students from the control group should reproduce an artwork using drawing software. To evaluate the cognitive performance we applied before and after the intervention period the Wechsler Intelligence Scale for Children (WISC-III). The results showed that in both the groups the performance of the intelligence, working memory, attention and visuospatial skills was below of the mean. The cognitive processes evaluated after of intervention in the experimental group had a performance significantly higher in the Perceptual Reasoning Index (t = -6,24; p < 0,01) and in the Full Scale IQ (t = -5,09; p < 0,01) and Performance IQ (t = -6,52; p < 0,01), suggesting a improvement on the visuospatial skills, attention, working memory and processing speed. On the control group, the performance was significantly higher in the Coding subtest (t = -5,38; p< 0,01) and in the Perceptual Reasoning Index (t = -3,66; p = 0,01), suggesting a improvement on the visuospatial skills and attention. The mean obtained with the Sleep Behavior Questionnaire was 53,76 (SD=14,96) for an experimental group and 61,19 (SD=12,82) for a control group, indicating tendency for a bad sleep quality in that last one. Not only during the first days, but also in the last fifteen days of the intervention we verified in the two groups an adequate time to sleep, duration and regularity, in the weekdays and on the weekends. We didn t find significant differences between the two groups in none of the sleep variables. We verified statistically meaningful improvement on the performance of the experimental group with the intervention in the two games. We didn t verify significant correlations between the games performance index and in the sleep variables of the experimental group individuals. We verified significant correlations among the performance on the Brain Workshop and the Cubes subtest, the Perceptual Reasoning Index and the Scale Performance IQ, suggesting that the significant improvement of the visuospatial skills and of the attention was correlated with the performance in the Brain Workshop. Although the absence of correlations with the performance in the Safari, possibly it also has relieved in the improvement of the cognitive performance. The findings support the hypothesis that the computer games might be a satisfactory tool for the improvement of the performance in visuospatial skills and attention. This can be resulted of the insertion of visuospatial stimulations in the task, for example, graphical elements with thematic for children that increase the interest. The IQ below mean the individuals might have influenced the improvement absence on the cognitive processes like the working memory with games. Moreover, it wasn t verified a relation between the sleep quality and the intervention efficacy. It might have been influenced by the n of the sample. Future studies must focalize in the improvement of the effect of the interventions with games

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This study deals with cognitive competences and abilities that are relevant to selection and education regarding Information Technology (IT). These competences relate to problem solving, decision making, and practical intelligence that regard scholar and extracurricular knowledge mobilization. The research aimed to contribute for the improvement of a selection instrument, consisting of five arrays of skills (dealing with objectives and prospection), as well as the development and comprehension of those skills that are involved in IT education. This is done by means of an analysis on the selection instrument used in the first selective process that occurred at Metropole Digital an Institute at the Federal University of Rio Grande do Norte in Brazil. This was evaluated aiming to acknowledge IT education (with basic training and emphasis on Web programming and electronics). The methodology used was of quantitative method involving performance scores relating education delivery. An Anova analysis of variance was done along with descriptive analysis involving socioeconomic data that was not observed in the meaningful relations between parental instruction and student performance in the graduate course. These analyses were able to point out the importance and need of the policies for vacancy reservation on behalf of public school students. A Spearman correlation analysis was done considering the instrument selection performance in the training course. The instrument is presented as a predictor that is significantly moderate and presents a good performance in the course as a whole. A Cluster and Regression analysis was also realized in the process. The first analysis allowed finding performance groups (Clusters) that ranged from medium and inferior. The regression analysis was able to point out association amongst criterion variables and the (average performance in basic and advanced modules) and explanatory (five matrixes). Regression analysis indicated that matrix 1 and matrix 3 were pointed out as being the strongest ones. In all the above analysis, the correlation between the instrument and the course was considered moderate. Thus this can be related in some of the aspects present in the course such as emphasis on evaluation itself as well as in technical contents and practical skills (educational ones) and competences and selection skills. It is known that the mediation of technological artifact in cultural context can foster the development of skills and abilities relevant to IT training. This study provides subsidies to reflect on the adoption of selection instrument and IT training in the Institute. Thus the research offers means to achieve a interdisciplinary discussion and enriching of areas such as Psychology and Information Technology; all of which regarding competencies and skills relevant in IT training

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This work presents JFLoat, a software implementation of IEEE-754 standard for binary floating point arithmetic. JFloat was built to provide some features not implemented in Java, specifically directed rounding support. That feature is important for Java-XSC, a project developed in this Department. Also, Java programs should have same portability when using floating point operations, mainly because IEEE-754 specifies that programs should have exactly same behavior on every configuration. However, it was noted that programs using Java native floating point types may be machine and operating system dependent. Also, JFloat is a possible solution to that problem

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The use of clustering methods for the discovery of cancer subtypes has drawn a great deal of attention in the scientific community. While bioinformaticians have proposed new clustering methods that take advantage of characteristics of the gene expression data, the medical community has a preference for using classic clustering methods. There have been no studies thus far performing a large-scale evaluation of different clustering methods in this context. This work presents the first large-scale analysis of seven different clustering methods and four proximity measures for the analysis of 35 cancer gene expression data sets. Results reveal that the finite mixture of Gaussians, followed closely by k-means, exhibited the best performance in terms of recovering the true structure of the data sets. These methods also exhibited, on average, the smallest difference between the actual number of classes in the data sets and the best number of clusters as indicated by our validation criteria. Furthermore, hierarchical methods, which have been widely used by the medical community, exhibited a poorer recovery performance than that of the other methods evaluated. Moreover, as a stable basis for the assessment and comparison of different clustering methods for cancer gene expression data, this study provides a common group of data sets (benchmark data sets) to be shared among researchers and used for comparisons with new methods

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Intendding to understand how the human mind operates, some philosophers and psycologists began to study about rationality. Theories were built from those studies and nowadays that interest have been extended to many other areas such as computing engineering and computing science, but with a minimal distinction at its goal: to understand the mind operational proccess and apply it on agents modelling to become possible the implementation (of softwares or hardwares) with the agent-oriented paradigm where agents are able to deliberate their own plans of actions. In computing science, the sub-area of multiagents systems has progressed using several works concerning artificial intelligence, computational logic, distributed systems, games theory and even philosophy and psycology. This present work hopes to show how it can be get a logical formalisation extention of a rational agents architecture model called BDI (based in a philosophic Bratman s Theory) in which agents are capable to deliberate actions from its beliefs, desires and intentions. The formalisation of this model is called BDI logic and it is a modal logic (in general it is a branching time logic) with three access relations: B, D and I. And here, it will show two possible extentions that tranform BDI logic in a modal-fuzzy logic where the formulae and the access relations can be evaluated by values from the interval [0,1]

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The objective of the researches in artificial intelligence is to qualify the computer to execute functions that are performed by humans using knowledge and reasoning. This work was developed in the area of machine learning, that it s the study branch of artificial intelligence, being related to the project and development of algorithms and techniques capable to allow the computational learning. The objective of this work is analyzing a feature selection method for ensemble systems. The proposed method is inserted into the filter approach of feature selection method, it s using the variance and Spearman correlation to rank the feature and using the reward and punishment strategies to measure the feature importance for the identification of the classes. For each ensemble, several different configuration were used, which varied from hybrid (homogeneous) to non-hybrid (heterogeneous) structures of ensemble. They were submitted to five combining methods (voting, sum, sum weight, multiLayer Perceptron and naïve Bayes) which were applied in six distinct database (real and artificial). The classifiers applied during the experiments were k- nearest neighbor, multiLayer Perceptron, naïve Bayes and decision tree. Finally, the performance of ensemble was analyzed comparatively, using none feature selection method, using a filter approach (original) feature selection method and the proposed method. To do this comparison, a statistical test was applied, which demonstrate that there was a significant improvement in the precision of the ensembles

Relevância:

10.00% 10.00%

Publicador:

Resumo:

On the last years, several middleware platforms for Wireless Sensor Networks (WSN) were proposed. Most of these platforms does not consider issues of how integrate components from generic middleware architectures. Many requirements need to be considered in a middleware design for WSN and the design, in this case, it is possibility to modify the source code of the middleware without changing the external behavior of the middleware. Thus, it is desired that there is a middleware generic architecture that is able to offer an optimal configuration according to the requirements of the application. The adoption of middleware based in component model consists of a promising approach because it allows a better abstraction, low coupling, modularization and management features built-in middleware. Another problem present in current middleware consists of treatment of interoperability with external networks to sensor networks, such as Web. Most current middleware lacks the functionality to access the data provided by the WSN via the World Wide Web in order to treat these data as Web resources, and they can be accessed through protocols already adopted the World Wide Web. Thus, this work presents the Midgard, a component-based middleware specifically designed for WSNs, which adopts the architectural patterns microkernel and REST. The microkernel architectural complements the component model, since microkernel can be understood as a component that encapsulates the core system and it is responsible for initializing the core services only when needed, as well as remove them when are no more needed. Already REST defines a standardized way of communication between different applications based on standards adopted by the Web and enables him to treat WSN data as web resources, allowing them to be accessed through protocol already adopted in the World Wide Web. The main goals of Midgard are: (i) to provide easy Web access to data generated by WSN, exposing such data as Web resources, following the principles of Web of Things paradigm and (ii) to provide WSN application developer with capabilities to instantiate only specific services required by the application, thus generating a customized middleware and saving node resources. The Midgard allows use the WSN as Web resources and still provide a cohesive and weakly coupled software architecture, addressing interoperability and customization. In addition, Midgard provides two services needed for most WSN applications: (i) configuration and (ii) inspection and adaptation services. New services can be implemented by others and easily incorporated into the middleware, because of its flexible and extensible architecture. According to the assessment, the Midgard provides interoperability between the WSN and external networks, such as web, as well as between different applications within a single WSN. In addition, we assessed the memory consumption, the application image size, the size of messages exchanged in the network, and response time, overhead and scalability on Midgard. During the evaluation, the Midgard proved satisfies their goals and shown to be scalable without consuming resources prohibitively

Relevância:

10.00% 10.00%

Publicador:

Resumo:

There is a need for multi-agent system designers in determining the quality of systems in the earliest phases of the development process. The architectures of the agents are also part of the design of these systems, and therefore also need to have their quality evaluated. Motivated by the important role that emotions play in our daily lives, embodied agents researchers have aimed to create agents capable of producing affective and natural interaction with users that produces a beneficial or desirable result. For this, several studies proposing architectures of agents with emotions arose without the accompaniment of appropriate methods for the assessment of these architectures. The objective of this study is to propose a methodology for evaluating architectures emotional agents, which evaluates the quality attributes of the design of architectures, in addition to evaluation of human-computer interaction, the effects on the subjective experience of users of applications that implement it. The methodology is based on a model of well-defined metrics. In assessing the quality of architectural design, the attributes assessed are: extensibility, modularity and complexity. In assessing the effects on users' subjective experience, which involves the implementation of the architecture in an application and we suggest to be the domain of computer games, the metrics are: enjoyment, felt support, warm, caring, trust, cooperation, intelligence, interestingness, naturalness of emotional reactions, believabiliy, reducing of frustration and likeability, and the average time and average attempts. We experimented with this approach and evaluate five architectures emotional agents: BDIE, DETT, Camurra-Coglio, EBDI, Emotional-BDI. Two of the architectures, BDIE and EBDI, were implemented in a version of the game Minesweeper and evaluated for human-computer interaction. In the results, DETT stood out with the best architectural design. Users who have played the version of the game with emotional agents performed better than those who played without agents. In assessing the subjective experience of users, the differences between the architectures were insignificant

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Os sensores inteligentes são dispositivos que se diferenciam dos sensores comuns por apresentar capacidade de processamento sobre os dados monitorados. Eles tipicamente são compostos por uma fonte de alimentação, transdutores (sensores e atuadores), memória, processador e transceptor. De acordo com o padrão IEEE 1451 um sensor inteligente pode ser dividido em módulos TIM e NCAP que devem se comunicar através de uma interface padronizada chamada TII. O módulo NCAP é a parte do sensor inteligente que comporta o processador. Portanto, ele é o responsável por atribuir a característica de inteligência ao sensor. Existem várias abordagens que podem ser utilizadas para o desenvolvimento desse módulo, dentre elas se destacam aquelas que utilizam microcontroladores de baixo custo e/ou FPGA. Este trabalho aborda o desenvolvimento de uma arquitetura hardware/software para um módulo NCAP segundo o padrão IEEE 1451.1. A infra-estrutura de hardware é composta por um driver de interface RS-232, uma memória RAM de 512kB, uma interface TII, o processador embarcado NIOS II e um simulador do módulo TIM. Para integração dos componentes de hardware é utilizada ferramenta de integração automática SOPC Builder. A infra-estrutura de software é composta pelo padrão IEEE 1451.1 e pela aplicação especí ca do NCAP que simula o monitoramento de pressão e temperatura em poços de petróleo com o objetivo de detectar vazamento. O módulo proposto é embarcado em uma FPGA e para a sua prototipação é usada a placa DE2 da Altera que contém a FPGA Cyclone II EP2C35F672C6. O processador embarcado NIOS II é utilizado para dar suporte à infra-estrutura de software do NCAP que é desenvolvido na linguagem C e se baseia no padrão IEEE 1451.1. A descrição do comportamento da infra-estrutura de hardware é feita utilizando a linguagem VHDL

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The northern portion of the Rio Grande do Norte State is characterized by intense coastal dynamics affecting areas with ecosystems of moderate to high environmental sensitivity. In this region are installed the main socioeconomic activities of RN State: salt industry, shrimp farm, fruit industry and oil industry. The oil industry suffers the effects of coastal dynamic action promoting problems such as erosion and exposure of wells and pipelines along the shore. Thus came the improvement of such modifications, in search of understanding of the changes which causes environmental impacts with the purpose of detecting and assessing areas with greater vulnerability to variations. Coastal areas under influence oil industry are highly vulnerable and sensitive in case of accidents involving oil spill in the vicinity. Therefore, it was established the geoenvironmental monitoring of the region with the aim of evaluating the entire coastal area evolution and check the sensitivity of the site on the presence of oil. The goal of this work was the implementation of a computer system that combines the needs of insertion and visualization of thematic maps for the generation of Environmental Vulnerability maps, using techniques of Business Intelligence (BI), from vector information previously stored in the database. The fundamental design interest was to implement a more scalable system that meets the diverse fields of study and make the appropriate system for generating online vulnerability maps, automating the methodology so as to facilitate data manipulation and fast results in cases of real time operational decision-making. In database development a geographic area was established the conceptual model of the selected data and Web system was done using the template database PostgreSQL, PostGis spatial extension, Glassfish Web server and the viewer maps Web environment, the GeoServer. To develop a geographic database it was necessary to generate the conceptual model of the selected data and the Web system development was done using the PostgreSQL database system, its spatial extension PostGIS, the web server Glassfish and GeoServer to display maps in Web

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Computational Intelligence Methods have been expanding to industrial applications motivated by their ability to solve problems in engineering. Therefore, the embedded systems follow the same idea of using computational intelligence tools embedded on machines. There are several works in the area of embedded systems and intelligent systems. However, there are a few papers that have joined both areas. The aim of this study was to implement an adaptive fuzzy neural hardware with online training embedded on Field Programmable Gate Array – FPGA. The system adaptation can occur during the execution of a given application, aiming online performance improvement. The proposed system architecture is modular, allowing different configurations of fuzzy neural network topologies with online training. The proposed system was applied to: mathematical function interpolation, pattern classification and selfcompensation of industrial sensors. The proposed system achieves satisfactory performance in both tasks. The experiments results shows the advantages and disadvantages of online training in hardware when performed in parallel and sequentially ways. The sequentially training method provides economy in FPGA area, however, increases the complexity of architecture actions. The parallel training method achieves high performance and reduced processing time, the pipeline technique is used to increase the proposed architecture performance. The study development was based on available tools for FPGA circuits.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A computação ubíqua é um paradigma no qual dispositivos com capacidade de processamento e comunicação são embutidos nos elementos comuns de nossas vidas (casas, carros, máquinas fotográficas, telefones, escolas, museus, etc), provendo serviços com um alto grau de mobilidade e transparência. O desenvolvimento de sistemas ubíquos é uma tarefa complexa, uma vez que envolve várias áreas da computação, como Engenharia de Software, Inteligência Artificial e Sistemas Distribuídos. Essa tarefa torna-se ainda mais complexa pela ausência de uma arquitetura de referência para guiar o desenvolvimento de tais sistemas. Arquiteturas de referência têm sido usadas para fornecer uma base comum e dar diretrizes para a construção de arquiteturas de softwares para diferentes classes de sistemas. Por outro lado, as linguagens de descrição arquitetural (ADLs) fornecem uma sintaxe para representação estrutural dos elementos arquiteturais, suas restrições e interações, permitindo-se expressar modelo arquitetural de sistemas. Atualmente não há, na literatura, ADLs baseadas em arquiteturas de referência para o domínio de computação ubíqua. De forma a permitir a modelagem arquitetural de aplicações ubíquas, esse trabalho tem como objetivo principal especificar UbiACME, uma linguagem de descrição arquitetural para aplicações ubíquas, bem como disponibilizar a ferramenta UbiACME Studio, que permitirá arquitetos de software realizar modelagens usando UbiACME. Para esse fim, inicialmente realizamos uma revisão sistemática, de forma a investigar na literatura relacionada com sistemas ubíquos, os elementos comuns a esses sistemas que devem ser considerados no projeto de UbiACME. Além disso, com base na revisão sistemática, definimos uma arquitetura de referência para sistemas ubíquos, RA-Ubi, que é a base para a definição dos elementos necessários para a modelagem arquitetural e, portanto, fornece subsídios para a definição dos elementos de UbiACME. Por fim, de forma a validar a linguagem e a ferramenta, apresentamos um experimento controlado onde arquitetos modelam uma aplicação ubíqua usando UbiACME Studio e comparam com a modelagem da mesma aplicação em SySML.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Educational Data Mining is an application domain in artificial intelligence area that has been extensively explored nowadays. Technological advances and in particular, the increasing use of virtual learning environments have allowed the generation of considerable amounts of data to be investigated. Among the activities to be treated in this context exists the prediction of school performance of the students, which can be accomplished through the use of machine learning techniques. Such techniques may be used for student’s classification in predefined labels. One of the strategies to apply these techniques consists in their combination to design multi-classifier systems, which efficiency can be proven by results achieved in other studies conducted in several areas, such as medicine, commerce and biometrics. The data used in the experiments were obtained from the interactions between students in one of the most used virtual learning environments called Moodle. In this context, this paper presents the results of several experiments that include the use of specific multi-classifier systems systems, called ensembles, aiming to reach better results in school performance prediction that is, searching for highest accuracy percentage in the student’s classification. Therefore, this paper presents a significant exploration of educational data and it shows analyzes of relevant results about these experiments.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The main thesis to be demonstrated in this work is that cognitive enhancement through the use of drugs can be included as a primary good within Rawls' thinking. To develop such notion, the text is structured in two parts. The first part intends to describe the theory of justice as equity in its elements directly related to primary goods. The first information to be verified is the unity of the notion of primary goods in all of Rawls' work. Some elements are modified, for example the distinction of natural and social primary goods. Natural primary goods are intelligence, health, imagination, vigor and chance (luck) and social primary goods are law and liberty, opportunity and power, income and wealth and the social fundaments of self-respect. The perception of some talents such as intelligence has also undergone changes, being altered from "higher intelligence" to "educated intelligence". Such fact highlights education as a primary good that permeates all of Rawls' work in different perspectives. Freedom and self-respect are social-primary goods that will also be deepened. The second part presents the definition of improvement and as to show that the distinction between enhancement and treatment is controversial. The part presents the definition of improvement and as the distinction between enhancement and treatment is controversial. Thus, we have deepened the problems related to practice improvement (enhancement) showing how the concepts of Rawls' primary goods as freedom and self-respect are not in opposition to the practice of improvement, particularly cognitive enhancement. We have shown, instead, that the ban of cognitive improvement could lead to denial of these primary goods. But how could we consider cognitive improvement as a primary social good? What we have done in this thesis is to show how cognitive enhancement is important to ensure that primary products are accessible to citizens, and we rebuilt the process that Rawls uses for choosing his primary goods to test that cognitive enhancement through drugs could perfectly be introduced as such.