869 resultados para Peer-to-peer architecture (Computer networks)
Resumo:
Function approximation is a very important task in environments where computation has to be based on extracting information from data samples in real world processes. Neural networks and wavenets have been recently seen as attractive tools for developing efficient solutions for many real world problems in function approximation. In this paper, it is shown how feedforward neural networks can be built using a different type of activation function referred to as the PPS-wavelet. An algorithm is presented to generate a family of PPS-wavelets that can be used to efficiently construct feedforward networks for function approximation.
Resumo:
This paper describes an innovative approach to develop the understanding about the relevance of mathematics to computer science. The mathematical subjects are introduced through an application-to-model scheme that lead computer science students to a better understanding of why they have to learn math and learn it effectively. Our approach consists of a single one semester course, taught at the first semester of the program, where the students are initially exposed to some typical computer applications. When they recognize the applications' complexity, the instructor gives the mathematical models supporting such applications, even before a formal introduction to the model in a math course. We applied this approach at Unesp (Brazil) and the results include a large reduction in the rate of students that abandon the college and better students in the final years of our program.
Resumo:
This paper presents results from an efficient approach to an automatic detection and extraction of human faces from images with any color, texture or objects in background, that consist in find isosceles triangles formed by the eyes and mouth.
Resumo:
The main objective involved with this paper consists of presenting the results obtained from the application of artificial neural networks and statistical tools in the automatic identification and classification process of faults in electric power distribution systems. The developed techniques to treat the proposed problem have used, in an integrated way, several approaches that can contribute to the successful detection process of faults, aiming that it is carried out in a reliable and safe way. The compilations of the results obtained from practical experiments accomplished in a pilot radial distribution feeder have demonstrated that the developed techniques provide accurate results, identifying and classifying efficiently the several occurrences of faults observed in the feeder.
Resumo:
To simplify computer management, several system administrators are adopting advanced techniques to manage software configuration on grids, but the tight coupling between hardware and software makes every PC an individual managed entity, lowering the scalability and increasing the costs to manage hundreds or thousands of PCs. This paper discusses the feasibility of a distributed virtual machine environment, named Flexlab: a new approach for computer management that combines virtualization and distributed system architectures as the basis of a management system. Flexlab is able to extend the coverage of a computer management solution beyond client operating system limitations and also offers a convenient hardware abstraction, decoupling software and hardware, simplifying computer management. The results obtained in this work indicate that FlexLab is able to overcome the limitations imposed by the coupling between software and hardware, simplifying the management of homogeneous and heterogeneous grids. © 2009 IEEE.
Resumo:
This paper presents a NCAP embedded on DE2 kit with Nios II processor and uClinux to development of a network gateway with two interfaces, wireless (ZigBee) and wired (RS232) based on IEEE 1451. Both the communications, wireless and wired, were developed to be point-to-point and working with the same protocols, based on IEEE 1451.0-2007. The tests were made using a microcomputer, which through of browser was possible access the web page stored in the DE2 kit and send commands of control and monitoring to both TIMs (WTIM and STIM). The system describes a different form of development of the NCAP node to be applied in different environments with wired or wireless in the same node. © 2011 IEEE.
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
Pós-graduação em Ciência da Informação - FFC
Resumo:
Pós-graduação em Ciência da Computação - IBILCE
Resumo:
A capacidade de processamento das instituições de pesquisa vem crescendo significativamente à medida que processadores e estações de trabalho cada vez mais poderosos vão surgindo no mercado. Considerando a melhoria de desempenho na área de redes de computadores e visando suprir a demanda por processamento cada vez maior, surgiu a ideia de utilizar computadores independentes conectados em rede como plataforma para execução de aplicações paralelas, originando assim a área de computação em grade. Em uma rede que se encontra sob um mesmo domínio administrativo, é comum que exista o compartilhamento de recursos como discos, impressoras, etc. Mas quando a rede ultrapassa um domínio administrativo, este compartilhamento se torna muito limitado. A finalidade das grades de computação é permitir compartilhamento de recursos mesmo que estes estejam espalhados por diversos domínios administrativos. Esta dissertação propõe uma arquitetura para o estabelecimento dinâmico de conexões multidomínio que faz uso da comutação de rajadas ópticas (OBS – Optical Burst Switching) utilizando um plano de controle GMPLS (Generalized Multiprotocol Label Switching). A arquitetura baseia-se no armazenamento de informações sobre recursos de grade de sistemas autônomos (AS -Autonomous Systems) distintos em um componente chamado Servidor GOBS Raiz (Grid OBS) e na utilização do roteamento explícito para reservar os recursos ao longo de uma rota que satisfaça as restrições de desempenho de uma aplicação. A validação da proposta é feita através de simulações que mostram que a arquitetura é capaz de garantir níveis de desempenho diferenciados de acordo com a classe da aplicação e proporciona uma melhor utilização dos recursos de rede e de computação.
Resumo:
The tool proposed, known as WSPControl, enables remote monitoring of computers across the Internet using distributed applications. Through a Web Services architecture is possible the communication between these distributed applications across heterogeneous platforms, also eliminates the need for additional settings in computer networks, such as release of ports or proxy. The tool is divided into three modules, namely: • Client Interface: developed in C Sharp, is responsible for capturing data on performance of the monitored computer also connects to the Web Services to report this data. • Web Services Interface: developed in PHP using the PHP SOAP library, is responsible for facilitating the communication between internet applications and client. • Internet Interface: developed in PHP, is responsible for reading and interpreting the information captured these available on the Internet
Resumo:
Currently, mammalian cells are the most utilized hosts for biopharmaceutical production. The culture media for these cell lines include commonly in their composition a pH indicator. Spectroscopic techniques are used for biopharmaceutical process monitoring, among them, UV–Vis spectroscopy has found scarce applications. This work aimed to define artificial neural networks architecture and fit its parameters to predict some nutrients and metabolites, as well as viable cell concentration based on UV–Vis spectral data of mammalian cell bioprocess using phenol red in culture medium. The BHK-21 cell line was used as a mammalian cell model. Off-line spectra of supernatant samples taken from batches performed at different dissolved oxygen concentrations in two bioreactor configurations and with two pH control strategies were used to define two artificial neural networks. According to absolute errors, glutamine (0.13 ± 0.14 mM), glutamate (0.02 ± 0.02 mM), glucose (1.11 ± 1.70 mM), lactate (0.84 ± 0.68 mM) and viable cell concentrations (1.89 105 ± 1.90 105 cell/mL) were suitably predicted. The prediction error averages for monitored variables were lower than those previously reported using different spectroscopic techniques in combination with partial least squares or artificial neural network. The present work allows for UV–VIS sensor development, and decreases cost related to nutrients and metabolite quantifications.
Resumo:
In this paper, a cross-layer solution for packet size optimization in wireless sensor networks (WSN) is introduced such that the effects of multi-hop routing, the broadcast nature of the physical wireless channel, and the effects of error control techniques are captured. A key result of this paper is that contrary to the conventional wireless networks, in wireless sensor networks, longer packets reduce the collision probability. Consequently, an optimization solution is formalized by using three different objective functions, i.e., packet throughput, energy consumption, and resource utilization. Furthermore, the effects of end-to-end latency and reliability constraints are investigated that may be required by a particular application. As a result, a generic, cross-layer optimization framework is developed to determine the optimal packet size in WSN. This framework is further extended to determine the optimal packet size in underwater and underground sensor networks. From this framework, the optimal packet sizes under various network parameters are determined.
Resumo:
This paper discusses some aspects related to Wireless Sensor Networks over the IEEE 802.15.4 standard, and proposes, for the very first time, a mesh network topology with geographic routing integrated to the open Freescale protocol (SMAC - Simple Medium Access Control). For this is proposed the SMAC routing protocol. Before this work the SMAC protocol was suitable to perform one hop communications only. However, with the developed mechanisms, it is possible to use multi-hop communication. Performance results from the implemented protocol are presented and analyzed in order to define important requirements for wireless sensor networks, such as robustness, self-healing property and low latency. (c) 2011 Elsevier Ltd. All rights reserved.
Resumo:
The use of statistical methods to analyze large databases of text has been useful in unveiling patterns of human behavior and establishing historical links between cultures and languages. In this study, we identified literary movements by treating books published from 1590 to 1922 as complex networks, whose metrics were analyzed with multivariate techniques to generate six clusters of books. The latter correspond to time periods coinciding with relevant literary movements over the last five centuries. The most important factor contributing to the distinctions between different literary styles was the average shortest path length, in particular the asymmetry of its distribution. Furthermore, over time there has emerged a trend toward larger average shortest path lengths, which is correlated with increased syntactic complexity, and a more uniform use of the words reflected in a smaller power-law coefficient for the distribution of word frequency. Changes in literary style were also found to be driven by opposition to earlier writing styles, as revealed by the analysis performed with geometrical concepts. The approaches adopted here are generic and may be extended to analyze a number of features of languages and cultures.