112 resultados para Internet (Redes de computação) - Provedores de serviços


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The use of wireless sensor and actuator networks in industry has been increasing past few years, bringing multiple benefits compared to wired systems, like network flexibility and manageability. Such networks consists of a possibly large number of small and autonomous sensor and actuator devices with wireless communication capabilities. The data collected by sensors are sent directly or through intermediary nodes along the network to a base station called sink node. The data routing in this environment is an essential matter since it is strictly bounded to the energy efficiency, thus the network lifetime. This work investigates the application of a routing technique based on Reinforcement Learning s Q-Learning algorithm to a wireless sensor network by using an NS-2 simulated environment. Several metrics like energy consumption, data packet delivery rates and delays are used to validate de proposal comparing it with another solutions existing in the literature

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Ethernet technology dominates the market of computer local networks. However, it was not been established as technology for industrial automation set, where the requirements demand determinism and real-time performance. Many solutions have been proposed to solve the problem of non-determinism, which are based mainly on TDMA (Time Division Multiple Access), Token Passing and Master-Slave. This work of research carries through measured of performance that allows to compare the behavior of the Ethernet nets when submitted with the transmissions of data on protocols UDP and RAW Ethernet, as well as, on three different types of Ethernet technologies. The objective is to identify to the alternative amongst the protocols and analyzed Ethernet technologies that offer to a more satisfactory support the nets of the industrial automation and distributed real-time application

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A serious problem that affects an oil refinery s processing units is the deposition of solid particles or the fouling on the equipments. These residues are naturally present on the oil or are by-products of chemical reactions during its transport. A fouled heat exchanger loses its capacity to adequately heat the oil, needing to be shut down periodically for cleaning. Previous knowledge of the best period to shut down the exchanger may improve the energetic and production efficiency of the plant. In this work we develop a system to predict the fouling on a heat exchanger from the Potiguar Clara Camarão Refinery, based on data collected in a partnership with Petrobras. Recurrent Neural Networks are used to predict the heat exchanger s flow in future time. This variable is the main indicator of fouling, because its value decreases gradually as the deposits on the tubes reduce their diameter. The prediction could be used to tell when the flow will have decreased under an acceptable value, indicating when the exchanger shutdown for cleaning will be needed

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Complex network analysis is a powerful tool into research of complex systems like brain networks. This work aims to describe the topological changes in neural functional connectivity networks of neocortex and hippocampus during slow-wave sleep (SWS) in animals submited to a novel experience exposure. Slow-wave sleep is an important sleep stage where occurs reverberations of electrical activities patterns of wakeness, playing a fundamental role in memory consolidation. Although its importance there s a lack of studies that characterize the topological dynamical of functional connectivity networks during that sleep stage. There s no studies that describe the topological modifications that novel exposure leads to this networks. We have observed that several topological properties have been modified after novel exposure and this modification remains for a long time. Major part of this changes in topological properties by novel exposure are related to fault tolerance

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Bayesian networks are powerful tools as they represent probability distributions as graphs. They work with uncertainties of real systems. Since last decade there is a special interest in learning network structures from data. However learning the best network structure is a NP-Hard problem, so many heuristics algorithms to generate network structures from data were created. Many of these algorithms use score metrics to generate the network model. This thesis compare three of most used score metrics. The K-2 algorithm and two pattern benchmarks, ASIA and ALARM, were used to carry out the comparison. Results show that score metrics with hyperparameters that strength the tendency to select simpler network structures are better than score metrics with weaker tendency to select simpler network structures for both metrics (Heckerman-Geiger and modified MDL). Heckerman-Geiger Bayesian score metric works better than MDL with large datasets and MDL works better than Heckerman-Geiger with small datasets. The modified MDL gives similar results to Heckerman-Geiger for large datasets and close results to MDL for small datasets with stronger tendency to select simpler network structures

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work has as main objective the application of Artificial Neural Networks, ANN, in the resolution of problems of RF /microwaves devices, as for example the prediction of the frequency response of some structures in an interest region. Artificial Neural Networks, are presently a alternative to the current methods of analysis of microwaves structures. Therefore they are capable to learn, and the more important to generalize the acquired knowledge, from any type of available data, keeping the precision of the original technique and adding the low computational cost of the neural models. For this reason, artificial neural networks are being increasily used for modeling microwaves devices. Multilayer Perceptron and Radial Base Functions models are used in this work. The advantages/disadvantages of these models and the referring algorithms of training of each one are described. Microwave planar devices, as Frequency Selective Surfaces and microstrip antennas, are in evidence due the increasing necessities of filtering and separation of eletromagnetic waves and the miniaturization of RF devices. Therefore, it is of fundamental importance the study of the structural parameters of these devices in a fast and accurate way. The presented results, show to the capacities of the neural techniques for modeling both Frequency Selective Surfaces and antennas

Relevância:

30.00% 30.00%

Publicador:

Resumo:

O presente estudo discute a formação de Redes Sociais no cotidiano da Estratégia Saúde da Família, a partir de aportes da teoria sociológica sobre redes, interações, dádiva e reconhecimento. O objetivo geral é analisar as redes sociais locais em saúde a partir da interação de usuários e profissionais da Estratégia Saúde da Família na Unidade de Saúde de Ligéia, em Natal, RN. Seus objetivos específicos são: Mapear as redes sociais locais em saúde existentes no território adscrito; Identificar os tipos de interações cotidianas entre os sujeitos; Compreender a percepção dos sujeitos sobre o processo de formação de redes sociais a partir das interações. Caracteriza-se enquanto pesquisa qualitativa exploratória cujos sujeitos foram profissionais e usuários vinculados à referida unidade de saúde. Para a coleta de dados foram utilizadas entrevistas individuais semiestruturadas e debates em grupos focais, estimulados pela Metodologia de Análise de Redes do Cotidiano (MARES), pertinente para abordar a complexidade das relações sociais e mapear os diferentes conteúdos expressos e as formas de mobilização coletiva. A análise dos dados foi realizada através da Técnica de Análise Temática de Conteúdo, proposta por Minayo. Os resultados foram interpretados à luz das Teorias da Dádiva (Mauss) e do Reconhecimento (Honneth). Os sujeitos visualizaram: Rede Virtual (28,20%); Rede de Atenção à Saúde (25,64%); Redes de Usuários (17,95%); Rede Pessoal (10,26%); Conselho Comunitário (10,26%); Escolas (7,69%). Os participantes não perceberam os arranjos familiares enquanto Redes Sociais. Os tipos de interações sociais identificadas foram: Confrontação/Negociação (41.02%); Harmônicas (25,70%); Correlativas (17,90%); Definidas pela Organização (15,38%). A formação de redes sociais ocorre a partir de interações cotidianas entre pessoas, pela articulação inseparável de conteúdos e formas, catalisadas pelo contexto, experiência e cognição, valorizando a liberdade, a expressividade e a diversidade dos parceiros de significação. Foram encontradas duas categorias, na percepção dos sujeitos, sobre a formação de redes sociais do cotidiano: Diálogo e Encontro. Validamos e recomendamos o uso da metodologia MARES: Na formação, para despertar uma visão mais tolerante e humana de si e do outro; Na avaliação qualitativa dos serviços, por facilitar a reflexão sobre a prática e (re)organização do processo de trabalho; Na comunidade, para estimular movimentos sociais existentes ou emergentes. A aposta no circuito da dádiva e do reconhecimento recíproco, durante o trânsito nas redes sociais em saúde, pode ser capaz de tecer uma práxis transformadora, pela busca e alcance de confiança, respeito e estima, nos espaços de encontro entre usuários e profissionais da Estratégia Saúde da Família

Relevância:

30.00% 30.00%

Publicador:

Resumo:

One of the current challenges of Ubiquitous Computing is the development of complex applications, those are more than simple alarms triggered by sensors or simple systems to configure the environment according to user preferences. Those applications are hard to develop since they are composed by services provided by different middleware and it is needed to know the peculiarities of each of them, mainly the communication and context models. This thesis presents OpenCOPI, a platform which integrates various services providers, including context provision middleware. It provides an unified ontology-based context model, as well as an environment that enable easy development of ubiquitous applications via the definition of semantic workflows that contains the abstract description of the application. Those semantic workflows are converted into concrete workflows, called execution plans. An execution plan consists of a workflow instance containing activities that are automated by a set of Web services. OpenCOPI supports the automatic Web service selection and composition, enabling the use of services provided by distinct middleware in an independent and transparent way. Moreover, this platform also supports execution adaptation in case of service failures, user mobility and degradation of services quality. The validation of OpenCOPI is performed through the development of case studies, specifically applications of the oil industry. In addition, this work evaluates the overhead introduced by OpenCOPI and compares it with the provided benefits, and the efficiency of OpenCOPI s selection and adaptation mechanism

Relevância:

30.00% 30.00%

Publicador:

Resumo:

It bet on the next generation of computers as architecture with multiple processors and/or multicore processors. In this sense there are challenges related to features interconnection, operating frequency, the area on chip, power dissipation, performance and programmability. The mechanism of interconnection and communication it was considered ideal for this type of architecture are the networks-on-chip, due its scalability, reusability and intrinsic parallelism. The networks-on-chip communication is accomplished by transmitting packets that carry data and instructions that represent requests and responses between the processing elements interconnected by the network. The transmission of packets is accomplished as in a pipeline between the routers in the network, from source to destination of the communication, even allowing simultaneous communications between pairs of different sources and destinations. From this fact, it is proposed to transform the entire infrastructure communication of network-on-chip, using the routing mechanisms, arbitration and storage, in a parallel processing system for high performance. In this proposal, the packages are formed by instructions and data that represent the applications, which are executed on routers as well as they are transmitted, using the pipeline and parallel communication transmissions. In contrast, traditional processors are not used, but only single cores that control the access to memory. An implementation of this idea is called IPNoSys (Integrated Processing NoC System), which has an own programming model and a routing algorithm that guarantees the execution of all instructions in the packets, preventing situations of deadlock, livelock and starvation. This architecture provides mechanisms for input and output, interruption and operating system support. As proof of concept was developed a programming environment and a simulator for this architecture in SystemC, which allows configuration of various parameters and to obtain several results to evaluate it

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The increase of capacity to integrate transistors permitted to develop completed systems, with several components, in single chip, they are called SoC (System-on-Chip). However, the interconnection subsystem cans influence the scalability of SoCs, like buses, or can be an ad hoc solution, like bus hierarchy. Thus, the ideal interconnection subsystem to SoCs is the Network-on-Chip (NoC). The NoCs permit to use simultaneous point-to-point channels between components and they can be reused in other projects. However, the NoCs can raise the complexity of project, the area in chip and the dissipated power. Thus, it is necessary or to modify the way how to use them or to change the development paradigm. Thus, a system based on NoC is proposed, where the applications are described through packages and performed in each router between source and destination, without traditional processors. To perform applications, independent of number of instructions and of the NoC dimensions, it was developed the spiral complement algorithm, which finds other destination until all instructions has been performed. Therefore, the objective is to study the viability of development that system, denominated IPNoSys system. In this study, it was developed a tool in SystemC, using accurate cycle, to simulate the system that performs applications, which was implemented in a package description language, also developed to this study. Through the simulation tool, several result were obtained that could be used to evaluate the system performance. The methodology used to describe the application corresponds to transform the high level application in data-flow graph that become one or more packages. This methodology was used in three applications: a counter, DCT-2D and float add. The counter was used to evaluate a deadlock solution and to perform parallel application. The DCT was used to compare to STORM platform. Finally, the float add aimed to evaluate the efficiency of the software routine to perform a unimplemented hardware instruction. The results from simulation confirm the viability of development of IPNoSys system. They showed that is possible to perform application described in packages, sequentially or parallelly, without interruptions caused by deadlock, and also showed that the execution time of IPNoSys is more efficient than the STORM platform

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recently the focus given to Web Services and Semantic Web technologies has provided the development of several research projects in different ways to addressing the Web services composition issue. Meanwhile, the challenge of creating an environment that provides the specification of an abstract business process and that it is automatically implemented by a composite service in a dynamic way is considered a currently open problem. WSDL and BPEL provided by industry support only manual service composition because they lack needed semantics so that Web services are discovered, selected and combined by software agents. Services ontology provided by Semantic Web enriches the syntactic descriptions of Web services to facilitate the automation of tasks, such as discovery and composition. This work presents an environment for specifying and ad-hoc executing Web services-based business processes, named WebFlowAH. The WebFlowAH employs common domain ontology to describe both Web services and business processes. It allows processes specification in terms of users goals or desires that are expressed based on the concepts of such common domain ontology. This approach allows processes to be specified in an abstract high level way, unburdening the user from the underline details needed to effectively run the process workflow

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The distribution of petroleum products through pipeline networks is an important problem that arises in production planning of refineries. It consists in determining what will be done in each production stage given a time horizon, concerning the distribution of products from source nodes to demand nodes, passing through intermediate nodes. Constraints concerning storage limits, delivering time, sources availability, limits on sending or receiving, among others, have to be satisfied. This problem can be viewed as a biobjective problem that aims at minimizing the time needed to for transporting the set of packages through the network and the successive transmission of different products in the same pipe is called fragmentation. This work are developed three algorithms that are applied to this problem: the first algorithm is discrete and is based on Particle Swarm Optimization (PSO), with local search procedures and path-relinking proposed as velocity operators, the second and the third algorithms deal of two versions based on the Non-dominated Sorting Genetic Algorithm II (NSGA-II). The proposed algorithms are compared to other approaches for the same problem, in terms of the solution quality and computational time spent, so that the efficiency of the developed methods can be evaluated

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Research on Wireless Sensor Networks (WSN) has evolved, with potential applications in several domains. However, the building of WSN applications is hampered by the need of programming in low-level abstractions provided by sensor OS and of specific knowledge about each application domain and each sensor platform. We propose a MDA approach do develop WSN applications. This approach allows domain experts to directly contribute in the developing of applications without needing low level knowledge on WSN platforms and, at the same time, it allows network experts to program WSN nodes to met application requirements without specific knowledge on the application domain. Our approach also promotes the reuse of the developed software artifacts, allowing an application model to be reused across different sensor platforms and a platform model to be reused for different applications

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The World Wide Web has been consolidated over the last years as a standard platform to provide software systems in the Internet. Nowadays, a great variety of user applications are available on the Web, varying from corporate applications to the banking domain, or from electronic commerce to the governmental domain. Given the quantity of information available and the quantity of users dealing with their services, many Web systems have sought to present recommendations of use as part of their functionalities, in order to let the users to have a better usage of the services available, based on their profile, history navigation and system use. In this context, this dissertation proposes the development of an agent-based framework that offers recommendations for users of Web systems. It involves the conception, design and implementation of an object-oriented framework. The framework agents can be plugged or unplugged in a non-invasive way in existing Web applications using aspect-oriented techniques. The framework is evaluated through its instantiation to three different Web systems

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The increasing complexity of integrated circuits has boosted the development of communications architectures like Networks-on-Chip (NoCs), as an architecture; alternative for interconnection of Systems-on-Chip (SoC). Networks-on-Chip complain for component reuse, parallelism and scalability, enhancing reusability in projects of dedicated applications. In the literature, lots of proposals have been made, suggesting different configurations for networks-on-chip architectures. Among all networks-on-chip considered, the architecture of IPNoSys is a non conventional one, since it allows the execution of operations, while the communication process is performed. This study aims to evaluate the execution of data-flow based applications on IPNoSys, focusing on their adaptation against the design constraints. Data-flow based applications are characterized by the flowing of continuous stream of data, on which operations are executed. We expect that these type of applications can be improved when running on IPNoSys, because they have a programming model similar to the execution model of this network. By observing the behavior of these applications when running on IPNoSys, were performed changes in the execution model of the network IPNoSys, allowing the implementation of an instruction level parallelism. For these purposes, analysis of the implementations of dataflow applications were performed and compared