10 resultados para Computer Network Resources

em Universidade Federal do Rio Grande do Norte(UFRN)


Relevância:

80.00% 80.00%

Publicador:

Resumo:

This dissertation presents a cooperative virtual multimedia enviroment for employing on time medical Field, using a TCP/IP computer network. The Virtual Diagnosis Room environment make it possible to perform cooperative tasks using classical image processing. Synchronous and assynchronous text conversation (chat) and content markup, in order to produce remote cooperative diagnosis. The dissertation also describes the tool in detail and its functions, that enables the interaction among users, along with implementation detals, contributions and weakness of this work

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In the two last decades of the past century, following the consolidation of the Internet as the world-wide computer network, applications generating more robust data flows started to appear. The increasing use of videoconferencing stimulated the creation of a new form of point-to-multipoint transmission called IP Multicast. All companies working in the area of software and the hardware development for network videoconferencing have adjusted their products as well as developed new solutionsfor the use of multicast. However the configuration of such different solutions is not easy done, moreover when changes in the operational system are also requirede. Besides, the existing free tools have limited functions, and the current comercial solutions are heavily dependent on specific platforms. Along with the maturity of IP Multicast technology and with its inclusion in all the current operational systems, the object-oriented programming languages had developed classes able to handle multicast traflic. So, with the help of Java APIs for network, data bases and hipertext, it became possible to the develop an Integrated Environment able to handle multicast traffic, which is the major objective of this work. This document describes the implementation of the above mentioned environment, which provides many functions to use and manage multicast traffic, functions which existed only in a limited way and just in few tools, normally the comercial ones. This environment is useful to different kinds of users, so that it can be used by common users, who want to join multimedia Internet sessions, as well as more advenced users such engineers and network administrators who may need to monitor and handle multicast traffic

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A new method to perform TCP/IP fingerprinting is proposed. TCP/IP fingerprinting is the process of identify a remote machine through a TCP/IP based computer network. This method has many applications related to network security. Both intrusion and defence procedures may use this process to achieve their objectives. There are many known methods that perform this process in favorable conditions. However, nowadays there are many adversities that reduce the identification performance. This work aims the creation of a new OS fingerprinting tool that bypass these actual problems. The proposed method is based on the use of attractors reconstruction and neural networks to characterize and classify pseudo-random numbers generators

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Even living in the XXI century are still some difficulties in access to broadband Internet in several Brazilian cities, due to the purchasing power of people and lack of government investment. But even with these difficulties, we seek to encourage the use of wireless technology, which is based on the IEEE 802.11b protocol - also known as Wi-Fi (Wireless Fidelity) Wireless Fidelity Communications, having wide range of commercial applications in the world market, nationally and internationally. In Brazil, this technology is in full operation in major cities and has proved attractive in relation to the access point to multipoint and point-to-point. This paper is a comparative analysis of prediction field, using models based on the prediction of propagation loss. To validate the techniques used here, the Okumura-Hata models, modified Okumura-Hata, Walfisch-Ikegami model, were applied to a wireless computer network, located in the neighborhood of Cajupiranga in the city of Melbourn, in Rio Grande do Norte . They are used for networking wireless 802.11b, using the Mobile Radio to measure signal levels, beyond the heights of the antennas and distances from the transmitter. The performance data versus distance are added to the graphs generated and compared with results obtained through calculations of propagation models

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The traditional perimeter-based approach for computer network security (the castle and the moat model) hinders the progress of enterprise systems and promotes, both in administrators and users, the delusion that systems are protected. To deal with the new range of threats, a new data-safety oriented paradigm, called de-perimeterisation , began to be studied in the last decade. One of the requirements for the implementation of the de-perimeterised model of security is the definition of a safe and effective mechanism for federated identity. This work seeks to fill this gap by presenting the specification, modelling and implementation of a mechanism for federated identity, based on the combination of SAML and X.509 digital certificates stored in smart-cards, following the A3 standard of ICP-Brasil (Brazilian official certificate authority and PKI)

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Various physical systems have dynamics that can be modeled by percolation processes. Percolation is used to study issues ranging from fluid diffusion through disordered media to fragmentation of a computer network caused by hacker attacks. A common feature of all of these systems is the presence of two non-coexistent regimes associated to certain properties of the system. For example: the disordered media can allow or not allow the flow of the fluid depending on its porosity. The change from one regime to another characterizes the percolation phase transition. The standard way of analyzing this transition uses the order parameter, a variable related to some characteristic of the system that exhibits zero value in one of the regimes and a nonzero value in the other. The proposal introduced in this thesis is that this phase transition can be investigated without the explicit use of the order parameter, but rather through the Shannon entropy. This entropy is a measure of the uncertainty degree in the information content of a probability distribution. The proposal is evaluated in the context of cluster formation in random graphs, and we apply the method to both classical percolation (Erd¨os- R´enyi) and explosive percolation. It is based in the computation of the entropy contained in the cluster size probability distribution and the results show that the transition critical point relates to the derivatives of the entropy. Furthermore, the difference between the smooth and abrupt aspects of the classical and explosive percolation transitions, respectively, is reinforced by the observation that the entropy has a maximum value in the classical transition critical point, while that correspondence does not occurs during the explosive percolation.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Various physical systems have dynamics that can be modeled by percolation processes. Percolation is used to study issues ranging from fluid diffusion through disordered media to fragmentation of a computer network caused by hacker attacks. A common feature of all of these systems is the presence of two non-coexistent regimes associated to certain properties of the system. For example: the disordered media can allow or not allow the flow of the fluid depending on its porosity. The change from one regime to another characterizes the percolation phase transition. The standard way of analyzing this transition uses the order parameter, a variable related to some characteristic of the system that exhibits zero value in one of the regimes and a nonzero value in the other. The proposal introduced in this thesis is that this phase transition can be investigated without the explicit use of the order parameter, but rather through the Shannon entropy. This entropy is a measure of the uncertainty degree in the information content of a probability distribution. The proposal is evaluated in the context of cluster formation in random graphs, and we apply the method to both classical percolation (Erd¨os- R´enyi) and explosive percolation. It is based in the computation of the entropy contained in the cluster size probability distribution and the results show that the transition critical point relates to the derivatives of the entropy. Furthermore, the difference between the smooth and abrupt aspects of the classical and explosive percolation transitions, respectively, is reinforced by the observation that the entropy has a maximum value in the classical transition critical point, while that correspondence does not occurs during the explosive percolation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The increasing of the number of attacks in the computer networks has been treated with the increment of the resources that are applied directly in the active routers equip-ments of these networks. In this context, the firewalls had been consolidated as essential elements in the input and output control process of packets in a network. With the advent of intrusion detectors systems (IDS), efforts have been done in the direction to incorporate packets filtering based in standards of traditional firewalls. This integration incorporates the IDS functions (as filtering based on signatures, until then a passive element) with the already existing functions in firewall. In opposite of the efficiency due this incorporation in the blockage of signature known attacks, the filtering in the application level provokes a natural retard in the analyzed packets, and it can reduce the machine performance to filter the others packets because of machine resources demand by this level of filtering. This work presents models of treatment for this problem based in the packets re-routing for analysis by a sub-network with specific filterings. The suggestion of implementa- tion of this model aims reducing the performance problem and opening a space for the consolidation of scenes where others not conventional filtering solutions (spam blockage, P2P traffic control/blockage, etc.) can be inserted in the filtering sub-network, without inplying in overload of the main firewall in a corporative network

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Artificial neural networks are usually applied to solve complex problems. In problems with more complexity, by increasing the number of layers and neurons, it is possible to achieve greater functional efficiency. Nevertheless, this leads to a greater computational effort. The response time is an important factor in the decision to use neural networks in some systems. Many argue that the computational cost is higher in the training period. However, this phase is held only once. Once the network trained, it is necessary to use the existing computational resources efficiently. In the multicore era, the problem boils down to efficient use of all available processing cores. However, it is necessary to consider the overhead of parallel computing. In this sense, this paper proposes a modular structure that proved to be more suitable for parallel implementations. It is proposed to parallelize the feedforward process of an RNA-type MLP, implemented with OpenMP on a shared memory computer architecture. The research consistes on testing and analizing execution times. Speedup, efficiency and parallel scalability are analyzed. In the proposed approach, by reducing the number of connections between remote neurons, the response time of the network decreases and, consequently, so does the total execution time. The time required for communication and synchronization is directly linked to the number of remote neurons in the network, and so it is necessary to investigate which one is the best distribution of remote connections

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Urban stormwater can be considered as potential water resources as well as problems for the proper functioning of the manifold activities of the city, resulting from inappropriate use and occupation of the soil, usually due to poor planning of the occupation of the development areas, with little care for the environmental aspects of the drainage of surface runoff. As a basic premise, we must seek mechanisms to preserve the natural flow in all stages of development of an urban area, preserving the soil infiltration capacity in the scale of the urban area, comprising the mechanisms of natural drainage, and noting preserving natural areas of dynamic water courses, both in the main channel and in the secondary. They are challenges for a sustainable urban development in a harmonious coexistence of modern developmental, which are consistent with the authoritative economic environmental and social quality. Integrated studies involving the quantity and quality of rainwater are absolutely necessary to achieve understanding and obtaining appropriate technologies, involving both aspects of the drainage problems and aspects of use of water when subjected to an adequate management of surface runoff , for example, the accumulation of these reservoirs in detention with the possibility of use for other purposes. The purpose of this study aims to develop a computer model, adjusted to prevailing conditions of an experimental urban watershed in order to enable the implementation of management practices for water resources, hydrological simulations of quantity and, in a preliminary way, the quality of stormwater that flow to a pond located at the downstream end of the basin. To this end, we used in parallel with the distributed model SWMM data raised the basin with the highest possible resolution to allow the simulation of diffuse loads, heterogeneous characteristics of the basin both in terms of hydrological and hydraulic parameters on the use and occupation soil. The parallel work should improve the degree of understanding of the phenomena simulated in the basin as well as the activity of the calibration models, and this is supported by monitoring data acquired during the duration of the project MAPLU (Urban Stormwater Management) belonging to the network PROSAB (Research Program in Basic Sanitation) in the years 2006 to 2008