808 resultados para Internet (Computer network) -- Economic aspects
Resumo:
IP based networks still do not have the required degree of reliability required by new multimedia services, achieving such reliability will be crucial in the success or failure of the new Internet generation. Most of existing schemes for QoS routing do not take into consideration parameters concerning the quality of the protection, such as packet loss or restoration time. In this paper, we define a new paradigm to develop new protection strategies for building reliable MPLS networks, based on what we have called the network protection degree (NPD). This NPD consists of an a priori evaluation, the failure sensibility degree (FSD), which provides the failure probability and an a posteriori evaluation, the failure impact degree (FID), to determine the impact on the network in case of failure. Having mathematical formulated these components, we point out the most relevant components. Experimental results demonstrate the benefits of the utilization of the NPD, when used to enhance some current QoS routing algorithms to offer a certain degree of protection
Estudo do impacto do tamanho máximo da carga da trama Ethernet no perfil do Tráfego IPV6 na Internet
Resumo:
A transição entre a versão 4 para a versão 6 do Internet Protocol (IP) vem ocorrendo na comunidade da Internet. No entanto, a estrutura interna dos protocolos IPv4 e IPv6, em detalhe no tamanho dos seus cabeçalhos, pode provocar alterações no perfil tráfego da rede. Este trabalho estuda as mudanças nas características de tráfego de rede, avaliando o que mudaria se o tráfego gerado fosse apenas IPv6 em vez de IPv4. Este artigo estende-se uma pesquisa anterior, abordando novas questões, mas usando os registos de dados reais disponíveis publicamente. É adotada uma metodologia de engenharia reversa nos pacotes IPv4 capturados, permitindo assim inferir qual a carga original no computador tributário e em seguida reencapsular essa carga em novos pacotes usando restrições de encapsulamento IPv6. Conclui-se que, na transição de IPv4 para IPv6, haverá um aumento no número de pacotes transmitidos na Internet.
Resumo:
In order to simplify computer management, several system administrators are adopting advanced techniques to manage software configuration of enterprise computer networks, but the tight coupling between hardware and software makes every PC an individual managed entity, lowering the scalability and increasing the costs to manage hundreds or thousands of PCs. Virtualization is an established technology, however its use is been more focused on server consolidation and virtual desktop infrastructure, not for managing distributed computers over a network. This paper discusses the feasibility of the Distributed Virtual Machine Environment, a new approach for enterprise computer management that combines virtualization and distributed system architecture as the basis of the management architecture. © 2008 IEEE.
Resumo:
The computers and network services became presence guaranteed in several places. These characteristics resulted in the growth of illicit events and therefore the computers and networks security has become an essential point in any computing environment. Many methodologies were created to identify these events; however, with increasing of users and services on the Internet, many difficulties are found in trying to monitor a large network environment. This paper proposes a methodology for events detection in large-scale networks. The proposal approaches the anomaly detection using the NetFlow protocol, statistical methods and monitoring the environment in a best time for the application. © 2010 Springer-Verlag Berlin Heidelberg.
Resumo:
Pós-graduação em Ciências Sociais - FFC
Resumo:
The EU began railway reform in earnest around the turn of the century. Two ‘railway packages’ have meanwhile been adopted amounting to a series of directives and a third package has been proposed. A range of complementary initiatives has been undertaken or is underway. This BEEP Briefing inspects the main economic aspects of EU rail reform. After highlighting the dramatic loss of market share of rail since the 1960s, the case for reform is argued to rest on three arguments: the need for greater competitiveness of rail, promoting the (market driven) diversion of road haulage to rail as a step towards sustainable mobility in Europe, and an end to the disproportional claims on public budgets of Member States. The core of the paper deals respectively with market failures in rail and in the internal market for rail services; the complex economic issues underlying vertical separation (unbundling) and pricing options; and the methods, potential and problems of introducing competition in rail freight and in passenger services. Market failures in the rail sector are several (natural monopoly, economies of density, safety and asymmetries of information), exacerbated by no less than 7 technical and legal barriers precluding the practical operation of an internal rail market. The EU choice to opt for vertical unbundling (with benefits similar in nature as in other network industries e.g. preventing opaque cross-subsidisation and greater cost revelation) risks the emergence of considerable coordination costs. The adoption of marginal cost pricing is problematic on economic grounds (drawbacks include arbitrary cost allocation rules in the presence of large economies of scope and relatively large common costs; a non-optimal incentive system, holding back the growth of freight services; possibly anti-competitive effects of two-part tariffs). Without further detailed harmonisation, it may also lead to many different systems in Member States, causing even greater distortions. Insofar as freight could develop into a competitive market, a combination of Ramsey pricing (given the incentive for service providers to keep market share) and price ceilings based on stand-alone costs might be superior in terms of competition, market growth and regulatory oversight. The incipient cooperative approach for path coordination and allocation is welcome but likely to be seriously insufficient. The arguments to introduce competition, notably in freight, are valuable and many e.g. optimal cross-border services, quality differentiation as well as general quality improvement, larger scale for cost recovery and a decrease of rent seeking. Nevertheless, it is not correct to argue for the introduction of competition in rail tout court. It depends on the size of the market and on removing a host of barriers; it requires careful PSO definition and costing; also, coordination failures ought to be pre-empted. On the other hand, reform and competition cannot and should not be assessed in a static perspective. Conduct and cost structures will change with reform. Infrastructure and investment in technology are known to generate enormous potential for cost savings, especially when coupled with the EU interoperability programme. All this dynamism may well help to induce entry and further enlarge the (net) welfare gains from EU railway reform. The paper ends with a few pointers for the way forward in EU rail reform.
Resumo:
Vita.
Resumo:
Computer networks produce tremendous amounts of event-based data that can be collected and managed to support an increasing number of new classes of pervasive applications. Examples of such applications are network monitoring and crisis management. Although the problem of distributed event-based management has been addressed in the non-pervasive settings such as the Internet, the domain of pervasive networks has its own characteristics that make these results non-applicable. Many of these applications are based on time-series data that possess the form of time-ordered series of events. Such applications also embody the need to handle large volumes of unexpected events, often modified on-the-fly, containing conflicting information, and dealing with rapidly changing contexts while producing results with low-latency. Correlating events across contextual dimensions holds the key to expanding the capabilities and improving the performance of these applications. This dissertation addresses this critical challenge. It establishes an effective scheme for complex-event semantic correlation. The scheme examines epistemic uncertainty in computer networks by fusing event synchronization concepts with belief theory. Because of the distributed nature of the event detection, time-delays are considered. Events are no longer instantaneous, but duration is associated with them. Existing algorithms for synchronizing time are split into two classes, one of which is asserted to provide a faster means for converging time and hence better suited for pervasive network management. Besides the temporal dimension, the scheme considers imprecision and uncertainty when an event is detected. A belief value is therefore associated with the semantics and the detection of composite events. This belief value is generated by a consensus among participating entities in a computer network. The scheme taps into in-network processing capabilities of pervasive computer networks and can withstand missing or conflicting information gathered from multiple participating entities. Thus, this dissertation advances knowledge in the field of network management by facilitating the full utilization of characteristics offered by pervasive, distributed and wireless technologies in contemporary and future computer networks.
Resumo:
This research seeks to understand how the problem of information security is treated in Brazil by the public thematization and also how it can affect the political and economic aspects of both Brazilian companies and government by using a study case based on the document leak event of the National Security Agency by Snowden. For this, the study case of sites, blogs and news portal coverage was carried out from the perspective of evidential paradigm, studies of movement and event concept. We are interested in examining how the media handles the information security topic and what its impact on national and international political relations. The subject matter was considered the largest data leakage in history of the NSA, which ranks as the world's largest agency of expression intelligence. This leak caused great repercussions in Brazil since it was revealed that the country was the most watched by the United States of America, behind only USA itself. The consequences were: a big tension between Brazil and the US and a public discussion about privacy and freedom on Internet. The research analyzed 256 publications released by Brazilian media outlets in digital media, in the period between June and July 2013.
Resumo:
This research seeks to understand how the problem of information security is treated in Brazil by the public thematization and also how it can affect the political and economic aspects of both Brazilian companies and government by using a study case based on the document leak event of the National Security Agency by Snowden. For this, the study case of sites, blogs and news portal coverage was carried out from the perspective of evidential paradigm, studies of movement and event concept. We are interested in examining how the media handles the information security topic and what its impact on national and international political relations. The subject matter was considered the largest data leakage in history of the NSA, which ranks as the world's largest agency of expression intelligence. This leak caused great repercussions in Brazil since it was revealed that the country was the most watched by the United States of America, behind only USA itself. The consequences were: a big tension between Brazil and the US and a public discussion about privacy and freedom on Internet. The research analyzed 256 publications released by Brazilian media outlets in digital media, in the period between June and July 2013.
Resumo:
Computer networks produce tremendous amounts of event-based data that can be collected and managed to support an increasing number of new classes of pervasive applications. Examples of such applications are network monitoring and crisis management. Although the problem of distributed event-based management has been addressed in the non-pervasive settings such as the Internet, the domain of pervasive networks has its own characteristics that make these results non-applicable. Many of these applications are based on time-series data that possess the form of time-ordered series of events. Such applications also embody the need to handle large volumes of unexpected events, often modified on-the-fly, containing conflicting information, and dealing with rapidly changing contexts while producing results with low-latency. Correlating events across contextual dimensions holds the key to expanding the capabilities and improving the performance of these applications. This dissertation addresses this critical challenge. It establishes an effective scheme for complex-event semantic correlation. The scheme examines epistemic uncertainty in computer networks by fusing event synchronization concepts with belief theory. Because of the distributed nature of the event detection, time-delays are considered. Events are no longer instantaneous, but duration is associated with them. Existing algorithms for synchronizing time are split into two classes, one of which is asserted to provide a faster means for converging time and hence better suited for pervasive network management. Besides the temporal dimension, the scheme considers imprecision and uncertainty when an event is detected. A belief value is therefore associated with the semantics and the detection of composite events. This belief value is generated by a consensus among participating entities in a computer network. The scheme taps into in-network processing capabilities of pervasive computer networks and can withstand missing or conflicting information gathered from multiple participating entities. Thus, this dissertation advances knowledge in the field of network management by facilitating the full utilization of characteristics offered by pervasive, distributed and wireless technologies in contemporary and future computer networks.
Resumo:
Los geógrafos ahora tienen a su disposición la red mundial de INTERNET. Esta res es mucho más que un depósito gigante de datos y programas. Es un cúmulo de experiencias humanas que incluyen texto, artículos, imagen, video y foros de discusión. Es una nueva forma de procesamiento a la información de formas que antes considerábamos imposibles. El profesional que continúe procesando y obteniendo información de la manera tradicional se estará quedando al margen de nuevo conocimiento disponible a diario en INTERNET. El profesional de hoy no se limita a recopilar información en una biblioteca o librería, sino que accesa directamente sitios de búsqueda que le permitirán encontrar rápidamente los datos que busca. Un ejemplo, son los meteorólogos que tienen en INTERNET su mejor herramienta, ya que pueden recuperar imágenes sobre el clima casi inmediatamente después que son almacenadas desde el satélite, lo cual les permite evaluar y discernir sobre el estado actual del clima (Aberdeen University Compiting Center, 1996). Las imágenes las pueden ver y bajar a su computadora individual para su propio uso. Los profesores en la actualidad brindan al estudiante todo su material almacenándolo en INTERNET. La relación profesor-estudiante ya no es la misma. Al estudiante se le exige encontrar la información en su computadora y asimilarla. El viejo cuaderno no es necesario, las lecciones pueden ser recuperadas para su estudio sin que el profesor tenga que impartirlas, como se hace en la mayoría de las universidades de los Estados Unidos (Ohio State University, 1996). En general, este articulo persigue mostrar a los profesionales de las ciencias geográficas, dónde encontrar la información que buscan t cómo localizar más de lo que imaginan con la red INTERNET. ABSTRACT Geographers now have at their disposition the world network of INTERNET. This network is much more than just a large deposit of digital data and programs. It is an accumulation of human experiences that include text, articles, images, videos, and discussion bulletin boards. It is a new form of processing and managing information that was previously considered impossible. The professional who continues searching and obtaining information by traditional methods will be left on the fringes of this new wave of digital information and material available daily on INTERNET. Hence, a professional is not limited to compiling information in libraries or bookstores as direct and rapid access of desired research materials is available on the INTERNET. For example, meteorologists have in INTERNET their best tool in that they can acquire meteorologic satellite images, which permit them to evaluate and discern the actual present climatic situation (Aberdeen University Computing Center, 1996). One can see and then down load to one´s personal computer imagines of interest for personal use. Professors can offer to students all their materials for a class through and stores on the INTERNET. The relationship between professor and student is not the same. Students can be asked to access and assimilate the information via individual computers connected to the INTERNTET. Notebooks are becoming obsolete given that all class lectures and materials could be placed on the INTERNET for review without a professor having to give a lecture, as is being done in many universities of the United States (Ohio State University, 1996).This article pursues showing, in general, where professionals in Geographical Sciences can find available information and much more on the INTERNET.
Resumo:
Discrete event-driven simulations of digital communication networks have been used widely. However, it is difficult to use a network simulator to simulate a hybrid system in which some objects are not discrete event-driven but are continuous time-driven. A networked control system (NCS) is such an application, in which physical process dynamics are continuous by nature. We have designed and implemented a hybrid simulation environment which effectively integrates models of continuous-time plant processes and discrete-event communication networks by extending the open source network simulator NS-2. To do this a synchronisation mechanism was developed to connect a continuous plant simulation with a discrete network simulation. Furthermore, for evaluating co-design approaches in an NCS environment, a piggybacking method was adopted to allow the control period to be adjusted during simulations. The effectiveness of the technique is demonstrated through case studies which simulate a networked control scenario in which the communication and control system properties are defined explicitly.
Resumo:
Generative music systems can be performed by manipulating the values of their algorithmic parameters, and their semi-autonomous nature provides an opportunity for coordinated interaction amongst a network of systems, a practice we call Network Jamming. This paper outlines the characteristics of this networked performance practice and discusses the types of mediated musical relationships and ensemble configurations that can arise. We have developed and tested the jam2jam network jamming software over recent years. We describe this system, draw from our experiences with it, and use it to illustrate some characteristics of Network Jamming.