935 resultados para Internet crime, Tor, Freenet, instant messenger, image hosting, Web 2.0, Internet Service Provider (ISP) Filtering, primary research, covert monitoring


Relevância:

100.00% 100.00%

Publicador:

Resumo:

In November 2010, tension between Internet infrastructure companies boiled over in a dispute between content distribution network (CDN) Level 3 and Internet service provider (ISP) Comcast. Level 3, a distribution partner of Netflix, accused Comcast of violating the principles of net neutrality when the ISP increased distribution fees for carrying high bandwidth services. Comcast justified its actions by stating that the price increase was standard practice and argued Level 3 was trying to avoid paying its fair share. The dispute exemplifies the growing concern over the rising costs of streaming media services. The companies facing these inflated infrastructure costs are CDNs (Level 3, Equinix, Limelight, Akamai, and Voxel), companies that host streaming media content on server farms and distribute the content to a variety of carriers, and ISPs (Comcast, Time Warner, Cox, and AT&T), the cable and phone companies that provide “last mile” service to paying customers. Both CDNs and ISPs are lobbying government regulators to keep their costs at a minimum. The outcome of these disputes will influence the cost, quality, and legal status of streaming media.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The makers of Dallas Buyers Club have been dealt a blow in their attempt to extract payment from people alleged to have downloaded illegal copies of the movie. Voltage Pictures, which owns Dallas Buyers Club, has been trying to identify over 4,700 iiNet subscribers who it alleges downloaded illicit copies of the movie. Earlier this year, the Federal Court agreed that iiNet should hand over subscriber details, but warned that any letter sent to account holders must first be approved by the court to protect consumers from abuse of the legal system. In a win for consumer protection, the Federal Court has now rejected Voltage’s draft letters, criticising Voltage’s attempts to avoid explaining what fee it would demand.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In many cases, a mobile user has the option of connecting to one of several IEEE 802.11 access points (APs),each using an independent channel. User throughput in each AP is determined by the number of other users as well as the frame size and physical rate being used. We consider the scenario where users could multihome, i.e., split their traffic amongst all the available APs, based on the throughput they obtain and the price charged. Thus, they are involved in a non-cooperative game with each other. We convert the problem into a fluid model and show that under a pricing scheme, which we call the cost price mechanism, the total system throughput is maximized,i.e., the system suffers no loss of efficiency due to selfish dynamics. We also study the case where the Internet Service Provider (ISP) could charge prices greater than that of the cost price mechanism. We show that even in this case multihoming outperforms unihoming, both in terms of throughput as well as profit to the ISP.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper studies the impact of exclusive contracts between a content provider (CP) and an internet service provider (ISP) in a nonneutral network. We consider a simple linear demand function for the CPs. We studywhen an exclusive contract is benefcial to the colluding pair and evaluate its impact on the noncolluding players at equilibrium. For the case of two CPs and one ISP we show that collusion may not always be benefcial. We derive an explicit condition in terms of the advertisement revenues of the CPs that tells when a collusion is proftable to the colluding entities.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We propose Trade & Cap (T&C), an economics-inspired mechanism that incentivizes users to voluntarily coordinate their consumption of the bandwidth of a shared resource (e.g., a DSLAM link) so as to converge on what they perceive to be an equitable allocation, while ensuring efficient resource utilization. Under T&C, rather than acting as an arbiter, an Internet Service Provider (ISP) acts as an enforcer of what the community of rational users sharing the resource decides is a fair allocation of that resource. Our T&C mechanism proceeds in two phases. In the first, software agents acting on behalf of users engage in a strategic trading game in which each user agent selfishly chooses bandwidth slots to reserve in support of primary, interactive network usage activities. In the second phase, each user is allowed to acquire additional bandwidth slots in support of presumed open-ended need for fluid bandwidth, catering to secondary applications. The acquisition of this fluid bandwidth is subject to the remaining "buying power" of each user and by prevalent "market prices" – both of which are determined by the results of the trading phase and a desirable aggregate cap on link utilization. We present analytical results that establish the underpinnings of our T&C mechanism, including game-theoretic results pertaining to the trading phase, and pricing of fluid bandwidth allocation pertaining to the capping phase. Using real network traces, we present extensive experimental results that demonstrate the benefits of our scheme, which we also show to be practical by highlighting the salient features of an efficient implementation architecture.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Data identification is a key task for any Internet Service Provider (ISP) or network administrator. As port fluctuation and encryption become more common in P2P traffic wishing to avoid identification, new strategies must be developed to detect and classify such flows. This paper introduces a new method of separating P2P and standard web traffic that can be applied as part of a data mining process, based on the activity of the hosts on the network. Unlike other research, our method is aimed at classifying individual flows rather than just identifying P2P hosts or ports. Heuristics are analysed and a classification system proposed. The accuracy of the system is then tested using real network traffic from a core internet router showing over 99% accuracy in some cases. We expand on this proposed strategy to investigate its application to real-time, early classification problems. New proposals are made and the results of real-time experiments compared to those obtained in the data mining research. To the best of our knowledge this is the first research to use host based flow identification to determine a flows application within the early stages of the connection.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

O uso de ferramentas Web 2.0 em educação, concretamente em contexto universitário, tem crescido de forma generalizada impulsionado por benefícios nas áreas pedagógica, científica e mesmo de administração universitária. Estas ferramentas caraterizam-se por serem de uso livre, de manipulação facilitada, e pela disponibilidade em diversos meios ou suportes e por não precisarem (a maioria delas) de elevada largura de banda, fator decisivo para os públicos dos países em desenvolvimento como Moçambique. A Universidade Eduardo Mondlane (UEM) encontra-se num processo de massificação do uso de Tecnologias de Informação e Comunicação (TIC) entre as quais se destacam as ferramentas Web 2.0. Este documento descreve uma investigação aplicada que compreendeu o desenvolvimento e implementação de estratégias para a introdução e disseminação destas ferramentas para apoio às áreas pedagógica, científica e de gestão universitária. Identificam-se os desafios e oportunidades decorrentes dos constrangimentos particulares deste tipo de iniciativas aplicadas a uma instituição de ensino superior de um país como Moçambique, em termos de infraestruturas tecnológicas e de literacia digital. Os resultados alcançados permitem evidenciar um caminho muito positivo com várias iniciativas de utilização das ferramentas implementadas e ativas no terreno.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Scenario analysis was used to examine empirically the relationships between guarantee type and service experience, and consumer satisfaction, for the service of an Internet Service Provider (ISP). The scenarios involved hypothetical situations in which several factors were varied: the existence of a problem; the invocation of a guarantee, the identity of the invoker; and the manner of resolution of any problem. Alternative service guarantees were associated with each hypothetical experience: a specific guarantee, and an unconditional guarantee. Overall, consumer satisfaction related to the nature of the service experience much more strongly than it did to the difference in guarantee type.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The consensus among researchers is that loyalty is a very complex construct (Javalgi & Moberg 1997). Various typologies have been developed to measure the loyalty construct (e.g., Curassi and Kennedy 2002; Hoare 2000; Knox 1998; Zeithaml, Parasuraman & Berry 1996). Zeithaml, Berry & Parasuraman (1996) developed a service loyalty framework comprising 13 items across five dimensions: “loyalty”, “switch”, “pay more”, “external responses”, and “internal responses”. This framework was criticised by Bloemer, de Ruyter & Wetzels (1999) for having conceptual and empirical limitations. Upon re-examination of the same 13 items, they concluded that the loyalty construct comprised only four factors: “word-of-mouth”, “purchase intentions”, “price sensitivity”, and “complaining behaviour”. Questions remain as to the precise dimensionality of the service loyalty construct as proposed by Zeithaml, Parasuraman & Berry (1996), and its stability or robustness generically, i.e., to what extent is there an invariant factor structure across the range of marketing contexts to which the battery may be applied? This paper reports on the testing of the goodness-of-fit of the five and fourfactor models to data collected in a study of consumer reaction to the service supplied by an Australian Internet Service Provider (ISP), through a series of hypothetical scenarios. In addition, comparisons were conducted with the results of exploratory factor analyses of the eight scenarios. The results suggested that factor structures are unstable across the data subsets, thereby limiting the generalisability and utility of the proposed models.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Along of this document the reader could find a suitable network design and solution for the Rally Championship of Ypres meeting all the requirements set by the organization of the rally. These requirements have brought many problems in accordance with the network standards, because the area where the boxes are located is pretty large nevertheless technologies to solve those problems are detailed in the project. It has been included different designs in the project, each one of them based on distinct characteristic as they could be efficient, performance… , and the most important, since the organization of the rally is non-profit , the budget. Nevertheless we didn’t dismiss the use of long-lasting devices, as CISCO devices, despite their price. Furthermore a configuration of routing/switching devices has been explained for those who will be commanded to implement this solution. This solution is design to supply internet access as well as video streaming to all boxes for what teams can follow the championship in live time. The maximum connection of internet service provider (ISP) is 160Mbps, this bandwidth has to be distributed for the boxes dynamically. Finally to ensure the network works out it has to be monitored, this is reachable by using network analysis tools which in this project Wireshark has been chosen. RESUMEN. A lo largo de este documento, el lector encontrara un posible diseño y una posible solución para la red local del circuito de Rally celebrado en Ypres, cumpliendo con todos los requisitos y especificaciones establecidos por la organización. Estos requisitos han causado problemas de conformidad con los estándares de la red, debido a que la zona donde se encuentran los Boxes de los equipos es bastante larga, sin embargo las tecnologías para resolver esos problemas se detallan en este proyecto. Se han incluido diferentes diseños, cada uno de ellos centrado en aspectos diferentes así como la eficacia, el rendimiento, el presupuesto, etc... Esta solución está diseñada para suministrar acceso a Internet, así como la transmisión dinámica de video a todos los equipos para que puedan seguir la competición en tiempo real. Finalmente para controlar y asegurar que la red funciona, será monitorizada mediante herramientas de análisis de redes (Wireshark).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Internet has become a universal communication network tool. It has evolved from a platform that supports best-effort traffic to one that now carries different traffic types including those involving continuous media with quality of service (QoS) requirements. As more services are delivered over the Internet, we face increasing risk to their availability given that malicious attacks on those Internet services continue to increase. Several networks have witnessed denial of service (DoS) and distributed denial of service (DDoS) attacks over the past few years which have disrupted QoS of network services, thereby violating the Service Level Agreement (SLA) between the client and the Internet Service Provider (ISP). Hence DoS or DDoS attacks are major threats to network QoS. In this paper we survey techniques and solutions that have been deployed to thwart DoS and DDoS attacks and we evaluate them in terms of their impact on network QoS for Internet services. We also present vulnerabilities that can be exploited for QoS protocols and also affect QoS if exploited. In addition, we also highlight challenges that still need to be addressed to achieve end-to-end QoS with recently proposed DoS/DDoS solutions. © 2010 John Wiley & Sons, Ltd.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Dissertação apresentada à Escola Superior de Comunicação Social como parte dos requisitos para obtenção de grau de mestre em Audiovisual e Multimédia.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The article focuses on the current situation of Spanish case law on ISP liability. It starts by presenting the more salient peculiarities of the Spanish transposition of the safe harbours laid down in the E-Commerce Directive. These peculiarities relate to the knowledge requirement of the hosting safe harbour, and to the safe harbour for information location tools. The article then provides an overview of the cases decided so far with regard to each of the safe harbours. Very few cases have dealt with the mere conduit and the caching safe harbours, though the latter was discussed in an interesting case involving Google’s cache. Most cases relate to hosting and linking safe harbours. With regard to hosting, the article focuses particularly on the two judgments handed down by the Supreme Court that hold an open interpretation of actual knowledge, an issue where courts had so far been split. Cases involving the linking safe harbour have mainly dealt with websites offering P2P download links. Accordingly, the article explores the legal actions brought against these sites, which for the moment have been unsuccessful. The new legislative initiative to fight against digital piracy – the Sustainable Economy Bill – is also analyzed. After the conclusion, the article provides an Annex listing the cases that have dealt with ISP liability in Spain since the safe harbours scheme was transposed into Spanish law.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Web 1.0 referred to the early, read-only internet; Web 2.0 refers to the ‘read-write web’ in which users actively contribute to as well as consume online content; Web 3.0 is now being used to refer to the convergence of mobile and Web 2.0 technologies and applications. One of the most important developments in mobile 3.0 is geography: with many mobile phones now equipped with GPS, mobiles promise to “bring the internet down to earth” through geographically-aware, or locative media. The internet was earlier heralded as “the death of geography” with predictions that with anyone able to access information from anywhere, geography would no longer matter. But mobiles are disproving this. GPS allows the location of the user to be pinpointed, and the mobile internet allows the user to access locally-relevant information, or to upload content which is geotagged to the specific location. It also allows locally-specific content to be sent to the user when the user enters a specific space. Location-based services are one of the fastest-growing segments of the mobile internet market: the 2008 AIMIA report indicates that user access of local maps increased by 347% over the previous 12 months, and restaurant guides/reviews increased by 174%. The central tenet of cultural geography is that places are culturally-constructed, comprised of the physical space itself, culturally-inflected perceptions of that space, and people’s experiences of the space (LeFebvre 1991). This paper takes a cultural geographical approach to locative media, anatomising the various spaces which have emerged through locative media, or “the geoweb” (Lake 2004). The geoweb is such a new concept that to date, critical discourse has treated it as a somewhat homogenous spatial formation. In order to counter this, and in order to demonstrate the dynamic complexity of the emerging spaces of the geoweb, the paper provides a topography of different types of locative media space: including the personal/aesthetic in which individual users geotag specific physical sites with their own content and meanings; the commercial, like the billboards which speak to individuals as they pass in Minority Report; and the social, in which one’s location is defined by the proximity of friends rather than by geography.