792 resultados para Peer-to-Peer networks
Resumo:
In a peer-to-peer network, the nodes interact with each other by sharing resources, services and information. Many applications have been developed using such networks, being a class of such applications are peer-to-peer databases. The peer-to-peer databases systems allow the sharing of unstructured data, being able to integrate data from several sources, without the need of large investments, because they are used existing repositories. However, the high flexibility and dynamicity of networks the network, as well as the absence of a centralized management of information, becomes complex the process of locating information among various participants in the network. In this context, this paper presents original contributions by a proposed architecture for a routing system that uses the Ant Colony algorithm to optimize the search for desired information supported by ontologies to add semantics to shared data, enabling integration among heterogeneous databases and the while seeking to reduce the message traffic on the network without causing losses in the amount of responses, confirmed by the improve of 22.5% in this amount. © 2011 IEEE.
Resumo:
The development of new technologies that use peer-to-peer networks grows every day, with the object to supply the need of sharing information, resources and services of databases around the world. Among them are the peer-to-peer databases that take advantage of peer-to-peer networks to manage distributed knowledge bases, allowing the sharing of information semantically related but syntactically heterogeneous. However, it is a challenge to ensure the efficient search for information without compromising the autonomy of each node and network flexibility, given the structural characteristics of these networks. On the other hand, some studies propose the use of ontology semantics by assigning standardized categorization of information. The main original contribution of this work is the approach of this problem with a proposal for optimization of queries supported by the Ant Colony algorithm and classification though ontologies. The results show that this strategy enables the semantic support to the searches in peer-to-peer databases, aiming to expand the results without compromising network performance. © 2011 IEEE.
SW-V: modelo de streaming de software baseado em técnicas de virtualização e transporte peer-to-peer
Resumo:
Pós-graduação em Ciência da Computação - IBILCE
Resumo:
Pós-graduação em Ciência da Computação - IBILCE
Resumo:
La prova informatica richiede l’adozione di precauzioni come in un qualsiasi altro accertamento scientifico. Si fornisce una panoramica sugli aspetti metodologici e applicativi dell’informatica forense alla luce del recente standard ISO/IEC 27037:2012 in tema di trattamento del reperto informatico nelle fasi di identificazione, raccolta, acquisizione e conservazione del dato digitale. Tali metodologie si attengono scrupolosamente alle esigenze di integrità e autenticità richieste dalle norme in materia di informatica forense, in particolare della Legge 48/2008 di ratifica della Convenzione di Budapest sul Cybercrime. In merito al reato di pedopornografia si offre una rassegna della normativa comunitaria e nazionale, ponendo l’enfasi sugli aspetti rilevanti ai fini dell’analisi forense. Rilevato che il file sharing su reti peer-to-peer è il canale sul quale maggiormente si concentra lo scambio di materiale illecito, si fornisce una panoramica dei protocolli e dei sistemi maggiormente diffusi, ponendo enfasi sulla rete eDonkey e il software eMule che trovano ampia diffusione tra gli utenti italiani. Si accenna alle problematiche che si incontrano nelle attività di indagine e di repressione del fenomeno, di competenza delle forze di polizia, per poi concentrarsi e fornire il contributo rilevante in tema di analisi forensi di sistemi informatici sequestrati a soggetti indagati (o imputati) di reato di pedopornografia: la progettazione e l’implementazione di eMuleForensic consente di svolgere in maniera estremamente precisa e rapida le operazioni di analisi degli eventi che si verificano utilizzando il software di file sharing eMule; il software è disponibile sia in rete all’url http://www.emuleforensic.com, sia come tool all’interno della distribuzione forense DEFT. Infine si fornisce una proposta di protocollo operativo per l’analisi forense di sistemi informatici coinvolti in indagini forensi di pedopornografia.
Resumo:
The eradication of BVD in the UK is technically possible but appears to be socially untenable. The following study explored farmer attitudes to BVD control schemes in relation to advice networks and information sharing, shared aims and goals, motivation and benefits of membership, notions of BVD as a priority disease and attitudes toward regulation. Two concepts from the organisational management literature framed the study: citizenship behaviour where actions of individuals support the collective good (but are not explicitly recognised as such) and peer to peer monitoring (where individuals evaluate other’s behaviour). Farmers from two BVD control schemes in the UK participated in the study: Orkney Livestock Association BVD Eradication Scheme and Norfolk and Suffolk Cattle Breeders Association BVD Eradication Scheme. In total 162 farmers participated in the research (109 in-scheme and 53 out of scheme). The findings revealed that group helping and information sharing among scheme members was low with a positive BVD status subject to social censure. Peer monitoring in the form of gossip with regard to the animal health status of other farms was high. Interestingly, farmers across both schemes supported greater regulation with regard to animal health, largely due to the mistrust of fellow farmers following voluntary disease control measures. While group cohesiveness varied across the two schemes, without continued financial inducements, longer-term sustainability is questionable
Resumo:
1. Introduction "The one that has compiled ... a database, the collection, securing the validity or presentation of which has required an essential investment, has the sole right to control the content over the whole work or over either a qualitatively or quantitatively substantial part of the work both by means of reproduction and by making them available to the public", Finnish Copyright Act, section 49.1 These are the laconic words that implemented the much-awaited and hotly debated European Community Directive on the legal protection of databases,2 the EDD, into Finnish Copyright legislation in 1998. Now in the year 2005, after more than half a decade of the domestic implementation it is yet uncertain as to the proper meaning and construction of the convoluted qualitative criteria the current legislation employs as a prerequisite for the database protection both in Finland and within the European Union. Further, this opaque Pan-European instrument has the potential of bringing about a number of far-reaching economic and cultural ramifications, which have remained largely uncharted or unobserved. Thus the task of understanding this particular and currently peculiarly European new intellectual property regime is twofold: first, to understand the mechanics and functioning of the EDD and second, to realise the potential and risks inherent in the new legislation in economic, cultural and societal dimensions. 2. Subject-matter of the study: basic issues The first part of the task mentioned above is straightforward: questions such as what is meant by the key concepts triggering the functioning of the EDD such as presentation of independent information, what constitutes an essential investment in acquiring data and when the reproduction of a given database reaches either qualitatively or quantitatively the threshold of substantiality before the right-holder of a database can avail himself of the remedies provided by the statutory framework remain unclear and call for a careful analysis. As for second task, it is already obvious that the practical importance of the legal protection providedby the database right is in the rapid increase. The accelerating transformationof information into digital form is an existing fact, not merely a reflection of a shape of things to come in the future. To take a simple example, the digitisation of a map, traditionally in paper format and protected by copyright, can provide the consumer a markedly easier and faster access to the wanted material and the price can be, depending on the current state of the marketplace, cheaper than that of the traditional form or even free by means of public lending libraries providing access to the information online. This also renders it possible for authors and publishers to make available and sell their products to markedly larger, international markets while the production and distribution costs can be kept at minimum due to the new electronic production, marketing and distributionmechanisms to mention a few. The troublesome side is for authors and publishers the vastly enhanced potential for illegal copying by electronic means, producing numerous virtually identical copies at speed. The fear of illegal copying canlead to stark technical protection that in turn can dampen down the demand for information goods and services and furthermore, efficiently hamper the right of access to the materials available lawfully in electronic form and thus weaken the possibility of access to information, education and the cultural heritage of anation or nations, a condition precedent for a functioning democracy. 3. Particular issues in Digital Economy and Information Networks All what is said above applies a fortiori to the databases. As a result of the ubiquity of the Internet and the pending breakthrough of Mobile Internet, peer-to-peer Networks, Localand Wide Local Area Networks, a rapidly increasing amount of information not protected by traditional copyright, such as various lists, catalogues and tables,3previously protected partially by the old section 49 of the Finnish Copyright act are available free or for consideration in the Internet, and by the same token importantly, numerous databases are collected in order to enable the marketing, tendering and selling products and services in above mentioned networks. Databases and the information embedded therein constitutes a pivotal element in virtually any commercial operation including product and service development, scientific research and education. A poignant but not instantaneously an obvious example of this is a database consisting of physical coordinates of a certain selected group of customers for marketing purposes through cellular phones, laptops and several handheld or vehicle-based devices connected online. These practical needs call for answer to a plethora of questions already outlined above: Has thecollection and securing the validity of this information required an essential input? What qualifies as a quantitatively or qualitatively significant investment? According to the Directive, the database comprises works, information and other independent materials, which are arranged in systematic or methodical way andare individually accessible by electronic or other means. Under what circumstances then, are the materials regarded as arranged in systematic or methodical way? Only when the protected elements of a database are established, the question concerning the scope of protection becomes acute. In digital context, the traditional notions of reproduction and making available to the public of digital materials seem to fit ill or lead into interpretations that are at variance with analogous domain as regards the lawful and illegal uses of information. This may well interfere with or rework the way in which the commercial and other operators have to establish themselves and function in the existing value networks of information products and services. 4. International sphere After the expiry of the implementation period for the European Community Directive on legal protection of databases, the goals of the Directive must have been consolidated into the domestic legislations of the current twenty-five Member States within the European Union. On one hand, these fundamental questions readily imply that the problemsrelated to correct construction of the Directive underlying the domestic legislation transpire the national boundaries. On the other hand, the disputes arisingon account of the implementation and interpretation of the Directive on the European level attract significance domestically. Consequently, the guidelines on correct interpretation of the Directive importing the practical, business-oriented solutions may well have application on European level. This underlines the exigency for a thorough analysis on the implications of the meaning and potential scope of Database protection in Finland and the European Union. This position hasto be contrasted with the larger, international sphere, which in early 2005 does differ markedly from European Union stance, directly having a negative effect on international trade particularly in digital content. A particular case in point is the USA, a database producer primus inter pares, not at least yet having aSui Generis database regime or its kin, while both the political and academic discourse on the matter abounds. 5. The objectives of the study The above mentioned background with its several open issues calls for the detailed study of thefollowing questions: -What is a database-at-law and when is a database protected by intellectual property rights, particularly by the European database regime?What is the international situation? -How is a database protected and what is its relation with other intellectual property regimes, particularly in the Digital context? -The opportunities and threats provided by current protection to creators, users and the society as a whole, including the commercial and cultural implications? -The difficult question on relation of the Database protection and protection of factual information as such. 6. Dsiposition The Study, in purporting to analyse and cast light on the questions above, is divided into three mainparts. The first part has the purpose of introducing the political and rationalbackground and subsequent legislative evolution path of the European database protection, reflected against the international backdrop on the issue. An introduction to databases, originally a vehicle of modern computing and information andcommunication technology, is also incorporated. The second part sets out the chosen and existing two-tier model of the database protection, reviewing both itscopyright and Sui Generis right facets in detail together with the emergent application of the machinery in real-life societal and particularly commercial context. Furthermore, a general outline of copyright, relevant in context of copyright databases is provided. For purposes of further comparison, a chapter on the precursor of Sui Generi, database right, the Nordic catalogue rule also ensues. The third and final part analyses the positive and negative impact of the database protection system and attempts to scrutinize the implications further in the future with some caveats and tentative recommendations, in particular as regards the convoluted issue concerning the IPR protection of information per se, a new tenet in the domain of copyright and related rights.
Resumo:
Gossip protocols have been analyzed as a feasible solution for data dissemination on peer-to-peer networks. In this thesis, a new data dissemination protocol is proposed and compared with other known gossip mechanisms. Performance evaluation is based on simulation.
Resumo:
With the advent of peer to peer networks, and more importantly sensor networks, the desire to extract useful information from continuous and unbounded streams of data has become more prominent. For example, in tele-health applications, sensor based data streaming systems are used to continuously and accurately monitor Alzheimer's patients and their surrounding environment. Typically, the requirements of such applications necessitate the cleaning and filtering of continuous, corrupted and incomplete data streams gathered wirelessly in dynamically varying conditions. Yet, existing data stream cleaning and filtering schemes are incapable of capturing the dynamics of the environment while simultaneously suppressing the losses and corruption introduced by uncertain environmental, hardware, and network conditions. Consequently, existing data cleaning and filtering paradigms are being challenged. This dissertation develops novel schemes for cleaning data streams received from a wireless sensor network operating under non-linear and dynamically varying conditions. The study establishes a paradigm for validating spatio-temporal associations among data sources to enhance data cleaning. To simplify the complexity of the validation process, the developed solution maps the requirements of the application on a geometrical space and identifies the potential sensor nodes of interest. Additionally, this dissertation models a wireless sensor network data reduction system by ascertaining that segregating data adaptation and prediction processes will augment the data reduction rates. The schemes presented in this study are evaluated using simulation and information theory concepts. The results demonstrate that dynamic conditions of the environment are better managed when validation is used for data cleaning. They also show that when a fast convergent adaptation process is deployed, data reduction rates are significantly improved. Targeted applications of the developed methodology include machine health monitoring, tele-health, environment and habitat monitoring, intermodal transportation and homeland security.
Resumo:
With the advent of peer to peer networks, and more importantly sensor networks, the desire to extract useful information from continuous and unbounded streams of data has become more prominent. For example, in tele-health applications, sensor based data streaming systems are used to continuously and accurately monitor Alzheimer's patients and their surrounding environment. Typically, the requirements of such applications necessitate the cleaning and filtering of continuous, corrupted and incomplete data streams gathered wirelessly in dynamically varying conditions. Yet, existing data stream cleaning and filtering schemes are incapable of capturing the dynamics of the environment while simultaneously suppressing the losses and corruption introduced by uncertain environmental, hardware, and network conditions. Consequently, existing data cleaning and filtering paradigms are being challenged. This dissertation develops novel schemes for cleaning data streams received from a wireless sensor network operating under non-linear and dynamically varying conditions. The study establishes a paradigm for validating spatio-temporal associations among data sources to enhance data cleaning. To simplify the complexity of the validation process, the developed solution maps the requirements of the application on a geometrical space and identifies the potential sensor nodes of interest. Additionally, this dissertation models a wireless sensor network data reduction system by ascertaining that segregating data adaptation and prediction processes will augment the data reduction rates. The schemes presented in this study are evaluated using simulation and information theory concepts. The results demonstrate that dynamic conditions of the environment are better managed when validation is used for data cleaning. They also show that when a fast convergent adaptation process is deployed, data reduction rates are significantly improved. Targeted applications of the developed methodology include machine health monitoring, tele-health, environment and habitat monitoring, intermodal transportation and homeland security.
Resumo:
This study examines the factors facilitating the transfer admission of students broadly classified as Black from a single community college into a selective engineering college. The work aims to further research on STEM preparation and performance for students of color, as well as scholarship on increasing access to four-year institutions from two-year schools. Factors illuminating Underrepresented Racial and Ethnic Minority (URM) student pathways through Science, Technology, Engineering, and Mathematics (STEM) degree programs have often been examined through large-scale quantitative studies. However, this qualitative study complements quantitative data through demographic questionnaires, as well as semi-structured individual and group. The backgrounds and voices of diverse Black transfer students in four-year engineering degree programs were captured through these methods. Major findings from this research include evidence that community college faculty, peer networks, and family members facilitated transfer. Other results distinguish Black African from Black American transfers; included in these distinctions are depictions of different K-12 schooling experiences and differences in how participants self-identified. The findings that result from this research build upon the few studies that account for expanded dimensions of student diversity within the Black population. Among other demographic data, participants’ countries of birth and years of migration to the U.S. (if applicable) are included. Interviews reveal participants’ perceptions of factors impacting their educational trajectories in STEM and subsequent ability to transfer into a competitive undergraduate engineering program. This study is inclusive of, and reveals an important shifting demographic within the United States of America, Black Africans, who represent one of the fastest-growing segments of the immigrant population.
Resumo:
Semistructured interviews were conducted with 40 adolescents who reported inhaling volatile solvents. Their average age was 14.2 years, and they used a range of substances. All were aware of the short-term health risks involved in volatile solvent use, and most reported experiencing ill effects, such as headaches and vomiting. Users were found to be organized into groups and peer networks, which often were involved in theft, prostitution, and other risk-taking behaviors. More chronic users had higher status within the group. Suggestions pertaining to intervention were obtained, and these are discussed in light of the findings.
Resumo:
Este estudo traz à tona nossa intenção de pesquisa: Compreender a história de vida de um sujeito com a Síndrome de Klinefelter, o Ramon. A Síndrome de Klinefelter é muito peculiar e vem ganhando destaque na área médica, não por sua prevalência na população, mas por sua complexidade. Na área da educação, a produção acerca dessa síndrome é incipiente encontramos apenas um estudo em Portugal. No nosso país, no entanto, ressalta-se o ineditismo dessa pesquisa. Nesse estudo pensamos, juntamente com Lev Semionovich Vigotski e outros autores que imprimem em seus textos raízes sócio-históricas, num Ramon para além do biológico, ou seja, para além dos seus limites orgânicos: um sujeito rico em subjetividade que foi valorizada. Objetivamos, assim, compreender a subjetividade desse sujeito que é singular na coletividade, assim, pensar no Ramon é pensar nos outros, nos seus pares e nas intrínsecas redes de dialogismos tecidas na imensa trama que é a vida. Buscamos também observar como ocorreu a inclusão desse sujeito no âmbito escolar. A fim de atingir os objetivos traçados, utilizamos a perspectiva histórico-cultural do desenvolvimento humano atrelando os pressupostos dessa opção teórica à metodologia história de vida. Para entender os detalhes, os indícios, as miudezas, os resquícios da história de vida do sujeito que parecem insignificantes, mas que são imprescindíveis para se compreender alguns processos de grande dimensão, nos apoiamos no paradigma indiciário de Ginzburg. Nessa pesquisa, nossos encontros com esse jovem de 22 anos são descritos e analisados, levando em conta os aspectos subjetivos. Nesses encontros, ouvimos várias narrativas para compor a história de vida do sujeito pesquisado: a do próprio Ramon, da sua mãe Marlene e das professoras da APAE e da Educação de Jovens e Adultos, ouvimos, portanto, Ramon em diferentes contextos: em casa, na APAE, na escola de ensino comum (EJA) e na casa da professora da APAE. Para colher os dados, recorremos às entrevistas biográficas semiestruturadas, as quais foram adequadas à singularidade de cada sujeito ouvido. Todas as entrevistas foram gravadas e transcritas em sua totalidade e os dados obtidos foram analisados levando-se em conta o contexto histórico e social de Ramon, assim, observamos as relações dialógicas estabelecidas por Ramon com seus pares e tentamos compreendê-las para melhor entender a construção subjetiva desse sujeito que para além de biológico, é social, cultural, que aprende, apreende e que, para além disso...muito nos ensina!
Resumo:
A Internet, conforme a conhecemos, foi projetada com base na pilha de protocolos TCP/IP, que foi desenvolvida nos anos 60 e 70 utilizando um paradigma centrado nos endereços individuais de cada máquina (denominado host-centric). Este paradigma foi extremamente bem-sucedido em interligar máquinas através de encaminhamento baseado no endereço IP. Estudos recentes demonstram que, parte significativa do tráfego atual da Internet centra-se na transferência de conteúdos, em vez das tradicionais aplicações de rede, conforme foi originalmente concebido. Surgiram então novos modelos de comunicação, entre eles, protocolos de rede ponto-a-ponto, onde cada máquina da rede pode efetuar distribuição de conteúdo (denominadas de redes peer-to-peer), para melhorar a distribuição e a troca de conteúdos na Internet. Por conseguinte, nos últimos anos o paradigma host-centric começou a ser posto em causa e apareceu uma nova abordagem de Redes Centradas na Informação (ICN - information-centric networking). Tendo em conta que a Internet, hoje em dia, basicamente é uma rede de transferência de conteúdos e informações, porque não centrar a sua evolução neste sentido, ao invés de comunicações host-to-host? O paradigma de Rede Centrada no Conteúdo (CCN - Content Centric Networking) simplifica a solução de determinados problemas de segurança relacionados com a arquitetura TCP/IP e é uma das principais propostas da nova abordagem de Redes Centradas na Informação. Um dos principais problemas do modelo TCP/IP é a proteção do conteúdo. Atualmente, para garantirmos a autenticidade e a integridade dos dados partilhados na rede, é necessário garantir a segurança do repositório e do caminho que os dados devem percorrer até ao seu destino final. No entanto, a contínua ineficácia perante os ataques de negação de serviço praticados na Internet, sugere a necessidade de que seja a própria infraestrutura da rede a fornecer mecanismos para os mitigar. Um dos principais pilares do paradigma de comunicação da CCN é focalizar-se no próprio conteúdo e não na sua localização física. Desde o seu aparecimento em 2009 e como consequência da evolução e adaptação a sua designação mudou atualmente para Redes de Conteúdos com Nome (NNC – Named Network Content). Nesta dissertação, efetuaremos um estudo de uma visão geral da arquitetura CCN, apresentando as suas principais características, quais os componentes que a compõem e como os seus mecanismos mitigam os tradicionais problemas de comunicação e de segurança. Serão efetuadas experiências com o CCNx, que é um protótipo composto por um conjunto de funcionalidades e ferramentas, que possibilitam a implementação deste paradigma. O objetivo é analisar criticamente algumas das propostas existentes, determinar oportunidades, desafios e perspectivas para investigação futura.
Resumo:
A causa de l'increment del nombre de consumidors de dades geogràfiques, sobretot de dispositius mòbils, els sistemes d'informació geogràfica es troben davant el repte de reduir el coll d'ampolla que suposa l'arquitectura client-servidor clàssica. Una de les solucions a aquest problema és la translació de part de la intel·ligència al node client, així com la creació de xarxes descentralitzades (peer to peer),L'objectiu d'aquesta recerca és demostrar la viabilitat d'una infraestructura client mòbil-servidor, on el client no és únicament un consumidor més, si no que es torna un node intel·ligent.Per fer aquest estudi s'ha desenvolupat una aplicació pel sistema operatiu Android que consumeix dades d'OpenStreetMap. Aquesta aplicació utilitza tècniques de tessel·lat, catching i descàrrega de dades en background segons la posició de l'usuari, per facilitar el consum i reduir el flux de dades intercanviades entre el client i el servidor. També s'ha creat un servei web intern al dispositiu mòbil per a la creació de xarxes peer to peer, les quals permetin un intercanvi de dades entre els terminals mòbils. En aquest treball s'ha demostrat la viabilitat de la infraestructura a través del impacte de l'ús de les tècniques comentades anteriorment sobre el dispositiu client. Per mesurar l'impacte s'ha tingut en compte la càrrega de la CPU (la qual repercuteix al consum de la bateria) i el temps de resposta del sistema. Els resultats de les proves realitzades indiquen que aquestes tècniques redueixen el temps de cerca de punts d'interès d'una manera dràstica, però que també tenen un alt impacte a la CPU i en el temps de càrrega, sobretot en dispositius amb menys capacitats.