20 resultados para Peer-to-peer databases

em Doria (National Library of Finland DSpace Services) - National Library of Finland, Finland


Relevância:

100.00% 100.00%

Publicador:

Resumo:

1. Introduction "The one that has compiled ... a database, the collection, securing the validity or presentation of which has required an essential investment, has the sole right to control the content over the whole work or over either a qualitatively or quantitatively substantial part of the work both by means of reproduction and by making them available to the public", Finnish Copyright Act, section 49.1 These are the laconic words that implemented the much-awaited and hotly debated European Community Directive on the legal protection of databases,2 the EDD, into Finnish Copyright legislation in 1998. Now in the year 2005, after more than half a decade of the domestic implementation it is yet uncertain as to the proper meaning and construction of the convoluted qualitative criteria the current legislation employs as a prerequisite for the database protection both in Finland and within the European Union. Further, this opaque Pan-European instrument has the potential of bringing about a number of far-reaching economic and cultural ramifications, which have remained largely uncharted or unobserved. Thus the task of understanding this particular and currently peculiarly European new intellectual property regime is twofold: first, to understand the mechanics and functioning of the EDD and second, to realise the potential and risks inherent in the new legislation in economic, cultural and societal dimensions. 2. Subject-matter of the study: basic issues The first part of the task mentioned above is straightforward: questions such as what is meant by the key concepts triggering the functioning of the EDD such as presentation of independent information, what constitutes an essential investment in acquiring data and when the reproduction of a given database reaches either qualitatively or quantitatively the threshold of substantiality before the right-holder of a database can avail himself of the remedies provided by the statutory framework remain unclear and call for a careful analysis. As for second task, it is already obvious that the practical importance of the legal protection providedby the database right is in the rapid increase. The accelerating transformationof information into digital form is an existing fact, not merely a reflection of a shape of things to come in the future. To take a simple example, the digitisation of a map, traditionally in paper format and protected by copyright, can provide the consumer a markedly easier and faster access to the wanted material and the price can be, depending on the current state of the marketplace, cheaper than that of the traditional form or even free by means of public lending libraries providing access to the information online. This also renders it possible for authors and publishers to make available and sell their products to markedly larger, international markets while the production and distribution costs can be kept at minimum due to the new electronic production, marketing and distributionmechanisms to mention a few. The troublesome side is for authors and publishers the vastly enhanced potential for illegal copying by electronic means, producing numerous virtually identical copies at speed. The fear of illegal copying canlead to stark technical protection that in turn can dampen down the demand for information goods and services and furthermore, efficiently hamper the right of access to the materials available lawfully in electronic form and thus weaken the possibility of access to information, education and the cultural heritage of anation or nations, a condition precedent for a functioning democracy. 3. Particular issues in Digital Economy and Information Networks All what is said above applies a fortiori to the databases. As a result of the ubiquity of the Internet and the pending breakthrough of Mobile Internet, peer-to-peer Networks, Localand Wide Local Area Networks, a rapidly increasing amount of information not protected by traditional copyright, such as various lists, catalogues and tables,3previously protected partially by the old section 49 of the Finnish Copyright act are available free or for consideration in the Internet, and by the same token importantly, numerous databases are collected in order to enable the marketing, tendering and selling products and services in above mentioned networks. Databases and the information embedded therein constitutes a pivotal element in virtually any commercial operation including product and service development, scientific research and education. A poignant but not instantaneously an obvious example of this is a database consisting of physical coordinates of a certain selected group of customers for marketing purposes through cellular phones, laptops and several handheld or vehicle-based devices connected online. These practical needs call for answer to a plethora of questions already outlined above: Has thecollection and securing the validity of this information required an essential input? What qualifies as a quantitatively or qualitatively significant investment? According to the Directive, the database comprises works, information and other independent materials, which are arranged in systematic or methodical way andare individually accessible by electronic or other means. Under what circumstances then, are the materials regarded as arranged in systematic or methodical way? Only when the protected elements of a database are established, the question concerning the scope of protection becomes acute. In digital context, the traditional notions of reproduction and making available to the public of digital materials seem to fit ill or lead into interpretations that are at variance with analogous domain as regards the lawful and illegal uses of information. This may well interfere with or rework the way in which the commercial and other operators have to establish themselves and function in the existing value networks of information products and services. 4. International sphere After the expiry of the implementation period for the European Community Directive on legal protection of databases, the goals of the Directive must have been consolidated into the domestic legislations of the current twenty-five Member States within the European Union. On one hand, these fundamental questions readily imply that the problemsrelated to correct construction of the Directive underlying the domestic legislation transpire the national boundaries. On the other hand, the disputes arisingon account of the implementation and interpretation of the Directive on the European level attract significance domestically. Consequently, the guidelines on correct interpretation of the Directive importing the practical, business-oriented solutions may well have application on European level. This underlines the exigency for a thorough analysis on the implications of the meaning and potential scope of Database protection in Finland and the European Union. This position hasto be contrasted with the larger, international sphere, which in early 2005 does differ markedly from European Union stance, directly having a negative effect on international trade particularly in digital content. A particular case in point is the USA, a database producer primus inter pares, not at least yet having aSui Generis database regime or its kin, while both the political and academic discourse on the matter abounds. 5. The objectives of the study The above mentioned background with its several open issues calls for the detailed study of thefollowing questions: -What is a database-at-law and when is a database protected by intellectual property rights, particularly by the European database regime?What is the international situation? -How is a database protected and what is its relation with other intellectual property regimes, particularly in the Digital context? -The opportunities and threats provided by current protection to creators, users and the society as a whole, including the commercial and cultural implications? -The difficult question on relation of the Database protection and protection of factual information as such. 6. Dsiposition The Study, in purporting to analyse and cast light on the questions above, is divided into three mainparts. The first part has the purpose of introducing the political and rationalbackground and subsequent legislative evolution path of the European database protection, reflected against the international backdrop on the issue. An introduction to databases, originally a vehicle of modern computing and information andcommunication technology, is also incorporated. The second part sets out the chosen and existing two-tier model of the database protection, reviewing both itscopyright and Sui Generis right facets in detail together with the emergent application of the machinery in real-life societal and particularly commercial context. Furthermore, a general outline of copyright, relevant in context of copyright databases is provided. For purposes of further comparison, a chapter on the precursor of Sui Generi, database right, the Nordic catalogue rule also ensues. The third and final part analyses the positive and negative impact of the database protection system and attempts to scrutinize the implications further in the future with some caveats and tentative recommendations, in particular as regards the convoluted issue concerning the IPR protection of information per se, a new tenet in the domain of copyright and related rights.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Current-day web search engines (e.g., Google) do not crawl and index a significant portion of theWeb and, hence, web users relying on search engines only are unable to discover and access a large amount of information from the non-indexable part of the Web. Specifically, dynamic pages generated based on parameters provided by a user via web search forms (or search interfaces) are not indexed by search engines and cannot be found in searchers’ results. Such search interfaces provide web users with an online access to myriads of databases on the Web. In order to obtain some information from a web database of interest, a user issues his/her query by specifying query terms in a search form and receives the query results, a set of dynamic pages that embed required information from a database. At the same time, issuing a query via an arbitrary search interface is an extremely complex task for any kind of automatic agents including web crawlers, which, at least up to the present day, do not even attempt to pass through web forms on a large scale. In this thesis, our primary and key object of study is a huge portion of the Web (hereafter referred as the deep Web) hidden behind web search interfaces. We concentrate on three classes of problems around the deep Web: characterization of deep Web, finding and classifying deep web resources, and querying web databases. Characterizing deep Web: Though the term deep Web was coined in 2000, which is sufficiently long ago for any web-related concept/technology, we still do not know many important characteristics of the deep Web. Another matter of concern is that surveys of the deep Web existing so far are predominantly based on study of deep web sites in English. One can then expect that findings from these surveys may be biased, especially owing to a steady increase in non-English web content. In this way, surveying of national segments of the deep Web is of interest not only to national communities but to the whole web community as well. In this thesis, we propose two new methods for estimating the main parameters of deep Web. We use the suggested methods to estimate the scale of one specific national segment of the Web and report our findings. We also build and make publicly available a dataset describing more than 200 web databases from the national segment of the Web. Finding deep web resources: The deep Web has been growing at a very fast pace. It has been estimated that there are hundred thousands of deep web sites. Due to the huge volume of information in the deep Web, there has been a significant interest to approaches that allow users and computer applications to leverage this information. Most approaches assumed that search interfaces to web databases of interest are already discovered and known to query systems. However, such assumptions do not hold true mostly because of the large scale of the deep Web – indeed, for any given domain of interest there are too many web databases with relevant content. Thus, the ability to locate search interfaces to web databases becomes a key requirement for any application accessing the deep Web. In this thesis, we describe the architecture of the I-Crawler, a system for finding and classifying search interfaces. Specifically, the I-Crawler is intentionally designed to be used in deepWeb characterization studies and for constructing directories of deep web resources. Unlike almost all other approaches to the deep Web existing so far, the I-Crawler is able to recognize and analyze JavaScript-rich and non-HTML searchable forms. Querying web databases: Retrieving information by filling out web search forms is a typical task for a web user. This is all the more so as interfaces of conventional search engines are also web forms. At present, a user needs to manually provide input values to search interfaces and then extract required data from the pages with results. The manual filling out forms is not feasible and cumbersome in cases of complex queries but such kind of queries are essential for many web searches especially in the area of e-commerce. In this way, the automation of querying and retrieving data behind search interfaces is desirable and essential for such tasks as building domain-independent deep web crawlers and automated web agents, searching for domain-specific information (vertical search engines), and for extraction and integration of information from various deep web resources. We present a data model for representing search interfaces and discuss techniques for extracting field labels, client-side scripts and structured data from HTML pages. We also describe a representation of result pages and discuss how to extract and store results of form queries. Besides, we present a user-friendly and expressive form query language that allows one to retrieve information behind search interfaces and extract useful data from the result pages based on specified conditions. We implement a prototype system for querying web databases and describe its architecture and components design.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Communications play a key role in modern smart grids. New functionalities that make the grids ‘smart’ require the communication network to function properly. Data transmission between intelligent electric devices (IEDs) in the rectifier and the customer-end inverters (CEIs) used for power conversion is also required in the smart grid concept of the low-voltage direct current (LVDC) distribution network. Smart grid applications, such as smart metering, demand side management (DSM), and grid protection applied with communications are all installed in the LVDC system. Thus, besides remote connection to the databases of the grid operators, a local communication network in the LVDC network is needed. One solution applied to implement the communication medium in power distribution grids is power line communication (PLC). There are power cables in the distribution grids, and hence, they may be applied as a communication channel for the distribution-level data. This doctoral thesis proposes an IP-based high-frequency (HF) band PLC data transmission concept for the LVDC network. A general method to implement the Ethernet-based PLC concept between the public distribution rectifier and the customerend inverters in the LVDC grid is introduced. Low-voltage cables are studied as the communication channel in the frequency band of 100 kHz–30 MHz. The communication channel characteristics and the noise in the channel are described. All individual components in the channel are presented in detail, and a channel model, comprising models for each channel component is developed and verified by measurements. The channel noise is also studied by measurements. Theoretical signalto- noise ratio (SNR) and channel capacity analyses and practical data transmission tests are carried out to evaluate the applicability of the PLC concept against the requirements set by the smart grid applications in the LVDC system. The main results concerning the applicability of the PLC concept and its limitations are presented, and suggestion for future research proposed.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Presentation at Open Repositories 2014, Helsinki, Finland, June 9-13, 2014

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The fact that most of new Personal Data Assistant (PDA) devices and smartphones have the ability to communicate via different wireless technologies has made several new applications possible. While traditional network model is based on the idea of static hosts, mobile devices can create decentralized, self-organizing ad-hoc networks and act as peers in the network. This kind of adapting network is suitable for mobile devices which can freely join and leave the networks. Because several different wireless communication technologies are involved, flexible changing of the networking technology must be handled in order to enable seamless communication between these networks. This thesis presents a transparent network interface to mobile Peer-to-Peer environment which is named as Virtual PeerHood. Different wireless technologies and aspects of providing a seamless connectivity between these technologies are explored. The result is a middleware platform for mobile Peer-to-Peer environment, capable of handling several networking technologies.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

During the last half decade the popularity of different peer-to-peer applications has grown tremendously. Traditionally only desktop-class computers with fixed line network connections have been powerful enough to utilize peer-to-peer. However, the situation is about to change. The rapid development of wireless terminals will soon enable peer-to-peer applications on these devices as well as on desktops. Possibilities are further enhanced by the upcoming high-bandwidth cellular networks. In this thesis the applicability and implementation alternatives of an existing peer-to-peer system are researched for two target platforms: Linux powered iPaq and Symbian OS based smartphone. The result is a peer-to-peer middleware component suitable for mobile terminals. It works on both platforms and utilizes Bluetooth networking technology. The implemented software platforms are compatible with each other and support for additional network technologies can be added with a minimal effort.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Increase of computational power and emergence of new computer technologies led to popularity of local communications between personal trusted devices. By-turn, it led to emergence of security problems related to user data utilized in such communications. One of the main aspects of the data security assurance is security of software operating on mobile devices. The aim of this work was to analyze security threats to PeerHood, software intended for performing personal communications between mobile devices regardless of underlying network technologies. To reach this goal, risk-based software security testing was performed. The results of the testing showed that the project has several security vulnerabilities. So PeerHood cannot be considered as a secure software. The analysis made in the work is the first step towards the further implementation of PeerHood security mechanisms, as well as taking into account security in the development process of this project.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The aim of this master’s thesis was to specify a system requiring minimal configuration and providing maximal connectivity in the vein of Skype but for device management purposes. As peer-to-peer applications are pervasive and especially as Skype is known to provide this functionality, the research was focused on these technologies. The resulting specification was a hybrid of a tiered hierarchical network structure and a Kademlia based DHT. A prototype was produced as a proof-of-concept for the hierarchical topology, demonstrating that the specification was feasible.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The purpose of this thesis is to find out whether all the peer to peer lenders are unworthy of credit and also if there are single qualities or combinations of qualities that determine the probability of default of a person or group of people. Distinguishing qualities are searched with self-organizing maps (SOM). Qualities and groups of people found by the self-organizing map are then compared to the average. The comparison is carried out by looking how big proportion of borrowers meeting the criteria is two months or more behind with their payments. Research data used is collected by an Estonian peer to peer lending company during the years of 2011-2014. Data consists of peer to peer borrowers and information gathered from them.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

Companies are increasingly under pressure to be more efficient both in terms of costs and overall performance and thus, they seek new ways to develop their products and innovate. For pharmaceutical industry it can take several decades to launch a new drug to the markets. Since pharmaceutical industry is one of the most research-intensive industries, is outsourcing one way to enhance the R&D processes of such companies. It is said that outsourcing to offshore locations is vastly more challenging and complicated than any other exporting activity or inter-company relationship that has evoked a lot of discussion. By outsourcing strategically, companies must also thoroughly focus on transaction costs and core competences. Today, the suppliers are looked for beyond national boundaries and furthermore, the location of the outsourcing activity must also be thoroughly considered. Consequently, the purpose of this study is to analyze what is known of strategic outsourcing of pharmaceutical R&D to India. In order to meet the purpose of the study, this study tries to answer three sub-questions set to it: first, what is strategic outsourcing, second, why pharmaceutical companies utilize strategic outsourcing of R&D and last, why pharmaceutical companies select India as the location for outsourcing their R&D. The study is a qualitative study. The purpose of the study was approached by a literature review with systematic elements and sub-questions were analyzed through different relevant theories, such as theory of transaction costs, core competences and location advantages. Applicable academic journal articles were comprehensively included in the study. The data was collected from electronic journal article databases using key words and almost only peer-reviewed, as new as possible articles were included. Also both the reference list of the included articles and article recommendations from professionals generated more articles for inclusion. The data was analyzed through thematization that resulted in themes that illuminate the purpose of the study and sub-questions. As an outcome of the analysis, each of the theory chapters in the study represents one sub-question. The literature used in this study revealed that strategic outsourcing of R&D is increasingly used in pharmaceutical industry and the major motives to practice it has to do with lowering costs, accessing skilled labor, resources and knowledge and enhancing their quality while speeding up the introduction of new drugs. Mainly for the above-mentioned motives India is frequently chosen as the target location for pharma outsourcers. Still, the literature is somewhat incomplete in this complex phenomenon and more research is needed.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Selostus: Maatalous- ja elintarviketieteiden www-pohjaiset viitetietokannat ja aihehakemistot - suomalaisen tiedonetsijän näkökulma

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The main objective of this research paper was to synthesize, integrate and analyze the theoretical foundation of the resource-based view of the firm on sustainable competitive advantage. Accordingly, this research was a literature research employing the methodology of interpretative study of concept and unobtrusive measures. The core and majority of the research data was gathered from the major online journal databases. Only peer-reviewed articles from highly-esteemed journals on the subject of competitive advantage were used. The theoretical core of the research paper was centred on resources, capabilities, and the sustainability dilemma of competitive advantage. Furthermore, other strategic management concepts relating to the resource-based view of the firm were used with reference to the research objectives. The resource-based view of the firm continues to be a controversial but important are of strategic management research on sustainable competitive advantage. Consequently, the theoretical foundation and the empirical testing of the framework needs further work. However, it is evident that internal organizational factors in the form of resources and capabilities are vital for the formation of sustainable competitive advantage. Resources and capabilities are not, however, valuable on their own - competitive advantage requires seamless interplay and complementarity between bundles of resources and capabilities.

Relevância:

40.00% 40.00%

Publicador:

Relevância:

40.00% 40.00%

Publicador: