56 resultados para Computer networks -- Security measures
Resumo:
Currently many ontologies are available for addressing different domains. However, it is not always possible to deploy such ontologies to support collaborative working, so that their full potential can be exploited to implement intelligent cooperative applications capable of reasoning over a network of context-specific ontologies. The main problem arises from the fact that presently ontologies are created in an isolated way to address specific needs. However we foresee the need for a network of ontologies which will support the next generation of intelligent applications/devices, and, the vision of Ambient Intelligence. The main objective of this paper is to motivate the design of a networked ontology (Meta) model which formalises ways of connecting available ontologies so that they are easy to search, to characterise and to maintain. The aim is to make explicit the virtual and implicit network of ontologies serving the Semantic Web.
Resumo:
Since the advent of the internet in every day life in the 1990s, the barriers to producing, distributing and consuming multimedia data such as videos, music, ebooks, etc. have steadily been lowered for most computer users so that almost everyone with internet access can join the online communities who both produce, consume and of course also share media artefacts. Along with this trend, the violation of personal data privacy and copyright has increased with illegal file sharing being rampant across many online communities particularly for certain music genres and amongst the younger age groups. This has had a devastating effect on the traditional media distribution market; in most cases leaving the distribution companies and the content owner with huge financial losses. To prove that a copyright violation has occurred one can deploy fingerprinting mechanisms to uniquely identify the property. However this is currently based on only uni-modal approaches. In this paper we describe some of the design challenges and architectural approaches to multi-modal fingerprinting currently being examined for evaluation studies within a PhD research programme on optimisation of multi-modal fingerprinting architectures. Accordingly we outline the available modalities that are being integrated through this research programme which aims to establish the optimal architecture for multi-modal media security protection over the internet as the online distribution environment for both legal and illegal distribution of media products.
Resumo:
The existence of endgame databases challenges us to extract higher-grade information and knowledge from their basic data content. Chess players, for example, would like simple and usable endgame theories if such holy grail exists: endgame experts would like to provide such insights and be inspired by computers to do so. Here, we investigate the use of artificial neural networks (NNs) to mine these databases and we report on a first use of NNs on KPK. The results encourage us to suggest further work on chess applications of neural networks and other data-mining techniques.
Resumo:
Locality to other nodes on a peer-to-peer overlay network can be established by means of a set of landmarks shared among the participating nodes. Each node independently collects a set of latency measures to landmark nodes, which are used as a multi-dimensional feature vector. Each peer node uses the feature vector to generate a unique scalar index which is correlated to its topological locality. A popular dimensionality reduction technique is the space filling Hilbert’s curve, as it possesses good locality preserving properties. However, there exists little comparison between Hilbert’s curve and other techniques for dimensionality reduction. This work carries out a quantitative analysis of their properties. Linear and non-linear techniques for scaling the landmark vectors to a single dimension are investigated. Hilbert’s curve, Sammon’s mapping and Principal Component Analysis have been used to generate a 1d space with locality preserving properties. This work provides empirical evidence to support the use of Hilbert’s curve in the context of locality preservation when generating peer identifiers by means of landmark vector analysis. A comparative analysis is carried out with an artificial 2d network model and with a realistic network topology model with a typical power-law distribution of node connectivity in the Internet. Nearest neighbour analysis confirms Hilbert’s curve to be very effective in both artificial and realistic network topologies. Nevertheless, the results in the realistic network model show that there is scope for improvements and better techniques to preserve locality information are required.
Resumo:
At present, collective action regarding bio-security among UK cattle and sheep farmers is rare. Despite the occurrence of catastrophic livestock diseases such as bovine spongiform encephalopathy (BSE) and foot and mouth disease (FMD), within recent decades, there are few national or local farmer-led animal health schemes. To explore the reasons for this apparent lack of interest, we utilised a socio-psychological approach to disaggregate the cognitive, emotive and contextual factors driving bio-security behaviour among cattle and sheep farmers in the United Kingdom (UK). In total, we interviewed 121 farmers in South-West England and Wales. The main analytical tools included a content, cluster and logistic regression analysis. The results of the content analysis illustrated apparent 'dissonance' between bio-security attitudes and behaviour.(1) Despite the heavy toll animal disease has taken on the agricultural economy, most study participants were dismissive of the many measures associated with bio-security. Justification for this lack of interest was largely framed in relation to the collective attribution or blame for the disease threats themselves. Indeed, epidemic diseases were largely related to external actors and agents. Reasons for outbreaks included inadequate border control, in tandem with ineffective policies and regulations. Conversely, endemic livestock disease was viewed as a problem for 'bad' farmers and not an issue for those individuals who managed their stock well. As such, there was little utility in forming groups to address what was largely perceived as an individual problem. Further, we found that attitudes toward bio-security did not appear to be influenced by any particular source of information per se. While strong negative attitudes were found toward specific sources of bio-security information, e.g. government leaflets, these appear to simply reflect widely held beliefs. In relation to actual bio-security behaviours, the logistic regression analysis revealed no significant difference between in-scheme and out of scheme farmers. We concluded that in order to support collective action with regard to bio-security, messages need to be reframed and delivered from a neutral source. Efforts to support group formation must also recognise and address the issues relating to perceptions of social connectedness among the communities involved. (c) 2008 Elsevier B.V. All rights reserved.
Resumo:
The identification of criminal networks is not a routine exploratory process within the current practice of the law enforcement authorities; rather it is triggered by specific evidence of criminal activity being investigated. A network is identified when a criminal comes to notice and any associates who could also be potentially implicated would need to be identified if only to be eliminated from the enquiries as suspects or witnesses as well as to prevent and/or detect crime. However, an identified network may not be the one causing most harm in a given area.. This paper identifies a methodology to identify all of the criminal networks that are present within a Law Enforcement Area, and, prioritises those that are causing most harm to the community. Each crime is allocated a score based on its crime type and how recently the crime was committed; the network score, which can be used as decision support to help prioritise it for law enforcement purposes, is the sum of the individual crime scores.
Resumo:
Traditionally, applications and tools supporting collaborative computing have been designed only with personal computers in mind and support a limited range of computing and network platforms. These applications are therefore not well equipped to deal with network heterogeneity and, in particular, do not cope well with dynamic network topologies. Progress in this area must be made if we are to fulfil the needs of users and support the diversity, mobility, and portability that are likely to characterise group work in future. This paper describes a groupware platform called Coco that is designed to support collaboration in a heterogeneous network environment. The work demonstrates that progress in the p development of a generic supporting groupware is achievable, even in the context of heterogeneous and dynamic networks. The work demonstrates the progress made in the development of an underlying communications infrastructure, building on peer-to-peer concept and topologies to improve scalability and robustness.
Resumo:
In this article, an overview of some of the latest developments in the field of cerebral cortex to computer interfacing (CCCI) is given. This is posed in the more general context of Brain-Computer Interfaces in order to assess advantages and disadvantages. The emphasis is clearly placed on practical studies that have been undertaken and reported on, as opposed to those speculated, simulated or proposed as future projects. Related areas are discussed briefly only in the context of their contribution to the studies being undertaken. The area of focus is notably the use of invasive implant technology, where a connection is made directly with the cerebral cortex and/or nervous system. Tests and experimentation which do not involve human subjects are invariably carried out a priori to indicate the eventual possibilities before human subjects are themselves involved. Some of the more pertinent animal studies from this area are discussed. The paper goes on to describe human experimentation, in which neural implants have linked the human nervous system bidirectionally with technology and the internet. A view is taken as to the prospects for the future for CCCI, in terms of its broad therapeutic role.
Resumo:
Abstract. Different types of mental activity are utilised as an input in Brain-Computer Interface (BCI) systems. One such activity type is based on Event-Related Potentials (ERPs). The characteristics of ERPs are not visible in single-trials, thus averaging over a number of trials is necessary before the signals become usable. An improvement in ERP-based BCI operation and system usability could be obtained if the use of single-trial ERP data was possible. The method of Independent Component Analysis (ICA) can be utilised to separate single-trial recordings of ERP data into components that correspond to ERP characteristics, background electroencephalogram (EEG) activity and other components with non- cerebral origin. Choice of specific components and their use to reconstruct “denoised” single-trial data could improve the signal quality, thus allowing the successful use of single-trial data without the need for averaging. This paper assesses single-trial ERP signals reconstructed using a selection of estimated components from the application of ICA on the raw ERP data. Signal improvement is measured using Contrast-To-Noise measures. It was found that such analysis improves the signal quality in all single-trials.
Resumo:
This paper discusses the RFID implants for identification via a sensor network. Brain-computer implants linked in to a wireless network. Biometric identification via body sensors is also discussed. The use of a network as a means for remote and distance monitoring of humans opens up a range of potential uses. Where implanted identification is concerned this immediately offers high security access to specific areas by means of only an RFID device. If a neural implant is employed then clearly the information exchanged with a network can take on a much richer form, allowing for identification and response to an individual's needs based on the signals apparent on their nervous system.
Resumo:
There are three key driving forces behind the development of Internet Content Management Systems (CMS) - a desire to manage the explosion of content, a desire to provide structure and meaning to content in order to make it accessible, and a desire to work collaboratively to manipulate content in some meaningful way. Yet the traditional CMS has been unable to meet the latter of these requirements, often failing to provide sufficient tools for collaboration in a distributed context. Peer-to-Peer (P2P) systems are networks in which every node is an equal participant (whether transmitting data, exchanging content, or invoking services) and there is an absence of any centralised administrative or coordinating authorities. P2P systems are inherently more scalable than equivalent client-server implementations as they tend to use resources at the edge of the network much more effectively. This paper details the rationale and design of a P2P middleware for collaborative content management.
Resumo:
This paper describes a prototype grid infrastructure, called the eMinerals minigrid, for molecular simulation scientists. which is based on an integration of shared compute and data resources. We describe the key components, namely the use of Condor pools, Linux/Unix clusters with PBS and IBM's LoadLeveller job handling tools, the use of Globus for security handling, the use of Condor-G tools for wrapping globus job submit commands, Condor's DAGman tool for handling workflow, the Storage Resource Broker for handling data, and the CCLRC dataportal and associated tools for both archiving data with metadata and making data available to other workers.