852 resultados para information networks
Resumo:
The existence of endgame databases challenges us to extract higher-grade information and knowledge from their basic data content. Chess players, for example, would like simple and usable endgame theories if such holy grail exists: endgame experts would like to provide such insights and be inspired by computers to do so. Here, we investigate the use of artificial neural networks (NNs) to mine these databases and we report on a first use of NNs on KPK. The results encourage us to suggest further work on chess applications of neural networks and other data-mining techniques.
Resumo:
This paper describes the user modeling component of EPIAIM, a consultation system for data analysis in epidemiology. The component is aimed at representing knowledge of concepts in the domain, so that their explanations can be adapted to user needs. The first part of the paper describes two studies aimed at analysing user requirements. The first one is a questionnaire study which examines the respondents' familiarity with concepts. The second one is an analysis of concept descriptions in textbooks and from expert epidemiologists, which examines how discourse strategies are tailored to the level of experience of the expected audience. The second part of the paper describes how the results of these studies have been used to design the user modeling component of EPIAIM. This module works in a two-step approach. In the first step, a few trigger questions allow the activation of a stereotype that includes a "body" and an "inference component". The body is the representation of the body of knowledge that a class of users is expected to know, along with the probability that the knowledge is known. In the inference component, the learning process of concepts is represented as a belief network. Hence, in the second step the belief network is used to refine the initial default information in the stereotype's body. This is done by asking a few questions on those concepts where it is uncertain whether or not they are known to the user, and propagating this new evidence to revise the whole situation. The system has been implemented on a workstation under UNIX. An example of functioning is presented, and advantages and limitations of the approach are discussed.
Resumo:
The study of the morphology of tidal networks and their relation to salt marsh vegetation is currently an active area of research, and a number of theories have been developed which require validation using extensive observations. Conventional methods of measuring networks and associated vegetation can be cumbersome and subjective. Recent advances in remote sensing techniques mean that these can now often reduce measurement effort whilst at the same time increasing measurement scale. The status of remote sensing of tidal networks and their relation to vegetation is reviewed. The measurement of network planforms and their associated variables is possible to sufficient resolution using digital aerial photography and airborne scanning laser altimetry (LiDAR), with LiDAR also being able to measure channel depths. A multi-level knowledge-based technique is described to extract networks from LiDAR in a semi-automated fashion. This allows objective and detailed geomorphological information on networks to be obtained over large areas of the inter-tidal zone. It is illustrated using LIDAR data of the River Ems, Germany, the Venice lagoon, and Carnforth Marsh, Morecambe Bay, UK. Examples of geomorphological variables of networks extracted from LiDAR data are given. Associated marsh vegetation can be classified into its component species using airborne hyperspectral and satellite multispectral data. Other potential applications of remote sensing for network studies include determining spatial relationships between networks and vegetation, measuring marsh platform vegetation roughness, in-channel velocities and sediment processes, studying salt pans, and for marsh restoration schemes.
Resumo:
Locality to other nodes on a peer-to-peer overlay network can be established by means of a set of landmarks shared among the participating nodes. Each node independently collects a set of latency measures to landmark nodes, which are used as a multi-dimensional feature vector. Each peer node uses the feature vector to generate a unique scalar index which is correlated to its topological locality. A popular dimensionality reduction technique is the space filling Hilbert’s curve, as it possesses good locality preserving properties. However, there exists little comparison between Hilbert’s curve and other techniques for dimensionality reduction. This work carries out a quantitative analysis of their properties. Linear and non-linear techniques for scaling the landmark vectors to a single dimension are investigated. Hilbert’s curve, Sammon’s mapping and Principal Component Analysis have been used to generate a 1d space with locality preserving properties. This work provides empirical evidence to support the use of Hilbert’s curve in the context of locality preservation when generating peer identifiers by means of landmark vector analysis. A comparative analysis is carried out with an artificial 2d network model and with a realistic network topology model with a typical power-law distribution of node connectivity in the Internet. Nearest neighbour analysis confirms Hilbert’s curve to be very effective in both artificial and realistic network topologies. Nevertheless, the results in the realistic network model show that there is scope for improvements and better techniques to preserve locality information are required.
Resumo:
This conceptual paper aims to improve our understanding of how internationalised firms use outsourcing and offshoring strategies to manage knowledge and information through the life-cycle of integrated product-service solutions. More precisely, we identify the appropriate theoretical framework for this analysis and investigate through in-depth case studies how UK engineering firms organise, coordinate, and incentivise work that is executed in globally distributed teams. Our research focuses on their UK and India offices to study the organisation and governance of distributed teams. The research has several theoretical dimensions - organization; geography; time and knowledge - that it addresses as boundary challenges.
Resumo:
A large volume of visual content is inaccessible until effective and efficient indexing and retrieval of such data is achieved. In this paper, we introduce the DREAM system, which is a knowledge-assisted semantic-driven context-aware visual information retrieval system applied in the film post production domain. We mainly focus on the automatic labelling and topic map related aspects of the framework. The use of the context- related collateral knowledge, represented by a novel probabilistic based visual keyword co-occurrence matrix, had been proven effective via the experiments conducted during system evaluation. The automatically generated semantic labels were fed into the Topic Map Engine which can automatically construct ontological networks using Topic Maps technology, which dramatically enhances the indexing and retrieval performance of the system towards an even higher semantic level.
Resumo:
In this paper we consider a cooperative communication system where some a priori information of wireless channels is available at the transmitter. Several opportunistic relaying strategies are developed to fully utilize the available channel information. Then an explicit expression of the outage probability is developed for each proposed cooperative scheme as well as the diversity-multiplexing tradeoff by using order statistics. Our analytical results show that the more channel information available at the transmitter, the better performance a cooperative system can achieve. When the exact values of the source-relay channels are available, the performance loss at low SNR can be effectively suppressed. When the source node has the access to the source-relay and relay-destination channels, the full diversity can be achieved by costing only one extra channel used for relaying transmission, and an optimal diversity-multiplexing tradeoff can be achieved d(r) = (N + 1)(1 - 2r), where N is the number of all possible relaying nodes.
Resumo:
Automatic indexing and retrieval of digital data poses major challenges. The main problem arises from the ever increasing mass of digital media and the lack of efficient methods for indexing and retrieval of such data based on the semantic content rather than keywords. To enable intelligent web interactions, or even web filtering, we need to be capable of interpreting the information base in an intelligent manner. For a number of years research has been ongoing in the field of ontological engineering with the aim of using ontologies to add such (meta) knowledge to information. In this paper, we describe the architecture of a system (Dynamic REtrieval Analysis and semantic metadata Management (DREAM)) designed to automatically and intelligently index huge repositories of special effects video clips, based on their semantic content, using a network of scalable ontologies to enable intelligent retrieval. The DREAM Demonstrator has been evaluated as deployed in the film post-production phase to support the process of storage, indexing and retrieval of large data sets of special effects video clips as an exemplar application domain. This paper provides its performance and usability results and highlights the scope for future enhancements of the DREAM architecture which has proven successful in its first and possibly most challenging proving ground, namely film production, where it is already in routine use within our test bed Partners' creative processes. (C) 2009 Published by Elsevier B.V.
Resumo:
The aim of this paper is to study the impact of channel state information on the design of cooperative transmission protocols. This is motivated by the fact that the performance gain achieved by cooperative diversity comes at the price of the extra bandwidth resource consumption. Several opportunistic relaying strategies are developed to fully utilize the different types of a priori channel information. The analytical and numerical results demonstrate that the use of such a priori information increases the spectral efficiency of cooperative diversity, especially at low signal-to-noise ratio.
Resumo:
We discuss the feasibility of wireless terahertz communications links deployed in a metropolitan area and model the large-scale fading of such channels. The model takes into account reception through direct line of sight, ground and wall reflection, as well as diffraction around a corner. The movement of the receiver is modeled by an autonomous dynamic linear system in state space, whereas the geometric relations involved in the attenuation and multipath propagation of the electric field are described by a static nonlinear mapping. A subspace algorithm in conjunction with polynomial regression is used to identify a single-output Wiener model from time-domain measurements of the field intensity when the receiver motion is simulated using a constant angular speed and an exponentially decaying radius. The identification procedure is validated by using the model to perform q-step ahead predictions. The sensitivity of the algorithm to small-scale fading, detector noise, and atmospheric changes are discussed. The performance of the algorithm is tested in the diffraction zone assuming a range of emitter frequencies (2, 38, 60, 100, 140, and 400 GHz). Extensions of the simulation results to situations where a more complicated trajectory describes the motion of the receiver are also implemented, providing information on the performance of the algorithm under a worst case scenario. Finally, a sensitivity analysis to model parameters for the identified Wiener system is proposed.
Resumo:
Driven by a range of modern applications that includes telecommunications, e-business and on-line social interaction, recent ideas in complex networks can be extended to the case of time-varying connectivity. Here we propose a general frame- work for modelling and simulating such dynamic networks, and we explain how the long time behaviour may reveal important information about the mechanisms underlying the evolution.