844 resultados para Heterogeneous information network


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Attractor properties of a popular discrete-time neural network model are illustrated through numerical simulations. The most complex dynamics is found to occur within particular ranges of parameters controlling the symmetry and magnitude of the weight matrix. A small network model is observed to produce fixed points, limit cycles, mode-locking, the Ruelle-Takens route to chaos, and the period-doubling route to chaos. Training algorithms for tuning this dynamical behaviour are discussed. Training can be an easy or difficult task, depending whether the problem requires the use of temporal information distributed over long time intervals. Such problems require training algorithms which can handle hidden nodes. The most prominent of these algorithms, back propagation through time, solves the temporal credit assignment problem in a way which can work only if the relevant information is distributed locally in time. The Moving Targets algorithm works for the more general case, but is computationally intensive, and prone to local minima.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Neural networks can be regarded as statistical models, and can be analysed in a Bayesian framework. Generalisation is measured by the performance on independent test data drawn from the same distribution as the training data. Such performance can be quantified by the posterior average of the information divergence between the true and the model distributions. Averaging over the Bayesian posterior guarantees internal coherence; Using information divergence guarantees invariance with respect to representation. The theory generalises the least mean squares theory for linear Gaussian models to general problems of statistical estimation. The main results are: (1)~the ideal optimal estimate is always given by average over the posterior; (2)~the optimal estimate within a computational model is given by the projection of the ideal estimate to the model. This incidentally shows some currently popular methods dealing with hyperpriors are in general unnecessary and misleading. The extension of information divergence to positive normalisable measures reveals a remarkable relation between the dlt dual affine geometry of statistical manifolds and the geometry of the dual pair of Banach spaces Ld and Ldd. It therefore offers conceptual simplification to information geometry. The general conclusion on the issue of evaluating neural network learning rules and other statistical inference methods is that such evaluations are only meaningful under three assumptions: The prior P(p), describing the environment of all the problems; the divergence Dd, specifying the requirement of the task; and the model Q, specifying available computing resources.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper surveys the context of feature extraction by neural network approaches, and compares and contrasts their behaviour as prospective data visualisation tools in a real world problem. We also introduce and discuss a hybrid approach which allows us to control the degree of discriminatory and topographic information in the extracted feature space.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Training Mixture Density Network (MDN) configurations within the NETLAB framework takes time due to the nature of the computation of the error function and the gradient of the error function. By optimising the computation of these functions, so that gradient information is computed in parameter space, training time is decreased by at least a factor of sixty for the example given. Decreased training time increases the spectrum of problems to which MDNs can be practically applied making the MDN framework an attractive method to the applied problem solver.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The ERS-1 satellite carries a scatterometer which measures the amount of radiation scattered back toward the satellite by the ocean's surface. These measurements can be used to infer wind vectors. The implementation of a neural network based forward model which maps wind vectors to radar backscatter is addressed. Input noise cannot be neglected. To account for this noise, a Bayesian framework is adopted. However, Markov Chain Monte Carlo sampling is too computationally expensive. Instead, gradient information is used with a non-linear optimisation algorithm to find the maximum em a posteriori probability values of the unknown variables. The resulting models are shown to compare well with the current operational model when visualised in the target space.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We study the equilibrium states of energy functions involving a large set of real variables, defined on the links of sparsely connected networks, and interacting at the network nodes, using the cavity and replica methods. When applied to the representative problem of network resource allocation, an efficient distributed algorithm is devised, with simulations showing full agreement with theory. Scaling properties with the network connectivity and the resource availability are found. © 2006 The American Physical Society.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The ERS-1 satellite carries a scatterometer which measures the amount of radiation scattered back toward the satellite by the ocean's surface. These measurements can be used to infer wind vectors. The implementation of a neural network based forward model which maps wind vectors to radar backscatter is addressed. Input noise cannot be neglected. To account for this noise, a Bayesian framework is adopted. However, Markov Chain Monte Carlo sampling is too computationally expensive. Instead, gradient information is used with a non-linear optimisation algorithm to find the maximum em a posteriori probability values of the unknown variables. The resulting models are shown to compare well with the current operational model when visualised in the target space.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In many Environmental Information Systems the actual observations arise from a discrete monitoring network which might be rather heterogeneous in both location and types of measurements made. In this paper we describe the architecture and infrastructure for a system, developed as part of the EU FP6 funded INTAMAP project, to provide a service oriented solution that allows the construction of an interoperable, automatic, interpolation system. This system will be based on the Open Geospatial Consortium’s Web Feature Service (WFS) standard. The essence of our approach is to extend the GML3.1 observation feature to include information about the sensor using SensorML, and to further extend this to incorporate observation error characteristics. Our extended WFS will accept observations, and will store them in a database. The observations will be passed to our R-based interpolation server, which will use a range of methods, including a novel sparse, sequential kriging method (only briefly described here) to produce an internal representation of the interpolated field resulting from the observations currently uploaded to the system. The extended WFS will then accept queries, such as ‘What is the probability distribution of the desired variable at a given point’, ‘What is the mean value over a given region’, or ‘What is the probability of exceeding a certain threshold at a given location’. To support information-rich transfer of complex and uncertain predictions we are developing schema to represent probabilistic results in a GML3.1 (object-property) style. The system will also offer more easily accessible Web Map Service and Web Coverage Service interfaces to allow users to access the system at the level of complexity they require for their specific application. Such a system will offer a very valuable contribution to the next generation of Environmental Information Systems in the context of real time mapping for monitoring and security, particularly for systems that employ a service oriented architecture.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper investigates the simultaneous causal relationship between investments in information and communication technology (ICT) and flows of foreign direct investment (FDI), with reference to its implications on economic growth. For the empirical analysis we use data from 23 major countries with heterogeneous economic development for the period 1976-99. Our causality test results suggest that there is a causal relationship from ICT to FDI in developed countries, which means that a higher level of ICT investment leads to an increase inflow of FDI. ICT may contribute to economic growth indirectly by attracting more FDI. Contrarily, we could not find significant causality from ICT to FDI in developing countries. Instead, we have partial evidence of opposite causality relationship: the inflow of FDI causes further increases in ICT investment and production capacity. © United Nations University 2006.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Business networks have been described as cooperative arrangements between independent business organisations that vary from contractual joint ventures to informal exchanges of information. This collaboration has become recognised as an innovative and efficient tool for organising interdependent activities, with benefits accruing to both firms and the local economy. For a number of years, resources have been devoted to supporting Irish networking policies. One recent example of such support is the Irish government's target of €20 million per annum for five years to support the creation of enterprise-led networks. It is imperative that a clear rationale for such interventions is established, as the opportunity cost of public funds is high. This article, therefore, develops an evaluation framework for such networking interventions. This framework will facilitate effective programme planning, implementation and evaluation. It will potentially show how a chain of cause-and-effect at both micro and macro-levels for networking interventions can be established.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Word of mouth (WOM) communication is a major part of online consumer interactions, particularly within the environment of online communities. Nevertheless, existing (offline) theory may be inappropriate to describe online WOM and its influence on evaluation and purchase.The authors report the results of a two-stage study aimed at investigating online WOM: a set of in-depth qualitative interviews followed by a social network analysis of a single online community. Combined, the results provide strong evidence that individuals behave as if Web sites themselves are primary "actors" in online social networks and that online communities can act as a social proxy for individual identification. The authors offer a conceptualization of online social networks which takes the Web site into account as an actor, an initial exploration of the concept of a consumer-Web site relationship, and a conceptual model of the online interaction and information evaluation process. © 2007 Wiley Periodicals, Inc. and Direct Marketing Educational Foundation, Inc.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper will outline a research methodology informed by theorists who have contributed to actor network theory (ANT). Research informed from such a perspective recognizes the constitutive role of accounting systems in the achievement of broader social goals. Latour, Knoor Cetina and others argue that the bringing in of non-human actants, through the growth of technology and science, has added immeasurably to the complexity of modern society. The paper ‘sees’ accounting and accounting systems as being constituted by technological ‘black boxes’ and seeks to discuss two questions. One concerns the processes which surround the establishment of ‘facts’, i.e. how ‘black boxes’ are created or accepted (even if temporarily) within society. The second concerns the role of existing ‘black boxes’ within society and organizations. Accounting systems not only promote a particular view of the activities of an organization or a subunit, but in their very implementation and operation ‘mobilize’ other organizational members in a particular direction. The implications of such an interpretation are explored in this paper. Firstly through a discussion of some of the theoretic constructs that have been proposed to frame ANT research. Secondly an attempt is made to relate some of these ideas to aspects of the empirics in a qualitative case study. The case site is in the health sector and involves the implementation of a casemix accounting system. Evidence from the case research is used to exemplify aspects of the theoretical constructs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Automatically generating maps of a measured variable of interest can be problematic. In this work we focus on the monitoring network context where observations are collected and reported by a network of sensors, and are then transformed into interpolated maps for use in decision making. Using traditional geostatistical methods, estimating the covariance structure of data collected in an emergency situation can be difficult. Variogram determination, whether by method-of-moment estimators or by maximum likelihood, is very sensitive to extreme values. Even when a monitoring network is in a routine mode of operation, sensors can sporadically malfunction and report extreme values. If this extreme data destabilises the model, causing the covariance structure of the observed data to be incorrectly estimated, the generated maps will be of little value, and the uncertainty estimates in particular will be misleading. Marchant and Lark [2007] propose a REML estimator for the covariance, which is shown to work on small data sets with a manual selection of the damping parameter in the robust likelihood. We show how this can be extended to allow treatment of large data sets together with an automated approach to all parameter estimation. The projected process kriging framework of Ingram et al. [2007] is extended to allow the use of robust likelihood functions, including the two component Gaussian and the Huber function. We show how our algorithm is further refined to reduce the computational complexity while at the same time minimising any loss of information. To show the benefits of this method, we use data collected from radiation monitoring networks across Europe. We compare our results to those obtained from traditional kriging methodologies and include comparisons with Box-Cox transformations of the data. We discuss the issue of whether to treat or ignore extreme values, making the distinction between the robust methods which ignore outliers and transformation methods which treat them as part of the (transformed) process. Using a case study, based on an extreme radiological events over a large area, we show how radiation data collected from monitoring networks can be analysed automatically and then used to generate reliable maps to inform decision making. We show the limitations of the methods and discuss potential extensions to remedy these.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In Information Filtering (IF) a user may be interested in several topics in parallel. But IF systems have been built on representational models derived from Information Retrieval and Text Categorization, which assume independence between terms. The linearity of these models results in user profiles that can only represent one topic of interest. We present a methodology that takes into account term dependencies to construct a single profile representation for multiple topics, in the form of a hierarchical term network. We also introduce a series of non-linear functions for evaluating documents against the profile. Initial experiments produced positive results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The chemical functionality within porous architectures dictates their performance as heterogeneous catalysts; however, synthetic routes to control the spatial distribution of individual functions within porous solids are limited. Here we report the fabrication of spatially orthogonal bifunctional porous catalysts, through the stepwise template removal and chemical functionalization of an interconnected silica framework. Selective removal of polystyrene nanosphere templates from a lyotropic liquid crystal-templated silica sol–gel matrix, followed by extraction of the liquid crystal template, affords a hierarchical macroporous–mesoporous architecture. Decoupling of the individual template extractions allows independent functionalization of macropore and mesopore networks on the basis of chemical and/or size specificity. Spatial compartmentalization of, and directed molecular transport between, chemical functionalities affords control over the reaction sequence in catalytic cascades; herein illustrated by the Pd/Pt-catalysed oxidation of cinnamyl alcohol to cinnamic acid. We anticipate that our methodology will prompt further design of multifunctional materials comprising spatially compartmentalized functions.