117 resultados para RANDOM REGULAR GRAPHS
em Consorci de Serveis Universitaris de Catalunya (CSUC), Spain
Resumo:
Vegeu el resum a l'inici del document del fitxer adjunt.
Resumo:
In this paper we study the reconstruction of a network topology from the values of its betweenness centrality, a measure of the influence of each of its nodes in the dissemination of information over the network. We consider a simple metaheuristic, simulated annealing, as the combinatorial optimization method to generate the network from the values of the betweenness centrality. We compare the performance of this technique when reconstructing different categories of networks –random, regular, small-world, scale-free and clustered–. We show that the method allows an exact reconstruction of small networks and leads to good topological approximations in the case of networks with larger orders. The method can be used to generate a quasi-optimal topology fora communication network from a list with the values of the maximum allowable traffic for each node.
Resumo:
Counting labelled planar graphs, and typical properties of random labelled planar graphs, have received much attention recently. We start the process here of extending these investigations to graphs embeddable on any fixed surface S. In particular we show that the labelled graphs embeddable on S have the same growth constant as for planar graphs, and the same holds for unlabelled graphs. Also, if we pick a graph uniformly at random from the graphs embeddable on S which have vertex set {1, . . . , n}, then with probability tending to 1 as n → ∞, this random graph either is connected or consists of one giant component together with a few nodes in small planar components.
Resumo:
We introduce and study a class of infinite-horizon nonzero-sum non-cooperative stochastic games with infinitely many interacting agents using ideas of statistical mechanics. First we show, in the general case of asymmetric interactions, the existence of a strategy that allows any player to eliminate losses after a finite random time. In the special case of symmetric interactions, we also prove that, as time goes to infinity, the game converges to a Nash equilibrium. Moreover, assuming that all agents adopt the same strategy, using arguments related to those leading to perfect simulation algorithms, spatial mixing and ergodicity are proved. In turn, ergodicity allows us to prove “fixation”, i.e. that players will adopt a constant strategy after a finite time. The resulting dynamics is related to zerotemperature Glauber dynamics on random graphs of possibly infinite volume.
Resumo:
In a seminal paper [10], Weitz gave a deterministic fully polynomial approximation scheme for counting exponentially weighted independent sets (which is the same as approximating the partition function of the hard-core model from statistical physics) in graphs of degree at most d, up to the critical activity for the uniqueness of the Gibbs measure on the innite d-regular tree. ore recently Sly [8] (see also [1]) showed that this is optimal in the sense that if here is an FPRAS for the hard-core partition function on graphs of maximum egree d for activities larger than the critical activity on the innite d-regular ree then NP = RP. In this paper we extend Weitz's approach to derive a deterministic fully polynomial approximation scheme for the partition function of general two-state anti-ferromagnetic spin systems on graphs of maximum degree d, up to the corresponding critical point on the d-regular tree. The main ingredient of our result is a proof that for two-state anti-ferromagnetic spin systems on the d-regular tree, weak spatial mixing implies strong spatial mixing. his in turn uses a message-decay argument which extends a similar approach proposed recently for the hard-core model by Restrepo et al [7] to the case of general two-state anti-ferromagnetic spin systems.
Resumo:
A parts based model is a parametrization of an object class using a collection of landmarks following the object structure. The matching of parts based models is one of the problems where pairwise Conditional Random Fields have been successfully applied. The main reason of their effectiveness is tractable inference and learning due to the simplicity of involved graphs, usually trees. However, these models do not consider possible patterns of statistics among sets of landmarks, and thus they sufffer from using too myopic information. To overcome this limitation, we propoese a novel structure based on a hierarchical Conditional Random Fields, which we explain in the first part of this memory. We build a hierarchy of combinations of landmarks, where matching is performed taking into account the whole hierarchy. To preserve tractable inference we effectively sample the label set. We test our method on facial feature selection and human pose estimation on two challenging datasets: Buffy and MultiPIE. In the second part of this memory, we present a novel approach to multiple kernel combination that relies on stacked classification. This method can be used to evaluate the landmarks of the parts-based model approach. Our method is based on combining responses of a set of independent classifiers for each individual kernel. Unlike earlier approaches that linearly combine kernel responses, our approach uses them as inputs to another set of classifiers. We will show that we outperform state-of-the-art methods on most of the standard benchmark datasets.
Resumo:
Recently, several anonymization algorithms have appeared for privacy preservation on graphs. Some of them are based on random-ization techniques and on k-anonymity concepts. We can use both of them to obtain an anonymized graph with a given k-anonymity value. In this paper we compare algorithms based on both techniques in orderto obtain an anonymized graph with a desired k-anonymity value. We want to analyze the complexity of these methods to generate anonymized graphs and the quality of the resulting graphs.
Resumo:
A pacemaker, regularly emitting chemical waves, is created out of noise when an excitable photosensitive Belousov-Zhabotinsky medium, strictly unable to autonomously initiate autowaves, is forced with a spatiotemporal patterned random illumination. These experimental observations are also reproduced numerically by using a set of reaction-diffusion equations for an activator-inhibitor model, and further analytically interpreted in terms of genuine coupling effects arising from parametric fluctuations. Within the same framework we also address situations of noise-sustained propagation in subexcitable media.
Resumo:
A pacemaker, regularly emitting chemical waves, is created out of noise when an excitable photosensitive Belousov-Zhabotinsky medium, strictly unable to autonomously initiate autowaves, is forced with a spatiotemporal patterned random illumination. These experimental observations are also reproduced numerically by using a set of reaction-diffusion equations for an activator-inhibitor model, and further analytically interpreted in terms of genuine coupling effects arising from parametric fluctuations. Within the same framework we also address situations of noise-sustained propagation in subexcitable media.
Resumo:
Recently, several anonymization algorithms have appeared for privacy preservation on graphs. Some of them are based on random-ization techniques and on k-anonymity concepts. We can use both of them to obtain an anonymized graph with a given k-anonymity value. In this paper we compare algorithms based on both techniques in orderto obtain an anonymized graph with a desired k-anonymity value. We want to analyze the complexity of these methods to generate anonymized graphs and the quality of the resulting graphs.
Resumo:
"Vegeu el resum a l'inici del document del fitxer adjunt."
Resumo:
Let T be the Cayley graph of a finitely generated free group F. Given two vertices in T consider all the walks of a given length between these vertices that at a certain time must follow a number of predetermined steps. We give formulas for the number of such walks by expressing the problem in terms of equations in F and solving the corresponding equations.
Resumo:
We analyze a model where firms chose a production technology which, together with some random event, determines the final emission level. We consider the coexistence of two alternative technologies: a "clean" technology, and a "dirty" technology. The environmental regulation is based on taxes over reported emissions, and on penalties over unreported emissions. We show that the optimal inspection policy is a cut-off strategy, for several scenarios concerning the observability of the adoption of the clean technology and the cost of adopting it. We also show that the optimal inspection policy induces the firm to adopt the clean technology if the adoption cost is not too high, but the cost levels for which the firm adopts it depend on the scenario.
Resumo:
"Vegeu el resum a l'inici del document del fitxer adjunt."
Resumo:
Les factoritzacions de la FFT (Fast Fourier Transform) que presenten un patró d’interconnexió regular entre factors o etapes son conegudes com algorismes paral·lels, o algorismes de Pease, ja que foren originalment proposats per Pease. En aquesta contribució s’han desenvolupat noves factoritzacions amb blocs que presenten el patró d’interconnexió regular de Pease. S’ha mostrat com aquests blocs poden ser obtinguts a una escala prèviament seleccionada. Les noves factoritzacions per ambdues FFT i IFFT (Inverse FFT) tenen dues classes de factors: uns pocs factors del tipus Cooley-Tukey i els nous factors que proporcionen la mateix patró d’interconnexió de Pease en blocs. Per a una factorització donada, els blocs comparteixen dimensions, el patró d’interconnexió etapa a etapa i a més cada un d’ells pot ser calculat independentment dels altres.