963 resultados para scale-free topology
Resumo:
ICINCO 2010
Resumo:
There are two competing models of our universe right now. One is Big Bang with inflation cosmology. The other is the cyclic model with ekpyrotic phase in each cycle. This paper is divided into two main parts according to these two models. In the first part, we quantify the potentially observable effects of a small violation of translational invariance during inflation, as characterized by the presence of a preferred point, line, or plane. We explore the imprint such a violation would leave on the cosmic microwave background anisotropy, and provide explicit formulas for the expected amplitudes $\langle a_{lm}a_{l'm'}^*\rangle$ of the spherical-harmonic coefficients. We then provide a model and study the two-point correlation of a massless scalar (the inflaton) when the stress tensor contains the energy density from an infinitely long straight cosmic string in addition to a cosmological constant. Finally, we discuss if inflation can reconcile with the Liouville's theorem as far as the fine-tuning problem is concerned. In the second part, we find several problems in the cyclic/ekpyrotic cosmology. First of all, quantum to classical transition would not happen during an ekpyrotic phase even for superhorizon modes, and therefore the fluctuations cannot be interpreted as classical. This implies the prediction of scale-free power spectrum in ekpyrotic/cyclic universe model requires more inspection. Secondly, we find that the usual mechanism to solve fine-tuning problems is not compatible with eternal universe which contains infinitely many cycles in both direction of time. Therefore, all fine-tuning problems including the flatness problem still asks for an explanation in any generic cyclic models.
Resumo:
During the last two decades, analysis of 1/f noise in cognitive science has led to a considerable progress in the way we understand the organization of our mental life. However, there is still a lack of specific models providing explanations of how 1/f noise is generated in coupled brain-body-environment systems, since existing models and experiments typically target either externally observable behaviour or isolated neuronal systems but do not address the interplay between neuronal mechanisms and sensorimotor dynamics. We present a conceptual model of a minimal neurorobotic agent solving a behavioural task that makes it possible to relate mechanistic (neurodynamic) and behavioural levels of description. The model consists of a simulated robot controlled by a network of Kuramoto oscillators with homeostatic plasticity and the ability to develop behavioural preferences mediated by sensorimotor patterns. With only three oscillators, this simple model displays self-organized criticality in the form of robust 1/f noise and a wide multifractal spectrum. We show that the emergence of self-organized criticality and 1/f noise in our model is the result of three simultaneous conditions: a) non-linear interaction dynamics capable of generating stable collective patterns, b) internal plastic mechanisms modulating the sensorimotor flows, and c) strong sensorimotor coupling with the environment that induces transient metastable neurodynamic regimes. We carry out a number of experiments to show that both synaptic plasticity and strong sensorimotor coupling play a necessary role, as constituents of self-organized criticality, in the generation of 1/f noise. The experiments also shown to be useful to test the robustness of 1/f scaling comparing the results of different techniques. We finally discuss the role of conceptual models as mediators between nomothetic and mechanistic models and how they can inform future experimental research where self-organized critically includes sensorimotor coupling among the essential interaction-dominant process giving rise to 1/f noise.
Resumo:
The dynamics of the survival of recruiting fish are analyzed as evolving random processes of aggregation and mortality. The analyses draw on recent advances in the physics of complex networks and, in particular, the scale-free degree distribution arising from growing random networks with preferential attachment of links to nodes. In this study simulations were conducted in which recruiting fish 1) were subjected to mortality by using alternative mortality encounter models and 2) aggregated according to random encounters (two schools randomly encountering one another join into a single school) or preferential attachment (the probability of a successful aggregation of two schools is proportional to the school sizes). The simulations started from either a “disaggregated” (all schools comprised a single fish) or an aggregated initial condition. Results showed the transition of the school-size distribution with preferential attachment evolving toward a scale-free school size distribution, whereas random attachment evolved toward an exponential distribution. Preferential attachment strategies performed better than random attachment strategies in terms of recruitment survival at time when mortality encounters were weighted toward schools rather than to individual fish. Mathematical models were developed whose solutions (either analytic or numerical) mimicked the simulation results. The resulting models included both Beverton-Holt and Ricker-like recruitment, which predict recruitment as a function of initial mean school size as well as initial stock size. Results suggest that school-size distributions during recruitment may provide information on recruitment processes. The models also provide a template for expanding both theoretical and empirical recruitment research.
Resumo:
In this paper, we studied range-based attacks on links in geographically constrained scale-free networks and found that there is a continuous switching of roles of short-and long-range attacks on links when tuning the geographical constraint strength. Our results demonstrate that the geography has a significant impact on the network efficiency and security; thus one can adjust the geographical structure to optimize the robustness and the efficiency of the networks. We introduce a measurement of the impact of links on the efficiency of the network, and an effective attacking strategy is suggested
Resumo:
We investigate the effect of clusters in complex networks on efficiency dynamics by studying a simple efficiency model in two coupled small-world networks. It is shown that the critical network randomness corresponding to transition from a stagnant phase to a growing one decreases to zero as the connection strength of clusters increases. It is also shown for fixed randomness that the state of clusters transits from a stagnant phase to a growing one as the connection strength of clusters increases. This work can be useful for understanding the critical transition appearing in many dynamic processes on the cluster networks.
Resumo:
In communication networks such as the Internet, the relationship between packet generation rate and time is similar to a rectangle wavefunction due to the rhythm of humans. Thus, we investigate the traffic dynamics on a network with a rectangle wavepacket generation rate. It is found that the critical delivering capacity parameter beta(c) (which separates the congested phase and the free phase) decreases significantly with the duty cycle r of the rectangle wave for package generation. And, in the congested phase, more collective generation of packets (smaller r) is helpful for decreasing the packet aggregation rate. Moreover, it is found that the congested phase can be divided into two regions, i.e., region1 and region2, where the distributions of queue lengths are nonlinear and linear, respectively. Also, the linear expression for the distribution of queue lengths in region2 is obtained analytically. Our work reveals an obvious effect of the rectangle wave on the traffic dynamics and the queue length distribution in the system, which is of essential interest and may provide insights into the designing of work-rest schedules and routing strategies.
Resumo:
Estimation of the skeleton of a directed acyclic graph (DAG) is of great importance for understanding the underlying DAG and causal effects can be assessed from the skeleton when the DAG is not identifiable. We propose a novel method named PenPC to estimate the skeleton of a high-dimensional DAG by a two-step approach. We first estimate the nonzero entries of a concentration matrix using penalized regression, and then fix the difference between the concentration matrix and the skeleton by evaluating a set of conditional independence hypotheses. For high-dimensional problems where the number of vertices p is in polynomial or exponential scale of sample size n, we study the asymptotic property of PenPC on two types of graphs: traditional random graphs where all the vertices have the same expected number of neighbors, and scale-free graphs where a few vertices may have a large number of neighbors. As illustrated by extensive simulations and applications on gene expression data of cancer patients, PenPC has higher sensitivity and specificity than the state-of-the-art method, the PC-stable algorithm.
Resumo:
Efficient searching is crucial for timely location of food and other resources. Recent studies show diverse living animals employ a theoretically optimal scale-free random search for sparse resources known as a Lévy walk, but little is known of the origins and evolution of foraging behaviour and the search strategies of extinct organisms. Here we show using simulations of self-avoiding trace fossil trails that randomly introduced strophotaxis (U-turns) – initiated by obstructions such as ¬¬¬self-trail avoidance or innate cueing – leads to random looping patterns with clustering across increasing scales that is consistent with the presence of Lévy walks. This predicts optimal Lévy searches can emerge from simple behaviours observed in fossil trails. We then analysed fossilized trails of benthic marine organisms using a novel path analysis technique and find the first evidence of Lévy-like search strategies in extinct animals. Our results show that simple search behaviours of extinct animals in heterogeneous environments give rise to hierarchically nested Brownian walk clusters that converge to optimal Lévy patterns. Primary productivity collapse and large-scale food scarcity characterising mass extinctions evident in the fossil record may have triggered adaptation of optimal Lévy-like searches. The findings suggest Lévy-like behaviour has been employed by foragers since at least the Eocene but may have a more ancient origin, which could explain recent widespread observations of such patterns among modern taxa.
Resumo:
A central question in community ecology is how the number of trophic links relates to community species richness. For simple dynamical food-web models, link density (the ratio of links to species) is bounded from above as the number of species increases; but empirical data suggest that it increases without bounds. We found a new empirical upper bound on link density in large marine communities with emphasis on fish and squid, using novel methods that avoid known sources of bias in traditional approaches. Bounds are expressed in terms of the diet-partitioning function (DPF): the average number of resources contributing more than a fraction f to a consumer's diet, as a function of f. All observed DPF follow a functional form closely related to a power law, with power-law exponents indepen- dent of species richness at the measurement accuracy. Results imply universal upper bounds on link density across the oceans. However, the inherently scale-free nature of power-law diet partitioning suggests that the DPF itself is a better defined characterization of network structure than link density.
Resumo:
We study the dissipative dynamics of two independent arrays of many-body systems, locally driven by a common entangled field. We showthat in the steady state the entanglement of the driving field is reproduced in an arbitrarily large series of inter-array entangled pairs over all distances. Local nonclassical driving thus realizes a scale-free entanglement replication and long-distance entanglement distribution mechanism that has immediate bearing on the implementation of quantum communication networks.
Resumo:
Background
Although the General Medical Council recommends that United Kingdom medical students are taught ‘whole person medicine’, spiritual care is variably recognised within the curriculum. Data on teaching delivery and attainment of learning outcomes is lacking. This study ascertained views of Faculty and students about spiritual care and how to teach and assess competence in delivering such care.
MethodsA questionnaire comprising 28 questions exploring attitudes to whole person medicine, spirituality and illness, and training of healthcare staff in providing spiritual care was designed using a five-point Likert scale. Free text comments were studied by thematic analysis. The questionnaire was distributed to 1300 students and 106 Faculty at Queen’s University Belfast Medical School.
Results351 responses (54 staff, 287 students; 25 %) were obtained. >90 % agreed that whole person medicine included physical, psychological and social components; 60 % supported inclusion of a spiritual component within the definition. Most supported availability of spiritual interventions for patients, including access to chaplains (71 %), counsellors (62 %), or members of the patient’s faith community (59 %). 90 % felt that personal faith/spirituality was important to some patients and 60 % agreed that this influenced health. However 80 % felt that doctors should never/rarely share their own spiritual beliefs with patients and 67 % felt they should only do so when specifically invited. Most supported including training on provision of spiritual care within the curriculum; 40-50 % felt this should be optional and 40 % mandatory. Small group teaching was the favoured delivery method. 64 % felt that teaching should not be assessed, but among assessment methods, reflective portfolios were most favoured (30 %). Students tended to hold more polarised viewpoints but generally were more favourably disposed towards spiritual care than Faculty. Respecting patients’ values and beliefs and the need for guidance in provision of spiritual care were identified in the free-text comments.
ConclusionsStudents and Faculty generally recognise a spiritual dimension to health and support provision of spiritual care to appropriate patients. There is lack of consensus whether this should be delivered by doctors or left to others. Spiritual issues impacting patient management should be included in the curriculum; agreement is lacking about how to deliver and assess.
Resumo:
This work consists of a theoretical part and an experimental one. The first part provides a simple treatment of the celebrated von Neumann minimax theorem as formulated by Nikaid6 and Sion. It also discusses its relationships with fundamental theorems of convex analysis. The second part is about externality in sponsored search auctions. It shows that in these auctions, advertisers have externality effects on each other which influence their bidding behavior. It proposes Hal R.Varian model and shows how adding externality to this model will affect its properties. In order to have a better understanding of the interaction among advertisers in on-line auctions, it studies the structure of the Google advertisements networ.k and shows that it is a small-world scale-free network.
Resumo:
The Aitchison vector space structure for the simplex is generalized to a Hilbert space structure A2(P) for distributions and likelihoods on arbitrary spaces. Central notations of statistics, such as Information or Likelihood, can be identified in the algebraical structure of A2(P) and their corresponding notions in compositional data analysis, such as Aitchison distance or centered log ratio transform. In this way very elaborated aspects of mathematical statistics can be understood easily in the light of a simple vector space structure and of compositional data analysis. E.g. combination of statistical information such as Bayesian updating, combination of likelihood and robust M-estimation functions are simple additions/ perturbations in A2(Pprior). Weighting observations corresponds to a weighted addition of the corresponding evidence. Likelihood based statistics for general exponential families turns out to have a particularly easy interpretation in terms of A2(P). Regular exponential families form finite dimensional linear subspaces of A2(P) and they correspond to finite dimensional subspaces formed by their posterior in the dual information space A2(Pprior). The Aitchison norm can identified with mean Fisher information. The closing constant itself is identified with a generalization of the cummulant function and shown to be Kullback Leiblers directed information. Fisher information is the local geometry of the manifold induced by the A2(P) derivative of the Kullback Leibler information and the space A2(P) can therefore be seen as the tangential geometry of statistical inference at the distribution P. The discussion of A2(P) valued random variables, such as estimation functions or likelihoods, give a further interpretation of Fisher information as the expected squared norm of evidence and a scale free understanding of unbiased reasoning
Resumo:
In this class, we will discuss network theory fundamentals, including concepts such as diameter, distance, clustering coefficient and others. We will also discuss different types of networks, such as scale-free networks, random networks etc. Readings: Graph structure in the Web, A. Broder and R. Kumar and F. Maghoul and P. Raghavan and S. Rajagopalan and R. Stata and A. Tomkins and J. Wiener Computer Networks 33 309--320 (2000) [Web link, Alternative Link] Optional: The Structure and Function of Complex Networks, M.E.J. Newman, SIAM Review 45 167--256 (2003) [Web link] Original course at: http://kmi.tugraz.at/staff/markus/courses/SS2008/707.000_web-science/