15 resultados para Vertex Coloring
em Aston University Research Archive
Resumo:
The problem of vertex coloring in random graphs is studied using methods of statistical physics and probability. Our analytical results are compared to those obtained by exact enumeration and Monte Carlo simulations. We critically discuss the merits and shortcomings of the various methods, and interpret the results obtained. We present an exact analytical expression for the two-coloring problem as well as general replica symmetric approximated solutions for the thermodynamics of the graph coloring problem with p colors and K-body edges. ©2002 The American Physical Society.
Resumo:
We study a variation of the graph coloring problem on random graphs of finite average connectivity. Given the number of colors, we aim to maximize the number of different colors at neighboring vertices (i.e. one edge distance) of any vertex. Two efficient algorithms, belief propagation and Walksat are adapted to carry out this task. We present experimental results based on two types of random graphs for different system sizes and identify the critical value of the connectivity for the algorithms to find a perfect solution. The problem and the suggested algorithms have practical relevance since various applications, such as distributed storage, can be mapped onto this problem.
Resumo:
Objective: Pharyngeal stimulation can induce remarkable increases in the excitability of swallowing motor cortex, which is associated with short-term improvements in swallowing behaviour in dysphagic stroke patients. However, the mechanism by which this input induces cortical change remains unclear. Our aims were to explore the stimulus-induced facilitation of the cortico-bulbar projections to swallowing musculature and examine how input from the pharynx interacts with swallowing motor cortex. Methods: In 8 healthy subjects, a transcranial magnetic stimulation (TMS) paired-pulse investigation was performed comprising a single conditioning electrical pharyngeal stimulus (pulse width 0.2 ms, 240 V) followed by cortical TMS at inter-stimulus intervals (ISI) of 10-100 ms. Pharyngeal sensory evoked potentials (PSEP) were also measured over the vertex. In 6 subjects whole-brain magnetoencephalography (MEG) was further acquired following pharyngeal stimulation. Results: TMS evoked pharyngeal motor evoked potentials were facilitated by the pharyngeal stimulus at ISI between 50 and 80 ms (Δ mean increase: 47±6%, P<0.05). This correlated with the peak latency of the P1 component of the PSEP (mean 79.6±8.5 ms). MEG confirmed that the equivalent P1 peak activities were localised to caudolateral sensory and motor cortices (BA 4, 1, 2). Conclusions: Facilitation of the cortico-bulbar pathway to pharyngeal stimulation relates to coincident afferent input to sensorimotor cortex. Significance: These findings have mechanistic importance on how pharyngeal stimulation may increase motor excitability and provide guidance on temporal windows for future manipulations of swallowing motor cortex. © 2004 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.
Resumo:
The inclusion of high-level scripting functionality in state-of-the-art rendering APIs indicates a movement toward data-driven methodologies for structuring next generation rendering pipelines. A similar theme can be seen in the use of composition languages to deploy component software using selection and configuration of collaborating component implementations. In this paper we introduce the Fluid framework, which places particular emphasis on the use of high-level data manipulations in order to develop component based software that is flexible, extensible, and expressive. We introduce a data-driven, object oriented programming methodology to component based software development, and demonstrate how a rendering system with a similar focus on abstract manipulations can be incorporated, in order to develop a visualization application for geospatial data. In particular we describe a novel SAS script integration layer that provides access to vertex and fragment programs, producing a very controllable, responsive rendering system. The proposed system is very similar to developments speculatively planned for DirectX 10, but uses open standards and has cross platform applicability. © The Eurographics Association 2007.
Resumo:
In this paper we describe a novel, extensible visualization system currently under development at Aston University. We introduce modern programming methods, such as the use of data driven programming, design patterns, and the careful definition of interfaces to allow easy extension using plug-ins, to 3D landscape visualization software. We combine this with modern developments in computer graphics, such as vertex and fragment shaders, to create an extremely flexible, extensible real-time near photorealistic visualization system. In this paper we show the design of the system and the main sub-components. We stress the role of modern programming practices and illustrate the benefits these bring to 3D visualization. © 2006 Springer-Verlag Berlin Heidelberg.
Resumo:
We assess the accuracy of the Visante anterior segment optical coherence tomographer (AS-OCT) and present improved formulas for measurement of surface curvature and axial separation. Measurements are made in physical model eyes. Accuracy is compared for measurements of corneal thickness (d1) and anterior chamber depth (d2) using-built-in AS-OCT software versus the improved scheme. The improved scheme enables measurements of lens thickness (d 3) and surface curvature, in the form of conic sections specified by vertex radii and conic constants. These parameters are converted to surface coordinates for error analysis. The built-in AS-OCT software typically overestimates (mean±standard deviation(SD)]d1 by +62±4 μm and d2 by +4±88μm. The improved scheme reduces d1 (-0.4±4 μm) and d2 (0±49 μm) errors while also reducing d3 errors from +218±90 (uncorrected) to +14±123 μm (corrected). Surface x coordinate errors gradually increase toward the periphery. Considering the central 6-mm zone of each surface, the x coordinate errors for anterior and posterior corneal surfaces reached +3±10 and 0±23 μm, respectively, with the improved scheme. Those of the anterior and posterior lens surfaces reached +2±22 and +11±71 μm, respectively. Our improved scheme reduced AS-OCT errors and could, therefore, enhance pre- and postoperative assessments of keratorefractive or cataract surgery, including measurement of accommodating intraocular lenses. © 2007 Society of Photo-Optical Instrumentation Engineers.
Resumo:
The two elcctrophysiological tests currently favoured in the clinical measurement of hearing threshold arc the brainstorm evoked potential (BAEP) and the slow vertex response (SVR). However, both tests possess disadvantages. The BAEP is the test of choice in younger patients as it is stable at all levels of arousal, but little information has been obtained to date at a range of frequencies. The SVR is frequency specific but is unreliable in certain adult subjects and is unstable during sleep or in young children. These deficiencies have prompted research into a third group of potentials, the middle latency response (MLR) and the 40HZ responses. This research has compared the SVR and 40HZ response in waking adults and reports that the 40HZ test can provide a viable alternative to the SVR provided that a high degree of subject relaxation is ensured. A second study examined the morphology of the MLR and 40HZ during sleep. This work suggested that these potentials arc markedly different during sleep and that methodological factors have been responsible for masking these changes in previous studies. The clinical possibilities of tone pip BAEPs were then examined as these components were proved to be the only stable responses present in sleep. It was found that threshold estimates to 5OOHz, lOOOHz and 4000Hz stimuli could be made to within 15dBSL in most cases. A final study looked more closely at methods of obtaining frequency specific information in sleeping subjects. Threshold estimates were made using established BAEP parameters and this was compared to a 40HZ procedure which recorded a series of BAEPs over a 100msec. time sweep. Results indicated that the 40mHz procedure was superior to existing techniques in estimating threshold to low frequency stimuli. This research has confirmed a role for the MLR and 40Hz response as alternative measures of hearing capability in waking subjects and proposes that the 40Hz technique is useful in measuring frequency specific thresholds although the responses recorded derive primarily from the brainstem.
Resumo:
We consider a variation of the prototype combinatorial optimization problem known as graph colouring. Our optimization goal is to colour the vertices of a graph with a fixed number of colours, in a way to maximize the number of different colours present in the set of nearest neighbours of each given vertex. This problem, which we pictorially call palette-colouring, has been recently addressed as a basic example of a problem arising in the context of distributed data storage. Even though it has not been proved to be NP-complete, random search algorithms find the problem hard to solve. Heuristics based on a naive belief propagation algorithm are observed to work quite well in certain conditions. In this paper, we build upon the mentioned result, working out the correct belief propagation algorithm, which needs to take into account the many-body nature of the constraints present in this problem. This method improves the naive belief propagation approach at the cost of increased computational effort. We also investigate the emergence of a satisfiable-to-unsatisfiable 'phase transition' as a function of the vertex mean degree, for different ensembles of sparse random graphs in the large size ('thermodynamic') limit.
Resumo:
The molecular chaperone, Hsc70, together with its cofactor, auxilin, facilitates the ATP-dependent removal of clathrin during clathrin-mediated endocytosis in cells. We have used cryo-electron microscopy to determine the 3D structure of a complex of clathrin, auxilin401-910 and Hsc70 at pH 6 in the presence of ATP, frozen within 20 seconds of adding Hsc70 in order to visualize events that follow the binding of Hsc70 to clathrin and auxilin before clathrin disassembly. In this map,we observe density beneath the vertex of the cage that we attribute to bound Hsc70. This density emerges asymmetrically from the clathrin vertex, suggesting preferential binding by Hsc70 for one of the three possible sites at the vertex. Statistical comparison with a map of whole auxilin and clathrin previously published by us reveals the location of statistically significant differences which implicate involvement of clathrin light chains in structural rearrangements which occur after Hsc70 is recruited. Clathrin disassembly assays using light scattering suggest that loss of clathrin light chains reduces the efficiency with which auxilin facilitates this reaction. These data support a regulatory role for clathrin light chains in clathrin disassembly in addition to their established role in regulating clathrin assembly. © 2013 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Resumo:
Noxious stimuli in the esophagus cause pain that is referred to the anterior chest wall because of convergence of visceral and somatic afferents within the spinal cord. We sought to characterize the neurophysiological responses of these convergent spinal pain pathways in humans by studying 12 healthy subjects over three visits (V1, V2, and V3). Esophageal pain thresholds (Eso-PT) were assessed by electrical stimulation and anterior chest wall pain thresholds (ACW-PT) by use of a contact heat thermode. Esophageal evoked potentials (EEP) were recorded from the vertex following 200 electrical stimuli, and anterior chest wall evoked potentials (ACWEP) were recorded following 40 heat pulses. The fear of pain questionnaire (FPQ) was administered on V1. Statistical data are shown as point estimates of difference +/- 95% confidence interval. Pain thresholds increased between V1 and V3 [Eso-PT: V1-V3 = -17.9 mA (-27.9, -7.9) P < 0.001; ACW-PT: V1-V3 = -3.38 degrees C (-5.33, -1.42) P = 0.001]. The morphology of cortical responses from both sites was consistent and equivalent [P1, N1, P2, N2 complex, where P1 and P2 are is the first and second positive (downward) components of the CEP waveform, respectively, and N1 and N2 are the first and second negative (upward) components, respectively], indicating activation of similar cortical networks. For EEP, N1 and P2 latencies decreased between V1 and V3 [N1: V1-V3 = 13.7 (1.8, 25.4) P = 0.02; P2: V1-V3 = 32.5 (11.7, 53.2) P = 0.003], whereas amplitudes did not differ. For ACWEP, P2 latency increased between V1 and V3 [-35.9 (-60, -11.8) P = 0.005] and amplitudes decreased [P1-N1: V1-V3 = 5.4 (2.4, 8.4) P = 0.01; P2-N2: 6.8 (3.4, 10.3) P < 0.001]. The mean P1 latency of EEP over three visits was 126.6 ms and that of ACWEP was 101.6 ms, reflecting afferent transmission via Adelta fibers. There was a significant negative correlation between FPQ scores and Eso-PT on V1 (r = -0.57, P = 0.05). These data provide the first neurophysiological evidence of convergent esophageal and somatic pain pathways in humans.
Resumo:
In this paper, we propose a new edge-based matching kernel for graphs by using discrete-time quantum walks. To this end, we commence by transforming a graph into a directed line graph. The reasons of using the line graph structure are twofold. First, for a graph, its directed line graph is a dual representation and each vertex of the line graph represents a corresponding edge in the original graph. Second, we show that the discrete-time quantum walk can be seen as a walk on the line graph and the state space of the walk is the vertex set of the line graph, i.e., the state space of the walk is the edges of the original graph. As a result, the directed line graph provides an elegant way of developing new edge-based matching kernel based on discrete-time quantum walks. For a pair of graphs, we compute the h-layer depth-based representation for each vertex of their directed line graphs by computing entropic signatures (computed from discrete-time quantum walks on the line graphs) on the family of K-layer expansion subgraphs rooted at the vertex, i.e., we compute the depth-based representations for edges of the original graphs through their directed line graphs. Based on the new representations, we define an edge-based matching method for the pair of graphs by aligning the h-layer depth-based representations computed through the directed line graphs. The new edge-based matching kernel is thus computed by counting the number of matched vertices identified by the matching method on the directed line graphs. Experiments on standard graph datasets demonstrate the effectiveness of our new kernel.
Resumo:
We propose a family of attributed graph kernels based on mutual information measures, i.e., the Jensen-Tsallis (JT) q-differences (for q ∈ [1,2]) between probability distributions over the graphs. To this end, we first assign a probability to each vertex of the graph through a continuous-time quantum walk (CTQW). We then adopt the tree-index approach [1] to strengthen the original vertex labels, and we show how the CTQW can induce a probability distribution over these strengthened labels. We show that our JT kernel (for q = 1) overcomes the shortcoming of discarding non-isomorphic substructures arising in the R-convolution kernels. Moreover, we prove that the proposed JT kernels generalize the Jensen-Shannon graph kernel [2] (for q = 1) and the classical subtree kernel [3] (for q = 2), respectively. Experimental evaluations demonstrate the effectiveness and efficiency of the JT kernels.
Resumo:
The study of complex networks has recently attracted increasing interest because of the large variety of systems that can be modeled using graphs. A fundamental operation in the analysis of complex networks is that of measuring the centrality of a vertex. In this paper, we propose to measure vertex centrality using a continuous-time quantum walk. More specifically, we relate the importance of a vertex to the influence that its initial phase has on the interference patterns that emerge during the quantum walk evolution. To this end, we make use of the quantum Jensen-Shannon divergence between two suitably defined quantum states. We investigate how the importance varies as we change the initial state of the walk and the Hamiltonian of the system. We find that, for a suitable combination of the two, the importance of a vertex is almost linearly correlated with its degree. Finally, we evaluate the proposed measure on two commonly used networks. © 2014 Springer-Verlag Berlin Heidelberg.
Resumo:
The analysis of complex networks is usually based on key properties such as small-worldness and vertex degree distribution. The presence of symmetric motifs on the other hand has been related to redundancy and thus robustness of the networks. In this paper we propose a method for detecting approximate axial symmetries in networks. For each pair of nodes, we define a continuous-time quantum walk which is evolved through time. By measuring the probability that the quantum walker to visits each node of the network in this time frame, we are able to determine whether the two vertices are symmetrical with respect to any axis of the graph. Moreover, we show that we are able to successfully detect approximate axial symmetries too. We show the efficacy of our approach by analysing both synthetic and real-world data. © 2012 Springer-Verlag Berlin Heidelberg.
Resumo:
In the study of complex networks, vertex centrality measures are used to identify the most important vertices within a graph. A related problem is that of measuring the centrality of an edge. In this paper, we propose a novel edge centrality index rooted in quantum information. More specifically, we measure the importance of an edge in terms of the contribution that it gives to the Von Neumann entropy of the graph. We show that this can be computed in terms of the Holevo quantity, a well known quantum information theoretical measure. While computing the Von Neumann entropy and hence the Holevo quantity requires computing the spectrum of the graph Laplacian, we show how to obtain a simplified measure through a quadratic approximation of the Shannon entropy. This in turns shows that the proposed centrality measure is strongly correlated with the negative degree centrality on the line graph. We evaluate our centrality measure through an extensive set of experiments on real-world as well as synthetic networks, and we compare it against commonly used alternative measures.