907 resultados para random graphs
Resumo:
Abstract: Root and root finding are concepts familiar to most branches of mathematics. In graph theory, H is a square root of G and G is the square of H if two vertices x,y have an edge in G if and only if x,y are of distance at most two in H. Graph square is a basic operation with a number of results about its properties in the literature. We study the characterization and recognition problems of graph powers. There are algorithmic and computational approaches to answer the decision problem of whether a given graph is a certain power of any graph. There are polynomial time algorithms to solve this problem for square of graphs with girth at least six while the NP-completeness is proven for square of graphs with girth at most four. The girth-parameterized problem of root fining has been open in the case of square of graphs with girth five. We settle the conjecture that recognition of square of graphs with girth 5 is NP-complete. This result is providing the complete dichotomy theorem for square root finding problem.
Resumo:
We associate some graphs to a ring R and we investigate the interplay between the ring-theoretic properties of R and the graph-theoretic properties of the graphs associated to R. Let Z(R) be the set of zero-divisors of R. We define an undirected graph ᴦ(R) with nonzero zero-divisors as vertices and distinct vertices x and y are adjacent if xy=0 or yx=0. We investigate the Isomorphism Problem for zero-divisor graphs of group rings RG. Let Sk denote the sphere with k handles, where k is a non-negative integer, that is, Sk is an oriented surface of genus k. The genus of a graph is the minimal integer n such that the graph can be embedded in Sn. The annihilating-ideal graph of R is defined as the graph AG(R) with the set of ideals with nonzero annihilators as vertex such that two distinct vertices I and J are adjacent if IJ=(0). We characterize Artinian rings whose annihilating-ideal graphs have finite genus. Finally, we extend the definition of the annihilating-ideal graph to non-commutative rings.
Resumo:
The conjecture claiming that every planar graph is acyclic 5-choosable[Borodin et al., 2002] has been verified for several restricted classes of planargraphs. Recently, O. V. Borodin and A. O. Ivanova, [Journal of Graph Theory,68(2), October 2011, 169-176], have shown that a planar graph is acyclically 5-choosable if it does not contain an i-cycle adjacent to a j-cycle, where 3<=j<=5 if i=3 and 4<=j<=6 if i=4. We improve the above mentioned result and prove that every planar graph without an i-cycle adjacent to a j-cycle with3<=j<=5 if i=3 and 4<=j<=5 if i=4 is acyclically 5-choosable.
Resumo:
According to the List Colouring Conjecture, if G is a multigraph then χ' (G)=χl' (G) . In this thesis, we discuss a relaxed version of this conjecture that every simple graph G is edge-(∆ + 1)-choosable as by Vizing’s Theorem ∆(G) ≤χ' (G)≤∆(G) + 1. We prove that if G is a planar graph without 7-cycles with ∆(G)≠5,6 , or without adjacent 4-cycles with ∆(G)≠5, or with no 3-cycles adjacent to 5-cycles, then G is edge-(∆ + 1)-choosable.
Resumo:
The KCube interconnection topology was rst introduced in 2010. The KCube graph is a compound graph of a Kautz digraph and hypercubes. Compared with the at- tractive Kautz digraph and well known hypercube graph, the KCube graph could accommodate as many nodes as possible for a given indegree (and outdegree) and the diameter of interconnection networks. However, there are few algorithms designed for the KCube graph. In this thesis, we will concentrate on nding graph theoretical properties of the KCube graph and designing parallel algorithms that run on this network. We will explore several topological properties, such as bipartiteness, Hamiltonianicity, and symmetry property. These properties for the KCube graph are very useful to develop efficient algorithms on this network. We will then study the KCube network from the algorithmic point of view, and will give an improved routing algorithm. In addition, we will present two optimal broadcasting algorithms. They are fundamental algorithms to many applications. A literature review of the state of the art network designs in relation to the KCube network as well as some open problems in this field will also be given.
Resumo:
Consider an undirected graph G and a subgraph of G, H. A q-backbone k-colouring of (G,H) is a mapping f: V(G) {1, 2, ..., k} such that G is properly coloured and for each edge of H, the colours of its endpoints differ by at least q. The minimum number k for which there is a backbone k-colouring of (G,H) is the backbone chromatic number, BBCq(G,H). It has been proved that backbone k-colouring of (G,T) is at most 4 if G is a connected C4-free planar graph or non-bipartite C5-free planar graph or Cj-free, j∈{6,7,8} planar graph without adjacent triangles. In this thesis we improve the results mentioned above and prove that 2-backbone k-colouring of any connected planar graphs without adjacent triangles is at most 4 by using a discharging method. In the second part of this thesis we further improve these results by proving that for any graph G with χ(G) ≥ 4, BBC(G,T) = χ(G). In fact, we prove the stronger result that a backbone tree T in G exists, such that ∀ uv ∈ T, |f(u)-f(v)|=2 or |f(u)-f(v)| ≥ k-2, k = χ(G). For the case that G is a planar graph, according to Four Colour Theorem, χ(G) = 4; so, BBC(G,T) = 4.
Resumo:
Diagrams (charts and graphs) made into a booklet with a newspaper cover. This booklet contains cross sections of the back ditch on the south side of the Welland Canal feeder, west of the Marshville culverts (45 pages, hand drawn). This was created by Fred Holmes, Oct. 3, 1857.
Resumo:
Charts and graphs of cross sections from Brown’s ditch culvert to the main drain, cross sections from the feeder on the road allowance between lots 26 and 27 in the 5th concession of Humberstone, Cross sections of the main drain from Lyons Creek culvert to the road allowance between lots 7 and 8 in Wainfleet and cross selections of the old ditch on the west side of the road allowance between lots 17 and 18 in the 3rd concession in Wainfleet (8 pages, hand drawn), n.d.
Resumo:
This paper presents a new theory of random consumer demand. The primitive is a collection of probability distributions, rather than a binary preference. Various assumptions constrain these distributions, including analogues of common assumptions about preferences such as transitivity, monotonicity and convexity. Two results establish a complete representation of theoretically consistent random demand. The purpose of this theory of random consumer demand is application to empirical consumer demand problems. To this end, the theory has several desirable properties. It is intrinsically stochastic, so the econometrician can apply it directly without adding extrinsic randomness in the form of residuals. Random demand is parsimoniously represented by a single function on the consumption set. Finally, we have a practical method for statistical inference based on the theory, described in McCausland (2004), a companion paper.
Resumo:
McCausland (2004a) describes a new theory of random consumer demand. Theoretically consistent random demand can be represented by a \"regular\" \"L-utility\" function on the consumption set X. The present paper is about Bayesian inference for regular L-utility functions. We express prior and posterior uncertainty in terms of distributions over the indefinite-dimensional parameter set of a flexible functional form. We propose a class of proper priors on the parameter set. The priors are flexible, in the sense that they put positive probability in the neighborhood of any L-utility function that is regular on a large subset bar(X) of X; and regular, in the sense that they assign zero probability to the set of L-utility functions that are irregular on bar(X). We propose methods of Bayesian inference for an environment with indivisible goods, leaving the more difficult case of indefinitely divisible goods for another paper. We analyse individual choice data from a consumer experiment described in Harbaugh et al. (2001).
Resumo:
Les travaux de recherche présentés ici avaient pour objectif principal la synthèse de copolymères statistiques à base d’éthylène et d’acide acrylique (AA). Pour cela, la déprotection des groupements esters d’un copolymère statistique précurseur, le poly(éthylène-co-(tert-butyl)acrylate), a été effectuée par hydrolyse à l’aide d’iodure de triméthylsilyle. La synthèse de ce précurseur est réalisée par polymérisation catalytique en présence d’un système à base de Palladium (Pd). Le deuxième objectif a été d’étudier et de caractériser des polymères synthétisés à l’état solide et en suspension colloïdale. Plusieurs copolymères précurseurs comprenant différents pourcentages molaires en tert-butyl acrylate (4 à 12% molaires) ont été synthétisés avec succès, puis déprotégés par hydrolyse pour obtenir des poly(éthylène-coacide acrylique) (pE-co-AA) avec différentes compositions. Seuls les copolymères comprenant 10% molaire ou plus de AA sont solubles dans le Tétrahydrofurane (THF) et uniquement dans ce solvant. De telles solutions peuvent être dialysées dans l’eau, ce qui conduit à un échange lent entre cette dernière et le THF, et l’autoassemblage du copolymère dans l’eau peut ensuite être étudié. C’est ainsi qu’ont pu être observées des nanoparticules stables dans le temps dont le comportement est sensible au pH et à la température. Les polymères synthétisés ont été caractérisés par Résonance Magnétique Nucléaire (RMN) ainsi que par spectroscopie Infra-Rouge (IR), avant et après déprotection. Les pourcentages molaires d’AA ont été déterminés par combinaison des résultats de RMN et ii de titrages conductimètriques. A l’état solide, les échantillons ont été analysés par Calorimétrie différentielle à balayage (DSC) et par Diffraction des rayons X. Les solutions colloïdales des polymères pE-co-AA ont été caractérisées par Diffusion dynamique de la lumière et par la DSC-haute sensibilité. De la microscopie électronique à transmission (TEM) a permis de visualiser la forme et la taille des nanoparticules.
Resumo:
Rapport de recherche présenté à la Faculté des arts et des sciences en vue de l'obtention du grade de Maîtrise en sciences économiques.
Resumo:
We complete the development of a testing ground for axioms of discrete stochastic choice. Our contribution here is to develop new posterior simulation methods for Bayesian inference, suitable for a class of prior distributions introduced by McCausland and Marley (2013). These prior distributions are joint distributions over various choice distributions over choice sets of di fferent sizes. Since choice distributions over di fferent choice sets can be mutually dependent, previous methods relying on conjugate prior distributions do not apply. We demonstrate by analyzing data from a previously reported experiment and report evidence for and against various axioms.