974 resultados para Convexity in Graphs


Relevância:

30.00% 30.00%

Publicador:

Resumo:

This project is concerned with the way that illustrations, photographs, diagrams and graphs, and typographic elements interact to convey ideas on the book page. A framework for graphic description is proposed to elucidate this graphic language of ‘complex texts’. The model is built up from three main areas of study, with reference to a corpus of contemporary children’s science books. First, a historical survey puts the subjects for study in context. Then a multidisciplinary discussion of graphic communication provides a theoretical underpinning for the model; this leads to various proposals, such as the central importance of ratios and relationships among parts in creating meaning in graphic communication. Lastly a series of trials in description contribute to the structure of the model itself. At the heart of the framework is an organising principle that integrates descriptive models from fields of design, literary criticism, art history, and linguistics, among others, as well as novel categories designed specifically for book design. Broadly, design features are described in terms of elemental component parts (micro-level), larger groupings of these (macro-level), and finally in terms of overarching, ‘whole book’ qualities (meta-level). Various features of book design emerge at different levels; for instance, the presence of nested discursive structures, a form of graphic recursion in editorial design, is proposed at the macro-level. Across these three levels are the intersecting categories of ‘rule’ and ‘context’, offering different perspectives with which to describe graphic characteristics. Contextbased features are contingent on social and cultural environment, the reader’s previous knowledge, and the actual conditions of reading; rule-based features relate to the systematic or codified aspects of graphic language. The model aims to be a frame of reference for graphic description, of use in different forms of qualitative or quantitative research and as a heuristic tool in practice and teaching.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This note reviews Ken Thompson's statistics on 6-man White wins with Black to move and explains the way in which the statistics have been graphed logarithmically.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We are looking into variants of a domination set problem in social networks. While randomised algorithms for solving the minimum weighted domination set problem and the minimum alpha and alpha-rate domination problem on simple graphs are already present in the literature, we propose here a randomised algorithm for the minimum weighted alpha-rate domination set problem which is, to the best of our knowledge, the first such algorithm. A theoretical approximation bound based on a simple randomised rounding technique is given. The algorithm is implemented in Python and applied to a UK Twitter mentions networks using a measure of individuals’ influence (klout) as weights. We argue that the weights of vertices could be interpreted as the costs of getting those individuals on board for a campaign or a behaviour change intervention. The minimum weighted alpha-rate dominating set problem can therefore be seen as finding a set that minimises the total cost and each individual in a network has at least alpha percentage of its neighbours in the chosen set. We also test our algorithm on generated graphs with several thousand vertices and edges. Our results on this real-life Twitter networks and generated graphs show that the implementation is reasonably efficient and thus can be used for real-life applications when creating social network based interventions, designing social media campaigns and potentially improving users’ social media experience.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, Bond Graphs are employed to develop a novel mathematical model of conventional switched-mode DC-DC converters valid for both continuous and discontinuous conduction modes. A unique causality bond graph model of hybrid models is suggested with the operation of the switch and the diode to be represented by a Modulated Transformer with a binary input and a resistor with fixed conductance causality. The operation of the diode is controlled using an if-then function within the model. The extracted hybrid model is implemented on a Boost and Buck converter with their operations to change from CCM to DCM and to return to CCM. The vector fields of the models show validity in a wide operation area and comparison with the simulation of the converters using PSPICE reveals high accuracy of the proposed model, with the Normalised Root Means Square Error and the Maximum Absolute Error remaining adequately low. The model is also experimentally tested on a Buck topology.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Iso-score curves graph (iSCG) and mathematical relationships between Scoring Parameters (SP) and Forecasting Parameters (FP) can be used in Economic Scoring Formulas (ESF) used in tendering to distribute the score among bidders in the economic part of a proposal. Each contracting authority must set an ESF when publishing tender specifications and the strategy of each bidder will differ depending on the ESF selected and the weight of the overall proposal scoring. The various mathematical relationships and density distributions that describe the main SPs and FPs, and the representation of tendering data by means of iSCGs, enable the generation of two new types of graphs that can be very useful for bidders who want to be more competitive: the scoring and position probability graphs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recent developments in the area of Bid Tender Forecasting have enabled bidders to implement new types of easy-to-use tools for increasing their chances of winning contracts. Although these new tools (such as iso-Score Curve Graphs, Scoring Probability Graphs, and Position Probability Graphs) are designed for bidders in capped tendering (tenders with an upper price limit), some of their principles can also be applied by a Contracting Authority to detect which bidders do not follow a standard pattern, that is, their bids are extremely high or low. Since a collusive bid generally needs to be sufficiently high or low to make an impact on the bid distribution, any person in charge of supervising capped tenders can be alerted to any bidder that might be involved in a cartel after identifying the same abnormal behavior in a series of tenders through simple calculations and a new type of graph.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Research in Bid Tender Forecasting Models (BTFM) has been in progress since the 1950s. None of the developed models were easy-to-use tools for effective use by bidding practitioners because the advanced mathematical apparatus and massive data inputs required. This scenario began to change in 2012 with the development of the Smartbid BTFM, a quite simple model that presents a series of graphs that enables any project manager to study competitors using a relatively short historical tender dataset. However, despite the advantages of this new model, so far, it is still necessary to study all the auction participants as an indivisible group; that is, the original BTFM was not devised for analyzing the behavior of a single bidding competitor or a subgroup of them. The present paper tries to solve that flaw and presents a stand-alone methodology useful for estimating future competitors’ bidding behaviors separately.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Texture is one of the most important visual attributes for image analysis. It has been widely used in image analysis and pattern recognition. A partially self-avoiding deterministic walk has recently been proposed as an approach for texture analysis with promising results. This approach uses walkers (called tourists) to exploit the gray scale image contexts in several levels. Here, we present an approach to generate graphs out of the trajectories produced by the tourist walks. The generated graphs embody important characteristics related to tourist transitivity in the image. Computed from these graphs, the statistical position (degree mean) and dispersion (entropy of two vertices with the same degree) measures are used as texture descriptors. A comparison with traditional texture analysis methods is performed to illustrate the high performance of this novel approach. (C) 2011 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Complex networks can be understood as graphs whose connectivity properties deviate from those of regular or near-regular graphs, which are understood as being ""simple"". While a great deal of the attention so far dedicated to complex networks has been duly driven by the ""complex"" nature of these structures, in this work we address the identification of their simplicity. The basic idea is to seek for subgraphs whose nodes exhibit similar measurements. This approach paves the way for complementing the characterization of networks, including results suggesting that the protein-protein interaction networks, and to a lesser extent also the Internet, may be getting simpler over time. Copyright (C) EPLA, 2009

Relevância:

30.00% 30.00%

Publicador:

Resumo:

2D electrophoresis is a well-known method for protein separation which is extremely useful in the field of proteomics. Each spot in the image represents a protein accumulation and the goal is to perform a differential analysis between pairs of images to study changes in protein content. It is thus necessary to register two images by finding spot correspondences. Although it may seem a simple task, generally, the manual processing of this kind of images is very cumbersome, especially when strong variations between corresponding sets of spots are expected (e.g. strong non-linear deformations and outliers). In order to solve this problem, this paper proposes a new quadratic assignment formulation together with a correspondence estimation algorithm based on graph matching which takes into account the structural information between the detected spots. Each image is represented by a graph and the task is to find a maximum common subgraph. Successful experimental results using real data are presented, including an extensive comparative performance evaluation with ground-truth data. (C) 2010 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The assessment of routing protocols for mobile wireless networks is a difficult task, because of the networks` dynamic behavior and the absence of benchmarks. However, some of these networks, such as intermittent wireless sensors networks, periodic or cyclic networks, and some delay tolerant networks (DTNs), have more predictable dynamics, as the temporal variations in the network topology can be considered as deterministic, which may make them easier to study. Recently, a graph theoretic model-the evolving graphs-was proposed to help capture the dynamic behavior of such networks, in view of the construction of least cost routing and other algorithms. The algorithms and insights obtained through this model are theoretically very efficient and intriguing. However, there is no study about the use of such theoretical results into practical situations. Therefore, the objective of our work is to analyze the applicability of the evolving graph theory in the construction of efficient routing protocols in realistic scenarios. In this paper, we use the NS2 network simulator to first implement an evolving graph based routing protocol, and then to use it as a benchmark when comparing the four major ad hoc routing protocols (AODV, DSR, OLSR and DSDV). Interestingly, our experiments show that evolving graphs have the potential to be an effective and powerful tool in the development and analysis of algorithms for dynamic networks, with predictable dynamics at least. In order to make this model widely applicable, however, some practical issues still have to be addressed and incorporated into the model, like adaptive algorithms. We also discuss such issues in this paper, as a result of our experience.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Let M = (V, E, A) be a mixed graph with vertex set V, edge set E and arc set A. A cycle cover of M is a family C = {C(1), ... , C(k)} of cycles of M such that each edge/arc of M belongs to at least one cycle in C. The weight of C is Sigma(k)(i=1) vertical bar C(i)vertical bar. The minimum cycle cover problem is the following: given a strongly connected mixed graph M without bridges, find a cycle cover of M with weight as small as possible. The Chinese postman problem is: given a strongly connected mixed graph M, find a minimum length closed walk using all edges and arcs of M. These problems are NP-hard. We show that they can be solved in polynomial time if M has bounded tree-width. (C) 2008 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In 1983, Chvatal, Trotter and the two senior authors proved that for any Delta there exists a constant B such that, for any n, any 2-colouring of the edges of the complete graph K(N) with N >= Bn vertices yields a monochromatic copy of any graph H that has n vertices and maximum degree Delta. We prove that the complete graph may be replaced by a sparser graph G that has N vertices and O(N(2-1/Delta)log(1/Delta)N) edges, with N = [B`n] for some constant B` that depends only on Delta. Consequently, the so-called size-Ramsey number of any H with n vertices and maximum degree Delta is O(n(2-1/Delta)log(1/Delta)n) Our approach is based on random graphs; in fact, we show that the classical Erdos-Renyi random graph with the numerical parameters above satisfies a stronger partition property with high probability, namely, that any 2-colouring of its edges contains a monochromatic universal graph for the class of graphs on n vertices and maximum degree Delta. The main tool in our proof is the regularity method, adapted to a suitable sparse setting. The novel ingredient developed here is an embedding strategy that allows one to embed bounded degree graphs of linear order in certain pseudorandom graphs. Crucial to our proof is the fact that regularity is typically inherited at a scale that is much finer than the scale at which it is assumed. (C) 2011 Elsevier Inc. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Consider the following problem: Forgiven graphs G and F(1),..., F(k), find a coloring of the edges of G with k colors such that G does not contain F; in color i. Rodl and Rucinski studied this problem for the random graph G,,, in the symmetric case when k is fixed and F(1) = ... = F(k) = F. They proved that such a coloring exists asymptotically almost surely (a.a.s.) provided that p <= bn(-beta) for some constants b = b(F,k) and beta = beta(F). This result is essentially best possible because for p >= Bn(-beta), where B = B(F, k) is a large constant, such an edge-coloring does not exist. Kohayakawa and Kreuter conjectured a threshold function n(-beta(F1,..., Fk)) for arbitrary F(1), ..., F(k). In this article we address the case when F(1),..., F(k) are cliques of different sizes and propose an algorithm that a.a.s. finds a valid k-edge-coloring of G(n,p) with p <= bn(-beta) for some constant b = b(F(1),..., F(k)), where beta = beta(F(1),..., F(k)) as conjectured. With a few exceptions, this algorithm also works in the general symmetric case. We also show that there exists a constant B = B(F,,..., Fk) such that for p >= Bn(-beta) the random graph G(n,p) a.a.s. does not have a valid k-edge-coloring provided the so-called KLR-conjecture holds. (C) 2008 Wiley Periodicals, Inc. Random Struct. Alg., 34, 419-453, 2009

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The performance of a carbon paste electrode (CPE) modified with SBA-15 nanostructured silica organofunctionalised with 2-benzothiazolethiol in the simultaneous determination of Pb(II), Cu(II) and Hg(II) ions in natural water and sugar cane spirit (cachaca) is described. Pb(II), Cu(II) and Hg(II) were pre-concentrated on the surface of the modified electrode by complexing with 2-benzothiazolethiol and reduced at a negative potential (-0.80 V). Then the reduced products were oxidised by DPASV procedure. The fact that three stripping peaks appeared on the voltammograms at the potentials of -0.48 V (Pb2+), -0.03 V (Cu2+) and +0.36 V (Hg2+) in relation to the SCE, demonstrates the possibility of simultaneous determination of Pb2+, Cu2+ and Hg2+. The best results were obtained under the following optimised conditions: 100 mV pulse amplitude, 3 min accumulation time, 25 mV s(-1) scan rate in phosphate solution pH 3.0. Using such parameters, calibration graphs were linear in the concentration ranges of 3.00-70.0 x 10(-7) mol L-1 (Pb2+), 8.00-100.0 X 10(-7) mol L-1 (Cu2+) and 2.00-10.0 x 10(-6) mol L-1 (Hg2+). Detection limits of 4.0 x 10(-8) mol L-1 (Pb2+), 2.0 x 10(-7) mol L-1 (Cu2+) and 4.0 x 10(-7) mol L-1 (Hg2+) were obtained at the signal noise ratio (SNR) of 3. The results indicate that this electrode is sensitive and effective for simultaneous determination of Pb2+, Cu2+ and Hg2+ in the analysed samples. (C) 2008 Published by Elsevier B.V.