824 resultados para trust graph


Relevância:

20.00% 20.00%

Publicador:

Resumo:

We describe a heuristic method for drawing graphs which uses a multilevel technique combined with a force-directed placement algorithm. The multilevel process groups vertices to form clusters, uses the clusters to define a new graph and is repeated until the graph size falls below some threshold. The coarsest graph is then given an initial layout and the layout is successively refined on all the graphs starting with the coarsest and ending with the original. In this way the multilevel algorithm both accelerates and gives a more global quality to the force- directed placement. The algorithm can compute both 2 & 3 dimensional layouts and we demonstrate it on a number of examples ranging from 500 to 225,000 vertices. It is also very fast and can compute a 2D layout of a sparse graph in around 30 seconds for a 10,000 vertex graph to around 10 minutes for the largest graph. This is an order of magnitude faster than recent implementations of force-directed placement algorithms.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We describe a heuristic method for drawing graphs which uses a multilevel framework combined with a force-directed placement algorithm. The multilevel technique matches and coalesces pairs of adjacent vertices to define a new graph and is repeated recursively to create a hierarchy of increasingly coarse graphs, G0, G1, …, GL. The coarsest graph, GL, is then given an initial layout and the layout is refined and extended to all the graphs starting with the coarsest and ending with the original. At each successive change of level, l, the initial layout for Gl is taken from its coarser and smaller child graph, Gl+1, and refined using force-directed placement. In this way the multilevel framework both accelerates and appears to give a more global quality to the drawing. The algorithm can compute both 2 & 3 dimensional layouts and we demonstrate it on examples ranging in size from 10 to 225,000 vertices. It is also very fast and can compute a 2D layout of a sparse graph in around 12 seconds for a 10,000 vertex graph to around 5-7 minutes for the largest graphs. This is an order of magnitude faster than recent implementations of force-directed placement algorithms.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The graph-partitioning problem is to divide a graph into several pieces so that the number of vertices in each piece is the same within some defined tolerance and the number of cut edges is minimised. Important applications of the problem arise, for example, in parallel processing where data sets need to be distributed across the memory of a parallel machine. Very effective heuristic algorithms have been developed for this problem which run in real-time, but it is not known how good the partitions are since the problem is, in general, NP-complete. This paper reports an evolutionary search algorithm for finding benchmark partitions. A distinctive feature is the use of a multilevel heuristic algorithm to provide an effective crossover. The technique is tested on several example graphs and it is demonstrated that our method can achieve extremely high quality partitions significantly better than those found by the state-of-the-art graph-partitioning packages.

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this chapter we look at JOSTLE, the multilevel graph-partitioning software package, and highlight some of the key research issues that it addresses. We first outline the core algorithms and place it in the context of the multilevel refinement paradigm. We then look at issues relating to its use as a tool for parallel processing and, in particular, partitioning in parallel. Since its first release in 1995, JOSTLE has been used for many mesh-based parallel scientific computing applications and so we also outline some enhancements such as multiphase mesh-partitioning, heterogeneous mapping and partitioning to optimise subdomain shape

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we shall critically examine a special class of graph matching algorithms that follow the approach of node-similarity measurement. A high-level algorithm framework, namely node-similarity graph matching framework (NSGM framework), is proposed, from which, many existing graph matching algorithms can be subsumed, including the eigen-decomposition method of Umeyama, the polynomial-transformation method of Almohamad, the hubs and authorities method of Kleinberg, and the kronecker product successive projection methods of Wyk, etc. In addition, improved algorithms can be developed from the NSGM framework with respects to the corresponding results in graph theory. As the observation, it is pointed out that, in general, any algorithm which can be subsumed from NSGM framework fails to work well for graphs with non-trivial auto-isomorphism structure.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper examines different ways of measuring similarity between software design models for Case Based Reasoning (CBR) to facilitate reuse of software design and code. The paper considers structural and behavioural aspects of similarity between software design models. Similarity metrics for comparing static class structures are defined and discussed. A Graph representation of UML class diagrams and corresponding similarity measures for UML class diagrams are defined. A full search graph matching algorithm for measuring structural similarity diagrams based on the identification of the Maximum Common Sub-graph (MCS) is presented. Finally, a simple evaluation of the approach is presented and discussed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper describes ways in which emergence engineering principles can be applied to the development of distributed applications. A distributed solution to the graph-colouring problem is used as a vehicle to illustrate some novel techniques. Each node acts autonomously to colour itself based only on its local view of its neighbourhood, and following a simple set of carefully tuned rules. Randomness breaks symmetry and thus enhances stability. The algorithm has been developed to enable self-configuration in wireless sensor networks, and to reflect real-world configurations the algorithm operates with 3 dimensional topologies (reflecting the propagation of radio waves and the placement of sensors in buildings, bridge structures etc.). The algorithm’s performance is evaluated and results presented. It is shown to be simultaneously highly stable and scalable whilst achieving low convergence times. The use of eavesdropping gives rise to low interaction complexity and high efficiency in terms of the communication overheads.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Trust is a complex concept that has increasingly been debated in academic research (Kramer and Tyler, 1996). Research on 'trust and leadership' (Caldwell and Hayes, 2007) has suggested, unsurprisingly, that leadership behaviours influence 'follower' perceptions of leaders' trustworthiness. The development of 'ethical stewardship' amongst leaders may foster high trust situations (Caldwell, Hayes, Karri and Bernal, 2008), yet studies on the erosion of teacher professionalism in UK post-compulsory education have highlighted the distrust that arguably accompanies 'new managerialism', performativity and surveillance within a climate of economic rationalisation established by recent deterministic skills-focused government agendas for education (Avis, 2003; Codd, 1999, Deem, 2004, DFES, 2006). Given the shift from community to commercialism identified by Collinson and Collinson (2005) in a global economic environment characterised by uncertainty and rapid change, trust is, simultaneously, increasingly important and progressively both more fragile and limited in a post compulsory education sector dominated by skills-based targets and inspection demands. Building on such prior studies, this conference paper reports on the analysis of findings from a 2007-8 funded research study on 'trust and leadership' carried out in post-compulsory education. The research project collected and analysed case study interview and survey data from the lifelong learning sector, including selected tertiary, further and higher education (FE and HE) institutions. We interviewed 18 UK respondents from HE and FE, including principals, middle managers, first line managers, lecturers and researchers, supplementing and cross-checking this with a small number of survey responses (11) on 'trust and leadership' and a larger number (241) of survey responses on more generalised leadership issues in post-compulsory education. A range of facilitators and enablers of trust and their relationship to leadership were identified and investigated. The research analysed the ways in which interviewees defined the concept of 'trust' and the extent to which they identified that trust was a mediating factor affecting leadership and organisational performance. Prior literature indicates that trust involves a psychological state in which, despite dependency, risk and vulnerability, trustors have some degree of confident expectation that trustees will behave in benevolent rather than detrimental ways. The project confirmed the views of prior researchers (Mayer, Davis and Schoorman, 1995) that, since trust inevitably involves potential betrayal, estimations of leadership 'trustworthiness' are based on followers' cognitive and affective perceptions of the reliability, competence, benevolence and reputation of leaders. During the course of the interviews it also became clear that some interviewees were being managed in more or less transaction-focused, performative, audit-dominated cultures in which trust was not regarded as particularly important: while 'cautious trust' existed, collegiality flourished only marginally in small teams. Economic necessity and survival were key factors influencing leadership and employee behaviours, while an increasing distance was reported between senior managers and their staff. The paper reflects on the nature of the public sector leadership and management environment in post-compulsory education reported by interviewees and survey respondents. Leadership behaviours to build trust are recommended, including effective communication, honesty, integrity, authenticity, reliability and openness. It was generally felt that building trust was difficult in an educational environment largely determined by economic necessity and performativity. Yet, despite this, the researchers did identify a number of examples of high trust leadership situations that are worthy of emulation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Academic partnerships bring knowledge and drive economic growth, but success depends on good communications that build trust, says Tim Gore.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Dimethylsulphide (DMS) is a globally important aerosol precurser. In 1987 Charlson and others proposed that an increase in DMS production by certain phytoplankton species in response to a warming climate could stimulate increased aerosol formation, increasing the lower-atmosphere's albedo, and promoting cooling. Despite two decades of research, the global significance of this negative climate feedback remains contentious. It is therefore imperative that schemes are developed and tested, which allow for the realistic incorporation of phytoplankton DMS production into Earth System models. Using these models we can investigate the DMS-climate feedback and reduce uncertainty surrounding projections of future climate. Here we examine two empirical DMS parameterisations within the context of an Earth System model and find them to perform marginally better than the standard DMS climatology at predicting observations from an independent global dataset. We then question whether parameterisations based on our present understanding of DMS production by phytoplankton, and simple enough to incorporate into global climate models, can be shown to enhance the future predictive capacity of those models. This is an important question to ask now, as results from increasingly complex Earth System models lead us into the 5th assessment of climate science by the Intergovernmental Panel on Climate Change. Comparing observed and predicted inter-annual variability, we suggest that future climate projections may underestimate the magnitude of surface ocean DMS change. Unfortunately this conclusion relies on a relatively small dataset, in which observed inter-annual variability may be exaggerated by biases in sample collection. We therefore encourage the observational community to make repeat measurements of sea-surface DMS concentrations an important focus, and highlight areas of apparent high inter-annual variability where sampling might be carried out. Finally, we assess future projections from two similarly valid empirical DMS schemes, and demonstrate contrasting results. We therefore conclude that the use of empirical DMS parameterisations within simulations of future climate should be undertaken only with careful appreciation of the caveats discussed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Two key players in the Arctic and subarctic marine ecosystem are the calanoid copepods, Calanus finmarchicus and C. glacialis. Although morphologically very similar, these sibling species have different life cycles and roles in the Arctic pelagic marine ecosystem. Considering that the distribution of C. glacialis corresponds to Arctic water masses and C. finmarchicus to Atlantic water masses, the species are frequently used as climate indicators. Consequently, correct identification of the two species is essential if we want to understand climate-impacted changes on Calanus-dominated marine ecosystems such as the Arctic. Here, we present a novel morphological character (redness) to distinguish live females of C. glacialis and C. finmarchicus and compare it to morphological (prosome length) and genetic identification. The characters are tested on 300 live females of C. glacialis and C. finmarchicus from Disko Bay, western Greenland. Our analysis confirms that length cannot be used as a stand-alone criterion for separation. The results based on the new morphological character were verified genetically using a single mitochondrial marker (16S) and nuclear loci (six microsatellites and 12 InDels). The pigmentation criterion was also used on individuals (n = 89) from Young Sound fjord, northeast Greenland to determine whether the technique was viable in different geographical locations. Genetic markers based on mitochondrial and nuclear loci were corroborative in their identification of individuals and revealed no hybrids. Molecular identification confirmed that live females of the two species from Greenlandic waters, both East and West, can easily be separated by the red pigmentation of the antenna and somites of C. glacialis in contrast to the pale opaque antenna and somites of C. finmarchicus, confirming that the pigmentation criterion is valid for separation of the two species

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present a novel approach to goal recognition based on a two-stage paradigm of graph construction and analysis. First, a graph structure called a Goal Graph is constructed to represent the observed actions, the state of the world, and the achieved goals as well as various connections between these nodes at consecutive time steps. Then, the Goal Graph is analysed at each time step to recognise those partially or fully achieved goals that are consistent with the actions observed so far. The Goal Graph analysis also reveals valid plans for the recognised goals or part of these goals. Our approach to goal recognition does not need a plan library. It does not suffer from the problems in the acquisition and hand-coding of large plan libraries, neither does it have the problems in searching the plan space of exponential size. We describe two algorithms for Goal Graph construction and analysis in this paradigm. These algorithms are both provably sound, polynomial-time, and polynomial-space. The number of goals recognised by our algorithms is usually very small after a sequence of observed actions has been processed. Thus the sequence of observed actions is well explained by the recognised goals with little ambiguity. We have evaluated these algorithms in the UNIX domain, in which excellent performance has been achieved in terms of accuracy, efficiency, and scalability.