805 resultados para Distance-balanced graph
Resumo:
Distance learning, Canadian educational system, institution, medial learning, school system Saxony-Anhalt, system theory, qualitative research
Resumo:
The object of this project is to schedule a ctitious European basketball competition with many teams situated a long distances. The schedule must be fair, feasible and economical, which means that the total distance trav- eled by every team must be the minimal possible. First, we de ne the sport competition terminology and study di erent competition systems, focusing on the NBA and the Euroleague systems. Then we de ne concepts of graph theory and spherical distance that will be needed. Next we propose a com- petition system, explaining where will be allocated the teams and how will be the scheduling. Then there is a description of the programs that have been implemented, and, nally, the complete schedule is displayed, and some possible improvements are mentioned.
Resumo:
The D-eigenvalues of a graph G are the eigenvalues of its distance matrix D, and the D-energy ED(G) is the sum of the absolute values of its D-eigenvalues. Two graphs are said to be D-equienergetic if they have the same D-energy. In this note we obtain bounds for the distance spectral radius and D-energy of graphs of diameter 2. Pairs of equiregular D-equienergetic graphs of diameter 2, on p = 3t + 1 vertices are also constructed.
Resumo:
The D-eigenvalues of a graph G are the eigenvalues of its distance matrix D, and the D-energy ED(G) is the sum of the absolute values of its D-eigenvalues. Two graphs are said to be D-equienergetic if they have the same D-energy. In this note we obtain bounds for the distance spectral radius and D-energy of graphs of diameter 2. Pairs of equiregular D-equienergetic graphs of diameter 2, on p = 3t + 1 vertices are also constructed.
Resumo:
The traveling salesman problem is although looking very simple problem but it is an important combinatorial problem. In this thesis I have tried to find the shortest distance tour in which each city is visited exactly one time and return to the starting city. I have tried to solve traveling salesman problem using multilevel graph partitioning approach.Although traveling salesman problem itself very difficult as this problem is belong to the NP-Complete problems but I have tried my best to solve this problem using multilevel graph partitioning it also belong to the NP-Complete problems. I have solved this thesis by using the k-mean partitioning algorithm which divides the problem into multiple partitions and solving each partition separately and its solution is used to improve the overall tour by applying Lin Kernighan algorithm on it. Through all this I got optimal solution which proofs that solving traveling salesman problem through graph partition scheme is good for this NP-Problem and through this we can solved this intractable problem within few minutes.Keywords: Graph Partitioning Scheme, Traveling Salesman Problem.
Resumo:
The problems of finding best facility locations require complete and accurate road network with the corresponding population data in a specific area. However the data obtained in road network databases usually do not fit in this usage. In this paper we propose our procedure of converting the road network database to a road graph which could be used in localization problems. The road network data come from the National road data base in Sweden. The graph derived is cleaned, and reduced to a suitable level for localization problems. The population points are also processed in ordered to match with that graph. The reduction of the graph is done maintaining most of the accuracy for distance measures in the network.
Resumo:
Research on image processing has shown that combining segmentation methods may lead to a solid approach to extract semantic information from different sort of images. Within this context, the Normalized Cut (NCut) is usually used as a final partitioning tool for graphs modeled in some chosen method. This work explores the Watershed Transform as a modeling tool, using different criteria of the hierarchical Watershed to convert an image into an adjacency graph. The Watershed is combined with an unsupervised distance learning step that redistributes the graph weights and redefines the Similarity matrix, before the final segmentation step using NCut. Adopting the Berkeley Segmentation Data Set and Benchmark as a background, our goal is to compare the results obtained for this method with previous work to validate its performance.
Resumo:
This book will serve as a foundation for a variety of useful applications of graph theory to computer vision, pattern recognition, and related areas. It covers a representative set of novel graph-theoretic methods for complex computer vision and pattern recognition tasks. The first part of the book presents the application of graph theory to low-level processing of digital images such as a new method for partitioning a given image into a hierarchy of homogeneous areas using graph pyramids, or a study of the relationship between graph theory and digital topology. Part II presents graph-theoretic learning algorithms for high-level computer vision and pattern recognition applications, including a survey of graph based methodologies for pattern recognition and computer vision, a presentation of a series of computationally efficient algorithms for testing graph isomorphism and related graph matching tasks in pattern recognition and a new graph distance measure to be used for solving graph matching problems. Finally, Part III provides detailed descriptions of several applications of graph-based methods to real-world pattern recognition tasks. It includes a critical review of the main graph-based and structural methods for fingerprint classification, a new method to visualize time series of graphs, and potential applications in computer network monitoring and abnormal event detection.
Resumo:
Java Enterprise Applications (JEAs) are complex systems composed using various technologies that in turn rely on languages other than Java, such as XML or SQL. Given the complexity of these applications, the need to reverse engineer them in order to support further development becomes critical. In this paper we show how it is possible to split a system into layers and how is possible to interpret the distance between application elements in order to support the refactoring of JEAs. The purpose of this paper is to explore ways to provide suggestions about the refactoring operations to perform on the code by evaluating the distance between layers and elements belonging those layers. We split JEAs into layers by considering the kinds and the purposes of the elements composing the application. We measure distance between elements by using the notion of the shortest path in a graph. Also we present how to enrich the interpretation of the distance value with enterprise pattern detection in order to refine the suggestion about modifications to perform on the code.
Resumo:
This paper addresses the issue of fully automatic segmentation of a hip CT image with the goal to preserve the joint structure for clinical applications in hip disease diagnosis and treatment. For this purpose, we propose a Multi-Atlas Segmentation Constrained Graph (MASCG) method. The MASCG method uses multi-atlas based mesh fusion results to initialize a bone sheetness based multi-label graph cut for an accurate hip CT segmentation which has the inherent advantage of automatic separation of the pelvic region from the bilateral proximal femoral regions. We then introduce a graph cut constrained graph search algorithm to further improve the segmentation accuracy around the bilateral hip joint regions. Taking manual segmentation as the ground truth, we evaluated the present approach on 30 hip CT images (60 hips) with a 15-fold cross validation. When the present approach was compared to manual segmentation, an average surface distance error of 0.30 mm, 0.29 mm, and 0.30 mm was found for the pelvis, the left proximal femur, and the right proximal femur, respectively. A further look at the bilateral hip joint regions demonstrated an average surface distance error of 0.16 mm, 0.21 mm and 0.20 mm for the acetabulum, the left femoral head, and the right femoral head, respectively.
Resumo:
Subgrid processes occur in various ecosystems and landscapes but, because of their small scale, they are not represented or poorly parameterized in climate models. These local heterogeneities are often important or even fundamental for energy and carbon balances. This is especially true for northern peatlands and in particular for the polygonal tundra, where methane emissions are strongly influenced by spatial soil heterogeneities. We present a stochastic model for the surface topography of polygonal tundra using Poisson-Voronoi diagrams and we compare the results with available recent field studies. We analyze seasonal dynamics of water table variations and the landscape response under different scenarios of precipitation income. We upscale methane fluxes by using a simple idealized model for methane emission. Hydraulic interconnectivities and large-scale drainage may also be investigated through percolation properties and thresholds in the Voronoi graph. The model captures the main statistical characteristics of the landscape topography, such as polygon area and surface properties as well as the water balance. This approach enables us to statistically relate large-scale properties of the system to the main small-scale processes within the single polygons.
Resumo:
In this paper, we use the quantum Jensen-Shannon divergence as a means to establish the similarity between a pair of graphs and to develop a novel graph kernel. In quantum theory, the quantum Jensen-Shannon divergence is defined as a distance measure between quantum states. In order to compute the quantum Jensen-Shannon divergence between a pair of graphs, we first need to associate a density operator with each of them. Hence, we decide to simulate the evolution of a continuous-time quantum walk on each graph and we propose a way to associate a suitable quantum state with it. With the density operator of this quantum state to hand, the graph kernel is defined as a function of the quantum Jensen-Shannon divergence between the graph density operators. We evaluate the performance of our kernel on several standard graph datasets from bioinformatics. We use the Principle Component Analysis (PCA) on the kernel matrix to embed the graphs into a feature space for classification. The experimental results demonstrate the effectiveness of the proposed approach. © 2013 Springer-Verlag.
Resumo:
One of the most fundamental problem that we face in the graph domain is that of establishing the similarity, or alternatively the distance, between graphs. In this paper, we address the problem of measuring the similarity between attributed graphs. In particular, we propose a novel way to measure the similarity through the evolution of a continuous-time quantum walk. Given a pair of graphs, we create a derived structure whose degree of symmetry is maximum when the original graphs are isomorphic, and where a subset of the edges is labeled with the similarity between the respective nodes. With this compositional structure to hand, we compute the density operators of the quantum systems representing the evolution of two suitably defined quantum walks. We define the similarity between the two original graphs as the quantum Jensen-Shannon divergence between these two density operators, and then we show how to build a novel kernel on attributed graphs based on the proposed similarity measure. We perform an extensive experimental evaluation both on synthetic and real-world data, which shows the effectiveness the proposed approach. © 2013 Springer-Verlag.
Resumo:
The study of the user scheduling problem in a Low Earth Orbit (LEO) Multi-User MIMO system is the objective of this thesis. With the application of cutting-edge digital beamforming algorithms, a LEO satellite with an antenna array and a large number of antenna elements can provide service to many user terminals (UTs) in full frequency reuse (FFR) schemes. Since the number of UTs on-ground are many more than the transmit antennas on the satellite, user scheduling is necessary. Scheduling can be accomplished by grouping users into different clusters: users within the same cluster are multiplexed and served together via Space Division Multiple Access (SDMA), i.e., digital beamforming or Multi-User MIMO techniques; the different clusters of users are then served on different time slots via Time Division Multiple Access (TDMA). The design of an optimal user grouping strategy is known to be an NP-complete problem which can be solved only through exhaustive search. In this thesis, we provide a graph-based user scheduling and feed space beamforming architecture for the downlink with the aim of reducing user inter-beam interference. The main idea is based on clustering users whose pairwise great-circle distance is as large as possible. First, we create a graph where the users represent the vertices, whereas an edge in the graph between 2 users exists if their great-circle distance is above a certain threshold. In the second step, we develop a low complex greedy user clustering technique and we iteratively search for the maximum clique in the graph, i.e., the largest fully connected subgraph in the graph. Finally, by using the 3 aforementioned power normalization techniques, a Minimum Mean Square Error (MMSE) beamforming matrix is deployed on a cluster basis. The suggested scheduling system is compared with a position-based scheduler, which generates a beam lattice on the ground and randomly selects one user per beam to form a cluster.
Resumo:
The consumption of dietary supplements is highest among athletes and it can represent potential a health risk for consumers. The aim of this study was to determine the prevalence of consumption of dietary supplements by road runners. We interviewed 817 volunteers from four road races in the Brazilian running calendar. The sample consisted of 671 male and 146 female runners with a mean age of 37.9 ± 12.4 years. Of the sample, 28.33% reported having used some type of dietary supplement. The main motivation for this consumption is to increase in stamina and improve performance. The probability of consuming dietary supplements increased 4.67 times when the runners were guided by coaches. The consumption of supplements was strongly correlated (r = 0.97) with weekly running distance, and also highly correlated (r = 0.86) with the number of years the sport had been practiced. The longer the runner had practiced the sport, the higher the training volume and the greater the intake of supplements. The five most frequently cited reasons for consumption were: energy enhancement (29.5%), performance improvement (17.1%), increased level of endurance (10.3%), nutrient replacement (11.1%), and avoidance of fatigue (10.3%). About 30% of the consumers declared more than one reason for taking dietary supplements. The most consumed supplements were: carbohydrates (52.17%), vitamins (28.70%), and proteins (13.48%). Supplement consumption by road runners in Brazil appeared to be guided by the energy boosting properties of the supplement, the influence of coaches, and the experience of the user. The amount of supplement intake seemed to be lower among road runners than for athletes of other sports. We recommend that coaches and nutritionists emphasise that a balanced diet can meet the needs of physically active people.