18 resultados para Distance hereditary graphs

em Biblioteca Digital da Produção Intelectual da Universidade de São Paulo (BDPI/USP)


Relevância:

30.00% 30.00%

Publicador:

Resumo:

A geodesic in a graph G is a shortest path between two vertices of G. For a specific function e(n) of n, we define an almost geodesic cycle C in G to be a cycle in which for every two vertices u and v in C, the distance d(G)(u, v) is at least d(C)(u, v) - e(n). Let omega(n) be any function tending to infinity with n. We consider a random d-regular graph on n vertices. We show that almost all pairs of vertices belong to an almost geodesic cycle C with e(n)= log(d-1)log(d-1) n+omega(n) and vertical bar C vertical bar =2 log(d-1) n+O(omega(n)). Along the way, we obtain results on near-geodesic paths. We also give the limiting distribution of the number of geodesics between two random vertices in this random graph. (C) 2010 Wiley Periodicals, Inc. J Graph Theory 66: 115-136, 2011

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The microstructure of the crestal alveolar bone is important for both the maintenance of osseointegration and the location of the gingival soft tissues. The aim of this study was to evaluate and compare the bone microstructure of the alveolar bone and of the interimplant bone in implants inserted at different interimplant distances. The mandibular bilateral premolars of six dogs were extracted, and after 12 weeks, each dog received eight implants, for a total of 48 implants. Two pairs of implants, one for each hemiarch, were separated by 2 mm (group 1) and by 3 mm (group 2). After 12 weeks, the implants received temporary acrylic prostheses. After four more weeks, metallic crowns substituted the temporary prostheses. After an additional 8 weeks the animals were sacrificed and the hemimandibles were removed, dissected, and processed. The longitudinal collagen fiber orientation was 43.2% for the alveolar bone; it was 30.3% for the 2-mm group and 43.9% for the 3-mm group. There was a statistically significant difference between the 2-mm and 3-mm groups (p < .05). The orientation of transverse collagen fibers was 47.8% for the alveolar bone; it was 37.3% for the 2-mm group and 56.3% for the 3-mm group. There was a statistically significant difference between the 2-mm and 3-mm groups (p < .05). The marrow spaces were 34.87% for the alveolar bone, 52.3% for the 2-mm group, and 59.9% for the 3-mm group. There was a statistically significant difference between the alveolar bone and the 3-mm group (p < .05). The low mineral density index was 36.29 for the alveolar bone, 46.76 for the 2-mm group, and 17.91 for the 3-mm group. There was a statistically significant difference between the 2-mm and 3-mm groups (p < .05). The high mineral density was 87.57 for the alveolar bone, 72.58 for the 2-mm group, and 84.91 for the 3-mm group. There was a statistically significant difference between the alveolar bone and the 2-mm group (p < .05). The collagen fiber orientation resulted in statistically significant differences in both the 2-mm and 3-mm groups compared with the alveolar bone. The marrow spaces appeared significantly increased in the 3-mm group compared with the alveolar bone. The low mineral density index was significantly higher in the 2-mm group, while the high mineral density index was significantly higher in the alveolar bone. In conclusion, the interimplant distance should not be less than 3 mm.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: Depending on the distance of laser tip to dental surface a specific morphological pattern should be expected. However, there have been limited reports that correlate the Er:YAG irradiation distance with dental morphology. Purpose: To assess the influence of Er:YAG laser irradiation distance on enamel morphology, by means of scanning electron microscopy (SEM). Methods: Sixty human third molars were employed to obtain discs (congruent to 1 mm thick) that were randomly assigned to six groups (n = 10). Five groups received Er:YAG laser irradiation (80 mJ/2 Hz) for 20 s, according to the irradiation distance: 11, 12, 14, 16, or 17 mm. and the control group was treated with 37% phosphoric acid for 15 s. The laser-irradiated discs were bisected. One hemi-disc was separated for superficial analysis without subsequent acid etching, and the other one, received the phosphoric acid for 15 s. Samples were prepared for SEM. Results: Laser irradiation at 11 and 12 min provided an evident ablation of enamel, with evident fissures and some fused areas. At 14, 16 and 17 mm the superficial topography was flatter than in the other distances. The subsequent acid etching on the lased-surface partially removed the disorganized tissue. Conclusions: Er:YAG laser in defocused mode promoted slight morphological alterations and seems more suitable for enamel conditioning than focused irradiation. The application of phosphoric acid on lased-enamel surface, regardless of the irradiation distance, decreased the superficial irregularities.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aim of this study was to assess in vitro the influence of Er:YAG laser irradiation distance on the shear strength of the bond between an adhesive restorative system and primary dentin. A total of 60 crowns of primary molars were embedded in acrylic resin and mechanically ground to expose a flat dentin surface and were randomly assigned to six groups (n = 10). The control group was etched with 37% phosphoric acid. The remaining five groups were irradiated (80 mJ, 2 Hz) at different irradiation distances (11, 12, 16, 17 and 20 mm), followed by acid etching. An adhesive agent (Single Bond) was applied to the bonding sites, and resin cylinders (Filtek Z250) were prepared. The shear bond strength tests were performed in a universal testing machine (0.5 mm/min). Data were submitted to statistical analysis using one-way ANOVA and the Kruskal-Wallis test (p < 0.05). The mean shear bond strengths were: 7.32 +/- 3.83, 5.07 +/- 2.62, 6.49 +/- 1.64, 7.71 +/- 0.66, 7.33 +/- 0.02, and 9.65 +/- 2.41 MPa in the control group and the groups irradiated at 11, 12, 16, 17, and 20 mm, respectively. The differences between the bond strengths in groups II and IV and between the bond strengths in groups II and VI were statistically significant (p < 0.05). Increasing the laser irradiation distance resulted in increasing shear strength of the bond to primary dentin.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Tick-borne encephalitis virus (TBEV) is the most important arboviral agent causing disease of the central nervous system in central Europe. In this study, 61 TBEV E gene sequences derived from 48 isolates from the Czech Republic, and four isolates and nine TBEV strains detected in ticks from Germany, covering more than half a century from 1954 to 2009, were sequenced and subjected to phylogenetic and Bayesian phylodynamic analysis to determine the phylogeography of TBEV in central Europe. The general Eurasian continental east-to-west pattern of the spread of TBEV was confirmed at the regional level but is interlaced with spreading that arises because of local geography and anthropogenic influence. This spread is reflected by the disease pattern in the Czech Republic that has been observed since 1991. The overall evolutionary rate was estimated to be approximately 8x10(-4) substitutions per nucleotide per year. The analysis of the TBEV E genes of 11 strains isolated at one natural focus in Zd`ar Kaplice proved for the first time that TBEV is indeed subject to local evolution.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Texture is one of the most important visual attributes for image analysis. It has been widely used in image analysis and pattern recognition. A partially self-avoiding deterministic walk has recently been proposed as an approach for texture analysis with promising results. This approach uses walkers (called tourists) to exploit the gray scale image contexts in several levels. Here, we present an approach to generate graphs out of the trajectories produced by the tourist walks. The generated graphs embody important characteristics related to tourist transitivity in the image. Computed from these graphs, the statistical position (degree mean) and dispersion (entropy of two vertices with the same degree) measures are used as texture descriptors. A comparison with traditional texture analysis methods is performed to illustrate the high performance of this novel approach. (C) 2011 Elsevier Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The shuttle radar topography mission (SRTM), was flow on the space shuttle Endeavour in February 2000, with the objective of acquiring a digital elevation model of all land between 60 degrees north latitude and 56 degrees south latitude, using interferometric synthetic aperture radar (InSAR) techniques. The SRTM data are distributed at horizontal resolution of 1 arc-second (similar to 30m) for areas within the USA and at 3 arc-second (similar to 90m) resolution for the rest of the world. A resolution of 90m can be considered suitable for the small or medium-scale analysis, but it is too coarse for more detailed purposes. One alternative is to interpolate the SRTM data at a finer resolution; it will not increase the level of detail of the original digital elevation model (DEM), but it will lead to a surface where there is the coherence of angular properties (i.e. slope, aspect) between neighbouring pixels, which is an important characteristic when dealing with terrain analysis. This work intents to show how the proper adjustment of variogram and kriging parameters, namely the nugget effect and the maximum distance within which values are used in interpolation, can be set to achieve quality results on resampling SRTM data from 3"" to 1"". We present for a test area in western USA, which includes different adjustment schemes (changes in nugget effect value and in the interpolation radius) and comparisons with the original 1"" model of the area, with the national elevation dataset (NED) DEMs, and with other interpolation methods (splines and inverse distance weighted (IDW)). The basic concepts for using kriging to resample terrain data are: (i) working only with the immediate neighbourhood of the predicted point, due to the high spatial correlation of the topographic surface and omnidirectional behaviour of variogram in short distances; (ii) adding a very small random variation to the coordinates of the points prior to interpolation, to avoid punctual artifacts generated by predicted points with the same location than original data points and; (iii) using a small value of nugget effect, to avoid smoothing that can obliterate terrain features. Drainages derived from the surfaces interpolated by kriging and by splines have a good agreement with streams derived from the 1"" NED, with correct identification of watersheds, even though a few differences occur in the positions of some rivers in flat areas. Although the 1"" surfaces resampled by kriging and splines are very similar, we consider the results produced by kriging as superior, since the spline-interpolated surface still presented some noise and linear artifacts, which were removed by kriging.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

2D electrophoresis is a well-known method for protein separation which is extremely useful in the field of proteomics. Each spot in the image represents a protein accumulation and the goal is to perform a differential analysis between pairs of images to study changes in protein content. It is thus necessary to register two images by finding spot correspondences. Although it may seem a simple task, generally, the manual processing of this kind of images is very cumbersome, especially when strong variations between corresponding sets of spots are expected (e.g. strong non-linear deformations and outliers). In order to solve this problem, this paper proposes a new quadratic assignment formulation together with a correspondence estimation algorithm based on graph matching which takes into account the structural information between the detected spots. Each image is represented by a graph and the task is to find a maximum common subgraph. Successful experimental results using real data are presented, including an extensive comparative performance evaluation with ground-truth data. (C) 2010 Elsevier B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present parallel algorithms on the BSP/CGM model, with p processors, to count and generate all the maximal cliques of a circle graph with n vertices and m edges. To count the number of all the maximal cliques, without actually generating them, our algorithm requires O(log p) communication rounds with O(nm/p) local computation time. We also present an algorithm to generate the first maximal clique in O(log p) communication rounds with O(nm/p) local computation, and to generate each one of the subsequent maximal cliques this algorithm requires O(log p) communication rounds with O(m/p) local computation. The maximal cliques generation algorithm is based on generating all maximal paths in a directed acyclic graph, and we present an algorithm for this problem that uses O(log p) communication rounds with O(m/p) local computation for each maximal path. We also show that the presented algorithms can be extended to the CREW PRAM model.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

There has been great interest in deciding whether a combinatorial structure satisfies some property, or in estimating the value of some numerical function associated with this combinatorial structure, by considering only a randomly chosen substructure of sufficiently large, but constant size. These problems are called property testing and parameter testing, where a property or parameter is said to be testable if it can be estimated accurately in this way. The algorithmic appeal is evident, as, conditional on sampling, this leads to reliable constant-time randomized estimators. Our paper addresses property testing and parameter testing for permutations in a subpermutation perspective; more precisely, we investigate permutation properties and parameters that can be well approximated based on a randomly chosen subpermutation of much smaller size. In this context, we use a theory of convergence of permutation sequences developed by the present authors [C. Hoppen, Y. Kohayakawa, C.G. Moreira, R.M. Sampaio, Limits of permutation sequences through permutation regularity, Manuscript, 2010, 34pp.] to characterize testable permutation parameters along the lines of the work of Borgs et al. [C. Borgs, J. Chayes, L Lovasz, V.T. Sos, B. Szegedy, K. Vesztergombi, Graph limits and parameter testing, in: STOC`06: Proceedings of the 38th Annual ACM Symposium on Theory of Computing, ACM, New York, 2006, pp. 261-270.] in the case of graphs. Moreover, we obtain a permutation result in the direction of a famous result of Alon and Shapira [N. Alon, A. Shapira, A characterization of the (natural) graph properties testable with one-sided error, SIAM J. Comput. 37 (6) (2008) 1703-1727.] stating that every hereditary graph property is testable. (C) 2011 Elsevier B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The assessment of routing protocols for mobile wireless networks is a difficult task, because of the networks` dynamic behavior and the absence of benchmarks. However, some of these networks, such as intermittent wireless sensors networks, periodic or cyclic networks, and some delay tolerant networks (DTNs), have more predictable dynamics, as the temporal variations in the network topology can be considered as deterministic, which may make them easier to study. Recently, a graph theoretic model-the evolving graphs-was proposed to help capture the dynamic behavior of such networks, in view of the construction of least cost routing and other algorithms. The algorithms and insights obtained through this model are theoretically very efficient and intriguing. However, there is no study about the use of such theoretical results into practical situations. Therefore, the objective of our work is to analyze the applicability of the evolving graph theory in the construction of efficient routing protocols in realistic scenarios. In this paper, we use the NS2 network simulator to first implement an evolving graph based routing protocol, and then to use it as a benchmark when comparing the four major ad hoc routing protocols (AODV, DSR, OLSR and DSDV). Interestingly, our experiments show that evolving graphs have the potential to be an effective and powerful tool in the development and analysis of algorithms for dynamic networks, with predictable dynamics at least. In order to make this model widely applicable, however, some practical issues still have to be addressed and incorporated into the model, like adaptive algorithms. We also discuss such issues in this paper, as a result of our experience.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Let M = (V, E, A) be a mixed graph with vertex set V, edge set E and arc set A. A cycle cover of M is a family C = {C(1), ... , C(k)} of cycles of M such that each edge/arc of M belongs to at least one cycle in C. The weight of C is Sigma(k)(i=1) vertical bar C(i)vertical bar. The minimum cycle cover problem is the following: given a strongly connected mixed graph M without bridges, find a cycle cover of M with weight as small as possible. The Chinese postman problem is: given a strongly connected mixed graph M, find a minimum length closed walk using all edges and arcs of M. These problems are NP-hard. We show that they can be solved in polynomial time if M has bounded tree-width. (C) 2008 Elsevier B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A bipartite graph G = (V, W, E) is convex if there exists an ordering of the vertices of W such that, for each v. V, the neighbors of v are consecutive in W. We describe both a sequential and a BSP/CGM algorithm to find a maximum independent set in a convex bipartite graph. The sequential algorithm improves over the running time of the previously known algorithm and the BSP/CGM algorithm is a parallel version of the sequential one. The complexity of the algorithms does not depend on |W|.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We consider the problems of finding the maximum number of vertex-disjoint triangles (VTP) and edge-disjoint triangles (ETP) in a simple graph. Both problems are NP-hard. The algorithm with the best approximation ratio known so far for these problems has ratio 3/2 + epsilon, a result that follows from a more general algorithm for set packing obtained by Hurkens and Schrijver [On the size of systems of sets every t of which have an SDR, with an application to the worst-case ratio of heuristics for packing problems, SIAM J. Discrete Math. 2(1) (1989) 68-72]. We present improvements on the approximation ratio for restricted cases of VTP and ETP that are known to be APX-hard: we give an approximation algorithm for VTP on graphs with maximum degree 4 with ratio slightly less than 1.2, and for ETP on graphs with maximum degree 5 with ratio 4/3. We also present an exact linear-time algorithm for VTP on the class of indifference graphs. (C) 2007 Elsevier B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In 1983, Chvatal, Trotter and the two senior authors proved that for any Delta there exists a constant B such that, for any n, any 2-colouring of the edges of the complete graph K(N) with N >= Bn vertices yields a monochromatic copy of any graph H that has n vertices and maximum degree Delta. We prove that the complete graph may be replaced by a sparser graph G that has N vertices and O(N(2-1/Delta)log(1/Delta)N) edges, with N = [B`n] for some constant B` that depends only on Delta. Consequently, the so-called size-Ramsey number of any H with n vertices and maximum degree Delta is O(n(2-1/Delta)log(1/Delta)n) Our approach is based on random graphs; in fact, we show that the classical Erdos-Renyi random graph with the numerical parameters above satisfies a stronger partition property with high probability, namely, that any 2-colouring of its edges contains a monochromatic universal graph for the class of graphs on n vertices and maximum degree Delta. The main tool in our proof is the regularity method, adapted to a suitable sparse setting. The novel ingredient developed here is an embedding strategy that allows one to embed bounded degree graphs of linear order in certain pseudorandom graphs. Crucial to our proof is the fact that regularity is typically inherited at a scale that is much finer than the scale at which it is assumed. (C) 2011 Elsevier Inc. All rights reserved.