156 resultados para Log conformance


Relevância:

10.00% 10.00%

Publicador:

Resumo:

The boxicity of a graph G, denoted box(G), is the least integer d such that G is the intersection graph of a family of d-dimensional (axis-parallel) boxes. The cubicity, denoted cub(G), is the least dsuch that G is the intersection graph of a family of d-dimensional unit cubes. An independent set of three vertices is an asteroidal triple if any two are joined by a path avoiding the neighbourhood of the third. A graph is asteroidal triple free (AT-free) if it has no asteroidal triple. The claw number psi(G) is the number of edges in the largest star that is an induced subgraph of G. For an AT-free graph G with chromatic number chi(G) and claw number psi(G), we show that box(G) <= chi(C) and that this bound is sharp. We also show that cub(G) <= box(G)([log(2) psi(G)] + 2) <= chi(G)([log(2) psi(G)] + 2). If G is an AT-free graph having girth at least 5, then box(G) <= 2, and therefore cub(G) <= 2 [log(2) psi(G)] + 4. (c) 2010 Elsevier B.V. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The search engine log files have been used to gather direct user feedback on the relevancy of the documents presented in the results page. Typically the relative position of the clicks gathered from the log files is used a proxy for the direct user feedback. In this paper we identify reasons for the incompleteness of the relative position of clicks for deciphering the user preferences. Hence, we propose the use of time spent by the user in reading through the document as indicative of user preference for a document with respect to a query. Also, we identify the issues involved in using the time measure and propose means to address them.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The hydrolysis of cupric ion has been studied at various ionic strengths (0·01, 0·05, 0·1 and 0·5 M). The results are analyzed employing 'core + links' theory, log-log plot, normalization plot, and extrapolation method for obtaining the pure mononuclear curve. The stability constants of Cu2(OH)2++, Cu3(OH)4++, Cu(OH)+ and Cu(OH)2 have been reported.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Different seismic hazard components pertaining to Bangalore city,namely soil overburden thickness, effective shear-wave velocity, factor of safety against liquefaction potential, peak ground acceleration at the seismic bedrock, site response in terms of amplification factor, and the predominant frequency, has been individually evaluated. The overburden thickness distribution, predominantly in the range of 5-10 m in the city, has been estimated through a sub-surface model from geotechnical bore-log data. The effective shear-wave velocity distribution, established through Multi-channel Analysis of Surface Wave (MASW) survey and subsequent data interpretation through dispersion analysis, exhibits site class D (180-360 m/s), site class C (360-760 m/s), and site class B (760-1500 m/s) in compliance to the National Earthquake Hazard Reduction Program (NEHRP) nomenclature. The peak ground acceleration has been estimated through deterministic approach, based on the maximum credible earthquake of M-W = 5.1 assumed to be nucleating from the closest active seismic source (Mandya-Channapatna-Bangalore Lineament). The 1-D site response factor, computed at each borehole through geotechnical analysis across the study region, is seen to be ranging from around amplification of one to as high as four times. Correspondingly, the predominant frequency estimated from the Fourier spectrum is found to be predominantly in range of 3.5-5.0 Hz. The soil liquefaction hazard assessment has been estimated in terms of factor of safety against liquefaction potential using standard penetration test data and the underlying soil properties that indicates 90% of the study region to be non-liquefiable. The spatial distributions of the different hazard entities are placed on a GIS platform and subsequently, integrated through analytical hierarchal process. The accomplished deterministic hazard map shows high hazard coverage in the western areas. The microzonation, thus, achieved is envisaged as a first-cut assessment of the site specific hazard in laying out a framework for higher order seismic microzonation as well as a useful decision support tool in overall land-use planning, and hazard management. (C) 2010 Elsevier Ltd. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

An atmospheric radio noise burst represents the radiation received from one complete lightning flash at the frequency to which a receiver is tuned and within the receiver bandwidth. At tropical latitudes, the principal source of interference in the frequency range from 0.1 to 10 MHz is the burst form of atmospheric radio noise. The structure of a burst shows several approximately rectangular pulses of random amplitude, duration and frequency of recurrence. The influence of the noise on data communication can only be examined when the value of the number of pulses crossing a certain amplitude threshold per unit time of the noise burst is known. A pulse rate counter designed for this purpose has been used at Bangalore (12°58′N, 77°35′E) to investigate the pulse characteristics of noise bursts at 3 MHz with a receiver bandwidth of 3.3 kHz/6d B. The results show that the number of pulses lying in the amplitude range between peak and quasi-peak values of the noise bursts and the burst duration corresponding to these pulses follow log normal distributions. The pulse rates deduced therefrom show certain correlation between the number of pulses and the duration of the noise burst. The results are discussed with a view to furnish necessary information for data communication.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We develop an alternate characterization of the statistical distribution of the inter-cell interference power observed in the uplink of CDMA systems. We show that the lognormal distribution better matches the cumulative distribution and complementary cumulative distribution functions of the uplink interference than the conventionally assumed Gaussian distribution and variants based on it. This is in spite of the fact that many users together contribute to uplink interference, with the number of users and their locations both being random. Our observations hold even in the presence of power control and cell selection, which have hitherto been used to justify the Gaussian distribution approximation. The parameters of the lognormal are obtained by matching moments, for which detailed analytical expressions that incorporate wireless propagation, cellular layout, power control, and cell selection parameters are developed. The moment-matched lognormal model, while not perfect, is an order of magnitude better in modeling the interference power distribution.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A linear time approximate maximum likelihood decoding algorithm on tail-biting trellises is presented, that requires exactly two rounds on the trellis. This is an adaptation of an algorithm proposed earlier with the advantage that it reduces the time complexity from O(m log m) to O(m) where m is the number of nodes in the tail-biting trellis. A necessary condition for the output of the algorithm to differ from the output of the ideal ML decoder is deduced and simulation results on an AWGN channel using tail-biting trellises for two rate 1/2 convolutional codes with memory 4 and 6 respectively, are reported.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We propose two texture-based approaches, one involving Gabor filters and the other employing log-polar wavelets, for separating text from non-text elements in a document image. Both the proposed algorithms compute local energy at some information-rich points, which are marked by Harris' corner detector. The advantage of this approach is that the algorithm calculates the local energy at selected points and not throughout the image, thus saving a lot of computational time. The algorithm has been tested on a large set of scanned text pages and the results have been seen to be better than the results from the existing algorithms. Among the proposed schemes, the Gabor filter based scheme marginally outperforms the wavelet based scheme.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Tanner Graph representation of linear block codes is widely used by iterative decoding algorithms for recovering data transmitted across a noisy communication channel from errors and erasures introduced by the channel. The stopping distance of a Tanner graph T for a binary linear block code C determines the number of erasures correctable using iterative decoding on the Tanner graph T when data is transmitted across a binary erasure channel using the code C. We show that the problem of finding the stopping distance of a Tanner graph is hard to approximate within any positive constant approximation ratio in polynomial time unless P = NP. It is also shown as a consequence that there can be no approximation algorithm for the problem achieving an approximation ratio of 2(log n)(1-epsilon) for any epsilon > 0 unless NP subset of DTIME(n(poly(log n))).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Extraction of text areas from the document images with complex content and layout is one of the challenging tasks. Few texture based techniques have already been proposed for extraction of such text blocks. Most of such techniques are greedy for computation time and hence are far from being realizable for real time implementation. In this work, we propose a modification to two of the existing texture based techniques to reduce the computation. This is accomplished with Harris corner detectors. The efficiency of these two textures based algorithms, one based on Gabor filters and other on log-polar wavelet signature, are compared. A combination of Gabor feature based texture classification performed on a smaller set of Harris corner detected points is observed to deliver the accuracy and efficiency.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Conformance testing focuses on checking whether an implementation. under test (IUT) behaves according to its specification. Typically, testers are interested it? performing targeted tests that exercise certain features of the IUT This intention is formalized as a test purpose. The tester needs a "strategy" to reach the goal specified by the test purpose. Also, for a particular test case, the strategy should tell the tester whether the IUT has passed, failed. or deviated front the test purpose. In [8] Jeron and Morel show how to compute, for a given finite state machine specification and a test purpose automaton, a complete test graph (CTG) which represents all test strategies. In this paper; we consider the case when the specification is a hierarchical state machine and show how to compute a hierarchical CTG which preserves the hierarchical structure of the specification. We also propose an algorithm for an online test oracle which avoids a space overhead associated with the CTG.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Template matching is concerned with measuring the similarity between patterns of two objects. This paper proposes a memory-based reasoning approach for pattern recognition of binary images with a large template set. It seems that memory-based reasoning intrinsically requires a large database. Moreover, some binary image recognition problems inherently need large template sets, such as the recognition of Chinese characters which needs thousands of templates. The proposed algorithm is based on the Connection Machine, which is the most massively parallel machine to date, using a multiresolution method to search for the matching template. The approach uses the pyramid data structure for the multiresolution representation of templates and the input image pattern. For a given binary image it scans the template pyramid searching the match. A binary image of N × N pixels can be matched in O(log N) time complexity by our algorithm and is independent of the number of templates. Implementation of the proposed scheme is described in detail.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The max-coloring problem is to compute a legal coloring of the vertices of a graph G = (V, E) with a non-negative weight function w on V such that Sigma(k)(i=1) max(v epsilon Ci) w(v(i)) is minimized, where C-1, ... , C-k are the various color classes. Max-coloring general graphs is as hard as the classical vertex coloring problem, a special case where vertices have unit weight. In fact, in some cases it can even be harder: for example, no polynomial time algorithm is known for max-coloring trees. In this paper we consider the problem of max-coloring paths and its generalization, max-coloring abroad class of trees and show it can be solved in time O(vertical bar V vertical bar+time for sorting the vertex weights). When vertex weights belong to R, we show a matching lower bound of Omega(vertical bar V vertical bar log vertical bar V vertical bar) in the algebraic computation tree model.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Let n points be placed independently in d-dimensional space according to the density f(x) = A(d)e(-lambda parallel to x parallel to alpha), lambda, alpha > 0, x is an element of R-d, d >= 2. Let d(n) be the longest edge length of the nearest-neighbor graph on these points. We show that (lambda(-1) log n)(1-1/alpha) d(n) - b(n) converges weakly to the Gumbel distribution, where b(n) similar to ((d - 1)/lambda alpha) log log n. We also prove the following strong law for the normalized nearest-neighbor distance (d) over tilde (n) = (lambda(-1) log n)(1-1/alpha) d(n)/log log n: (d - 1)/alpha lambda <= lim inf(n ->infinity) (d) over tilde (n) <= lim sup(n ->infinity) (d) over tilde (n) <= d/alpha lambda almost surely. Thus, the exponential rate of decay alpha = 1 is critical, in the sense that, for alpha > 1, d(n) -> 0, whereas, for alpha <= 1, d(n) -> infinity almost surely as n -> infinity.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We have investigated the influence of Fe excess on the electrical transport and magnetism of Fe1+yTe0.5Se0.5 (y=0.04 and 0.09) single crystals. Both compositions exhibit resistively determined superconducting transitions (T-c) with an onset temperature of about 15 K. From the width of the superconducting transition and the magnitude of the lower critical field H-c1, it is inferred that excess of Fe suppresses superconductivity. The linear and nonlinear responses of the ac susceptibility show that the superconducting state for these compositions is inhomogeneous. A possible origin of this phase separation is a magnetic coupling between Fe excess occupying interstitial sites in the chalcogen planes and those in the Fe-square lattice. The temperature derivative of the resistivity d(rho)/d(T) in the temperature range T-c < T < T-a with T-a being the temperature of a magnetic anomaly, changes from positive to negative with increasing Fe. A log 1/T divergence of the resistivity above T-c in the sample with higher amount of Fe suggests a disorder-driven electronic localization.