260 resultados para Fast view-matching algorithm


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The application of energy minimisation methods for stereo matching has been demonstrated to produce high quality disparity maps. However the majority of these methods are known to be computationally expensive, requiring minutes or even hours of computation. We propose a fast minimisation scheme that produces strongly competitive results for significantly reduced computation, requiring only a few seconds of computation. In this paper, we present our iterated dynamic programming algorithm along with a quadtree subregioning process for fast stereo matching.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

For determining functionality dependencies between two proteins, both represented as 3D structures, it is an essential condition that they have one or more matching structural regions called patches. As 3D structures for proteins are large, complex and constantly evolving, it is computationally expensive and very time-consuming to identify possible locations and sizes of patches for a given protein against a large protein database. In this paper, we address a vector space based representation for protein structures, where a patch is formed by the vectors within the region. Based on our previews work, a compact representation of the patch named patch signature is applied here. A similarity measure of two patches is then derived based on their signatures. To achieve fast patch matching in large protein databases, a match-and-expand strategy is proposed. Given a query patch, a set of small k-sized matching patches, called candidate patches, is generated in match stage. The candidate patches are further filtered by enlarging k in expand stage. Our extensive experimental results demonstrate encouraging performances with respect to this biologically critical but previously computationally prohibitive problem.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A set of DCT domain properties for shifting and scaling by real amounts, and taking linear operations such as differentiation is described. The DCT coefficients of a sampled signal are subjected to a linear transform, which returns the DCT coefficients of the shifted, scaled and/or differentiated signal. The properties are derived by considering the inverse discrete transform as a cosine series expansion of the original continuous signal, assuming sampling in accordance with the Nyquist criterion. This approach can be applied in the signal domain, to give, for example, DCT based interpolation or derivatives. The same approach can be taken in decoding from the DCT to give, for example, derivatives in the signal domain. The techniques may prove useful in compressed domain processing applications, and are interesting because they allow operations from the continuous domain such as differentiation to be implemented in the discrete domain. An image matching algorithm illustrates the use of the properties, with improvements in computation time and matching quality.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In this paper, we propose a fast adaptive importance sampling method for the efficient simulation of buffer overflow probabilities in queueing networks. The method comprises three stages. First, we estimate the minimum cross-entropy tilting parameter for a small buffer level; next, we use this as a starting value for the estimation of the optimal tilting parameter for the actual (large) buffer level. Finally, the tilting parameter just found is used to estimate the overflow probability of interest. We study various properties of the method in more detail for the M/M/1 queue and conjecture that similar properties also hold for quite general queueing networks. Numerical results support this conjecture and demonstrate the high efficiency of the proposed algorithm.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Recently Adams and Bischof (1994) proposed a novel region growing algorithm for segmenting intensity images. The inputs to the algorithm are the intensity image and a set of seeds - individual points or connected components - that identify the individual regions to be segmented. The algorithm grows these seed regions until all of the image pixels have been assimilated. Unfortunately the algorithm is inherently dependent on the order of pixel processing. This means, for example, that raster order processing and anti-raster order processing do not, in general, lead to the same tessellation. In this paper we propose an improved seeded region growing algorithm that retains the advantages of the Adams and Bischof algorithm fast execution, robust segmentation, and no tuning parameters - but is pixel order independent. (C) 1997 Elsevier Science B.V.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

OBJECTIVE: To evaluate a diagnostic algorithm for pulmonary tuberculosis based on smear microscopy and objective response to trial of antibiotics. SETTING: Adult medical wards, Hlabisa Hospital, South Africa, 1996-1997. METHODS: Adults with chronic chest symptoms and abnormal chest X-ray had sputum examined for Ziehl-Neelsen stained acid-fast bacilli by light microscopy. Those with negative smears were treated with amoxycillin for 5 days and assessed. Those who had not improved were treated with erythromycin for 5 days and reassessed. Response was compared with mycobacterial culture. RESULTS: Of 280 suspects who completed the diagnostic pathway, 160 (57%) had a positive smear, 46 (17%) responded to amoxycillin, 34 (12%) responded to erythromycin and 40 (14%) were treated as smear-negative tuberculosis. The sensitivity (89%) and specificity (84%) of the full algorithm for culture-positive tuberculosis were high. However, 11 patients (positive predictive value [PPV] 95%) were incorrectly diagnosed with tuberculosis, and 24 cases of tuberculosis (negative predictive value [NPV] 70%) were not identified. NPV improved to 75% when anaemia was included as a predictor. Algorithm performance was independent of human immunodeficiency virus status. CONCLUSION: Sputum smear microscopy plus trial of antibiotic algorithm among a selected group of tuberculosis suspects may increase diagnostic accuracy in district hospitals in developing countries.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

An equivalent unit cell waveguide approach (WGA) to designing 4 multilayer microstrip reflectarray of variable size patches is presented. In this approach, a normal incidence of a plane wave on an infinite periodic array of radiating elements is considered to obtain reflection coefficient phase curves for the reflectarray's elements. It is shown that this problem is equivalent to the problem of reflection of the dominant TEM mode in a waveguide with patches interleaved by layers of dielectric. This waveguide problem is solved using a field matching technique and a method of moments (MoM). Based on this solution, a fast computer algorithm is developed to generate reflection coefficient phase curves for a multilayer microstrip patch reflectarray. The validity of the developed algorithm is tested against alternative approaches and Agilent High Frequency Structure Simulator (HFSS). Having confirmed the validity of the WGA approach, a small offset feed two-layer microstrip patch array is designed and developed. This reflectarray is tested experimentally and shows good performance.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

We present the results of applying automated machine learning techniques to the problem of matching different object catalogues in astrophysics. In this study, we take two partially matched catalogues where one of the two catalogues has a large positional uncertainty. The two catalogues we used here were taken from the H I Parkes All Sky Survey (HIPASS) and SuperCOSMOS optical survey. Previous work had matched 44 per cent (1887 objects) of HIPASS to the SuperCOSMOS catalogue. A supervised learning algorithm was then applied to construct a model of the matched portion of our catalogue. Validation of the model shows that we achieved a good classification performance (99.12 per cent correct). Applying this model to the unmatched portion of the catalogue found 1209 new matches. This increases the catalogue size from 1887 matched objects to 3096. The combination of these procedures yields a catalogue that is 72 per cent matched.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Beyond the inherent technical challenges, current research into the three dimensional surface correspondence problem is hampered by a lack of uniform terminology, an abundance of application specific algorithms, and the absence of a consistent model for comparing existing approaches and developing new ones. This paper addresses these challenges by presenting a framework for analysing, comparing, developing, and implementing surface correspondence algorithms. The framework uses five distinct stages to establish correspondence between surfaces. It is general, encompassing a wide variety of existing techniques, and flexible, facilitating the synthesis of new correspondence algorithms. This paper presents a review of existing surface correspondence algorithms, and shows how they fit into the correspondence framework. It also shows how the framework can be used to analyse and compare existing algorithms and develop new algorithms using the framework's modular structure. Six algorithms, four existing and two new, are implemented using the framework. Each implemented algorithm is used to match a number of surface pairs. Results demonstrate that the correspondence framework implementations are faithful implementations of existing algorithms, and that powerful new surface correspondence algorithms can be created. (C) 2004 Elsevier Inc. All rights reserved.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

An approach and strategy for automatic detection of buildings from aerial images using combined image analysis and interpretation techniques is described in this paper. It is undertaken in several steps. A dense DSM is obtained by stereo image matching and then the results of multi-band classification, the DSM, and Normalized Difference Vegetation Index (NDVI) are used to reveal preliminary building interest areas. From these areas, a shape modeling algorithm has been used to precisely delineate their boundaries. The Dempster-Shafer data fusion technique is then applied to detect buildings from the combination of three data sources by a statistically-based classification. A number of test areas, which include buildings of different sizes, shape, and roof color have been investigated. The tests are encouraging and demonstrate that all processes in this system are important for effective building detection.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Online multimedia data needs to be encrypted for access control. To be capable of working on mobile devices such as pocket PC and mobile phones, lightweight video encryption algorithms should be proposed. The two major problems in these algorithms are that they are either not fast enough or unable to work on highly compressed data stream. In this paper, we proposed a new lightweight encryption algorithm based on Huffman error diffusion. It is a selective algorithm working on compressed data. By carefully choosing the most significant parts (MSP), high performance is achieved with proper security. Experimental results has proved the algorithm to be fast. secure: and compression-compatible.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a new relative measure of signal complexity, referred to here as relative structural complexity, which is based on the matching pursuit (MP) decomposition. By relative, we refer to the fact that this new measure is highly dependent on the decomposition dictionary used by MP. The structural part of the definition points to the fact that this new measure is related to the structure, or composition, of the signal under analysis. After a formal definition, the proposed relative structural complexity measure is used in the analysis of newborn EEG. To do this, firstly, a time-frequency (TF) decomposition dictionary is specifically designed to compactly represent the newborn EEG seizure state using MP. We then show, through the analysis of synthetic and real newborn EEG data, that the relative structural complexity measure can indicate changes in EEG structure as it transitions between the two EEG states; namely seizure and background (non-seizure).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The generalized Gibbs sampler (GGS) is a recently developed Markov chain Monte Carlo (MCMC) technique that enables Gibbs-like sampling of state spaces that lack a convenient representation in terms of a fixed coordinate system. This paper describes a new sampler, called the tree sampler, which uses the GGS to sample from a state space consisting of phylogenetic trees. The tree sampler is useful for a wide range of phylogenetic applications, including Bayesian, maximum likelihood, and maximum parsimony methods. A fast new algorithm to search for a maximum parsimony phylogeny is presented, using the tree sampler in the context of simulated annealing. The mathematics underlying the algorithm is explained and its time complexity is analyzed. The method is tested on two large data sets consisting of 123 sequences and 500 sequences, respectively. The new algorithm is shown to compare very favorably in terms of speed and accuracy to the program DNAPARS from the PHYLIP package.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A novel strategy for fast NMR resonance assignment of N-15 HSQC spectra of proteins is presented. It requires the structure coordinates of the protein, a paramagnetic center, and one or more residue-selectively N-15-labeled samples. Comparison of sensitive undecoupled N-15 HSQC spectra recorded of paramagnetic and diamagnetic samples yields data for every cross-peak on pseudocontact shift, paramagnetic relaxation enhancement, cross-correlation between Curie-spin and dipole-dipole relaxation, and residual dipolar coupling. Comparison of these four different paramagnetic quantities with predictions from the three-dimensional structure simultaneously yields the resonance assignment and the anisotropy of the susceptibility tensor of the paramagnetic center. The method is demonstrated with the 30 kDa complex between the N-terminal domain of the epsilon subunit and the theta subunit of Escherichia Coll DNA polymerase III. The program PLATYPUS was developed to perform the assignment, provide a measure of reliability of the assignment, and determine the susceptibility tensor anisotropy.