826 resultados para distance transforms


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis studies gray-level distance transforms, particularly the Distance Transform on Curved Space (DTOCS). The transform is produced by calculating distances on a gray-level surface. The DTOCS is improved by definingmore accurate local distances, and developing a faster transformation algorithm. The Optimal DTOCS enhances the locally Euclidean Weighted DTOCS (WDTOCS) with local distance coefficients, which minimize the maximum error from the Euclideandistance in the image plane, and produce more accurate global distance values.Convergence properties of the traditional mask operation, or sequential localtransformation, and the ordered propagation approach are analyzed, and compared to the new efficient priority pixel queue algorithm. The Route DTOCS algorithmdeveloped in this work can be used to find and visualize shortest routes between two points, or two point sets, along a varying height surface. In a digital image, there can be several paths sharing the same minimal length, and the Route DTOCS visualizes them all. A single optimal path can be extracted from the route set using a simple backtracking algorithm. A new extension of the priority pixel queue algorithm produces the nearest neighbor transform, or Voronoi or Dirichlet tessellation, simultaneously with the distance map. The transformation divides the image into regions so that each pixel belongs to the region surrounding the reference point, which is nearest according to the distance definition used. Applications and application ideas for the DTOCS and its extensions are presented, including obstacle avoidance, image compression and surface roughness evaluation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis deals with distance transforms which are a fundamental issue in image processing and computer vision. In this thesis, two new distance transforms for gray level images are presented. As a new application for distance transforms, they are applied to gray level image compression. The new distance transforms are both new extensions of the well known distance transform algorithm developed by Rosenfeld, Pfaltz and Lay. With some modification their algorithm which calculates a distance transform on binary images with a chosen kernel has been made to calculate a chessboard like distance transform with integer numbers (DTOCS) and a real value distance transform (EDTOCS) on gray level images. Both distance transforms, the DTOCS and EDTOCS, require only two passes over the graylevel image and are extremely simple to implement. Only two image buffers are needed: The original gray level image and the binary image which defines the region(s) of calculation. No other image buffers are needed even if more than one iteration round is performed. For large neighborhoods and complicated images the two pass distance algorithm has to be applied to the image more than once, typically 3 10 times. Different types of kernels can be adopted. It is important to notice that no other existing transform calculates the same kind of distance map as the DTOCS. All the other gray weighted distance function, GRAYMAT etc. algorithms find the minimum path joining two points by the smallest sum of gray levels or weighting the distance values directly by the gray levels in some manner. The DTOCS does not weight them that way. The DTOCS gives a weighted version of the chessboard distance map. The weights are not constant, but gray value differences of the original image. The difference between the DTOCS map and other distance transforms for gray level images is shown. The difference between the DTOCS and EDTOCS is that the EDTOCS calculates these gray level differences in a different way. It propagates local Euclidean distances inside a kernel. Analytical derivations of some results concerning the DTOCS and the EDTOCS are presented. Commonly distance transforms are used for feature extraction in pattern recognition and learning. Their use in image compression is very rare. This thesis introduces a new application area for distance transforms. Three new image compression algorithms based on the DTOCS and one based on the EDTOCS are presented. Control points, i.e. points that are considered fundamental for the reconstruction of the image, are selected from the gray level image using the DTOCS and the EDTOCS. The first group of methods select the maximas of the distance image to new control points and the second group of methods compare the DTOCS distance to binary image chessboard distance. The effect of applying threshold masks of different sizes along the threshold boundaries is studied. The time complexity of the compression algorithms is analyzed both analytically and experimentally. It is shown that the time complexity of the algorithms is independent of the number of control points, i.e. the compression ratio. Also a new morphological image decompression scheme is presented, the 8 kernels' method. Several decompressed images are presented. The best results are obtained using the Delaunay triangulation. The obtained image quality equals that of the DCT images with a 4 x 4

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The identification, tracking, and statistical analysis of tropical convective complexes using satellite imagery is explored in the context of identifying feature points suitable for tracking. The feature points are determined based on the shape of complexes using the distance transform technique. This approach has been applied to the determination feature points for tropical convective complexes identified in a time series of global cloud imagery. The feature points are used to track the complexes, and from the tracks statistical diagnostic fields are computed. This approach allows the nature and distribution of organized deep convection in the Tropics to be explored.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

During decades Distance Transforms have proven to be useful for many image processing applications, and more recently, they have started to be used in computer graphics environments. The goal of this paper is to propose a new technique based on Distance Transforms for detecting mesh elements which are close to the objects' external contour (from a given point of view), and using this information for weighting the approximation error which will be tolerated during the mesh simplification process. The obtained results are evaluated in two ways: visually and using an objective metric that measures the geometrical difference between two polygonal meshes.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In many industrial applications, accurate and fast surface reconstruction is essential for quality control. Variation in surface finishing parameters, such as surface roughness, can reflect defects in a manufacturing process, non-optimal product operational efficiency, and reduced life expectancy of the product. This thesis considers reconstruction and analysis of high-frequency variation, that is roughness, on planar surfaces. Standard roughness measures in industry are calculated from surface topography. A fast and non-contact method to obtain surface topography is to apply photometric stereo in the estimation of surface gradients and to reconstruct the surface by integrating the gradient fields. Alternatively, visual methods, such as statistical measures, fractal dimension and distance transforms, can be used to characterize surface roughness directly from gray-scale images. In this thesis, the accuracy of distance transforms, statistical measures, and fractal dimension are evaluated in the estimation of surface roughness from gray-scale images and topographies. The results are contrasted to standard industry roughness measures. In distance transforms, the key idea is that distance values calculated along a highly varying surface are greater than distances calculated along a smoother surface. Statistical measures and fractal dimension are common surface roughness measures. In the experiments, skewness and variance of brightness distribution, fractal dimension, and distance transforms exhibited strong linear correlations to standard industry roughness measures. One of the key strengths of photometric stereo method is the acquisition of higher frequency variation of surfaces. In this thesis, the reconstruction of planar high-frequency varying surfaces is studied in the presence of imaging noise and blur. Two Wiener filterbased methods are proposed of which one is optimal in the sense of surface power spectral density given the spectral properties of the imaging noise and blur. Experiments show that the proposed methods preserve the inherent high-frequency variation in the reconstructed surfaces, whereas traditional reconstruction methods typically handle incorrect measurements by smoothing, which dampens the high-frequency variation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis investigates the potential use of zerocrossing information for speech sample estimation. It provides 21 new method tn) estimate speech samples using composite zerocrossings. A simple linear interpolation technique is developed for this purpose. By using this method the A/D converter can be avoided in a speech coder. The newly proposed zerocrossing sampling theory is supported with results of computer simulations using real speech data. The thesis also presents two methods for voiced/ unvoiced classification. One of these methods is based on a distance measure which is a function of short time zerocrossing rate and short time energy of the signal. The other one is based on the attractor dimension and entropy of the signal. Among these two methods the first one is simple and reguires only very few computations compared to the other. This method is used imtea later chapter to design an enhanced Adaptive Transform Coder. The later part of the thesis addresses a few problems in Adaptive Transform Coding and presents an improved ATC. Transform coefficient with maximum amplitude is considered as ‘side information’. This. enables more accurate tfiiz assignment enui step—size computation. A new bit reassignment scheme is also introduced in this work. Finally, sum ATC which applies switching between luiscrete Cosine Transform and Discrete Walsh-Hadamard Transform for voiced and unvoiced speech segments respectively is presented. Simulation results are provided to show the improved performance of the coder

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We provide a unified framework for a range of linear transforms that can be used for the analysis of terahertz spectroscopic data, with particular emphasis on their application to the measurement of leaf water content. The use of linear transforms for filtering, regression, and classification is discussed. For illustration, a classification problem involving leaves at three stages of drought and a prediction problem involving simulated spectra are presented. Issues resulting from scaling the data set are discussed. Using Lagrange multipliers, we arrive at the transform that yields the maximum separation between the spectra and show that this optimal transform is equivalent to computing the Euclidean distance between the samples. The optimal linear transform is compared with the average for all the spectra as well as with the Karhunen–Loève transform to discriminate a wet leaf from a dry leaf. We show that taking several principal components into account is equivalent to defining new axes in which data are to be analyzed. The procedure shows that the coefficients of the Karhunen–Loève transform are well suited to the process of classification of spectra. This is in line with expectations, as these coefficients are built from the statistical properties of the data set analyzed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Approximately 7.2% of the Atlantic rainforest remains in Brazil, with only 16% of this forest remaining in the State of Rio de Janeiro, all of it distributed in fragments. This forest fragmentation can produce biotic and abiotic differences between edges and the fragment interior. In this study, we compared the structure and richness of tree communities in three habitats - an anthropogenic edge (AE), a natural edge (NE) and the fragment interior (FI) - of a fragment of Atlantic forest in the State of Rio de Janeiro, Brazil (22°50'S and 42°28'W). One thousand and seventy-six trees with a diameter at breast height > 4.8 cm, belonging to 132 morphospecies and 39 families, were sampled in a total study area of 0.75 ha. NE had the greatest basal area and the trees in this habitat had the greatest diameter:height allometric coefficient, whereas AE had a lower richness and greater variation in the height of the first tree branch. Tree density, diameter, height and the proportion of standing dead trees did not differ among the habitats. There was marked heterogeneity among replicates within each habitat. These results indicate that the forest interior and the fragment edges (natural or anthropogenic) do not differ markedly considering the studied parameters. Other factors, such as the age from the edge, type of matrix and proximity of gaps, may play a more important role in plant community structure than the proximity from edges.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The objective of this study was to evaluate children's respiratory patterns in the mixed dentition, by means of acoustic rhinometry, and its relation to the upper arch width development. Fifty patients were examined, 25 females and 25 males with mean age of eight years and seven months. All of them were submitted to acoustic rhinometry and upper and lower arch impressions to obtain plaster models. The upper arch analysis was accomplished by measuring the interdental transverse distance of the upper teeth, deciduous canines (measurement 1), deciduous first molars (measurement 2), deciduous second molars (measurement 3) and the first molars (measurement 4). The results showed that an increased left nasal cavity area in females means an increased interdental distance of the deciduous first molars and deciduous second molars and an increased interdental distance of the deciduous canines, deciduous first and second molars in males. It was concluded that there is a correlation between the nasal cavity area and the upper arch transverse distance in the anterior and mid maxillary regions for both genders.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Rangel EM, Mendes IA, Carnio EC, Marchi Alves LM, Godoy S, Crispim JA. Development, implementation, and assessment of a distance module in endocrine physiology. Adv Physiol Educ 34: 70-74, 2010; doi: 10.1152/advan.00070.2009.-This study aimed to develop, implement, and assess a distance module in endocrine physiology in TelEduc for undergraduate nursing students from a public university in Brazil, with a sample size of 44 students. Stage 1 consisted of the development of the module, through the process of creating a distance course by means of the Web. Stage 2 was the planning of the module's practical functioning, and stage 3 was the planning of student evaluations. In the experts' assessment, the module complied with pedagogical and technical requirements most of the time. In the practical functioning stage, 10 h were dedicated for on-site activities and 10 h for distance activities. Most students (93.2%) were women between 19 and 23 yr of age (75%). The internet was the most used means to remain updated for 23 students (59.0%), and 30 students (68.2%) accessed it from the teaching institution. A personal computer was used by 23 students (56.1%), and most of them (58.1%) learned to use it alone. Access to a forum was more dispersed (variation coefficient: 86.80%) than access to chat (variation coefficient: 65.14%). Average participation was 30 students in forums and 22 students in the chat. Students' final grades in the module averaged 8.5 (SD: 1.2). TelEduc was shown to be efficient in supporting the teaching- learning process of endocrine physiology.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Context. In April 2004, the first image was obtained of a planetary mass companion (now known as 2M 1207 b) in orbit around a self-luminous object different from our own Sun (the young brown dwarf 2MASSW J 1207334-393254, hereafter 2M 1207 A). That 2M 1207 b probably formed via fragmentation and gravitational collapse offered proof that such a mechanism can form bodies in the planetary mass regime. However, the predicted mass, luminosity, and radius of 2MI207 b depend on its age, distance, and other observables, such as effective temperature. Aims. To refine our knowledge of the physical properties of 2M 1207 b and its nature, we accurately determined the distance to the 2M 1207 A and b system by measuring of its trigonometric parallax at the milliarcsec level. Methods. With the ESO NTT/SUS12 telescope, we began a campaign of photometric and astrometric observations in 2006 to measure the trigonometric parallax of 2M 1207 A. Results. An accurate distance (52.4 +/- 1.1 pc) to 2M1207A was measured. From distance and proper motions we derived spatial velocities that are fully compatible with TWA membership. Conclusions. With this new distance estimate, we discuss three scenarios regarding the nature of 2M 1207 b: (1) a cool (1150 +/- 150 K) companion of mass 4 +/- 1 M-Jup (2) a warmer (1600 +/- 100 K) and heavier (8 +/- 2 M-Jup) companion occulted by an edge-on circumsecondary disk, or (3) a hot protoplanet collision afterglow.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Context. Observations in the cosmological domain are heavily dependent on the validity of the cosmic distance-duality (DD) relation, eta = D(L)(z)(1+ z)(2)/D(A)(z) = 1, an exact result required by the Etherington reciprocity theorem where D(L)(z) and D(A)(z) are, respectively, the luminosity and angular diameter distances. In the limit of very small redshifts D(A)(z) = D(L)(z) and this ratio is trivially satisfied. Measurements of Sunyaev-Zeldovich effect (SZE) and X-rays combined with the DD relation have been used to determine D(A)(z) from galaxy clusters. This combination offers the possibility of testing the validity of the DD relation, as well as determining which physical processes occur in galaxy clusters via their shapes. Aims. We use WMAP (7 years) results by fixing the conventional Lambda CDM model to verify the consistence between the validity of DD relation and different assumptions about galaxy cluster geometries usually adopted in the literature. Methods. We assume that. is a function of the redshift parametrized by two different relations: eta(z) = 1+eta(0)z, and eta(z) = 1+eta(0)z/(1+z), where eta(0) is a constant parameter quantifying the possible departure from the strict validity of the DD relation. In order to determine the probability density function (PDF) of eta(0), we consider the angular diameter distances from galaxy clusters recently studied by two different groups by assuming elliptical (isothermal) and spherical (non-isothermal) beta models. The strict validity of the DD relation will occur only if the maximum value of eta(0) PDF is centered on eta(0) = 0. Results. It was found that the elliptical beta model is in good agreement with the data, showing no violation of the DD relation (PDF peaked close to eta(0) = 0 at 1 sigma), while the spherical (non-isothermal) one is only marginally compatible at 3 sigma. Conclusions. The present results derived by combining the SZE and X-ray surface brightness data from galaxy clusters with the latest WMAP results (7-years) favors the elliptical geometry for galaxy clusters. It is remarkable that a local property like the geometry of galaxy clusters might be constrained by a global argument provided by the cosmic DD relation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this Letter, we propose a new and model-independent cosmological test for the distance-duality (DD) relation, eta = D(L)(z)(1 + z)(-2)/D(A)(z) = 1, where D(L) and D(A) are, respectively, the luminosity and angular diameter distances. For D(L) we consider two sub-samples of Type Ia supernovae (SNe Ia) taken from Constitution data whereas D(A) distances are provided by two samples of galaxy clusters compiled by De Filippis et al. and Bonamente et al. by combining Sunyaev-Zeldovich effect and X-ray surface brightness. The SNe Ia redshifts of each sub-sample were carefully chosen to coincide with the ones of the associated galaxy cluster sample (Delta z < 0.005), thereby allowing a direct test of the DD relation. Since for very low redshifts, D(A)(z) approximate to D(L)(z), we have tested the DD relation by assuming that. is a function of the redshift parameterized by two different expressions: eta(z) = 1 + eta(0)z and eta(z) = 1 +eta(0)z/(1 + z), where eta(0) is a constant parameter quantifying a possible departure from the strict validity of the reciprocity relation (eta(0) = 0). In the best scenario (linear parameterization), we obtain eta(0) = -0.28(-0.44)(+0.44) (2 sigma, statistical + systematic errors) for the De Filippis et al. sample (elliptical geometry), a result only marginally compatible with the DD relation. However, for the Bonamente et al. sample (spherical geometry) the constraint is eta(0) = -0.42(-0.34)(+0.34) (3 sigma, statistical + systematic errors), which is clearly incompatible with the duality-distance relation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: Identifying local similarity between two or more sequences, or identifying repeats occurring at least twice in a sequence, is an essential part in the analysis of biological sequences and of their phylogenetic relationship. Finding such fragments while allowing for a certain number of insertions, deletions, and substitutions, is however known to be a computationally expensive task, and consequently exact methods can usually not be applied in practice. Results: The filter TUIUIU that we introduce in this paper provides a possible solution to this problem. It can be used as a preprocessing step to any multiple alignment or repeats inference method, eliminating a possibly large fraction of the input that is guaranteed not to contain any approximate repeat. It consists in the verification of several strong necessary conditions that can be checked in a fast way. We implemented three versions of the filter. The first is simply a straightforward extension to the case of multiple sequences of an application of conditions already existing in the literature. The second uses a stronger condition which, as our results show, enable to filter sensibly more with negligible (if any) additional time. The third version uses an additional condition and pushes the sensibility of the filter even further with a non negligible additional time in many circumstances; our experiments show that it is particularly useful with large error rates. The latter version was applied as a preprocessing of a multiple alignment tool, obtaining an overall time (filter plus alignment) on average 63 and at best 530 times smaller than before (direct alignment), with in most cases a better quality alignment. Conclusion: To the best of our knowledge, TUIUIU is the first filter designed for multiple repeats and for dealing with error rates greater than 10% of the repeats length.