172 resultados para Binary images
em Indian Institute of Science - Bangalore - Índia
Resumo:
Template matching is concerned with measuring the similarity between patterns of two objects. This paper proposes a memory-based reasoning approach for pattern recognition of binary images with a large template set. It seems that memory-based reasoning intrinsically requires a large database. Moreover, some binary image recognition problems inherently need large template sets, such as the recognition of Chinese characters which needs thousands of templates. The proposed algorithm is based on the Connection Machine, which is the most massively parallel machine to date, using a multiresolution method to search for the matching template. The approach uses the pyramid data structure for the multiresolution representation of templates and the input image pattern. For a given binary image it scans the template pyramid searching the match. A binary image of N × N pixels can be matched in O(log N) time complexity by our algorithm and is independent of the number of templates. Implementation of the proposed scheme is described in detail.
Resumo:
For active contour modeling (ACM), we propose a novel self-organizing map (SOM)-based approach, called the batch-SOM (BSOM), that attempts to integrate the advantages of SOM- and snake-based ACMs in order to extract the desired contours from images. We employ feature points, in the form of ail edge-map (as obtained from a standard edge-detection operation), to guide the contour (as in the case of SOM-based ACMs) along with the gradient and intensity variations in a local region to ensure that the contour does not "leak" into the object boundary in case of faulty feature points (weak or broken edges). In contrast with the snake-based ACMs, however, we do not use an explicit energy functional (based on gradient or intensity) for controlling the contour movement. We extend the BSOM to handle extraction of contours of multiple objects, by splitting a single contour into as many subcontours as the objects in the image. The BSOM and its extended version are tested on synthetic binary and gray-level images with both single and multiple objects. We also demonstrate the efficacy of the BSOM on images of objects having both convex and nonconvex boundaries. The results demonstrate the superiority of the BSOM over others. Finally, we analyze the limitations of the BSOM.
Resumo:
In order to describe the atmospheric turbulence which limits the resolution of long-exposure images obtained using ground-based large telescopes, a simplified model of a speckle pattern, reducing the complexity of calculating field-correlations of very high order, is presented. Focal plane correlations are used instead of correlations in the spatial frequency domain. General tripple correlations for a point source and for a binary are calculated and it is shown that they are not a strong function of the binary separation. For binary separations close to the diffraction limit of the telescope, the genuine triple correlation technique ensures a better SNR than the near-axis Knox-Thompson technique. The simplifications allow a complete analysis of the noise properties at all levels of light.
Resumo:
Passing a H-2-CH4 mixture over oxide spinels containing two transition elements as in Mg0.8MyMz'Al2O4 (M, M' = Fe, Co or Ni, y + z = 0.2) at 1070 degrees C produces small alloy nanoparticles which enable the formation of carbon nanotubes. Surface area measurements are found to be useful for assessing the yield and quality of the nanotubes. Good-quality single-walled nanotubes (SWNTs) have been obtained in high yields with the FeCo alloy nanoparticles, as evidenced by transmission electron microscope images and surface area measurements. The diameter of the SWNTs is in the 0.8-5 nm range, and the multiwalled nanotubes, found occasionally, possess very few graphite layers. (C) 1999 Elsevier Science B.V. All rights reserved.
Resumo:
There are essentially two different phenomenological models available to describe the interdiffusion process in binary systems in the olid state. The first of these, which is used more frequently, is based on the theory of flux partitioning. The second model, developed much more recently, uses the theory of dissociation and reaction. Although the theory of flux partitioning has been widely used, we found that this theory does not account for the mobility of both species and therefore is not suitable for use in most interdiffusion systems. We have first modified this theory to take into account the mobility of both species and then further extended it to develop relations or the integrated diffusion coefficient and the ratio of diffusivities of the species. The versatility of these two different models is examined in the Co-Si system with respect to different end-member compositions. From our analysis, we found that the applicability of the theory of flux partitioning is rather limited but the theory of dissociation and reaction can be used in any binary system.
Resumo:
We propose a robust method for mosaicing of document images using features derived from connected components. Each connected component is described using the Angular Radial Tran. form (ART). To ensure geometric consistency during feature matching, the ART coefficients of a connected component are augmented with those of its two nearest neighbors. The proposed method addresses two critical issues often encountered in correspondence matching: (i) The stability of features and (ii) Robustness against false matches due to the multiple instances of characters in a document image. The use of connected components guarantees a stable localization across images. The augmented features ensure a successful correspondence matching even in the presence of multiple similar regions within the page. We illustrate the effectiveness of the proposed method on camera captured document images exhibiting large variations in viewpoint, illumination and scale.
Resumo:
The LISA Parameter Estimation Taskforce was formed in September 2007 to provide the LISA Project with vetted codes, source distribution models and results related to parameter estimation. The Taskforce's goal is to be able to quickly calculate the impact of any mission design changes on LISA's science capabilities, based on reasonable estimates of the distribution of astrophysical sources in the universe. This paper describes our Taskforce's work on massive black-hole binaries (MBHBs). Given present uncertainties in the formation history of MBHBs, we adopt four different population models, based on (i) whether the initial black-hole seeds are small or large and (ii) whether accretion is efficient or inefficient at spinning up the holes. We compare four largely independent codes for calculating LISA's parameter-estimation capabilities. All codes are based on the Fisher-matrix approximation, but in the past they used somewhat different signal models, source parametrizations and noise curves. We show that once these differences are removed, the four codes give results in extremely close agreement with each other. Using a code that includes both spin precession and higher harmonics in the gravitational-wave signal, we carry out Monte Carlo simulations and determine the number of events that can be detected and accurately localized in our four population models.
Resumo:
Measurements of the ratio of diffusion coefficient to mobility (D/ mu ) of electrons in SF6-N2 and CCl2F2-N2 mixtures over the range 80
Resumo:
Experimental results are presented of ionisation (a)a nd electron attachment ( v ) coefficients evaluated from the steady-state Townsend curregnrto wth curves for SFsN2 and CC12FrN2 mixtures over the range 60 S E/P 6 240 (where E is the electric field in V cm" and P is the pressure in Torr reduced to 20'C). In both the mixtures the attachment coefficients (vmu) evaluated were found to follow the relationship; where 7 is the attachment coefficient of pure electronegative gas, F is the fraction of the electronegative gas in the mixture and /3 is a constant. The ionisation coefficients (amlx) generally obeyed the relationship where w2a nd aAa re thei onisation coefficients of nitrogen and the attachinggraess pectively. However, in case of CC12FrN2 mixtures, there were maxima in the a,,,v,a,l ues for CCI2F2 concentrations varying between 10% and 30% at all values of E/P investigated. Effective ionisation coefficients (a - p)/P obtained in these binary mixtures show that the critical E/P (corresponding to (a - q)/P = 0) increases with increase in the concentration of the electronegative gas up to 40%. Further increase in the electronegative gas content does not seem to alter the critical E/P.
Resumo:
A numerical study on columnar-to-equiaxed transition (CET) during directional solidification of binary alloys is presented using a macroscopic solidification model. The position of CET is predicted numerically using a critical cooling rate criterion reported in literature. The macroscopic solidification model takes into account movement of solid phase due to buoyancy, and drag effect on the moving solid phase because of fluid motion. The model is applied to simulate the solidification process for binary alloys (Sn-Pb) and to estimate solidification parameters such as position of the liquidus, velocity of the liquidus isotherm, temperature gradient ahead of the liquidus, and cooling rate at the liquidus. Solidification phenomena under two cooling configurations are studied: one without melt convection and the other involvin thermosolutal convection. The numerically predicted positions of CET compare well with those of experiments reported in literature. Melt convection results in higher cooling rate, higher liquidus isotherm velocities, and stimulation of occurrence of CET in comparison to the nonconvecting case. The movement of solid phase aids further the process of CET. With a fixed solid phase, the occurrence of CET based on the same critical cooling rate is delayed and it occurs at a greater distance from the chill.
Resumo:
The ratio of the electron attachment coefficient eta to the gas pressure p (reduced to 0 degrees C) evaluated from the Townsend current growth curves in binary mixtures of electronegative gases (SF6, CCl2F2, CO2) and buffer gases (N2, Ar, air) clearly indicate that the eta /p ratios do not scale as the partial pressure of electronegative gas in the mixture. Extensive calculations carried out using data experimentally obtained have shown that the attachment coefficient of the mixture eta mix can be expressed as eta mix= eta (1-exp- beta F/(100-F)) where eta is the attachment coefficient of the 100% electronegative gas, F is the percentage of the electronegative gas in the mixture and beta is a constant. The results of this analysis explain to a high degree of accuracy the data obtained in various mixtures and are in very good agreement with the data deduced by Itoh and co-workers (1980) using the Boltzmann equation method.
Resumo:
The low-frequency (5–100 kHz) dielectric constant ε has been measured in the temperature range 7 × 10−5 < T = (T − Tc)/Tc < 8 × 10−2. Near Tc an exponent ≈0.11 characterizes the power law behaviour of dε/dt consistent with the theoretically predicted t−α singularity. However, over the full range of t an exponent ≈0.35 is obtained.
Resumo:
It is well known that the notions of normal forms and acyclicity capture many practical desirable properties for database schemes. The basic schema design problem is to develop design methodologies that strive toward these ideals. The usual approach is to first normalize the database scheme as far as possible. If the resulting scheme is cyclic, then one tries to transform it into an acyclic scheme. In this paper, we argue in favor of carrying out these two phases of design concurrently. In order to do this efficiently, we need to be able to incrementally analyze the acyclicity status of a database scheme as it is being designed. To this end, we propose the formalism of "binary decompositions". Using this, we characterize design sequences that exactly generate theta-acyclic schemes, for theta = agr,beta. We then show how our results can be put to use in database design. Finally, we also show that our formalism above can be effectively used as a proof tool in dependency theory. We demonstrate its power by showing that it leads to a significant simplification of the proofs of some previous results connecting sets of multivalued dependencies and acyclic join dependencies.