84 resultados para Feature extraction
Resumo:
Semi-rigid molecular tweezers 1, 3 and 4 bind picric acid with more than tenfold increment in tetrachloromethane as compared to chloroform.
Resumo:
With the availability of a huge amount of video data on various sources, efficient video retrieval tools are increasingly in demand. Video being a multi-modal data, the perceptions of ``relevance'' between the user provided query video (in case of Query-By-Example type of video search) and retrieved video clips are subjective in nature. We present an efficient video retrieval method that takes user's feedback on the relevance of retrieved videos and iteratively reformulates the input query feature vectors (QFV) for improved video retrieval. The QFV reformulation is done by a simple, but powerful feature weight optimization method based on Simultaneous Perturbation Stochastic Approximation (SPSA) technique. A video retrieval system with video indexing, searching and relevance feedback (RF) phases is built for demonstrating the performance of the proposed method. The query and database videos are indexed using the conventional video features like color, texture, etc. However, we use the comprehensive and novel methods of feature representations, and a spatio-temporal distance measure to retrieve the top M videos that are similar to the query. In feedback phase, the user activated iterative on the previously retrieved videos is used to reformulate the QFV weights (measure of importance) that reflect the user's preference, automatically. It is our observation that a few iterations of such feedback are generally sufficient for retrieving the desired video clips. The novel application of SPSA based RF for user-oriented feature weights optimization makes the proposed method to be distinct from the existing ones. The experimental results show that the proposed RF based video retrieval exhibit good performance.
Resumo:
Dendrocalamus strictus and Bambusa arundinacea are monocarpic, gregariously flowering species of bamboo, common in the deciduous forests of the State of Karnataka in India. Their populations have significantly declined, especially since the last flowering. This decline parelleis increasing incidence of grazing, fire and extraction in recent decades. Results of an experiment in which the intensities of grazing and fire were varied, indicate that while grazing significantly depresses the survival of seedlings and the recruitment of new eulms of bamboo clumps, fire appeared to enhance seedling survival, presumably by reducing competition of lass fire-resistant species. New shoots of bamboo are destroyed by insects and a variety of herbivorous mammals. In areas of intense herbivore pressure, a bamboo clump initiates the production of a much larger number of new culrm, but results in many fewer and shorter intact culms. Extraction renders the new shoots more susceptible to herbivore pressure by removal of the protective covering of branches at the base of a bamboo clump. Hence, regular and extensive extraction by the paper mills in conjuction with intense grazing pressure strongly depresses the addition of new culms to bamboo clumps. Regulation of grazing in the forest by domestic livestock along with maintenance of the cover at the base of the clumps by extracting the culms at a higher level should reduce the rate of decline of the bamboo stocks.
Resumo:
A method of ion extraction from plasmas is reported in which the interference of field lines due to the extraction system in the plasma region is avoided by proper shaping of the extractor electrode and is supported by field plots.
Resumo:
The minimum cost classifier when general cost functionsare associated with the tasks of feature measurement and classification is formulated as a decision graph which does not reject class labels at intermediate stages. Noting its complexities, a heuristic procedure to simplify this scheme to a binary decision tree is presented. The optimizationof the binary tree in this context is carried out using ynamicprogramming. This technique is applied to the voiced-unvoiced-silence classification in speech processing.
Resumo:
It is shown that lithium can be oxidatively extracted from Li2MoO3 at room temperature using Br2 in CHCl3. The delithiated oxides, Li2â��xMoO3 (0 < x â�¤ 1.5) retain the parent ordered rocksalt structure. Complete removal of lithium from Li2MoO3 using Br2 in CH3CN results in a poorly crystalline MoO3 that transforms to the stable structure at 280�°C. Li2MoO3 undergoes topotactic ion-exchange in aqueous H2SO4 to yield a new protonated oxide, H2MoO3.
Resumo:
The kinetics of iron(II1) extraction by bis(Zethylhexy1) phosphate (HDEHP, HA) in kerosene from sulfuric acid solutions has been studied in a liquid-liquid laminar jet reactor. The contact time of the interface in this reacting device is of the same order of magnitude as the surface renewal time in dispersion mixing and much less than that obtained in the relatively quiescent condition of the Lewis cell. Yet the analysis of the data in this study suggested a rate-controlling step involving surface saturation quite in conformity with that obtained in the Lewis cell and not with that in dispersion mixing as reported in the literature. Further, the mechanism suggested a weaker dependence of the rate on hydrogen ion concentration which was reported by other workers.
Resumo:
The addition of activated carbon particles (Darco-G, average size 4.3,μm) is shown to enhance the initial rate of extraction of copper in a Lewis cell by a mixture of α- and β-hydroxyoximes, when the rate of extraction is controlled by resistances in the organic phase. It is likely that the copper complex is adsorbed by carbon near the interace and partially released in the bulk. The enhancing effect of carbon vanishes when toluene is used as a diluent instead of heptane, presumably because toluene preferentially adsorbs on its surface.
Resumo:
The concept of feature selection in a nonparametric unsupervised learning environment is practically undeveloped because no true measure for the effectiveness of a feature exists in such an environment. The lack of a feature selection phase preceding the clustering process seriously affects the reliability of such learning. New concepts such as significant features, level of significance of features, and immediate neighborhood are introduced which result in meeting implicitly the need for feature slection in the context of clustering techniques.
Resumo:
The concept of feature selection in a nonparametric unsupervised learning environment is practically undeveloped because no true measure for the effectiveness of a feature exists in such an environment. The lack of a feature selection phase preceding the clustering process seriously affects the reliability of such learning. New concepts such as significant features, level of significance of features, and immediate neighborhood are introduced which result in meeting implicitly the need for feature slection in the context of clustering techniques.
Resumo:
In this paper, pattern classification problem in tool wear monitoring is solved using nature inspired techniques such as Genetic Programming(GP) and Ant-Miner (AM). The main advantage of GP and AM is their ability to learn the underlying data relationships and express them in the form of mathematical equation or simple rules. The extraction of knowledge from the training data set using GP and AM are in the form of Genetic Programming Classifier Expression (GPCE) and rules respectively. The GPCE and AM extracted rules are then applied to set of data in the testing/validation set to obtain the classification accuracy. A major attraction in GP evolved GPCE and AM based classification is the possibility of obtaining an expert system like rules that can be directly applied subsequently by the user in his/her application. The performance of the data classification using GP and AM is as good as the classification accuracy obtained in the earlier study.
Resumo:
In this paper, we present a new feature-based approach for mosaicing of camera-captured document images. A novel block-based scheme is employed to ensure that corners can be reliably detected over a wide range of images. 2-D discrete cosine transform is computed for image blocks defined around each of the detected corners and a small subset of the coefficients is used as a feature vector A 2-pass feature matching is performed to establish point correspondences from which the homography relating the input images could be computed. The algorithm is tested on a number of complex document images casually taken from a hand-held camera yielding convincing results.
Resumo:
We describe a novel method for human activity segmentation and interpretation in surveillance applications based on Gabor filter-bank features. A complex human activity is modeled as a sequence of elementary human actions like walking, running, jogging, boxing, hand-waving etc. Since human silhouette can be modeled by a set of rectangles, the elementary human actions can be modeled as a sequence of a set of rectangles with different orientations and scales. The activity segmentation is based on Gabor filter-bank features and normalized spectral clustering. The feature trajectories of an action category are learnt from training example videos using dynamic time warping. The combined segmentation and the recognition processes are very efficient as both the algorithms share the same framework and Gabor features computed for the former can be used for the later. We have also proposed a simple shadow detection technique to extract good silhouette which is necessary for good accuracy of an action recognition technique.
Resumo:
In this paper the approach for automatic road extraction for an urban region using structural, spectral and geometric characteristics of roads has been presented. Roads have been extracted based on two levels: Pre-processing and road extraction methods. Initially, the image is pre-processed to improve the tolerance by reducing the clutter (that mostly represents the buildings, parking lots, vegetation regions and other open spaces). The road segments are then extracted using Texture Progressive Analysis (TPA) and Normalized cut algorithm. The TPA technique uses binary segmentation based on three levels of texture statistical evaluation to extract road segments where as, Normalizedcut method for road extraction is a graph based method that generates optimal partition of road segments. The performance evaluation (quality measures) for road extraction using TPA and normalized cut method is compared. Thus the experimental result show that normalized cut method is efficient in extracting road segments in urban region from high resolution satellite image.
Resumo:
Guo and Nixon proposed a feature selection method based on maximizing I(x; Y),the multidimensional mutual information between feature vector x and class variable Y. Because computing I(x; Y) can be difficult in practice, Guo and Nixon proposed an approximation of I(x; Y) as the criterion for feature selection. We show that Guo and Nixon's criterion originates from approximating the joint probability distributions in I(x; Y) by second-order product distributions. We remark on the limitations of the approximation and discuss computationally attractive alternatives to compute I(x; Y).