918 resultados para Pattern recognition techniques
Resumo:
Nonogram is a logical puzzle whose associated decision problem is NP-complete. It has applications in pattern recognition problems and data compression, among others. The puzzle consists in determining an assignment of colors to pixels distributed in a N M matrix that satisfies line and column constraints. A Nonogram is encoded by a vector whose elements specify the number of pixels in each row and column of a figure without specifying their coordinates. This work presents exact and heuristic approaches to solve Nonograms. The depth first search was one of the chosen exact approaches because it is a typical example of brute search algorithm that is easy to implement. Another implemented exact approach was based on the Las Vegas algorithm, so that we intend to investigate whether the randomness introduce by the Las Vegas-based algorithm would be an advantage over the depth first search. The Nonogram is also transformed into a Constraint Satisfaction Problem. Three heuristics approaches are proposed: a Tabu Search and two memetic algorithms. A new function to calculate the objective function is proposed. The approaches are applied on 234 instances, the size of the instances ranging from 5 x 5 to 100 x 100 size, and including logical and random Nonograms
Resumo:
Data clustering is applied to various fields such as data mining, image processing and pattern recognition technique. Clustering algorithms splits a data set into clusters such that elements within the same cluster have a high degree of similarity, while elements belonging to different clusters have a high degree of dissimilarity. The Fuzzy C-Means Algorithm (FCM) is a fuzzy clustering algorithm most used and discussed in the literature. The performance of the FCM is strongly affected by the selection of the initial centers of the clusters. Therefore, the choice of a good set of initial cluster centers is very important for the performance of the algorithm. However, in FCM, the choice of initial centers is made randomly, making it difficult to find a good set. This paper proposes three new methods to obtain initial cluster centers, deterministically, the FCM algorithm, and can also be used in variants of the FCM. In this work these initialization methods were applied in variant ckMeans.With the proposed methods, we intend to obtain a set of initial centers which are close to the real cluster centers. With these new approaches startup if you want to reduce the number of iterations to converge these algorithms and processing time without affecting the quality of the cluster or even improve the quality in some cases. Accordingly, cluster validation indices were used to measure the quality of the clusters obtained by the modified FCM and ckMeans algorithms with the proposed initialization methods when applied to various data sets
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
Since 2005, geophysical surveys have been carried out in the Precambri-an Borborema Province, along two transects with 800 km long each one. A pool of Brazilian public universities and institutions has been acquired deep refrac-tion seismic, gravity and magnetotelluric, with the purpose to model the conti-nental lithosphere of the region. This paper present the gravity survey of the second transect, that crosses the Borborema Province from SW to NE, passing through the São Francisco Craton, Transversal and Meridional zones and Rio Grande do Norte Domain, in the Setentrional Zone. In this way, it cuts some important geologic structures, like the limit of the São Francis Craton and the Borborema Province, Paleozoic and Mesozoic sedimentary basins of Tucano, Jatobá and Potiguar and the extensive Pernambuco and Patos shear zones. Recognition techniques gravity sources in the subsurface, such as spectral analysis and Euler Deconvolution, were applied to the Bouguer anomalies, as well as their regional and residual components. These techniques provided in-formation on possible anomalous bodies, which correlated with pre-existing geological and geophysical data, subsidized a 2.5 D gravity modeling of the lithosphere beneath the Borborema Province and its southern limit with the São Francisco Craton.
Resumo:
This paper reports the novel application of digital curvature as a feature for morphological characterization and classification of landmark shapes. By inheriting several unique features of the continuous curvature, the digital curvature provides invariance to translations, rotations, local shape deformations, and is easily made tolerant to scaling. In addition, the bending energy, a global shape feature, can be directly estimated from the curvature values. The application of these features to analyse patterns of cranial morphological geographic differentiation in the rodent species Thrichomys apereoides has led to encouraging results, indicating a close correspondence between the geographical and morphological distributions. (C) 2003 Pattern Recognition Society. Published by Elsevier Ltd. All rights reserved.
Resumo:
The Compact Muon Solenoid (CMS) detector is described. The detector operates at the Large Hadron Collider (LHC) at CERN. It was conceived to study proton-proton (and lead-lead) collisions at a centre-of-mass energy of 14 TeV (5.5 TeV nucleon-nucleon) and at luminosities up to 10(34)cm(-2)s(-1) (10(27)cm(-2)s(-1)). At the core of the CMS detector sits a high-magnetic-field and large-bore superconducting solenoid surrounding an all-silicon pixel and strip tracker, a lead-tungstate scintillating-crystals electromagnetic calorimeter, and a brass-scintillator sampling hadron calorimeter. The iron yoke of the flux-return is instrumented with four stations of muon detectors covering most of the 4 pi solid angle. Forward sampling calorimeters extend the pseudo-rapidity coverage to high values (vertical bar eta vertical bar <= 5) assuring very good hermeticity. The overall dimensions of the CMS detector are a length of 21.6 m, a diameter of 14.6 m and a total weight of 12500 t.
Resumo:
Epstein-Barr virus (EBV) has been associated with 10% of gastric carcinomas. The aim of this study was to determine the frequency of EBV in gastric carcinomas in Brazil assessed by in situ hybridization (ISH) and PCR, which would contribute to the characterization of the clinical and pathological aspects of EBV-associated gastric carcinomas. One hundred and ninety-two gastric carcinoma cases were collected at hospitals in two Brazilian states. Seventy-three out of 151 cases were PCR(+), while 11/160 cases were ISH(+). Nine out of eleven ISH(+) cases displayed a diffuse staining pattern and 2 out of 11 a focal pattern. Both techniques showed that the EBV(+) cases were characterized by their association with males, older patients, lower gastric region, intestinal type, advanced stage and poorly to moderately differentiated tumors. The concordance between the two techniques was 55.8% (Cohen's kappa index = 0.034). Four cases were ISH(+)/PCR(-), while 49 cases were PCR(+)/ISH(-). Only two cases showed stained lymphocytes by ISH and one of them was PCR(-). The observed discrepancy between the two techniques could not be explained just by the elevated accuracy of PCR. ISH(+)/PCR(-) carcinomas may be encountered if EBV is not present in the whole tumor tissue or if there are polymorphisms in the sequences of the viral genome amplified. on the other hand, the high frequency of PCR(+) results associated with the absence of ISH staining in lymphocytes and/or tumors cells suggests that the virus may be present in tumor cells or other cell types without expressing EBER1, the target of the ISH technique.
Resumo:
Objective Ocular conjunctivas of healthy dogs were studied by conjunctival impression cytology for evaluation of feasibility, protocol standardization, and normal cytologic pattern recognition of this technique.Animals studied Twenty healthy, adult, cross-breed dogs.Procedures Samples of the bulbar conjunctiva were collected after instillation of topical anesthetic drops at the ocular surface. Impression cytology was performed by applying asymmetric strips of Millipore filter on the superior temporal bulbar conjunctiva near the limbus. The filter strip was gently pressed against the conjunctiva for 5 s and removed with a peeling motion. Samples were immediately fixed in 95% ethyl alcohol, stained with periodic acid-Schiff and hematoxylin, and mounted on slides cover-slipped using synthetic resin. The slides were examined by light microscopy.Results Microscopic examination of the impressions revealed superficial, intermediate and basal epithelial cells arranged in sheets. Keratinized epithelial cells, goblet cells and leukocytes, as well as cellular debris and mucus were observed.Conclusions Feasibility of impression cytology for sampling the bulbar conjunctiva of the dog and the standardization the the proposed protocol was shown. The results allowed the recognition the the normal cytologic pattern of healthy conjunctivas in dogs.
Resumo:
A methodology for pipeline leakage detection using a combination of clustering and classification tools for fault detection is presented here. A fuzzy system is used to classify the running mode and identify the operational and process transients. The relationship between these transients and the mass balance deviation are discussed. This strategy allows for better identification of the leakage because the thresholds are adjusted by the fuzzy system as a function of the running mode and the classified transient level. The fuzzy system is initially off-line trained with a modified data set including simulated leakages. The methodology is applied to a small-scale LPG pipeline monitoring case where portability, robustness and reliability are amongst the most important criteria for the detection system. The results are very encouraging with relatively low levels of false alarms, obtaining increased leakage detection with low computational costs. (c) 2005 Elsevier B.V. All rights reserved.
Resumo:
A set of 25 quinone compounds with anti-trypanocidal activity was studied by using the density functional theory (DFT) method in order to calculate atomic and molecular properties to be correlated with the biological activity. The chemometric methods principal component analysis (PCA), hierarchical cluster analysis (HCA), stepwise discriminant analysis (SDA), Kth nearest neighbor (KNN) and soft independent modeling of class analogy (SIMCA) were used to obtain possible relationships between the calculated descriptors and the biological activity studied and to predict the anti-trypanocidal activity of new quinone compounds from a prediction set. Four descriptors were responsible for the separation between the active and inactive compounds: T-5 (torsion angle), QTS1 (sum of absolute values of the atomic charges), VOLS2 (volume of the substituent at region B) and HOMO-1 (energy of the molecular orbital below HOMO). These descriptors give information on the kind of interaction that occurs between the compounds and the biological receptor. The prediction study was done with a set of three new compounds by using the PCA, HCA, SDA, KNN and SIMCA methods and two of them were predicted as active against the Trypanosoma cruzi. (c) 2005 Elsevier SAS. All rights reserved.
Resumo:
dIn this work, a perceptron neural-network technique is applied to estimate hourly values of the diffuse solar-radiation at the surface in São Paulo City, Brazil, using as input the global solar-radiation and other meteorological parameters measured from 1998 to 2001. The neural-network verification was performed using the hourly measurements of diffuse solar-radiation obtained during the year 2002. The neural network was developed based on both feature determination and pattern selection techniques. It was found that the inclusion of the atmospheric long-wave radiation as input improves the neural-network performance. on the other hand traditional meteorological parameters, like air temperature and atmospheric pressure, are not as important as long-wave radiation which acts as a surrogate for cloud-cover information on the regional scale. An objective evaluation has shown that the diffuse solar-radiation is better reproduced by neural network synthetic series than by a correlation model. (C) 2004 Elsevier Ltd. All rights reserved.
Resumo:
This paper addresses biometric identification using large databases, in particular, iris databases. In such applications, it is critical to have low response time, while maintaining an acceptable recognition rate. Thus, the trade-off between speed and accuracy must be evaluated for processing and recognition parts of an identification system. In this paper, a graph-based framework for pattern recognition, called Optimum-Path Forest (OPF), is utilized as a classifier in a pre-developed iris recognition system. The aim of this paper is to verify the effectiveness of OPF in the field of iris recognition, and its performance for various scale iris databases. The existing Gauss-Laguerre Wavelet based coding scheme is used for iris encoding. The performance of the OPF and two other - Hamming and Bayesian - classifiers, is compared using small, medium, and large-scale databases. Such a comparison shows that the OPF has faster response for large-scale databases, thus performing better than the more accurate, but slower, classifiers.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Majority of biometric researchers focus on the accuracy of matching using biometrics databases, including iris databases, while the scalability and speed issues have been neglected. In the applications such as identification in airports and borders, it is critical for the identification system to have low-time response. In this paper, a graph-based framework for pattern recognition, called Optimum-Path Forest (OPF), is utilized as a classifier in a pre-developed iris recognition system. The aim of this paper is to verify the effectiveness of OPF in the field of iris recognition, and its performance for various scale iris databases. This paper investigates several classifiers, which are widely used in iris recognition papers, and the response time along with accuracy. The existing Gauss-Laguerre Wavelet based iris coding scheme, which shows perfect discrimination with rotary Hamming distance classifier, is used for iris coding. The performance of classifiers is compared using small, medium, and large scale databases. Such comparison shows that OPF has faster response for large scale database, thus performing better than more accurate but slower Bayesian classifier.
Resumo:
Anaerobic threshold (AT) is usually estimated as a change point problem by visual analysis of the cardiorespiratory response to incremental dynamic exercise. In this study, two phase linear (TPL) models of the linear-linear and linear-quadratic type were used for the estimation of AT. The correlation coefficient between the classical and statistical approaches was 0.88, and 0.89 after outlier exclusion. The TPL models provide a simple method for estimating AT that can be easily implemented using a digital computer for the automatic pattern recognition of AT.