313 resultados para Supervised pattern recognition


Relevância:

80.00% 80.00%

Publicador:

Resumo:

In order to comprehend user information needs by concepts, this paper introduces a novel method to match relevance features with ontological concepts. The method first discovers relevance features from user local instances. Then, a concept matching approach is developed for matching these features to accurate concepts in a global knowledge base. This approach is significant for the transition of informative descriptor and conceptional descriptor. The proposed method is elaborately evaluated by comparing against three information gathering baseline models. The experimental results shows the matching approach is successful and achieves a series of remarkable improvements on search effectiveness.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

News blog hot topics are important for the information recommendation service and marketing. However, information overload and personalized management make the information arrangement more difficult. Moreover, what influences the formation and development of blog hot topics is seldom paid attention to. In order to correctly detect news blog hot topics, the paper first analyzes the development of topics in a new perspective based on W2T (Wisdom Web of Things) methodology. Namely, the characteristics of blog users, context of topic propagation and information granularity are unified to analyze the related problems. Some factors such as the user behavior pattern, network opinion and opinion leader are subsequently identified to be important for the development of topics. Then the topic model based on the view of event reports is constructed. At last, hot topics are identified by the duration, topic novelty, degree of topic growth and degree of user attention. The experimental results show that the proposed method is feasible and effective.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Big data is big news in almost every sector including crisis communication. However, not everyone has access to big data and even if we have access to big data, we often do not have necessary tools to analyze and cross reference such a large data set. Therefore this paper looks at patterns in small data sets that we have ability to collect with our current tools to understand if we can find actionable information from what we already have. We have analyzed 164390 tweets collected during 2011 earthquake to find out what type of location specific information people mention in their tweet and when do they talk about that. Based on our analysis we find that even a small data set that has far less data than a big data set can be useful to find priority disaster specific areas quickly.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Trajectory basis Non-Rigid Structure From Motion (NRSFM) currently faces two problems: the limit of reconstructability and the need to tune the basis size for different sequences. This paper provides a novel theoretical bound on 3D reconstruction error, arguing that the existing definition of reconstructability is fundamentally flawed in that it fails to consider system condition. This insight motivates a novel strategy whereby the trajectory's response to a set of high-pass filters is minimised. The new approach eliminates the need to tune the basis size and is more efficient for long sequences. Additionally, the truncated DCT basis is shown to have a dual interpretation as a high-pass filter. The success of trajectory filter reconstruction is demonstrated quantitatively on synthetic projections of real motion capture sequences and qualitatively on real image sequences.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Highly sensitive infrared (IR) cameras provide high-resolution diagnostic images of the temperature and vascular changes of breasts. These images can be processed to emphasize hot spots that exhibit early and subtle changes owing to pathology. The resulting images show clusters that appear random in shape and spatial distribution but carry class dependent information in shape and texture. Automated pattern recognition techniques are challenged because of changes in location, size and orientation of these clusters. Higher order spectral invariant features provide robustness to such transformations and are suited for texture and shape dependent information extraction from noisy images. In this work, the effectiveness of bispectral invariant features in diagnostic classification of breast thermal images into malignant, benign and normal classes is evaluated and a phase-only variant of these features is proposed. High resolution IR images of breasts, captured with measuring accuracy of ±0.4% (full scale) and temperature resolution of 0.1 °C black body, depicting malignant, benign and normal pathologies are used in this study. Breast images are registered using their lower boundaries, automatically extracted using landmark points whose locations are learned during training. Boundaries are extracted using Canny edge detection and elimination of inner edges. Breast images are then segmented using fuzzy c-means clustering and the hottest regions are selected for feature extraction. Bispectral invariant features are extracted from Radon projections of these images. An Adaboost classifier is used to select and fuse the best features during training and then classify unseen test images into malignant, benign and normal classes. A data set comprising 9 malignant, 12 benign and 11 normal cases is used for evaluation of performance. Malignant cases are detected with 95% accuracy. A variant of the features using the normalized bispectrum, which discards all magnitude information, is shown to perform better for classification between benign and normal cases, with 83% accuracy compared to 66% for the original.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

With the explosive growth of resources available through the Internet, information mismatching and overload have become a severe concern to users. Web users are commonly overwhelmed by huge volume of information and are faced with the challenge of finding the most relevant and reliable information in a timely manner. Personalised information gathering and recommender systems represent state-of-the-art tools for efficient selection of the most relevant and reliable information resources, and the interest in such systems has increased dramatically over the last few years. However, web personalization has not yet been well-exploited; difficulties arise while selecting resources through recommender systems from a technological and social perspective. Aiming to promote high quality research in order to overcome these challenges, this paper provides a comprehensive survey on the recent work and achievements in the areas of personalised web information gathering and recommender systems. The report covers concept-based techniques exploited in personalised information gathering and recommender systems.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Iris based identity verification is highly reliable but it can also be subject to attacks. Pupil dilation or constriction stimulated by the application of drugs are examples of sample presentation security attacks which can lead to higher false rejection rates. Suspects on a watch list can potentially circumvent the iris based system using such methods. This paper investigates a new approach using multiple parts of the iris (instances) and multiple iris samples in a sequential decision fusion framework that can yield robust performance. Results are presented and compared with the standard full iris based approach for a number of iris degradations. An advantage of the proposed fusion scheme is that the trade-off between detection errors can be controlled by setting parameters such as the number of instances and the number of samples used in the system. The system can then be operated to match security threat levels. It is shown that for optimal values of these parameters, the fused system also has a lower total error rate.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Robust hashing is an emerging field that can be used to hash certain data types in applications unsuitable for traditional cryptographic hashing methods. Traditional hashing functions have been used extensively for data/message integrity, data/message authentication, efficient file identification and password verification. These applications are possible because the hashing process is compressive, allowing for efficient comparisons in the hash domain but non-invertible meaning hashes can be used without revealing the original data. These techniques were developed with deterministic (non-changing) inputs such as files and passwords. For such data types a 1-bit or one character change can be significant, as a result the hashing process is sensitive to any change in the input. Unfortunately, there are certain applications where input data are not perfectly deterministic and minor changes cannot be avoided. Digital images and biometric features are two types of data where such changes exist but do not alter the meaning or appearance of the input. For such data types cryptographic hash functions cannot be usefully applied. In light of this, robust hashing has been developed as an alternative to cryptographic hashing and is designed to be robust to minor changes in the input. Although similar in name, robust hashing is fundamentally different from cryptographic hashing. Current robust hashing techniques are not based on cryptographic methods, but instead on pattern recognition techniques. Modern robust hashing algorithms consist of feature extraction followed by a randomization stage that introduces non-invertibility and compression, followed by quantization and binary encoding to produce a binary hash output. In order to preserve robustness of the extracted features, most randomization methods are linear and this is detrimental to the security aspects required of hash functions. Furthermore, the quantization and encoding stages used to binarize real-valued features requires the learning of appropriate quantization thresholds. How these thresholds are learnt has an important effect on hashing accuracy and the mere presence of such thresholds are a source of information leakage that can reduce hashing security. This dissertation outlines a systematic investigation of the quantization and encoding stages of robust hash functions. While existing literature has focused on the importance of quantization scheme, this research is the first to emphasise the importance of the quantizer training on both hashing accuracy and hashing security. The quantizer training process is presented in a statistical framework which allows a theoretical analysis of the effects of quantizer training on hashing performance. This is experimentally verified using a number of baseline robust image hashing algorithms over a large database of real world images. This dissertation also proposes a new randomization method for robust image hashing based on Higher Order Spectra (HOS) and Radon projections. The method is non-linear and this is an essential requirement for non-invertibility. The method is also designed to produce features more suited for quantization and encoding. The system can operate without the need for quantizer training, is more easily encoded and displays improved hashing performance when compared to existing robust image hashing algorithms. The dissertation also shows how the HOS method can be adapted to work with biometric features obtained from 2D and 3D face images.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A finely-tuned innate immune response plays a pivotal role in protecting host against bacterial invasion during periodontal disease progression. Hyperlipidemia has been suggested to exacerbate periodontal health condition. However, the underlying mechanism has not been addressed. In the present study, we investigated the effect of hyperlipidemia on innate immune responses to periodontal pathogen Porphyromonas gingivalis infection. Apolipoprotein E-deficient and wild-type mice at the age of 20 weeks were used for the study. Peritoneal macrophages were isolated and subsequently used for the study of viable P. gingivalis infection. ApoE−/− mice demonstrated inhibited iNOS production and impaired clearance of P. gingivalis in vitro and in vivo; furthermore, ApoE−/− mice displayed disrupted cytokine production pattern in response to P. gingivalis, with a decreased production of tumor necrosis factor-α, interleukin-6 (IL-6), IL-1β and monocyte chemotactic protein-1. Microarray data demonstrated that Toll-like receptor (TLR) and NOD-like receptor (NLR) pathway were altered in ApoE−/− mice macrophages; further analysis of pattern recognition receptors (PRRs) demonstrated that expression of triggering receptors on myeloid cells-1 (TREM-1), an amplifier of the TLR and NLR pathway, was decreased in ApoE−/− mice macrophages, leading to decreased recruitment of NF-κB onto the promoters of the TNF-α and IL-6. Our data suggest that in ApoE−/− mice hyperlipidemia disrupts the expression of PRRs, and cripples the host’s capability to generate sufficient innate immune response to P. gingivalis, which may facilitate immune evasion, subgingival colonization and establishment of P. gingivalis in the periodontal niche.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The work described in this technical report is part of an ongoing project to build practical tools for the manipulation, analysis and visualisation of recordings of the natural environment. This report describes the methods we use to remove background noise from spectrograms. It updates techniques previously described in Towsey and Planitz (2011), Technical report: acoustic analysis of the natural environment, downloadable from: http://eprints.qut.edu.au/41131/. It also describes noise removal from wave-forms, a technique not described in the above 2011 technical report.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In recent years face recognition systems have been applied in various useful applications, such as surveillance, access control, criminal investigations, law enforcement, and others. However face biometric systems can be highly vulnerable to spoofing attacks where an impostor tries to bypass the face recognition system using a photo or video sequence. In this paper a novel liveness detection method, based on the 3D structure of the face, is proposed. Processing the 3D curvature of the acquired data, the proposed approach allows a biometric system to distinguish a real face from a photo, increasing the overall performance of the system and reducing its vulnerability. In order to test the real capability of the methodology a 3D face database has been collected simulating spoofing attacks, therefore using photographs instead of real faces. The experimental results show the effectiveness of the proposed approach.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this study, the promising metabolomic approach integrating with ingenuity pathway analysis (IPA) was applied to characterize the tissue specific metabolic perturbation of rats that was induced by indomethacin. The selective pattern recognition analyses were applied to analyze global metabolic profiling of urine of rats treated by indomethacin at an acute dosage of reference that has been proven to induce tissue disorders in rats, evaluated throughout the time-course of -24-72 h. The results preliminarily revealed that modifications of amino acid metabolism, fatty acid metabolism and energetically associated metabolic pathways accounted for metabolic perturbation of the rats that was induced by indomethacin. Furthermore, IPA was applied to deeply analyze the biomarkers and their relations with the metabolic perturbations evidenced by pattern recognition analyses. Specific biochemical functions affected by indomethacin suggested that there is an important correlation of its effects in kidney and liver metabolism, based on the determined metabolites and their pathway-based analysis. The IPA correlation of the three major biomarkers, identified as creatinine, prostaglandin E2 and guanosine, suggested that the administration of indomethacin induced certain levels of toxicity in the kidneys and liver. The changes in the levels of biomarker metabolites allowed the phenotypical determination of the metabolic perturbations induced by indomethacin in a time-dependent manner.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We introduce the use of Ingenuity Pathway Analysis to analyzing global metabonomics in order to characterize phenotypically biochemical perturbations and the potential mechanisms of the gentamicin-induced toxicity in multiple organs. A single dose of gentamicin was administered to Sprague Dawley rats (200 mg/kg, n = 6) and urine samples were collected at -24-0 h pre-dosage, 0-24, 24-48, 48-72 and 72-96 h post-dosage of gentamicin. The urine metabonomics analysis was performed by UPLC/MS, and the mass spectra signals of the detected metabolites were systematically deconvoluted and analyzed by pattern recognition analyses (Heatmap, PCA and PLS-DA), revealing a time-dependency of the biochemical perturbations induced by gentamicin toxicity. As result, the holistic metabolome change induced by gentamicin toxicity in the animal's organisms was characterized. Several metabolites involved in amino acid metabolism were identified in urine, and it was confirmed that gentamicin biochemical perturbations can be foreseen from these biomarkers. Notoriously, it was found that gentamicin induced toxicity in multiple organs system in the laboratory rats. The proof-of-knowledge based Ingenuity Pathway Analysis revealed gentamicin induced liver and heart toxicity, along with the previously known toxicity in kidney. The metabolites creatine, nicotinic acid, prostaglandin E2, and cholic acid were identified and validated as phenotypic biomarkers of gentamicin induced toxicity. Altogether, the significance of the use of metabonomics analyses in the assessment of drug toxicity is highlighted once more; furthermore, this work demonstrated the powerful predictive potential of the Ingenuity Pathway Analysis to study of drug toxicity and its valuable complementation for metabonomics based assessment of the drug toxicity.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Atherosclerotic cardiovascular disease remains the leading cause of morbidity and mortality in industrialized societies. The lack of metabolite biomarkers has impeded the clinical diagnosis of atherosclerosis so far. In this study, stable atherosclerosis patients (n=16) and age- and sex-matched non-atherosclerosis healthy subjects (n=28) were recruited from the local community (Harbin, P. R. China). The plasma was collected from each study subject and was subjected to metabolomics analysis by GC/MS. Pattern recognition analyses (principal components analysis, orthogonal partial least-squares discriminate analysis, and hierarchical clustering analysis) commonly demonstrated plasma metabolome, which was significantly different from atherosclerotic and non-atherosclerotic subjects. The development of atherosclerosis-induced metabolic perturbations of fatty acids, such as palmitate, stearate, and 1-monolinoleoylglycerol, was confirmed consistent with previous publication, showing that palmitate significantly contributes to atherosclerosis development via targeting apoptosis and inflammation pathways. Altogether, this study demonstrated that the development of atherosclerosis directly perturbed fatty acid metabolism, especially that of palmitate, which was confirmed as a phenotypic biomarker for clinical diagnosis of atherosclerosis.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Tags or personal metadata for annotating web resources have been widely adopted in Web 2.0 sites. However, as tags are freely chosen by users, the vocabularies are diverse, ambiguous and sometimes only meaningful to individuals. Tag recommenders may assist users during tagging process. Its objective is to suggest relevant tags to use as well as to help consolidating vocabulary in the systems. In this paper we discuss our approach for providing personalized tag recommendation by making use of existing domain ontology generated from folksonomy. Specifically we evaluated the approach in sparse situation. The evaluation shows that the proposed ontology-based method has improved the accuracy of tag recommendation in this situation.