983 resultados para biomimetic pattern recognition


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Among the largest resources for biological sequence data is the large amount of expressed sequence tags (ESTs) available in public and proprietary databases. ESTs provide information on transcripts but for technical reasons they often contain sequencing errors. Therefore, when analyzing EST sequences computationally, such errors must be taken into account. Earlier attempts to model error prone coding regions have shown good performance in detecting and predicting these while correcting sequencing errors using codon usage frequencies. In the research presented here, we improve the detection of translation start and stop sites by integrating a more complex mRNA model with codon usage bias based error correction into one hidden Markov model (HMM), thus generalizing this error correction approach to more complex HMMs. We show that our method maintains the performance in detecting coding sequences.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

AbstractThe vertebrate immune system is composed of the innate and the adaptive branches. Innate immune cells represent the first line of defense and detect pathogens through pattern recognition receptors (PRRs), detecting evolutionary conserved pathogen- and danger- associated molecular patterns. Engagement of these receptors initiates the inflammatory response, but also instructs antigen-specific adaptive immune cells. NOD-like receptors (NLRs) are an important group of PRRs, leading to the production of inflammatory mediators and favoring antigen presentation to Τ lymphocytes through the regulation of major histocompatibility complex (MHC) molecules.In this work we focused our attention on selected NOD-like receptors (NLRs) and their role at the interface between innate and adaptive immunity. First, we describe a new regulatory mechanism controlling IL-1 production. Our results indicate that type I interferons (IFNs) block NLRP1 and NLRP3 inflammasome activity and interfere with LPS-driven proIL-Ια and -β induction. As type I IFNs are produced upon viral infections, these anti-inflammatory effects of type I IFN could be relevant in the context of superinfections, but could also help explaining the efficacy of IFN-β in multiple sclerosis treatment.The second project addresses the role of a novel NLR family member, called NLRC5. The function of this NLR is still matter of debate, as it has been proposed as both an inhibitor and an activator of different inflammatory pathways. We found that the expression of this protein is restricted to immune cells and is positively regulated by IFNs. We generated Nlrc5-deficient mice and found that this NLR plays an essential role in Τ, NKT and, NK lymphocytes, in which it drives the expression of MHC class I molecules. Accordingly, we could show that CD8+ Τ cell-mediated killing of target lymphocytes lacking NLRC5 is strongly impaired. Moreover, NLRC5 expression was found to be low in many lymphoid- derived tumor cell lines, a mechanism that could be exploited by tumors to escape immunosurveillance.Finally, we found NLRC5 to be involved in the production of IL-10 by CD4+ Τ cells, as Nlrc5- deficient Τ lymphocytes produced less of this cytokine upon TCR triggering. In line with these observations, Mrc5-deficient CD4+ Τ cells expanded more than control cells when transferred into lymphopenic hosts and led to a more rapid appearance of colitis symptoms. Therefore, our work gives novel insights on the function of NLRC5 by using knockout mice, and strongly supports the idea that NLRs direct not only innate, but also adaptive immune responses.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Difficult tracheal intubation assessment is an important research topic in anesthesia as failed intubations are important causes of mortality in anesthetic practice. The modified Mallampati score is widely used, alone or in conjunction with other criteria, to predict the difficulty of intubation. This work presents an automatic method to assess the modified Mallampati score from an image of a patient with the mouth wide open. For this purpose we propose an active appearance models (AAM) based method and use linear support vector machines (SVM) to select a subset of relevant features obtained using the AAM. This feature selection step proves to be essential as it improves drastically the performance of classification, which is obtained using SVM with RBF kernel and majority voting. We test our method on images of 100 patients undergoing elective surgery and achieve 97.9% accuracy in the leave-one-out crossvalidation test and provide a key element to an automatic difficult intubation assessment system.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Human electrophysiological studies support a model whereby sensitivity to so-called illusory contour stimuli is first seen within the lateral occipital complex. A challenge to this model posits that the lateral occipital complex is a general site for crude region-based segmentation, based on findings of equivalent hemodynamic activations in the lateral occipital complex to illusory contour and so-called salient region stimuli, a stimulus class that lacks the classic bounding contours of illusory contours. Using high-density electrical mapping of visual evoked potentials, we show that early lateral occipital cortex activity is substantially stronger to illusory contour than to salient region stimuli, whereas later lateral occipital complex activity is stronger to salient region than to illusory contour stimuli. Our results suggest that equivalent hemodynamic activity to illusory contour and salient region stimuli probably reflects temporally integrated responses, a result of the poor temporal resolution of hemodynamic imaging. The temporal precision of visual evoked potentials is critical for establishing viable models of completion processes and visual scene analysis. We propose that crude spatial segmentation analyses, which are insensitive to illusory contours, occur first within dorsal visual regions, not the lateral occipital complex, and that initial illusory contour sensitivity is a function of the lateral occipital complex.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Chronic inhalation of grain dust is associated with asthma and chronic bronchitis in grain worker populations. Exposure to fungal particles was postulated to be an important etiologic agent of these pathologies. Fusarium species frequently colonize grain and straw and produce a wide array of mycotoxins that impact human health, necessitating an evaluation of risk exposure by inhalation of Fusarium and its consequences on immune responses. Data showed that Fusarium culmorum is a frequent constituent of aerosols sampled during wheat harvesting in the Vaud region of Switzerland. The aim of this study was to examine cytokine/chemokine responses and innate immune sensing of F. culmorum in bone-marrow-derived dendritic cells and macrophages. Overall, dendritic cells and macrophages responded to F. culmorum spores but not to its secreted components (i.e., mycotoxins) by releasing large amounts of macrophage inflammatory protein (MIP)-1α, MIP-1β, MIP-2, monocyte chemoattractant protein (MCP)-1, RANTES, and interleukin (IL)-12p40, intermediate amounts of tumor necrosis factor (TNF), IL-6, IL-12p70, IL-33, granulocyte colony-stimulating factor (G-CSF), and interferon gamma-induced protein (IP-10), but no detectable amounts of IL-4 and IL-10, a pattern of mediators compatible with generation of Th1 or Th17 antifungal protective immune responses rather than with Th2-dependent allergic responses. The sensing of F. culmorum spores by dendritic cells required dectin-1, the main pattern recognition receptor involved in β-glucans detection, but likely not MyD88 and TRIF-dependent Toll-like receptors. Taken together, our results indicate that F. culmorum stimulates potently innate immune cells in a dectin-1-dependent manner, suggesting that inhalation of F. culmorum from grain dust may promote immune-related airway diseases in exposed worker populations.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The present study examined individual differences in Absorption and fantasy, as well as in Achiievement and achievement striving as possible moderators of the perceptual closure effect found by Snodgrass and Feenan (1990). The study also examined whether different instructions (experiential versus instrumental) interact with the personality variables to moderate the relationship between priming and subsequent performance on a picture completion task. 1 28 participants completed two sessions, one to fill out the MPQ and NEO personality inventories and the other to complete the experimental task. The experimental task consisted of a priming phase and a test phase, with pictures presented on a computer screen for both phases. Participants were shown 30 pictures in the priming phase, and then shovm the 30 primed pictures along with 30 new pictures for the test phase. Participants were randomly assigned to receive one of the two different instruction sets for the task. Two measures of performance were calculated, most fragmented measure and threshold. Results of the present study confirm that a five-second exposure time is long enough to produce the perceptual closure effect. The analysis of the two-way interaction effects indicated a significant quadratic interaction of Absorption with priming level on threshold performance. The results were in the opposite direction of predictions. Possible explanations for the Absorption results include lack of optimal conditions, lack of intrinsic motivation and measurement problems. Primary analyses also revealed two significant between-subject effects of fantasy and achievement striving on performance collapsed across priming levels. These results suggest that fantasy has a beneficial effect on performance at test for pictures primed at all levels, whereas achievement striving seems to have an adverse effect on performance at test for pictures primed at all levels. Results of the secondary analyses with a revised threshold performance measure indicated a significant quadratic interaction of Absorption, condition and priming level. In the experiential condition, test performance, based on Absorption scores for pictures primed at level 4, showed a positive slope and performance for pictures primed at levels 1 and 7 based on Absorption showed a negative slope. The reverse effect was found in the instrumental condition. The results suggest that Absorption, in combination with experiential involvement, may affect implicit memory. A second significant result of the secondary analyses was a linear three-way interaction of Achievement, condition and priming level on performance. Results suggest that as Achievement scores increased, test performance improved for less fragmented primed pictures in the instrumental condition and test performance improved for more highly fragmented primes in the experiential condition. Results from the secondary analyses suggest that the revised threshold measure may be more sensitive to individual differences. Results of the exploratory analyses with Openness to Experience, Conscientiousness and agentic positive emotionality (PEM-A) measures indicated no significant effects of any of these personality variables. Results suggest that facets of the scales may be more useful with regard to perceptual research, and that future research should examine narrowly focused personality traits as opposed to broader constructs.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The main focus of this thesis is to evaluate and compare Hyperbalilearning algorithm (HBL) to other learning algorithms. In this work HBL is compared to feed forward artificial neural networks using back propagation learning, K-nearest neighbor and 103 algorithms. In order to evaluate the similarity of these algorithms, we carried out three experiments using nine benchmark data sets from UCI machine learning repository. The first experiment compares HBL to other algorithms when sample size of dataset is changing. The second experiment compares HBL to other algorithms when dimensionality of data changes. The last experiment compares HBL to other algorithms according to the level of agreement to data target values. Our observations in general showed, considering classification accuracy as a measure, HBL is performing as good as most ANn variants. Additionally, we also deduced that HBL.:s classification accuracy outperforms 103's and K-nearest neighbour's for the selected data sets.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Remote sensing techniques involving hyperspectral imagery have applications in a number of sciences that study some aspects of the surface of the planet. The analysis of hyperspectral images is complex because of the large amount of information involved and the noise within that data. Investigating images with regard to identify minerals, rocks, vegetation and other materials is an application of hyperspectral remote sensing in the earth sciences. This thesis evaluates the performance of two classification and clustering techniques on hyperspectral images for mineral identification. Support Vector Machines (SVM) and Self-Organizing Maps (SOM) are applied as classification and clustering techniques, respectively. Principal Component Analysis (PCA) is used to prepare the data to be analyzed. The purpose of using PCA is to reduce the amount of data that needs to be processed by identifying the most important components within the data. A well-studied dataset from Cuprite, Nevada and a dataset of more complex data from Baffin Island were used to assess the performance of these techniques. The main goal of this research study is to evaluate the advantage of training a classifier based on a small amount of data compared to an unsupervised method. Determining the effect of feature extraction on the accuracy of the clustering and classification method is another goal of this research. This thesis concludes that using PCA increases the learning accuracy, and especially so in classification. SVM classifies Cuprite data with a high precision and the SOM challenges SVM on datasets with high level of noise (like Baffin Island).

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Les détecteurs ATLAS-MPX sont des détecteurs Medipix2-USB recouverts de convertisseurs de fluorure de lithium et de polyéthylène pour augmenter l’efficacité de détection des neutrons lents et des neutrons rapides respectivement. Un réseau de quinze détecteurs ATLAS-MPX a été mis en opération dans le détecteur ATLAS au LHC du CERN. Deux détecteurs ATLAS-MPX de référence ont été exposés à des sources de neutrons rapides 252 Cf et 241 AmBe ainsi qu’aux neutrons rapides produits par la réaction 7Li(p, xn) pour l’étude de la réponse du détecteur à ces neutrons. Les neutrons rapides sont principalement détectés à partir des protons de recul des collisions élastiques entre les neutrons et l’hydrogène dans le polyéthylène. Des réactions nucléaires entre les neutrons et le silicium produisent des particules-α. Une étude de l’efficacité de reconnaissance des traces des protons et des particules-α dans le détecteur Medipix2-USB a été faite en fonction de l’énergie cinétique incidente et de l’angle d’incidence. L’efficacité de détection des neutrons rapides a été évaluée à deux seuils d’énergie (8 keV et 230 keV) dans les détecteurs ATLAS-MPX. L’efficacité de détection des neutrons rapides dans la région du détecteur couverte avec le polyéthylène augmente en fonction de l’énergie des neutrons : (0.0346 ± 0.0004) %, (0.0862 ± 0.0018) % et (0.1044 ± 0.0026) % pour des neutrons rapides de 2.13 MeV, 4.08 MeV et 27 MeV respectivement. L’étude pour déterminer l’énergie des neutrons permet donc d’estimer le flux des neutrons quand le détecteur ATLAS-MPX est dans un champ de radiation inconnu comme c’est le cas dans le détecteur ATLAS au LHC.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

L’objectif principal de cette thèse était de quantifier et comparer l’effort requis pour reconnaître la parole dans le bruit chez les jeunes adultes et les personnes aînées ayant une audition normale et une acuité visuelle normale (avec ou sans lentille de correction de la vue). L’effort associé à la perception de la parole est lié aux ressources attentionnelles et cognitives requises pour comprendre la parole. La première étude (Expérience 1) avait pour but d’évaluer l’effort associé à la reconnaissance auditive de la parole (entendre un locuteur), tandis que la deuxième étude (Expérience 2) avait comme but d’évaluer l’effort associé à la reconnaissance auditivo-visuelle de la parole (entendre et voir le visage d’un locuteur). L’effort fut mesuré de deux façons différentes. D’abord par une approche comportementale faisant appel à un paradigme expérimental nommé double tâche. Il s’agissait d’une tâche de reconnaissance de mot jumelée à une tâche de reconnaissance de patrons vibro-tactiles. De plus, l’effort fut quantifié à l’aide d’un questionnaire demandant aux participants de coter l’effort associé aux tâches comportementales. Les deux mesures d’effort furent utilisées dans deux conditions expérimentales différentes : 1) niveau équivalent – c'est-à-dire lorsque le niveau du bruit masquant la parole était le même pour tous les participants et, 2) performance équivalente – c'est-à-dire lorsque le niveau du bruit fut ajusté afin que les performances à la tâche de reconnaissance de mots soient identiques pour les deux groupes de participant. Les niveaux de performance obtenus pour la tâche vibro-tactile ont révélé que les personnes aînées fournissent plus d’effort que les jeunes adultes pour les deux conditions expérimentales, et ce, quelle que soit la modalité perceptuelle dans laquelle les stimuli de la parole sont présentés (c.-à.-d., auditive seulement ou auditivo-visuelle). Globalement, le ‘coût’ associé aux performances de la tâche vibro-tactile était au plus élevé pour les personnes aînées lorsque la parole était présentée en modalité auditivo-visuelle. Alors que les indices visuels peuvent améliorer la reconnaissance auditivo-visuelle de la parole, nos résultats suggèrent qu’ils peuvent aussi créer une charge additionnelle sur les ressources utilisées pour traiter l’information. Cette charge additionnelle a des conséquences néfastes sur les performances aux tâches de reconnaissance de mots et de patrons vibro-tactiles lorsque celles-ci sont effectuées sous des conditions de double tâche. Conformément aux études antérieures, les coefficients de corrélations effectuées à partir des données de l’Expérience 1 et de l’Expérience 2 soutiennent la notion que les mesures comportementales de double tâche et les réponses aux questionnaires évaluent différentes dimensions de l’effort associé à la reconnaissance de la parole. Comme l’effort associé à la perception de la parole repose sur des facteurs auditifs et cognitifs, une troisième étude fut complétée afin d’explorer si la mémoire auditive de travail contribue à expliquer la variance dans les données portant sur l’effort associé à la perception de la parole. De plus, ces analyses ont permis de comparer les patrons de réponses obtenues pour ces deux facteurs après des jeunes adultes et des personnes aînées. Pour les jeunes adultes, les résultats d’une analyse de régression séquentielle ont démontré qu’une mesure de la capacité auditive (taille de l’empan) était reliée à l’effort, tandis qu’une mesure du traitement auditif (rappel alphabétique) était reliée à la précision avec laquelle les mots étaient reconnus lorsqu’ils étaient présentés sous les conditions de double tâche. Cependant, ces mêmes relations n’étaient pas présentes dans les données obtenues pour le groupe de personnes aînées ni dans les données obtenues lorsque les tâches de reconnaissance de la parole étaient effectuées en modalité auditivo-visuelle. D’autres études sont nécessaires pour identifier les facteurs cognitifs qui sous-tendent l’effort associé à la perception de la parole, et ce, particulièrement chez les personnes aînées.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Les collisions proton-proton produites par le LHC imposent un environnement radiatif hostile au détecteur ATLAS. Afin de quantifier les effets de cet environnement sur la performance du détecteur et la sécurité du personnel, plusieurs simulations Monte Carlo ont été réalisées. Toutefois, la mesure directe est indispensable pour suivre les taux de radiation dans ATLAS et aussi pour vérifier les prédictions des simulations. À cette fin, seize détecteurs ATLAS-MPX ont été installés à différents endroits dans les zones expérimentale et technique d'ATLAS. Ils sont composés d'un détecteur au silicium à pixels appelé MPX dont la surface active est partiellement recouverte de convertisseurs de neutrons thermiques, lents et rapides. Les détecteurs ATLAS-MPX mesurent en temps réel les champs de radiation en enregistrant les traces des particules détectées sous forme d'images matricielles. L'analyse des images acquises permet d'identifier les types des particules détectées à partir des formes de leurs traces. Dans ce but, un logiciel de reconnaissance de formes appelé MAFalda a été conçu. Étant donné que les traces des particules fortement ionisantes sont influencées par le partage de charge entre pixels adjacents, un modèle semi-empirique décrivant cet effet a été développé. Grâce à ce modèle, l'énergie des particules fortement ionisantes peut être estimée à partir de la taille de leurs traces. Les convertisseurs de neutrons qui couvrent chaque détecteur ATLAS-MPX forment six régions différentes. L'efficacité de chaque région à détecter les neutrons thermiques, lents et rapides a été déterminée par des mesures d'étalonnage avec des sources connues. L'étude de la réponse des détecteurs ATLAS-MPX à la radiation produite par les collisions frontales de protons à 7TeV dans le centre de masse a montré que le nombre de traces enregistrées est proportionnel à la luminosité du LHC. Ce résultat permet d'utiliser les détecteurs ATLAS-MPX comme moniteurs de luminosité. La méthode proposée pour mesurer et étalonner la luminosité absolue avec ces détecteurs est celle de van der Meer qui est basée sur les paramètres des faisceaux du LHC. Vu la corrélation entre la réponse des détecteurs ATLAS-MPX et la luminosité, les taux de radiation mesurés sont exprimés en termes de fluences de différents types de particules par unité de luminosité intégrée. Un écart significatif a été obtenu en comparant ces fluences avec celles prédites par GCALOR qui est l'une des simulations Monte Carlo du détecteur ATLAS. Par ailleurs, les mesures effectuées après l'arrêt des collisions proton-proton ont montré que les détecteurs ATLAS-MPX permettent d'observer la désintégration des isotopes radioactifs générés au cours des collisions. L'activation résiduelle des matériaux d'ATLAS peut être mesurée avec ces détecteurs grâce à un étalonnage en équivalent de dose ambiant.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

During 1990's the Wavelet Transform emerged as an important signal processing tool with potential applications in time-frequency analysis and non-stationary signal processing.Wavelets have gained popularity in broad range of disciplines like signal/image compression, medical diagnostics, boundary value problems, geophysical signal processing, statistical signal processing,pattern recognition,underwater acoustics etc.In 1993, G. Evangelista introduced the Pitch- synchronous Wavelet Transform, which is particularly suited for pseudo-periodic signal processing.The work presented in this thesis mainly concentrates on two interrelated topics in signal processing,viz. the Wavelet Transform based signal compression and the computation of Discrete Wavelet Transform. A new compression scheme is described in which the Pitch-Synchronous Wavelet Transform technique is combined with the popular linear Predictive Coding method for pseudo-periodic signal processing. Subsequently,A novel Parallel Multiple Subsequence structure is presented for the efficient computation of Wavelet Transform. Case studies also presented to highlight the potential applications.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A new procedure for the classification of lower case English language characters is presented in this work . The character image is binarised and the binary image is further grouped into sixteen smaller areas ,called Cells . Each cell is assigned a name depending upon the contour present in the cell and occupancy of the image contour in the cell. A data reduction procedure called Filtering is adopted to eliminate undesirable redundant information for reducing complexity during further processing steps . The filtered data is fed into a primitive extractor where extraction of primitives is done . Syntactic methods are employed for the classification of the character . A decision tree is used for the interaction of the various components in the scheme . 1ike the primitive extraction and character recognition. A character is recognized by the primitive by primitive construction of its description . Openended inventories are used for including variants of the characters and also adding new members to the general class . Computer implementation of the proposal is discussed at the end using handwritten character samples . Results are analyzed and suggestions for future studies are made. The advantages of the proposal are discussed in detail .

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Handwriting is an acquired tool used for communication of one's observations or feelings. Factors that inuence a person's handwriting not only dependent on the individual's bio-mechanical constraints, handwriting education received, writing instrument, type of paper, background, but also factors like stress, motivation and the purpose of the handwriting. Despite the high variation in a person's handwriting, recent results from different writer identification studies have shown that it possesses sufficient individual traits to be used as an identification method. Handwriting as a behavioral biometric has had the interest of researchers for a long time. But recently it has been enjoying new interest due to an increased need and effort to deal with problems ranging from white-collar crime to terrorist threats. The identification of the writer based on a piece of handwriting is a challenging task for pattern recognition. The main objective of this thesis is to develop a text independent writer identification system for Malayalam Handwriting. The study also extends to developing a framework for online character recognition of Grantha script and Malayalam characters

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Speech signals are one of the most important means of communication among the human beings. In this paper, a comparative study of two feature extraction techniques are carried out for recognizing speaker independent spoken isolated words. First one is a hybrid approach with Linear Predictive Coding (LPC) and Artificial Neural Networks (ANN) and the second method uses a combination of Wavelet Packet Decomposition (WPD) and Artificial Neural Networks. Voice signals are sampled directly from the microphone and then they are processed using these two techniques for extracting the features. Words from Malayalam, one of the four major Dravidian languages of southern India are chosen for recognition. Training, testing and pattern recognition are performed using Artificial Neural Networks. Back propagation method is used to train the ANN. The proposed method is implemented for 50 speakers uttering 20 isolated words each. Both the methods produce good recognition accuracy. But Wavelet Packet Decomposition is found to be more suitable for recognizing speech because of its multi-resolution characteristics and efficient time frequency localizations