980 resultados para kernel method
Resumo:
Recent advances in machine learning methods enable increasingly the automatic construction of various types of computer assisted methods that have been difficult or laborious to program by human experts. The tasks for which this kind of tools are needed arise in many areas, here especially in the fields of bioinformatics and natural language processing. The machine learning methods may not work satisfactorily if they are not appropriately tailored to the task in question. However, their learning performance can often be improved by taking advantage of deeper insight of the application domain or the learning problem at hand. This thesis considers developing kernel-based learning algorithms incorporating this kind of prior knowledge of the task in question in an advantageous way. Moreover, computationally efficient algorithms for training the learning machines for specific tasks are presented. In the context of kernel-based learning methods, the incorporation of prior knowledge is often done by designing appropriate kernel functions. Another well-known way is to develop cost functions that fit to the task under consideration. For disambiguation tasks in natural language, we develop kernel functions that take account of the positional information and the mutual similarities of words. It is shown that the use of this information significantly improves the disambiguation performance of the learning machine. Further, we design a new cost function that is better suitable for the task of information retrieval and for more general ranking problems than the cost functions designed for regression and classification. We also consider other applications of the kernel-based learning algorithms such as text categorization, and pattern recognition in differential display. We develop computationally efficient algorithms for training the considered learning machines with the proposed kernel functions. We also design a fast cross-validation algorithm for regularized least-squares type of learning algorithm. Further, an efficient version of the regularized least-squares algorithm that can be used together with the new cost function for preference learning and ranking tasks is proposed. In summary, we demonstrate that the incorporation of prior knowledge is possible and beneficial, and novel advanced kernels and cost functions can be used in algorithms efficiently.
Resumo:
UNLABELLED: In vivo transcriptional analyses of microbial pathogens are often hampered by low proportions of pathogen biomass in host organs, hindering the coverage of full pathogen transcriptome. We aimed to address the transcriptome profiles of Candida albicans, the most prevalent fungal pathogen in systemically infected immunocompromised patients, during systemic infection in different hosts. We developed a strategy for high-resolution quantitative analysis of the C. albicans transcriptome directly from early and late stages of systemic infection in two different host models, mouse and the insect Galleria mellonella. Our results show that transcriptome sequencing (RNA-seq) libraries were enriched for fungal transcripts up to 1,600-fold using biotinylated bait probes to capture C. albicans sequences. This enrichment biased the read counts of only ~3% of the genes, which can be identified and removed based on a priori criteria. This allowed an unprecedented resolution of C. albicans transcriptome in vivo, with detection of over 86% of its genes. The transcriptional response of the fungus was surprisingly similar during infection of the two hosts and at the two time points, although some host- and time point-specific genes could be identified. Genes that were highly induced during infection were involved, for instance, in stress response, adhesion, iron acquisition, and biofilm formation. Of the in vivo-regulated genes, 10% are still of unknown function, and their future study will be of great interest. The fungal RNA enrichment procedure used here will help a better characterization of the C. albicans response in infected hosts and may be applied to other microbial pathogens. IMPORTANCE: Understanding the mechanisms utilized by pathogens to infect and cause disease in their hosts is crucial for rational drug development. Transcriptomic studies may help investigations of these mechanisms by determining which genes are expressed specifically during infection. This task has been difficult so far, since the proportion of microbial biomass in infected tissues is often extremely low, thus limiting the depth of sequencing and comprehensive transcriptome analysis. Here, we adapted a technology to capture and enrich C. albicans RNA, which was next used for deep RNA sequencing directly from infected tissues from two different host organisms. The high-resolution transcriptome revealed a large number of genes that were so far unknown to participate in infection, which will likely constitute a focus of study in the future. More importantly, this method may be adapted to perform transcript profiling of any other microbes during host infection or colonization.
Resumo:
Building on the instrumental model of group conflict (IMGC), the present experiment investigates the support for discriminatory and meritocratic method of selections at university in a sample of local and immigrant students. Results showed that local students were supporting in a larger proportion selection method that favors them over immigrants in comparison to method that consists in selecting the best applicants without considering his/her origin. Supporting the assumption of the IMGC, this effect was stronger for locals who perceived immigrants as competing for resources. Immigrant students supported more strongly the meritocratic selection method than the one that discriminated them. However, contrasting with the assumption of the IMGC, this effect was only present in students who perceived immigrants as weakly competing for locals' resources. Results demonstrate that selection methods used at university can be perceived differently depending on students' origin. Further, they suggest that the mechanisms underlying the perception of discriminatory and meritocratic selection methods differ between local and immigrant students. Hence, the present experiment makes a theoretical contribution to the IMGC by delimiting its assumptions to the ingroup facing a competitive situation with a relevant outgroup. Practical implication for universities recruitment policies are discussed.
Resumo:
This paper reports the method development for the simultaneous determination of methylmercury MeHgþ) and inorganic mercury (iHg) species in seafood samples. The study focused on the extraction and quantification of MeHgþ (the most toxic species) by liquid chromatography coupled to on-line UV irradiation and cold vapour atomic fluorescence spectroscopy (LC-UV-CV-AFS), using HCl 4 mol/L as the extractant agent. Accuracy of the method has been verified by analysing three certified reference materials and different spiked samples. The values found for total Hg and MeHgþ for the CRMs did not differ significantly from certified values at a 95% confidence level, and recoveries between 85% and 97% for MeHgþ, based on spikes, were achieved. The detection limits (LODs) obtained were 0.001 mg Hg/kg for total mercury, 0.0003 mg Hg/kg for MeHgþ and 0.0004 mg Hg/kg for iHg. The quantification limits (LOQs) established were 0.003 mg Hg/kg for total mercury, 0.0010 mg Hg/kg for MeHgþ and 0.0012 mg Hg/kg for iHg. Precision for each mercury species was established, being 12% in terms of RSD in all cases. Finally, the developed method was applied to 24 seafood samples from different origins and total mercury contents. The concentrations for Total Hg, MeHg and iHg ranged from 0.07 to 2.33, 0.003-2.23 and 0.006-0.085 mg Hg/kg, respectively. The established analytical method allows to obtain results for mercury speciation in less than 1 one hour including both, sample pretreatment and measuring step.
Resumo:
Abstract This work studies the multi-label classification of turns in simple English Wikipedia talk pages into dialog acts. The treated dataset was created and multi-labeled by (Ferschke et al., 2012). The first part analyses dependences between labels, in order to examine the annotation coherence and to determine a classification method. Then, a multi-label classification is computed, after transforming the problem into binary relevance. Regarding features, whereas (Ferschke et al., 2012) use features such as uni-, bi-, and trigrams, time distance between turns or the indentation level of the turn, other features are considered here: lemmas, part-of-speech tags and the meaning of verbs (according to WordNet). The dataset authors applied approaches such as Naive Bayes or Support Vector Machines. The present paper proposes, as an alternative, to use Schoenberg transformations which, following the example of kernel methods, transform original Euclidean distances into other Euclidean distances, in a space of high dimensionality. Résumé Ce travail étudie la classification supervisée multi-étiquette en actes de dialogue des tours de parole des contributeurs aux pages de discussion de Simple English Wikipedia (Wikipédia en anglais simple). Le jeu de données considéré a été créé et multi-étiqueté par (Ferschke et al., 2012). Une première partie analyse les relations entre les étiquettes pour examiner la cohérence des annotations et pour déterminer une méthode de classification. Ensuite, une classification supervisée multi-étiquette est effectuée, après recodage binaire des étiquettes. Concernant les variables, alors que (Ferschke et al., 2012) utilisent des caractéristiques telles que les uni-, bi- et trigrammes, le temps entre les tours de parole ou l'indentation d'un tour de parole, d'autres descripteurs sont considérés ici : les lemmes, les catégories morphosyntaxiques et le sens des verbes (selon WordNet). Les auteurs du jeu de données ont employé des approches telles que le Naive Bayes ou les Séparateurs à Vastes Marges (SVM) pour la classification. Cet article propose, de façon alternative, d'utiliser et d'étendre l'analyse discriminante linéaire aux transformations de Schoenberg qui, à l'instar des méthodes à noyau, transforment les distances euclidiennes originales en d'autres distances euclidiennes, dans un espace de haute dimensionnalité.
Resumo:
Following their detection and seizure by police and border guard authorities, false identity and travel documents are usually scanned, producing digital images. This research investigates the potential of these images to classify false identity documents, highlight links between documents produced by a same modus operandi or same source, and thus support forensic intelligence efforts. Inspired by previous research work about digital images of Ecstasy tablets, a systematic and complete method has been developed to acquire, collect, process and compare images of false identity documents. This first part of the article highlights the critical steps of the method and the development of a prototype that processes regions of interest extracted from images. Acquisition conditions have been fine-tuned in order to optimise reproducibility and comparability of images. Different filters and comparison metrics have been evaluated and the performance of the method has been assessed using two calibration and validation sets of documents, made up of 101 Italian driving licenses and 96 Portuguese passports seized in Switzerland, among which some were known to come from common sources. Results indicate that the use of Hue and Edge filters or their combination to extract profiles from images, and then the comparison of profiles with a Canberra distance-based metric provides the most accurate classification of documents. The method appears also to be quick, efficient and inexpensive. It can be easily operated from remote locations and shared amongst different organisations, which makes it very convenient for future operational applications. The method could serve as a first fast triage method that may help target more resource-intensive profiling methods (based on a visual, physical or chemical examination of documents for instance). Its contribution to forensic intelligence and its application to several sets of false identity documents seized by police and border guards will be developed in a forthcoming article (part II).
Resumo:
This study shows the possibility offered by modern ultra-high performance supercritical fluid chromatography combined with tandem mass spectrometry in doping control analysis. A high throughput screening method was developed for 100 substances belonging to the challenging classes of anabolic agents, hormones and metabolic modulators, synthetic cannabinoids and glucocorticoids, which should be detected at low concentrations in urine. To selectively extract these doping agents from urine, a supported liquid extraction procedure was implemented in a 48-well plate format. At the tested concentration levels ranging from 0.5 to 5 ng/mL, the recoveries were better than 70% for 48-68% of the compounds and higher than 50% for 83-87% of the tested substances. Due to the numerous interferences related to isomers of steroids and ions produced by the loss of water in the electrospray source, the choice of SFC separation conditions was very challenging. After careful optimization, a Diol stationary phase was employed. The total analysis time for the screening assay was only 8 min, and interferences as well as susceptibility to matrix effect (ME) were minimized. With the developed method, about 70% of the compounds had relative ME within the range ±20%, at a concentration of 1 and 5 ng/mL. Finally, limits of detection achieved with the above-described strategy including 5-fold preconcentration were below 0.1 ng/mL for the majority of the tested compounds. Therefore, LODs were systematically better than the minimum required performance levels established by the World anti-doping agency, except for very few metabolites.
Resumo:
Objective To evaluate the utility of a new multimodal image-guided intervention technique to detect epileptogenic areas with a gamma probe as compared with intraoperative electrocorticography. Materials and Methods Two symptomatic patients with refractory epilepsy underwent magnetic resonance imaging, videoelectroencephalography, brain SPECT scan, neuropsychological evaluation and were submitted to gamma probe-assisted surgery. Results In patient 1, maximum radioactive count was initially observed on the temporal gyrus at about 3.5 cm posteriorly to the tip of the left temporal lobe. After corticotomy, the gamma probe indicated maximum count at the head of the hippocampus, in agreement with the findings of intraoperative electrocorticography. In patient 2, maximum count was observed in the occipital region at the transition between the temporal and parietal lobes (right hemisphere). During the surgery, the area of epileptogenic activity mapped at electrocorticography was also delimited, demarcated, and compared with the gamma probe findings. After lesionectomy, new radioactive counts were performed both in the patients and on the surgical specimens (ex-vivo). Conclusion The comparison between intraoperative electrocorticography and gamma probe-assisted surgery showed similarity of both methods. The advantages of gamma probe include: noninvasiveness, low cost and capacity to demonstrate decrease in the radioactive activity at the site of excision after lesionectomy.
Resumo:
La dermatite irritative est décrite comme une réaction réversible, non immunologique caractérisée par des lésions d'aspect très variable, allant de la simple rougeur jusqu'à la formation de bulles voire d'une nécrose, accompagnée de prurit ou d'une sensation de brûlure suite à I' application d'une substance chimique. Le teste de prédiction d'irritation cutanée est traditionnellement depuis les années 1940 le Test de Draize. Ce test consiste en l'application d'une substance chimique sur une peau rasée de lapin pendant 4h et de regarder à 24h si des signes cliniques d'irritations sont présents. Cette méthode critiquable autant d'un point de vue éthique que qualitative reste actuellement le teste le plus utilisé. Depuis le début des années 2000 de nouvelles méthodes in vitro se sont développées tel que le model d'épiderme humain recombiné (RHE). Il s agit d'une multicouche de kératinocyte bien différencié obtenu depuis une culture de don d'ovocyte. Cependant cette méthode en plus d'être très couteuse n'obtient au mieux que 76% de résultat similaire comparé au test in vivo humain. Il existe donc la nécessité de développer une nouvelle méthode in vitro qui simulerait encore mieux la réalité anatomique et physiologique retrouvée in vivo. Notre objectif a été de développer cette nouvelle méthode in vitro. Pour cela nous avons travaillé avec de la peau humaine directement prélevée après une abdominoplastie. Celle ci après préparation avec un dermatome, un couteau dont la lame est réglable pour découper l'épaisseur souhaitée de peau, est montée dans un système de diffusion cellulaire. La couche cornée est alors exposée de manière optimale à 1 ml de la substance chimique testée pendant 4h. L'échantillon de peau est alors fixé dans du formaldéhyde pour permettre la préparation de lames standards d'hématoxyline et éosine. L'irritation est alors investiguée selon des critères histopathologiques de spongioses, de nécroses et de vacuolisations cellulaires. Les résultats de ce.tte première batterie de testes sont plus que prometteurs. En effet, comparé au résultat in vivo, nous obtenons 100% de concordance pour les 4 même substances testes irritantes ou non irritantes, ce qui est supérieur au model d épiderme humain recombiné (76%). De plus le coefficient de variation entre les 3 différentes séries est inférieur à 0.1 ce qui montre une bonne reproductibilité dans un même laboratoire. Dans le futur cette méthode va devoir être testée avec un plus grand nombre de substances chimiques et sa reproductibilité évaluée dans différents laboratoires. Mais cette première evaluation, très encourageante, ouvre des pistes précieuses pour l'avenir des tests irritatifs.
Resumo:
Body percussion using to the BAPNE method is a means of cognitive stimulation with multiple applications. The aim of this research is to assess their full potential as a source of therapy. The methodology used is theoretical in nature and makes use of a wide bibliography to find evidence for its therapeutic effect. In essence, body percussion can be seen to lead to improvements in three areas. the Physical, as it stimulates awareness of the body, control of movement and muscular strength, coordination and balance; the Mental, as it improves concentration, memory and perception; and finally Socio-affective, as it helps to build egalitarian relationships and leads to a decrease in anxiety in social interactions. This means of therapy has several different uses and it is targeted at different groups. In the present investigation we categorise them into five main groups: individuals with neurodegenerative diseases like Alzheimer's or Parkinson's disease; individuals with learning disorders such as dyslexia or ADHD; patients affected by diseases of the spinal cord, cranial neuropathies and trauma (Neurorehabilitation); and for the treatment of addictive behavior (addiction); and depressive disorders or anxiety disorders.After thorough analysis, we have found scientific evidence that the therapeutic body percussion using the BAPNE method improves the quality of life of patients and it is an important factor in stabilizing the development of different diseases.In addition, evidence involving certain biological indicators (in control and experimental groups, and through a pre-test and post-test) show its effect on levels of stress and anxiety (reduction of cortisol), as well as improvement of social relations as a result of working as a group (increased levels of oxytocin), and improvements seen in self-esteem and in a variety of personal aspects through the Aspects of Identity questionnaire.
Resumo:
This study aimed at comparing the efficiency of various sampling materials for the collection and subsequent analysis of organic gunshot residues (OGSR). To the best of our knowledge, it is the first time that sampling devices were investigated in detail for further quantitation of OGSR by LC-MS. Seven sampling materials, namely two "swab"-type and five "stub"-type collection materials, were tested. The investigation started with the development of a simple and robust LC-MS method able to separate and quantify molecules typically found in gunpowders, such as diphenylamine or ethylcentralite. The evaluation of sampling materials was then systematically carried out by first analysing blank extracts of the materials to check for potential interferences and determining matrix effects. Based on these results, the best four materials, namely cotton buds, polyester swabs, a tape from 3M and PTFE were compared in terms of collection efficiency during shooting experiments using a set of 9 mm Luger ammunition. It was found that the tape was capable of recovering the highest amounts of OGSR. As tape-lifting is the technique currently used in routine for inorganic GSR, OGSR analysis might be implemented without modifying IGSR sampling and analysis procedure.
Resumo:
A BASIC computer program (REMOVAL) was developed to compute in a VAXNMS environment all the calculations of the removal method for population size estimation (catch-effort method for closed populations with constant sampling effort). The program follows the maximum likelihood methodology,checks the failure conditions, applies the appropriate formula, and displays the estimates of population size and catchability, with their standard deviations and coefficients of variation, and two goodness-of-fit statistics with their significance levels. Data of removal experiments for the cyprinodontid fish Aphanius iberus in the Alt Emporda wetlands are used to exemplify the use of the program