108 resultados para Computational Lexical Semantics


Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present the results of exploratory experiments using lexical valence extracted from brain using electroencephalography (EEG) for sentiment analysis. We selected 78 English words (36 for training and 42 for testing), presented as stimuli to 3 English native speakers. EEG signals were recorded from the subjects while they performed a mental imaging task for each word stimulus. Wavelet decomposition was employed to extract EEG features from the time-frequency domain. The extracted features were used as inputs to a sparse multinomial logistic regression (SMLR) classifier for valence classification, after univariate ANOVA feature selection. After mapping EEG signals to sentiment valences, we exploited the lexical polarity extracted from brain data for the prediction of the valence of 12 sentences taken from the SemEval-2007 shared task, and compared it against existing lexical resources.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Oncology is a field that profits tremendously from the genomic data generated by high-throughput technologies, including next-generation sequencing. However, in order to exploit, integrate, visualize and interpret such high-dimensional data efficiently, non-trivial computational and statistical analysis methods are required that need to be developed in a problem-directed manner.

Discussion: For this reason, computational cancer biology aims to fill this gap. Unfortunately, computational cancer biology is not yet fully recognized as a coequal field in oncology, leading to a delay in its maturation and, as an immediate consequence, an under-exploration of high-throughput data for translational research.

Summary: Here we argue that this imbalance, favoring 'wet lab-based activities', will be naturally rectified over time, if the next generation of scientists receives an academic education that provides a fair and competent introduction to computational biology and its manifold capabilities. Furthermore, we discuss a number of local educational provisions that can be implemented on university level to help in facilitating the process of harmonization.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Traditional internal combustion engine vehicles are a major contributor to global greenhouse gas emissions and other air pollutants, such as particulate matter and nitrogen oxides. If the tail pipe point emissions could be managed centrally without reducing the commercial and personal user functionalities, then one of the most attractive solutions for achieving a significant reduction of emissions in the transport sector would be the mass deployment of electric vehicles. Though electric vehicle sales are still hindered by battery performance, cost and a few other technological bottlenecks, focused commercialisation and support from government policies are encouraging large scale electric vehicle adoptions. The mass proliferation of plug-in electric vehicles is likely to bring a significant additional electric load onto the grid creating a highly complex operational problem for power system operators. Electric vehicle batteries also have the ability to act as energy storage points on the distribution system. This double charge and storage impact of many uncontrollable small kW loads, as consumers will want maximum flexibility, on a distribution system which was originally not designed for such operations has the potential to be detrimental to grid balancing. Intelligent scheduling methods if established correctly could smoothly integrate electric vehicles onto the grid. Intelligent scheduling methods will help to avoid cycling of large combustion plants, using expensive fossil fuel peaking plant, match renewable generation to electric vehicle charging and not overload the distribution system causing a reduction in power quality. In this paper, a state-of-the-art review of scheduling methods to integrate plug-in electric vehicles are reviewed, examined and categorised based on their computational techniques. Thus, in addition to various existing approaches covering analytical scheduling, conventional optimisation methods (e.g. linear, non-linear mixed integer programming and dynamic programming), and game theory, meta-heuristic algorithms including genetic algorithm and particle swarm optimisation, are all comprehensively surveyed, offering a systematic reference for grid scheduling considering intelligent electric vehicle integration.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The energetics of the low-temperature adsorption and decomposition of nitrous oxide, N(2)O, on flat and stepped platinum surfaces were calculated using density-functional theory (DFT). The results show that the preferred adsorption site for N(2)O is an atop site, bound upright via the terminal nitrogen. The molecule is only weakly chemisorbed to the platinum surface. The decomposition barriers on flat (I 11) surfaces and stepped (211) surfaces are similar. While the barrier for N(2)O dissociation is relatively small, the surface rapidly becomes poisoned by adsorbed oxygen. These findings are supported by experimental results of pulsed N(2)O decomposition with 5% Pt/SiO(2) and bismuth-modified Pt/C catalysts. At low temperature, decomposition occurs but self-poisoning by O((ads)) prevents further decomposition. At higher temperatures some desorption Of O(2) is observed, allowing continued catalytic activity. The study with bismuth-modified Pt/C catalysts showed that, although the activation barriers calculated for both terraces and steps were similar, the actual rate was different for the two surfaces. Steps were found experimentally to be more active than terraces and this is attributed to differences in the preexponential term. (C) 2004 Elsevier Inc. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, we present a meeting report for the 2nd Summer School in Computational Biology organized by the Queen's University of Belfast. We describe the organization of the summer school, its underlying concept and student feedback we received after the completion of the summer school.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The piezoresistance effect is defined as change in resistance due to applied stress. Silicon has a relatively large piezoresistance effect which has been known since 1954. A four point bending setup is proposed and designed to analyze the piezoresistance effect in p-type silicon. This setup is used to apply uniform and uniaxial stress along the <110> crystal direction. The main aim of this work is to investigate the piezoresistive characteristic of p-type resistors as a function of doping concentrations using COMSOL Multiphysics. Simulation results are compared with experimental data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Answer Set Programming (ASP) is a popular framework for modelling combinatorial problems. However, ASP cannot be used easily for reasoning about uncertain information. Possibilistic ASP (PASP) is an extension of ASP that combines possibilistic logic and ASP. In PASP a weight is associated with each rule, whereas this weight is interpreted as the certainty with which the conclusion can be established when the body is known to hold. As such, it allows us to model and reason about uncertain information in an intuitive way. In this paper we present new semantics for PASP in which rules are interpreted as constraints on possibility distributions. Special models of these constraints are then identified as possibilistic answer sets. In addition, since ASP is a special case of PASP in which all the rules are entirely certain, we obtain a new characterization of ASP in terms of constraints on possibility distributions. This allows us to uncover a new form of disjunction, called weak disjunction, that has not been previously considered in the literature. In addition to introducing and motivating the semantics of weak disjunction, we also pinpoint its computational complexity. In particular, while the complexity of most reasoning tasks coincides with standard disjunctive ASP, we find that brave reasoning for programs with weak disjunctions is easier.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Research in emotion analysis of text suggest that emotion lexicon based features are superior to corpus based n-gram features. However the static nature of the general purpose emotion lexicons make them less suited to social media analysis, where the need to adopt to changes in vocabulary usage and context is crucial. In this paper we propose a set of methods to extract a word-emotion lexicon automatically from an emotion labelled corpus of tweets. Our results confirm that the features derived from these lexicons outperform the standard Bag-of-words features when applied to an emotion classification task. Furthermore, a comparative analysis with both manually crafted lexicons and a state-of-the-art lexicon generated using Point-Wise Mutual Information, show that the lexicons generated from the proposed methods lead to significantly better classi- fication performance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Discussion forums have evolved into a dependablesource of knowledge to solvecommon problems. However, only a minorityof the posts in discussion forumsare solution posts. Identifying solutionposts from discussion forums, hence, is animportant research problem. In this paper,we present a technique for unsupervisedsolution post identification leveraginga so far unexplored textual feature, thatof lexical correlations between problemsand solutions. We use translation modelsand language models to exploit lexicalcorrelations and solution post characterrespectively. Our technique is designedto not rely much on structural featuressuch as post metadata since suchfeatures are often not uniformly availableacross forums. Our clustering-based iterativesolution identification approach basedon the EM-formulation performs favorablyin an empirical evaluation, beatingthe only unsupervised solution identificationtechnique from literature by a verylarge margin. We also show that our unsupervisedtechnique is competitive againstmethods that require supervision, outperformingone such technique comfortably.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Large-scale biological jobs on high-performance computing systems require manual intervention if one or more computing cores on which they execute fail. This places not only a cost on the maintenance of the job, but also a cost on the time taken for reinstating the job and the risk of losing data and execution accomplished by the job before it failed. Approaches which can proactively detect computing core failures and take action to relocate the computing core's job onto reliable cores can make a significant step towards automating fault tolerance. Method: This paper describes an experimental investigation into the use of multi-agent approaches for fault tolerance. Two approaches are studied, the first at the job level and the second at the core level. The approaches are investigated for single core failure scenarios that can occur in the execution of parallel reduction algorithms on computer clusters. A third approach is proposed that incorporates multi-agent technology both at the job and core level. Experiments are pursued in the context of genome searching, a popular computational biology application. Result: The key conclusion is that the approaches proposed are feasible for automating fault tolerance in high-performance computing systems with minimal human intervention. In a typical experiment in which the fault tolerance is studied, centralised and decentralised checkpointing approaches on an average add 90% to the actual time for executing the job. On the other hand, in the same experiment the multi-agent approaches add only 10% to the overall execution time

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Symposium of papers on Computational Thinking