Modelling word meaning using efficient tensor representations


Autoria(s): Symonds, Michael; Bruza, Peter D.; Sitbon, Laurianne; Turner, Ian
Data(s)

12/10/2011

Resumo

Models of word meaning, built from a corpus of text, have demonstrated success in emulating human performance on a number of cognitive tasks. Many of these models use geometric representations of words to store semantic associations between words. Often word order information is not captured in these models. The lack of structural information used by these models has been raised as a weakness when performing cognitive tasks. This paper presents an efficient tensor based approach to modelling word meaning that builds on recent attempts to encode word order information, while providing flexible methods for extracting task specific semantic information.

Formato

application/pdf

Identificador

http://eprints.qut.edu.au/46419/

Relação

http://eprints.qut.edu.au/46419/1/TEModel.final.pdf

http://portal.cohass.ntu.edu.sg/PACLIC25/importantdates.asp

Symonds, Michael, Bruza, Peter D., Sitbon, Laurianne, & Turner, Ian (2011) Modelling word meaning using efficient tensor representations. In Proceedings of 25th Pacific Asia Conference on Language, Information and Computation, Nanyang Technological University, Singapore.

http://purl.org/au-research/grants/ARC/DP1094974

Direitos

Copyright 2011 Mike Symonds, Peter Bruza, Laurianne Sitbon, and Ian Turner

Fonte

Computer Science; Faculty of Science and Technology; Information Systems; Mathematical Sciences

Palavras-Chave #080200 COMPUTATION THEORY AND MATHEMATICS #170299 Cognitive Science not elsewhere classified #Semantic Space #Tensors #unsupervised learning #linguistics #tensor encoding
Tipo

Conference Paper