On learning context-free and context-sensitive languages


Autoria(s): Boden, Mikael; Wiles, Janet
Contribuinte(s)

J. Zurada

Data(s)

01/03/2002

Resumo

The long short-term memory (LSTM) is not the only neural network which learns a context sensitive language. Second-order sequential cascaded networks (SCNs) are able to induce means from a finite fragment of a context-sensitive language for processing strings outside the training set. The dynamical behavior of the SCN is qualitatively distinct from that observed in LSTM networks. Differences in performance and dynamics are discussed.

Identificador

http://espace.library.uq.edu.au/view/UQ:63176

Idioma(s)

eng

Publicador

The Institute of Electrical and Electronics Engineers

Palavras-Chave #Computer Science, Artificial Intelligence #Computer Science, Hardware & Architecture #Computer Science, Theory & Methods #Engineering, Electrical & Electronic #language #prediction #recurrent neural network (RNN) #Recognizers #C1 #380305 Knowledge Representation and Machine Learning #780108 Behavioural and cognitive sciences
Tipo

Journal Article