Knowledge extraction from a mixed transfer function artificial neural network


Autoria(s): Khan, I.; Frayman, Yakov; Nahavandi, Saeid
Contribuinte(s)

Alo, Richard

Data(s)

01/01/2004

Resumo

One of the big problems with Artificial Neural Networks (ANN) is that their results are not intuitively clear. For example, if we use the traditional neurons, with a sigmoid activation function, we can approximate any function, including linear functions, but the coefficients (weights) in this approximation will be rather meaningless. To resolve this problem, this paper presents a novel kind of ANN with different transfer functions mixed together. The aim of such a network is to i) obtain a better generalization than current networks ii) to obtain knowledge from the networks without a sophisticated knowledge extraction algorithm iii) to increase the understanding and acceptance of ANNs. Transfer Complexity Ratio is defined to make a sense of the weights associated with the network. The paper begins with a review of the knowledge extraction from ANNs and then presents a Mixed Transfer Function Artificial Neural Network (MTFANN). A MTFANN contains different transfer functions mixed together rather than mono-transfer functions. This mixed presence has helped to obtain high level knowledge and similar generalization comparatively to monotransfer function nets in a global optimization context.<br />

Identificador

http://hdl.handle.net/10536/DRO/DU:30005552

Idioma(s)

eng

Publicador

University of Houston-Downtown

Relação

http://dro.deakin.edu.au/eserv/DU:30005552/frayman-knowledgeextraction-2004.pdf

http://www.intech.scitech.au.edu/register/Intech09/main.asp

Tipo

Conference Paper