Using Fourier convergence analysis for effective learning in max- min neural networks

Download or Read eBook Using Fourier convergence analysis for effective learning in max- min neural networks PDF written by Kia Fock Loe and published by . This book was released on 1996 with total page 25 pages. Available in PDF, EPUB and Kindle.
Using Fourier convergence analysis for effective learning in max- min neural networks
Author :
Publisher :
Total Pages : 25
Release :
ISBN-10 : OCLC:37266573
ISBN-13 :
Rating : 4/5 (73 Downloads)

Book Synopsis Using Fourier convergence analysis for effective learning in max- min neural networks by : Kia Fock Loe

Book excerpt: Abstract: "Max and min operations have interesting properties that facilitate the exchange of information between the symbolic and real- valued domains. As such, neural networks that employ max-min activation functions have been a subject of interest in recent years. Since max-min functions are not strictly differentiable, many ad hoc learning methods for such max-min neural networks have been proposed in the literature. In this technical report, we propose a mathematically sound learning method based on using Fourier convergence analysis to derive a gradient descent technique for max-min error functions. This method is then applied to two models: a feedforward fuzzy-neural network and a recurrent max-min neural network. We show how a 'typical' fuzzy-neural network model employing max- min activation functions can be trained to perform function approximation; its performance was found to be better than that of a conventional feedforward neural network. We also propose a novel recurrent max-min neural network model which is trained to perform grammatical inference as an application example. Comparisons are made between this model and recurrent neural networks that use conventional sigmoidal activation fuctions; such recurrent sigmoidal networks are known to be difficult to train and generalize poorly on long strings. The comparisons show that our model not only performs better in terms of learning speed and generalization, its final weight configuration allows a DFQ to be extracted in a straighforward manner. However, it has a potential drawback: the minimal network size required for successful convergence grows with increasing language depth and complexity. Nevertheless, we are able to demonstrate that our proposed gradient descent technique does allow max-min neural networks to learn effectively. Our leaning method should be extensible to other neural networks that have non-differentiable activation functions."


Using Fourier convergence analysis for effective learning in max- min neural networks Related Books