Please use this identifier to cite or link to this item: http://hdl.handle.net/2440/120139
Citations
Scopus Web of Science® Altmetric
?
?
Type: Conference paper
Title: Modular expansion of the hidden layer in Single Layer Feedforward neural Networks
Author: Tissera, M.
McDonnell, M.
Citation: 2016 Intemational Joint Conference on Neural Networks (IJCNN), 2016 / vol.2016-October, pp.2939-2945
Publisher: IEEE
Issue Date: 2016
Series/Report no.: IEEE International Joint Conference on Neural Networks (IJCNN)
ISBN: 9781509006199
ISSN: 2161-4393
2161-4407
Conference Name: International Joint Conference on Neural Networks (IJCNN) (24 Jul 2016 - 29 Jul 2016 : Vancouver, Canada)
Statement of
Responsibility: 
Migel D. Tissera and Mark D. McDonnell
Abstract: We present a neural network architecture and a training algorithm designed to enable very rapid training, and that requires low computational processing power, memory and time. The algorithm is based on a modular architecture, which expands the output weights layer constructively, so that the final network can be visualised as a Single Layer Feedforward Network (SLFN) with a large hidden-layer. The method does not use backpropagation, and consequently offers very fast training and very few trainable parameters in each module. It is therefore potentially a useful method for applications which require frequent retraining, or which rely on reduced hardware capability, such as mobile robots or Internet of Things (IoT). We demonstrate the efficacy of the method in two benchmark image classification datasets, MNIST and CIFAR-10. The network produces very favourable results for a SLFN on these benchmarks, with an average of 99.07% correct classification rate on MNIST and nearly 82% on CIFAR-10 when applied to convolutional features. Code for the method has been made available online.
Rights: © 2016 IEEE
RMID: 0030062518
DOI: 10.1109/IJCNN.2016.7727571
Grant ID: http://purl.org/au-research/grants/arc/DP1093425
Appears in Collections:Electrical and Electronic Engineering publications

Files in This Item:
There are no files associated with this item.


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.