Show simple item record

dc.creatorEpitropakis, M. G.en
dc.creatorPlagianakos, V. P.en
dc.creatorVrahatis, M. N.en
dc.date.accessioned2015-11-23T10:26:18Z
dc.date.available2015-11-23T10:26:18Z
dc.date.issued2010
dc.identifier10.1016/j.asoc.2009.08.010
dc.identifier.issn1568-4946
dc.identifier.urihttp://hdl.handle.net/11615/27370
dc.description.abstractIn this paper, we study the class of Higher-Order Neural Networks and especially the Pi-Sigma Networks. The performance of Pi-Sigma Networks is evaluated through several well known Neural Network Training benchmarks. In the experiments reported here, Distributed Evolutionary Algorithms are implemented for Pi-Sigma neural networks training. More specifically the distributed versions of the Differential Evolution and the Particle Swarm Optimization algorithms have been employed. To this end, each processor is assigned a subpopulation of potential solutions. The subpopulations are independently evolved in parallel and occasional migration is employed to allow cooperation between them. The proposed approach is applied to train Pi-Sigma Networks using threshold activation functions. Moreover, the weights and biases were confined to a narrow band of integers, constrained in the range [-32; 32]. Thus, the trained Pi-Sigma neural networks can be represented by using 6 bits. Such networks are better suited than the real weight ones for hardware implementation and to some extend are immune to low amplitude noise that possibly contaminates the training data. Experimental results suggest that the proposed training process is fast, stable and reliable and the distributed trained Pi-Sigma Networks exhibited good generalization capabilities. (C) 2009 Elsevier B.V. All rights reserved.en
dc.sourceApplied Soft Computingen
dc.source.uri<Go to ISI>://WOS:000272206500006
dc.subjectPi-Sigma Networksen
dc.subjectDistributed Differential Evolutionen
dc.subjectDistributeden
dc.subjectParticle Swarm Optimizationen
dc.subjectBack-propagation Neural Networksen
dc.subjectIntegeren
dc.subjectWeight Neural Networksen
dc.subjectThreshold activation functionsen
dc.subject"Hardware-Friendly'' Implementationsen
dc.subject'On-chip' trainingen
dc.subjectHigher-Orderen
dc.subjectNeural Networksen
dc.subjectDIFFERENTIAL EVOLUTIONen
dc.subjectPARTICLE SWARMen
dc.subjectOPTIMIZATIONen
dc.subjectPOLYNOMIALSen
dc.subjectADAPTATIONen
dc.subjectComputer Science, Artificial Intelligenceen
dc.subjectComputer Science,en
dc.subjectInterdisciplinary Applicationsen
dc.titleHardware-friendly Higher-Order Neural Network Training using Distributed Evolutionary Algorithmsen
dc.typejournalArticleen


Files in this item

FilesSizeFormatView

There are no files associated with this item.

This item appears in the following Collection(s)

Show simple item record