An evolutionary algorithm of feature accumulation for the construction of neural networks

Authors

  • V.V. Tiouterev

Keywords:

нейронные сети
построение нейронных сетей
генетические алгоритмы
нейроинформатика

Abstract

A description of a new algorithm for the automatic neural network construction technique that involves positive features of the evolutionary and constructive approaches is presented. The main idea of the algorithm is based on the following: at each step, a neural network is increased by adding blocks of new neurons having a maximal influence on the error reduction. The search for the blocks is implemented with genetic algorithms. The process of constructing a neural network ends up when the network’s error achieves a prescribed value. The efficiency of this algorithm is validated with the use of benchmark problems.


Published

2001-10-21

Issue

Section

Section 2. Programming

Author Biography

V.V. Tiouterev


References

  1. Горбань А.Н., Дунин-Барковский В.Л., Кирдин А.Н. Нейроинформатика. Новосибирск: Наука, 1998 // (http://www.neuropower.de/rus/).
  2. Bishop C.M. Neural network for pattern recognition. Oxford: Oxford University Press, 1997.
  3. Moody J.E., Utans J. Architecture selection strategies for neural networks: application to corporate bond rating prediction // Neural Networks in the Capital Markets. 277-300. Chichester: Wiley, 1995 (ftp://cse.ogi.edu/pub/tech-reports/1994/94-036.ps.gz).
  4. Ripley B.D. Pattern recognition and neural networks. Cambridge: Cambridge University Press, 1997.
  5. Hassibi B., Stork D.G. Second order derivatives for network pruning: optimal brain Surgeon // Neural Information Processing Systems. 1992 (ftp://archive.cis.ohio-state.edu/pub/neuroprose/stork.obs.ps.gz).
  6. Torres-Moreno J.M. Apprentissage et généralisation par des reseaux de neurones : etude de nouveaux algorithmes constructifs: PhD Thesis. De l’Institut National Polytechnique de Grenoblre. Grenoblre, 1997 (ftp://archive.cis.ohio-state.edu/pub/neuroprose/torres.thesis.ps.Z).
  7. Fahlman S.E., Lebiere C. The cascade-correlation learning architecture // Advances in Neural Information Processing II. 1990. 524-532 (ftp://archive.cis.ohio-state.edu/pub/neuroprose/fahlman.cascor-tr.ps.gz).
  8. Горбань А.Н. Обучение нейронных сетей. М.: СП «ParaGraph», 1990.
  9. Murata N., Yoshizawa S., Amari S. Network information criterion - determining the number of hidden units for an artificial neural network model // IEEE Transactions on Neural Networks. 1994. 5, N 6. 865-872 // (http://www.islab.brain.riken.go.jp/~mura/paper/mura94_nic.ps.gz).
  10. Spears W.M., de Jong~A., Back T., Fogel D.B., de Garis H. An overview of evolutionary computation // Proceedings of the 1993 European Conference on Machine Learning. 1993 (http://www.aic.nrl.navy.mil/~spears/papers/ecml93.ps).
  11. Holland J.H. Adaptation in natural and artificial systems. Univ. of Michigan Press, 1992.
  12. Дмитрович А.И. Интеллектуальные информационные системы. Минск: Тетрасистемс, 1997.
  13. Branke J. Evolutionary algorithms for neural network design and training // 1st Nordic Workshop on Genetic Algorithms and its Applications. 1995 (ftp://ftp.aifb.uni-karlsruhe.de/pub/jbr/Vaasa.ps.gz).
  14. Koehn Ph. Combining genetic algorithms and neural networks: the encoding problem: PhD Thesis. The University of Tennessee. Knoxville, 1994 (ftp://archive.cis.ohio-state.edu/pub/neuroprose/koehn.encoding.ps.gz).
  15. Тютерев В.В., Шевелев О.Г. Использование L-грамматик для определения нейронной сети оптимальной топологии методом генетических алгоритмов // IV межвузовская конференция студентов, аспирантов и молодых ученых «Наука и образование». Томск: ТГПУ, 2000.
  16. Siddiqi A.A., Lucas S.M. A comparison of matrix rewriting versus direct encoding for evolving neural networks // Proceedings of Intern. Joint Conf. on Neural Networks’98. Anchorage, 1998.
  17. Gruau F., Whitley D., Pyeatt L. A comparison between cellular encoding and direct encoding for genetic neural networks // Proceedings of the First Genetic Programming Conference. 1996. 81-89.
  18. Jacob W., Rehder M. Evolution of neural net architectures by a hierarchical grammar-based genetic system // Proceedings of the International Joint Conference on Neural Networks and Genetic Algorithms. Innsbruck, 1993.
  19. Gruau F. Cellular encoding of genetic neural networks. Tech. Rep. 92-21, Laboratoire de l’Informatique du Parallelisme, Ecole Normale Superieure de Lyon. Lyon, 1992 (http://www.cwi.nl/~gruau/gruau/RR92-21.ps.Z).
  20. Gronroos M. Evolutionary design of neural networks: PhD Thesis. Department of Mathematical Sciences, University of Turku. Turku, 1998 (http://magi.yok.utu.fi/-Emagi/opinnot/gradu/mscthesis.ps.gz).
  21. Hussain T.S. Modularity within neural networks: PhD Thesis. Department of Computing and Information Sciences, 1995.
  22. Gruau F. Automatic definition of sub-neural networks. Tech. Rep. RR94-28. Laboratoire de l’Informatique du Parallelisme, Ecole Normale Superieure de Lyon. Lyon, 1994.
  23. Boers E., Kuiper H., Happel B., Sprinkhuizen-Kuyper I. Designing modular artificial neural networks. Tech. Rep. 93-24. Department of Computer Science, Leiden University. Leiden, 1993.
  24. Hinton Е., van Camp D. Keeping neural networks simple by minimizing the description length of the weights // Proceedings of the Workshop on Computational Learning Theory. Toronto: Morgan Kaufmann Publishers, 1993 (
  25. Тютерев В.В., Новосельцев В.Б. Метод динамического наращивания узлов как способ построения нейронных сетей эффективного размера // Нейрокомпьютеры: разработка, применение. М.: Радиотехника, 2001. № 2. 3-8.
  26. Тютерев В.В., Новосельцев В.Б. Исследования алгоритма автоматического построения нейронной сети // Исследования по анализу и алгебре. Томск: Томский гос. ун-т. 2001. № 3. 269-281.
  27. Phatak D.S., Koren I. Connectivity and performance tradeoffs in the cascade correlation learning architecture. Tech. Rep. TR-92-CSE-27. ECE Dept., UMASS. Amherst, 1994 // (ftp://archive.cis.ohio-state.edu/pub/neuroprose/phatak.layered-cascor.ps.gz).
  28. Treadgold N.K., Gedeon T.D. Exploring constructive cascade networks // IEEE Transactions on Neural Networks. 1999 (http://www.cse.unsw.edu.au/~nickt/doc/acasper.ps).
  29. Prechelt L. Proben1 - a set of neural network benchmark problems and benchmarking rules. Tech. Rep. 21/94. Fakultat fur Informatik, Universitat Karlsruhe. Karlsruhe, 1994 (ftp://ftp.ira.uka.uka.de/pub/neuron/proben1.tar.gz).
  30. Prechelt L. Investigation of the CasCor family of learning algorithms // Neural Networks. 1997. 10, N 5. 885-896 (ftp://ftp.ira.uka.de/pub/neuron/neurnetw97.ps.gz).
  31. Treadgold N.K., Gedeon T.D. Exploring architecture variations in constructive cascade networks // Proceedings Int. Joint Conf. on Neural Networks. Anchorage, 1998. 343-348 (http://www.cse.unsw.edu.au/~nickt/doc/tower.ps).
  32. Treadgold N.K., Gedeon T.D. A Cascade network algorithm employing progressive RPROP // Int. Conf. on Artificial and Natural Neural Networks. Lanzarote, 1997. 733-742 (http://www.cse.unsw.edu.au/~nickt/doc/casper.ps).
  33. Treadgold N.K., Gedeon T.D. Extending and benchmarking the CasPer algorithm // Australian Conference on Artificial Intelligence. Perth, 1997. 398-406 (http://www.cse.unsw.edu.au/~nickt/doc/casperp_class.ps).
  34. Treadgold N.K., Gedeon T.D. Extending CasPer: a regression survey // Int. Conf. on Neural Information Processing. Dunedin, 1997. 310-313 (http://www.cse.unsw.edu.au/nickt/doc/casperp_reg.ps).
  35. Schiffmann W., Joost M., Werner R. Application of genetic algorithms to the construction of topologies for multilayer perceptrons // Proceedings of Artificial Neural Networks and Genetic Algorithms. Innsbruck, 1993. 675-682.
  36. Salustowicz R. A genetic algorithm for the topological optimization of neural networks: PhD Thesis. Technische Universitat Berlin. Berlin, 1995 (ftp://archive.cis.ohio-state.edu/pub/neuroprose/salustowicz.evnn.ps.gz).
  37. Friedrich C.M., Moraga C. An evolutionary method to find good building-blocks for architectures of artificial neural networks // Proceedings of the Sixth International Conference on Information Processing and Management of Uncertainty in Knowledge-Based Systems (IPMU96). Granada, 1996. 951-956.
  38. Friedrich C.M., Moraga C. Using genetic engineering to find modular structures and activation functions for architectures of artificial neural networks // Computational Intelligence, Theory and Applications. 1997. 150-161.
  39. Fredriksson K. Genetic algorithms and generative encoding of neural networks for some benchmark classification problems // Proceedings of the Third Nordic Workshop on Genetic Algorithms and their Applications. Turku, 1997. 123-~34.
  40. Ragg T., Gutjahr S., Hai Ming Sa. Automatic determination of optimal network topologies based on information theory and evolution // Euromicro ’97, Track on Computational Intelligence. Springbank, 1997 // (ftp://springbank.ira.uka.de/pub/neuro/ragg/euromicro97.ps.gz).
  41. Gruau F. Automatic definition of modular neural networks // Adaptive Behavior. 1995. N 3. 151-183.
  42. Finnoff W., Hergert F., Zimmermann H.G. Improving model selection by nonconvergent methods // Neural Networks. 1993. N 6. 771-783.