Get 20M+ Full-Text Papers For Less Than $1.50/day. Start a 14-Day Trial for You or Your Team.

Learn More →

Multilayered neural architectures evolution for computing sequences of orthogonal polynomials

Multilayered neural architectures evolution for computing sequences of orthogonal polynomials This article presents an evolutionary algorithm to autonomously construct full-connected multilayered feedforward neural architectures. This algorithm employs grammar-guided genetic programming with a context-free grammar that has been specifically designed to satisfy three important restrictions. First, the sentences that belong to the language produced by the grammar only encode all valid neural architectures. Second, full-connected feedforward neural architectures of any size can be generated. Third, smaller-sized neural architectures are favored to avoid overfitting. The proposed evolutionary neural architectures construction system is applied to compute the terms of the two sequences that define the three-term recurrence relation associated with a sequence of orthogonal polynomials. This application imposes an important constraint: training datasets are always very small. Therefore, an adequate sized neural architecture has to be evolved to achieve satisfactory results, which are presented in terms of accuracy and size of the evolved neural architectures, and convergence speed of the evolutionary process. http://www.deepdyve.com/assets/images/DeepDyve-Logo-lg.png Annals of Mathematics and Artificial Intelligence Springer Journals

Multilayered neural architectures evolution for computing sequences of orthogonal polynomials

Loading next page...
 
/lp/springer-journals/multilayered-neural-architectures-evolution-for-computing-sequences-of-lJz0WPoRfa

References (37)

Publisher
Springer Journals
Copyright
Copyright © 2018 by Springer Nature Switzerland AG
Subject
Computer Science; Artificial Intelligence; Mathematics, general; Computer Science, general; Complex Systems
ISSN
1012-2443
eISSN
1573-7470
DOI
10.1007/s10472-018-9601-2
Publisher site
See Article on Publisher Site

Abstract

This article presents an evolutionary algorithm to autonomously construct full-connected multilayered feedforward neural architectures. This algorithm employs grammar-guided genetic programming with a context-free grammar that has been specifically designed to satisfy three important restrictions. First, the sentences that belong to the language produced by the grammar only encode all valid neural architectures. Second, full-connected feedforward neural architectures of any size can be generated. Third, smaller-sized neural architectures are favored to avoid overfitting. The proposed evolutionary neural architectures construction system is applied to compute the terms of the two sequences that define the three-term recurrence relation associated with a sequence of orthogonal polynomials. This application imposes an important constraint: training datasets are always very small. Therefore, an adequate sized neural architecture has to be evolved to achieve satisfactory results, which are presented in terms of accuracy and size of the evolved neural architectures, and convergence speed of the evolutionary process.

Journal

Annals of Mathematics and Artificial IntelligenceSpringer Journals

Published: Sep 18, 2018

There are no references for this article.