Automatic Generation of Neural Networks with Structured Grammatical Evolution

The effectiveness of Artificial Neural Networks (ANNs) depends on a non-trivial manual crafting of their topology and parameters. Typically, practitioners resort to a time consuming methodology of trial-and-error to find and/or adjust the models to solve specific tasks. To minimise this burden one might resort to algorithms for the automatic selection of the most appropriate properties of a given ANN. A remarkable example of such methodologies is Grammar-based Genetic Programming. This work analyses and compares the use of two grammar-based methods, Grammatical Evolution (GE) and Structured Grammatical Evolution (SGE), to automatically design and configure ANNs. The evolved networks are used to tackle several classification datasets. Experimental results show that SGE is able to automatically build better models than GE, and that are competitive with the state of the art, outperforming hand-designed ANNs in all the used benchmarks.

 

Neural Networks Structured Grammatical Evolution (NN-SGE)

 

NN-SGE is our proposal for the evolution of ANNs using SGE. Because of the higher locality and lower redundancy of SGE we hypothesise that NN-SGE is not only capable of evolving the structure, i.e., topology of ANNs, but also its real-valued parameters, such as weights and bias. Following, we introduce the implementation details that differ from the standard SGE implementation.

 

Mutation

 

In the standard SGE implementation the probability of mutating each gene is the same, and equal to 1/n, where n is the number of genes (which equals the number of non-terminal symbols). However, the number of integers in each gene is not the same. Thus, we propose a roulette-like approach for selecting the gene to be mutated, being the probability of choosing each gene proportional to its number of integers.

 

Crossover

 

We rely on one-point crossover to combine two parents. Note that in SGE the genetic material that is swapped corresponds to genes and as such not directly to the integers, i.e., the genetic information that is swapped consists of lists of integers encoding the expansions of a given non-terminal symbols.

 

Fitness Evaluation

 

The performance of each ANN is measured by the Root Mean Square Error (RMSE) obtained while solving a classification task. To avoid overfitting and to tackle unbalanced datasets we consider the RMSE per class, and the fitness function is the multiplication of the exponential values of the multiple RMSEs per class. This way, higher errors are more penalised than lower ones.

 

Evolving Artificial Neural Networks

 

The topology and parameters of ANNs are evolved for the classification of four problems: Flame, Wisconsin Breast Cancer Detection, Ionosphere and Sonar. The used grammar is represented in Figure 1.

 

Grammar
Figure 1

Grammar used for the conducted experiments. n represents the number of features of the problem.

 

Results show that using NN-SGE it is possible to successfully evolve ANNs for the considered problems. NN-SGE consistently presents mean fitness values below GE (e.g., Figure 2), and the differences are statistically significant in the majority of the cases. Concerning the comparison with state-of-the-art GGP approaches to evolve ANNs, NN-SGE is able to evolve solutions that surpass the ones achieved by previous works. Tsoulos et al. report an accuracy of 0.9544 in the WDBC dataset and 0.9034 in the Ionosphere. The results are incomplete, since they only show the results attained by the best networks. NN-SGE mean accuracy values for WDBC and Ionosphere are 0.93 and 0.87, respectively. However, the best found networks have test accuracies up to 0.97 and 0.96 for the WDBC and Ionosphere, respectively. Soltanian et al. report an average test accuracy of 0.899 in the Ionosphere benchmark. Despite lower than NN-SGE, the authors use backpropagation to finetune the weights of the evolved networks and longer experiments (in terms of needed evaluations). Later, we show that by fine tuning the best evolved networks of each run similar results are achieved. More recently, Ahmadizar et al. combined a GA with GE, obtaining a RMSE of 0.4398 and an accuracy of 0.7201 in the Sonar benchmark and 0.8694 in the ionosphere dataset. The proposed evolutionary approach takes into account the generalisation ability of the evolved networks, by measuring the classification accuracy on the testing set, which is considered in the fitness function. By doing so, the evolutionary engine is provided with all the dataset information, and no data is kept aside from the evolutionary process. Thus, this results should be compared with our training ones, which are superior.

 

Fitness
Figure 2

Evolution of the fitness of the best individuals across 25000 evaluations for the sonar dataset. Results are averages of 30 independent runs.

 

Analysing the Evolved Networks

 

To better understand the impact of evolving the weights and topology of an ANN we fine-tune the networks built by SGE using backpropagation (BP) in two different scenarios: (i) considering the evolved topology and the weights as the initial weights for the BP algorithm; and (ii) considering the topology but not the evolved weights. In addition, we also compare the obtained networks with the results obtained by training Fully Connected Neural Networks (FCNNs) with one hidden-layer and number of nodes equal to the ones used by the best networks evolved using NN-SGE.

The experimental results demonstrate that networks which are evolved using NN-SGE attain performances better than the ones achieved by hand-crafted ANNs. More interestingly, when using BP to fine-tune the weights found during evolution, it is showed that if only the topology is considered networks perform worse than when considering both the evolved topology and weights, i.e., seeding the BP algorithm with the found weights speeds up convergence, leading to a faster and more efficient learning.

 

In Proceedings

  • F. Assunção, N. Lourenço, P. Machado, and B. Ribeiro, “Automatic Generation of Neural Networks with Structured Grammatical Evolution,” in 2017 IEEE Congress on Evolutionary Computation (CEC), 2017.

Author

Filipe Assunção

Nuno Lourenço

Bernardete Ribeiro