Scalable and Highly Available Multi-Objective Neural Architecture Search in Bare Metal Kubernetes Cluster

The interest in deep neural networks for solving computer vision task has dramatically increased. Due to the heavy influence of the neural networks architecture on its predictive accuracy, neural architecture search has gained much attention in recent years. This research area typically implies a high computational burden and thus, requires high scalability as well as availability to ensure no data loss or waist of computational power. Moreover, the thinking of developing applications has changed from monolithic once to microservices. Hence, we developed a highly scalable and available multi-objective neural architecture search and adopted to the modern thinking of developing application by subdividing an already existing, monolithic neural architecture search – based on a genetic algorithm – into microservices. Furthermore, we adopted the initial population creation by 1,000 mutations of each individual, extended the approach by inception layers, implemented it as island model to facilitate scalability and achieved on MNIST, Fashion-MNIST and CIFAR-10 dataset 99.75%, 94.35% and 89.90% test accuracy respectively. Besides, our model is strongly focused on high availability empowered by the deployment in our bare-metal Kubernetes cluster. Our results show that the introduced multi-objective neural architecture search can easily handle even the loss of nodes and proceed the algorithm within seconds on another node without any loss of results or the necessity of human interaction.

[1]  Roland Vollgraf,et al.  Fashion-MNIST: a Novel Image Dataset for Benchmarking Machine Learning Algorithms , 2017, ArXiv.

[2]  Yong Xia,et al.  Autonomous Deep Learning: A Genetic DCNN Designer for Image Classification , 2018, Neurocomputing.

[3]  Marko Lukša Kubernetes in Action , 2018 .

[4]  Sergey Ioffe,et al.  Rethinking the Inception Architecture for Computer Vision , 2015, 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[5]  Michael Heizmann,et al.  Gradient Based Evolution to Optimize the Structure of Convolutional Neural Networks , 2018, 2018 25th IEEE International Conference on Image Processing (ICIP).

[6]  Yann LeCun,et al.  The mnist database of handwritten digits , 2005 .

[7]  Nuno Lourenço,et al.  DENSER: deep evolutionary network structured representation , 2018, Genetic Programming and Evolvable Machines.

[8]  Junjie Yan,et al.  Practical Network Blocks Design with Q-Learning , 2017, ArXiv.

[9]  Andreas Klos,et al.  Compute-Efficient Neural Network Architecture Optimization by a Genetic Algorithm , 2019, ICANN.

[10]  Quoc V. Le,et al.  Large-Scale Evolution of Image Classifiers , 2017, ICML.

[11]  Oriol Vinyals,et al.  Hierarchical Representations for Efficient Architecture Search , 2017, ICLR.

[12]  Darrell Whitley,et al.  The Island Model Genetic Algorithm: On Separability, Population Size and Convergence , 2015, CIT 2015.

[13]  Yutana Jewajinda,et al.  Hybrid Multi-population Evolution based on Genetic Algorithm and Regularized Evolution for Neural Architecture Search , 2020, 2020 17th International Joint Conference on Computer Science and Software Engineering (JCSSE).

[14]  Alejandro Baldominos Gómez,et al.  Evolutionary convolutional neural networks: An application to handwriting recognition , 2017, Neurocomputing.

[15]  Suely Oliveira,et al.  A Scalable System for Neural Architecture Search , 2020, 2020 10th Annual Computing and Communication Workshop and Conference (CCWC).

[16]  Yutana Jewajinda,et al.  A Hyper-parameter Optimization for Deep Neural Network using an Island-based Genetic Algorithm , 2019, 2019 16th International Conference on Electrical Engineering/Electronics, Computer, Telecommunications and Information Technology (ECTI-CON).