A survey on recent activation functions with emphasis on oscillating activation functions

The neural network learning and generalizing mechanisms of the brain have always intrigued scientists and researchers. Therefore, with the aim to mimic these “physiological workings” of the brain, the concept of using layers of artificial neurons (nodes) to make experienced predictions, accurate decisions, etc (Artificial neural network (ANN)) was introduced [5]. This paper briefly discusses few activation functions that have been used frequently over the years. In addition to that, the recent propositions regarding oscillating activation functions have been dealt with.

[1]  Shiho Kim,et al.  Single Neuron for Solving XOR like Nonlinear Problems , 2022, Computational intelligence and neuroscience.

[2]  Murilo Gustineli A survey on recently proposed activation functions for Deep Learning , 2022, ArXiv.

[3]  Geraldine Bessie Amali,et al.  Biologically Inspired Oscillating Activation Functions Can Bridge the Performance Gap between Biological and Artificial Neurons , 2021, ArXiv.

[4]  Mathew Mithra Noel,et al.  Growing Cosine Unit: A Novel Oscillatory Activation Function That Can Speedup Training and Reduce Parameters in Convolutional Neural Networks , 2021, ArXiv.

[5]  Roberto Prevete,et al.  A survey on modern trainable activation functions , 2020, Neural Networks.

[6]  Tomasz Szandała,et al.  Review and Comparison of Commonly Used Activation Functions for Deep Neural Networks , 2020, Bio-inspired Neurocomputing.

[7]  Diganta Misra,et al.  Mish: A Self Regularized Non-Monotonic Neural Activation Function , 2019, ArXiv.

[8]  Stephen Marshall,et al.  Activation Functions: Comparison of trends in Practice and Research for Deep Learning , 2018, ArXiv.

[9]  Quoc V. Le,et al.  Searching for Activation Functions , 2018, arXiv.

[10]  Guigang Zhang,et al.  Deep Learning , 2016, Int. J. Semantic Comput..

[11]  Yanpeng Li,et al.  Improving deep neural networks using softplus units , 2015, 2015 International Joint Conference on Neural Networks (IJCNN).

[12]  Tianqi Chen,et al.  Empirical Evaluation of Rectified Activations in Convolutional Network , 2015, ArXiv.

[13]  Axel Hutt,et al.  Stimulus Statistics Shape Oscillations in Nonlinear Recurrent Neural Networks , 2015, The Journal of Neuroscience.

[14]  Ehsan Lotfi,et al.  A Novel Single Neuron Perceptron with Universal Approximation and XOR Computation Properties , 2014, Comput. Intell. Neurosci..

[15]  Andrew L. Maas Rectifier Nonlinearities Improve Neural Network Acoustic Models , 2013 .