Generalized Learning Vector Quantization for Classification in Randomized Neural Networks and Hyperdimensional Computing

Machine learning algorithms deployed on edge devices must meet certain resource constraints and efficiency requirements. Random Vector Functional Link (RVFL) networks are favored for such applications due to their simple design and training efficiency. We propose a modified RVFL network that avoids computationally expensive matrix operations during training, thus expanding the network's range of potential applications. Our modification replaces the least-squares classifier with the Generalized Learning Vector Quantization (GLVQ) classifier, which only employs simple vector and distance calculations. The GLVQ classifier can also be considered an improvement upon certain classification algorithms popularly used in the area of Hyperdimensional Computing. The proposed approach achieved state-of-the-art accuracy on a collection of datasets from the UCI Machine Learning Repository-higher than previously proposed RVFL networks. We further demonstrate that our approach still achieves high accuracy while severely limited in training iterations (using on average only 21% of the least-squares classifier computational costs).

[1]  Evgeny Osipov,et al.  Brain-like classifier of temporal patterns , 2014, 2014 International Conference on Computer and Information Sciences (ICCOINS).

[2]  Alexander Legalov,et al.  Associative synthesis of finite state automata model of a controlled object with hyperdimensional computing , 2017, IECON 2017 - 43rd Annual Conference of the IEEE Industrial Electronics Society.

[3]  Okko Johannes Räsänen,et al.  Modeling Dependencies in Multiple Parallel Data Streams with Hyperdimensional Computing , 2014, IEEE Signal Processing Letters.

[4]  K. Parhi,et al.  Classification Using Hyperdimensional Computing: A Review , 2020, IEEE Circuits and Systems Magazine.

[5]  Tajana Simunic,et al.  FACH: FPGA-based acceleration of hyperdimensional computing by reducing computational complexity , 2019, ASP-DAC.

[6]  Rayan Saab,et al.  Random Vector Functional Link Networks for Function Approximation on Manifolds , 2020, ArXiv.

[7]  Denis Kleyko,et al.  Integer Echo State Networks: Efficient Reservoir Computing for Digital Hardware , 2017, IEEE Transactions on Neural Networks and Learning Systems.

[8]  Mohsen Imani,et al.  VoiceHD: Hyperdimensional Computing for Efficient Speech Recognition , 2017, 2017 IEEE International Conference on Rebooting Computing (ICRC).

[9]  Dmitri A. Rachkovskij,et al.  Representation and Processing of Structures with Binary Sparse Distributed Codes , 2001, IEEE Trans. Knowl. Data Eng..

[10]  Massimo Panella,et al.  Hyperdimensional Computing for Efficient Distributed Classification with Randomized Neural Networks , 2021, 2021 International Joint Conference on Neural Networks (IJCNN).

[11]  Atsushi Sato,et al.  Generalized Learning Vector Quantization , 1995, NIPS.

[12]  HyperEmbed: Tradeoffs Between Resources and Performance in NLP Tasks with Hyperdimensional Computing Enabled Embedding of n-gram Statistics , 2019, 2021 International Joint Conference on Neural Networks (IJCNN).

[13]  Senén Barro,et al.  Do we need hundreds of classifiers to solve real world classification problems? , 2014, J. Mach. Learn. Res..

[14]  Emilio Soria-Olivas,et al.  Hardware implementation methods in Random Vector Functional-Link Networks , 2013, Applied Intelligence.

[15]  P. Kanerva,et al.  Hyperdimensional Computing for Text Classification , 2016 .

[16]  Ross W. Gayler Vector Symbolic Architectures answer Jackendoff's challenges for cognitive neuroscience , 2004, ArXiv.

[17]  Eric A. Weiss,et al.  The Hyperdimensional Stack Machine , 2018 .

[18]  Luca Benini,et al.  PULP-HD: Accelerating Brain-Inspired High-Dimensional Computing on a Parallel Ultra-Low Power Platform , 2018, 2018 55th ACM/ESDA/IEEE Design Automation Conference (DAC).

[19]  Luca Benini,et al.  Efficient Biosignal Processing Using Hyperdimensional Computing: Network Templates for Combined Learning and Classification of ExG Signals , 2019, Proceedings of the IEEE.

[20]  Pentti Kanerva,et al.  Hyperdimensional Computing: An Introduction to Computing in Distributed Representation with High-Dimensional Random Vectors , 2009, Cognitive Computation.

[21]  Friedrich T. Sommer,et al.  Variable Binding for Sparse Distributed Representations: Theory and Applications , 2020, IEEE Transactions on Neural Networks and Learning Systems.

[22]  Evgeny Osipov,et al.  Density Encoding Enables Resource-Efficient Randomly Connected Neural Networks , 2019, IEEE Transactions on Neural Networks and Learning Systems.

[23]  Thomas Villmann,et al.  Efficient Kernelized Prototype Based Classification , 2011, Int. J. Neural Syst..

[24]  Tajana Simunic,et al.  F5-HD: Fast Flexible FPGA-based Framework for Refreshing Hyperdimensional Computing , 2019, FPGA.

[25]  Giovanni De Micheli,et al.  AdaptHD: Adaptive Efficient Training for Brain-Inspired Hyperdimensional Computing , 2019, 2019 IEEE Biomedical Circuits and Systems Conference (BioCAS).

[26]  Yoh-Han Pao,et al.  Stochastic choice of basis functions in adaptive function approximation and the functional-link net , 1995, IEEE Trans. Neural Networks.

[27]  Friedrich T. Sommer,et al.  Perceptron Theory for Predicting the Accuracy of Neural Networks , 2020, ArXiv.

[28]  Jan M. Rabaey,et al.  Vector Symbolic Architectures as a Computing Framework for Nanoscale Hardware , 2021, ArXiv.

[29]  Jan M. Rabaey,et al.  Classification and Recall With Binary Hyperdimensional Computing: Tradeoffs in Choice of Density and Mapping Characteristics , 2018, IEEE Transactions on Neural Networks and Learning Systems.

[30]  Pablo A. Estévez,et al.  A review of learning vector quantization classifiers , 2013, Neural Computing and Applications.

[31]  Masoumeh Haghpanahi,et al.  Cardiologist-level arrhythmia detection and classification in ambulatory electrocardiograms using a deep neural network , 2019, Nature Medicine.

[32]  Teuvo Kohonen,et al.  An introduction to neural computing , 1988, Neural Networks.

[33]  Friedrich T. Sommer,et al.  A Theory of Sequence Indexing and Working Memory in Recurrent Neural Networks , 2018, Neural Computation.

[34]  Dianhui Wang,et al.  Randomness in neural networks: an overview , 2017, WIREs Data Mining Knowl. Discov..

[35]  Özgür Yilmaz,et al.  Symbolic Computation Using Cellular Automata-Based Hyperdimensional Computing , 2015, Neural Computation.

[36]  Jan M. Rabaey,et al.  A Robust and Energy-Efficient Classifier Using Brain-Inspired Hyperdimensional Computing , 2016, ISLPED.

[37]  Koby Crammer,et al.  Margin Analysis of the LVQ Algorithm , 2002, NIPS.

[38]  Daswin De Silva,et al.  Integer Self-Organizing Maps for Digital Hardware , 2019, 2019 International Joint Conference on Neural Networks (IJCNN).

[39]  Mohamed Medhat Gaber,et al.  Edge Machine Learning: Enabling Smart Internet of Things Applications , 2018, Big Data Cogn. Comput..

[40]  Denis Kleyko,et al.  End to End Binarized Neural Networks for Text Classification , 2020, SUSTAINLP.

[41]  Michael Biehl,et al.  Learning Vector Quantization: generalization ability and dynamics of competing prototypes , 2007, Similarity-based Clustering and its Application to Medicine and Biology.

[42]  Elad Hoffer,et al.  Fix your classifier: the marginal value of training the last weight layer , 2018, ICLR.

[43]  Frank Rosenblatt,et al.  PRINCIPLES OF NEURODYNAMICS. PERCEPTRONS AND THE THEORY OF BRAIN MECHANISMS , 1963 .

[44]  David Patterson,et al.  Benchmarking TinyML Systems: Challenges and Direction , 2020, ArXiv.

[45]  Luca Benini,et al.  Hardware Optimizations of Dense Binary Hyperdimensional Computing: Rematerialization of Hypervectors, Binarized Bundling, and Combinational Associative Memory , 2018, ACM J. Emerg. Technol. Comput. Syst..

[46]  Nikolaos Papakonstantinou,et al.  Hyperdimensional Computing in Industrial Systems: The Use-Case of Distributed Fault Isolation in a Power Plant , 2018, IEEE Access.

[47]  Daswin De Silva,et al.  Trajectory clustering of road traffic in urban environments using incremental machine learning in combination with hyperdimensional computing , 2019, 2019 IEEE Intelligent Transportation Systems Conference (ITSC).

[48]  Jan M. Rabaey,et al.  Brain-inspired computing exploiting carbon nanotube FETs and resistive RAM: Hyperdimensional computing case study , 2018, 2018 IEEE International Solid - State Circuits Conference - (ISSCC).

[49]  A. Kai Qin,et al.  A novel kernel prototype-based learning algorithm , 2004, Proceedings of the 17th International Conference on Pattern Recognition, 2004. ICPR 2004..

[50]  Todorka Kovacheva,et al.  LINEAR CLASSIFIERS BASED ON BINARY DISTRIBUTED REPRESENTATIONS , 2007 .

[51]  Luca Benini,et al.  Hyperdimensional biosignal processing: A case study for EMG-based hand gesture recognition , 2016, 2016 IEEE International Conference on Rebooting Computing (ICRC).

[52]  Tajana Simunic,et al.  Efficient human activity recognition using hyperdimensional computing , 2018, IOT.

[53]  Denis Kleyko,et al.  Autoscaling Bloom filter: controlling trade-off between true and false positives , 2017, Neural Computing and Applications.

[54]  Chee Kheong Siew,et al.  Extreme learning machine: Theory and applications , 2006, Neurocomputing.

[55]  Marvin Minsky,et al.  Perceptrons: An Introduction to Computational Geometry , 1969 .

[56]  Oren Etzioni,et al.  Green AI , 2019, Commun. ACM.

[57]  Valeriy Vyatkin,et al.  Distributed Representation of n-gram Statistics for Boosting Self-organizing Maps with Hyperdimensional Computing , 2019, Ershov Informatics Conference.