Representation learning using deep random vector functional link networks for clustering

[1]  Shaoyi Du,et al.  Hypergraph Learning: Methods and Practices , 2020, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[2]  Laurent Jacques,et al.  The Separation Capacity of Random Neural Networks , 2021, ArXiv.

[3]  Ponnuthurai N. Suganthan,et al.  On the origins of randomization-based feedforward neural networks , 2021, Appl. Soft Comput..

[4]  P. N. Suganthan,et al.  Random Vector Functional Link Neural Network based Ensemble Deep Learning , 2019, ArXiv.

[5]  Pradipta Kishore Dash,et al.  Real-time Energy Management for PV-battery-wind based microgrid using on-line sequential Kernel Based Robust Random Vector Functional Link Network , 2021, Applied Soft Computing.

[6]  Barenya Bikash Hazarika,et al.  Modelling and forecasting of COVID-19 spread using wavelet-coupled random vector functional link networks , 2020, Applied Soft Computing.

[7]  Rayan Saab,et al.  Random Vector Functional Link Networks for Function Approximation on Manifolds , 2020, ArXiv.

[8]  Guang-Bin Huang,et al.  Clustering via Adaptive and Locality-constrained Graph Learning and Unsupervised ELM , 2020, Neurocomputing.

[9]  P. Suganthan,et al.  Stacked Autoencoder Based Deep Random Vector Functional Link Neural Network for Classification , 2019, Appl. Soft Comput..

[10]  Fakhri Karray,et al.  Eigenvalue and Generalized Eigenvalue Problems: Tutorial , 2019, ArXiv.

[11]  Zhiping Lin,et al.  An adaptive graph learning method based on dual data representations for clustering , 2018, Pattern Recognit..

[12]  Ponnuthurai N. Suganthan,et al.  Ensemble incremental learning Random Vector Functional Link network for short-term electric load forecasting , 2018, Knowl. Based Syst..

[13]  Le Zhang,et al.  An ensemble of decision trees with random vector functional link networks for multi-class classification , 2017, Appl. Soft Comput..

[14]  P. N. Suganthan,et al.  Benchmarking Ensemble Classifiers with Novel Co-Trained Kernal Ridge Regression and Random Vector Functional Link Ensembles [Research Frontier] , 2017, IEEE Computational Intelligence Magazine.

[15]  Feiping Nie,et al.  The Constrained Laplacian Rank Algorithm for Graph-Based Clustering , 2016, AAAI.

[16]  Feiping Nie,et al.  Clustering and projected clustering with adaptive neighbors , 2014, KDD.

[17]  Cheng Wu,et al.  Semi-Supervised and Unsupervised Extreme Learning Machines , 2014, IEEE Transactions on Cybernetics.

[18]  Emilio Corchado,et al.  A survey of multiple classifier systems as hybrid systems , 2014, Inf. Fusion.

[19]  Yong Yu,et al.  Robust Recovery of Subspace Structures by Low-Rank Representation , 2010, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[20]  Sandro Vega-Pons,et al.  A Survey of Clustering Ensemble Algorithms , 2011, Int. J. Pattern Recognit. Artif. Intell..

[21]  Mohamed S. Kamel,et al.  On voting-based consensus of cluster ensembles , 2010, Pattern Recognit..

[22]  Mohamed S. Kamel,et al.  Cumulative Voting Consensus Method for Partitions with Variable Number of Clusters , 2008, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[23]  Kagan Tumer,et al.  Classifier ensembles: Select real-world applications , 2008, Inf. Fusion.

[24]  Ulrike von Luxburg,et al.  A tutorial on spectral clustering , 2007, Stat. Comput..

[25]  Sergei Vassilvitskii,et al.  k-means++: the advantages of careful seeding , 2007, SODA '07.

[26]  Mikhail Belkin,et al.  Manifold Regularization: A Geometric Framework for Learning from Labeled and Unlabeled Examples , 2006, J. Mach. Learn. Res..

[27]  Chee Kheong Siew,et al.  Extreme learning machine: Theory and applications , 2006, Neurocomputing.

[28]  Ludmila I. Kuncheva,et al.  Moderate diversity for better cluster ensembles , 2006, Inf. Fusion.

[29]  Ann B. Lee,et al.  Geometric diffusions as a tool for harmonic analysis and structure definition of data: diffusion maps. , 2005, Proceedings of the National Academy of Sciences of the United States of America.

[30]  Ulrike von Luxburg,et al.  Limits of Spectral Clustering , 2004, NIPS.

[31]  William F. Punch,et al.  A Comparison of Resampling Methods for Clustering Ensembles , 2004, IC-AI.

[32]  Mikhail Belkin,et al.  Laplacian Eigenmaps for Dimensionality Reduction and Data Representation , 2003, Neural Computation.

[33]  D. Donoho,et al.  Hessian eigenmaps: Locally linear embedding techniques for high-dimensional data , 2003, Proceedings of the National Academy of Sciences of the United States of America.

[34]  Michael I. Jordan,et al.  On Spectral Clustering: Analysis and an algorithm , 2001, NIPS.

[35]  Mikhail Belkin,et al.  Laplacian Eigenmaps and Spectral Techniques for Embedding and Clustering , 2001, NIPS.

[36]  Dejan J. Sobajic,et al.  Learning and generalization characteristics of the random vector Functional-link net , 1994, Neurocomputing.

[37]  Andrew R. Barron,et al.  Universal approximation bounds for superpositions of a sigmoidal function , 1993, IEEE Trans. Inf. Theory.

[38]  Y. Takefuji,et al.  Functional-link net computing: theory, system architecture, and functionalities , 1992, Computer.

[39]  George Cybenko,et al.  Approximation by superpositions of a sigmoidal function , 1989, Math. Control. Signals Syst..

[40]  Kurt Hornik,et al.  Multilayer feedforward networks are universal approximators , 1989, Neural Networks.

[41]  J. Munkres ALGORITHMS FOR THE ASSIGNMENT AND TRANSIORTATION tROBLEMS* , 1957 .