Statistical approach to the Jutten-Hérault algorithm
暂无分享,去创建一个
One year ago, the Jutten-Herault (JH) network was the only existing tool for recovering p stochastic “source” processes from an unknown mixture [1]. With guidance provided by neurosciences analogies, the unsupervised learning JH algorithm has been adjusted and implemented on an array of p linear neurons totally interconnected [0] [1]. Because of its numerous applications ranging from image processing to antenna array processing, the JH algorithm received much attention during the last few years [2], but no rigorous derivation has been proposed to date. We attempt in this paper to analyze it from a statistics point of view. For instance, it could be shown that the updating term of the synaptic efficacies matrix, δC, cannot be the gradient of a single C2 functional contrary to what is sometimes understood. In fact, we show that the JH algorithm is actually searching common zeros of p functionals by a technique of Robbins-Monro type.
[1] Lennart Ljung,et al. Theory and Practice of Recursive Identification , 1983 .
[2] Christian Jutten,et al. Analog implementation of a permanent unsupervised learning algorithm , 1989, NATO Neurocomputing.