S-Orthogonal Projection Operators as Asymptotic Solutions of a Class of Matrix Differential Equations

A class of nonlinear autonomous matrix differential equations is considered. Two special cases of this class yield the differential equations of adaptive network models which were initially introduced in connection with idealized neuron networks. It is shown that these equations exhibit an asymptotic behavior that is intimately related with the matrix form of the Gram-Schmidt orthogonalization algorithm in a real vector space, with respect to an inner product with an arbitrary positive definite symmetric weight matrix S. 1. Introduction. Several applications of pattern recognition, associative memories, and theory of learning systems are concerned with the problem of con- structing projection operators on specified subspaces. Such operators frequently result as asymptotic states or goals of learning in certain adaptive processes (for a review, see (3)). Recently, Kohonen (3), (5) has suggested a dynamical network model whose overall transfer matrix has interesting projection properties. The transfer matrix is governed by the equation d___ ad 2 aaTC 2, >= O, dt where 4 is an n n matrix function of t, a R" is a vector and a > 0 is a scalar. In (5) it is shown for the corresponding difference equation that the solution, starting from a projection matrix, tends asymptotically to a projection matrix on a given subspace, when the input of the network, or vector a, is suitably chosen. In this paper, the above mathematical result is applied to matrix differential equations of a more general type, to appear as equations (12) and (25) in following sections. No physical implementations are here regarded; however, equations of this class may represent the effect of certain variations in the basic network model introduced in (3). It will be shown that the relation between the initial and asymptotic solutions is intimately connected with a step in the matrix form of the Gram-Schmidt orthogonalization algorithm. Thus it will be seen that, in a sense, this class of equa- tions is a continuous counterpart of the Gram-Schmidt algorithm in R" with an inner product having a weight matrix that is either the unit matrix or a positive definite arbitrary matrix. 2. Gram-Schmidt construction of an S-orthogonal basis by means of nonor- thogonal projection matrices. This section is a brief presentation of well-known properties of Gram-Schmidt orthogonalization, put in a form that allows comparison with the asymptotic properties of matrix differential equations under study. Let {aa, a2," "} be an arbitrary sequence of vectors in R". Consider the un- normalized Gram-Schmidt algorithm