Information theory with applications

1. Erasures and additive noise. LetX be a vector-valued random variable in Rk with mean zero and covariance matrix CX = I. Assume that X is encoded by an isometry V : Rk → Rn, (V X)j = X, fj with an associated (n, k)-frame {fj}j=1. Now assume that zero-mean noise N : Ω → Rn with covariance CN = σ2I, σ > 0, is added to the encoded vector, and that a one-erasure El, l ∈ {1, 2, . . . n} happens each time a vector is sent, with equal probability for each l, and the input vector, the noise and the erasures are independent of each other. Find an expression for the mean-square error