Independent component analysis

probability machine-learning

Independent component analysis is a form of linear dimension reduction. The goal of independent component analysis is to form a linear map to features which are independent of one another.

Strictly if you previous features were $X_1, X_2, \ldots, X_n$ and you map to $Y_1, Y_2, \ldots, Y_m$ then we want the following statements about Mutual information:

  • $I(Y_i, Y_j) = 0$ for all $i \not = j$, and
  • To maximise $I(Y,X)$.

This can be used to solve the Cocktail party problem.