On the Decomposition of Square Matrices

An extracurricular paper I wrote for an introductory Linear Algebra course. Matrix decompositions are invaluable tools for compression, which can have extraordinary consequences.

On the Decomposition of Square Matrices

The Signal and the Noise

The idea that complex systems often contain dominant low-dimensional patterns is a very powerful one. Extracting this underlying 'signal' from the noise is extremely valuable. Matrix decompositions provide methods to compress the information within tabular data, giving rise to efficient sensing, compact representations for modeling and control, and machine learning. Some, such as Marcus Hutter, believe that compression is intelligence.

Abstract: This paper provides a brief overview of some of the requirements, computational procedures, benefits and applications of the decomposition (i.e. factorization) of square matrices into the products of several simpler matrices. Some of the factorizations discussed are LU decomposition, Symmetric Eigenvalue decomposition, Jordan decomposition, and the Singular Value Decomposition. An important takeaway is that these decompositions are related by their form with progressively demanding requirements and constructions.

"In the language of Computer Science, the expression of [a matrix] A as a product amounts to a pre-processing of the data in A, organizing that data into two or more parts whose structures are more useful in some way, perhaps more accessible for computation" - David C. Lay