Is there a classic Matrix Algebra reference?

I’m looking for a classic matrix algebra reference, either introductory or advanced.

In fact, I’m looking for ways to factorize elements of a matrix, and its appropriate determinant implications.

Your help is greatly appreciated.

Solutions Collecting From Web of "Is there a classic Matrix Algebra reference?"

F. R. Gantmacher’s The Theory of Matrices (2 Volumes)(1959), AMS Chelsea Publishing (trans. K.A. Hirsch), is certainly a classic treatise. I find it useful on occasion for its discussion of Lyapunov stability and eigenvalue/root location via Routh-Hurwitz (vol. 2), but the basics are well-covered in vol. 1.

A bit expensive to buy new, but worth your while keeping an eye out for used copies.

Golub and Van Loan‘s Matrix Computations is kind of a standard reference, but it is actually more oriented towards numerical linear algebra, with a strong emphasize of algorithmic questions, though not without extensive analyses of their theoretical foundations.

Horn and Johnson’s Matrix Analysis is also widely used as a reference. Recently I came across Harville’s Matrix algebra from a statistician’s perspective, and it’s pretty useful too (especially for vec operator and matrix derivatives).

I think Halmos: “Finite Dimensional Vector Spaces” is regarded as a classic by many, though it has a more general approach, i.e. what many people might call “Linear Algebra done right” or something along those lines
.