I have matrices for my syllabus but I don’t know where they find their use. I even asked my teacher but she also has no answer. Can anyone please tell me where they are used?
And please also give me an example of how they are used?
I work in the field of applied math, so I will give you the point of view of an applied mathematician.
I do numerical PDEs. Basically, I take a differential equation (an equation whose solution is not a number, but a function, and that involves the function and its derivatives) and, instead of finding an analytical solution, I try to find an approximation of the value of the solution at some points (think of a grid of points). It’s a bit more deep than this, but it’s not the point here. The point is that eventually I find myself having to solve a linear system of equations which usually is of huge size (order of millions). It is a pretty huge number of equations to solve, I would say.
Where do matrices come into play? Well, as you know (or maybe not, I don’t know) a linear system can be seen in matrix-vector form as
$$\text{A}\underline{x}=\underline{b}$$
where $\underline{x}$ contains the unknowns, A the coefficients of the equations and $\underline{b}$ contains the values of the right hand sides of the equations.
For instance for the system
$$\begin{cases}2x_1+x_2=3\\4x_1-x_2=1\end{cases}$$
we have
$$\text{A}=\left[
\begin{array}{cc}
2 & 1\\
4 & -1
\end{array}
\right],\qquad \underline{x}=
\left[\begin{array}{c}
x_1\\
x_2
\end{array}
\right]\qquad \underline{b}=
\left[\begin{array}{c}
3\\
1
\end{array}
\right]$$
For what I said so far, in this context matrices look just like a fancy and compact way to write down a system of equations, mere tables of numbers.
However, in order to solve this system fast is not enough to use a calculator with a big RAM and/or a high clock rate (CPU). Of course, the more powerful the calculator is, the faster you will get the solution. But sometimes, faster might still mean days (or more) if you tackle the problem in the wrong way, even if you are on a Blue Gene.
So, to reduce the computational costs, you have to come up with a good algorithm, a smart idea. But in order to do so, you need to exploit some property or some structure of your linear system. These properties are encoded somehow in the coefficients of the matrix A. Therefore, studying matrices and their properties is of crucial importance in trying to improve linear solvers efficiency. Recognizing that the matrix enjoys a particular property might be crucial to develop a fast algorithm or even to prove that a solution exists, or that the solution has some nice property.
For instance, consider the linear system
$$\left[\begin{array}{cccc}
2 & -1 & 0 & 0\\
-1 & 2 & -1 & 0\\
0 & -1 & 2 & -1\\
0 & 0 & -1 & 2
\end{array}
\right]
\left[
\begin{array}{c}
x_1\\
x_2\\
x_3\\
x_4
\end{array}
\right]=
\left[
\begin{array}{c}
1\\
1\\
1\\
1
\end{array}
\right]$$
which corresponds (in equation form) to
$$\begin{cases}
2x_1-x_2=1\\
-x_1+2x_2-x_3=1\\
-x_2+2x_3-x_4=1\\
-x_3+2x_4=1
\end{cases}$$
Just giving a quick look to the matrix, I can claim that this system has a solution and, moreover, the solution is non-negative (meaning that all the components of the solution are non-negative). I’m pretty sure you wouldn’t be able to draw this conclusion just looking at the system without trying to solve it. I can also claim that to solve this system you need only 25 operations (one operation being a single addition/subtraction/division/multiplication). If you construct a larger system with the same pattern (2 on the diagonal, -1 on the upper and lower diagonal) and put a right hand side with only positive entries, I can still claim that the solution exists and it’s positive and the number of operations needed to solve it is only $8n-7$, where $n$ is the size of the system.
Moreover, people already pointed out other fields where matrices are important bricks and plays an important role. I hope this thread gave you an idea of why it is worth it to study matrices. =)
Matrices are a useful way to represent, manipulate and study linear maps between finite dimensional vector spaces (if you have chosen basis).
Matrices can also represent quadratic forms (it’s useful, for example, in analysis to study hessian matrices, which help us to study the behavior of critical points).
So, it’s a useful tool of linear algebra.
Moreover, linear algebra is a crucial tool in math.
To convince yourself, there are a lot of linear problems you can study with little knowledge in math. For examples, system of linear equations, some error-correcting codes (linear codes), linear differential equations, linear recurrence sequences…
I also think that linear algebra is a natural framework of quantum mechanics.
Graph Theory –loosely, the study of connect-the-dot figures– uses matrices to encode adjacency and incidence structures. More than simply bookkeeping, however, the matrices have computational uses. From powers of the adjacency matrix, for a simple example, one can read the number of available paths between any two dots.
“Spectral” Graph Theory derives graph-theoretical information from matrix-theoretical results (specifically, “eigenvalues” and “eigenvectors” –by the way, the set of eigenvalues is the “spectrum” of a matrix, hence “spectral”– which come from the linear map interpretation of matrices). My own work generates coordinates for “symmetric” geometric realizations of graphs –think Platonic and Archimedean solids– from this kind of analysis of their adjacency matrices.
Matrices are a useful tool for studying finite groups. Every finite group has a representation as a set of invertible matrices; the study of such representations is called, well, Representation Theory.
One of the major theorems of all time in finite group theory is the classification of all finite simple groups. These are the building blocks of group theory, the group-theoretic version of prime numbers. The “proof” took scores of mathematicians many decades, and could not have been completed without viewing these groups as groups of matrices. One just has to open the ATLAS of Finite Groups, or wonder what a group of Lie type is, to get my point!
(Of course, linear algebra is exceptionally useful etc. etc. but that is a topic better covered by an engineer…)
Matrices are used very often in 3D geometry (e.g. computer graphics) and are very powerful. A simple 4×4 matrix can represent a lot of transformations at once (translation, rotation, scaling, perspective/orthogonal projection). You can then multiply a 3D position vector (x, y, z, 1) by this matrix to obtain a new position with all the trasformations applied. Notice that this vector is also a 1×4 matrix (although the position is in 3D, the fourth component is added to make the multiplication possible and allow for the projection transformation, if you want to know more read about homogeneous coordinates). Similar ideas can be used in 2D or even in higher dimensions like 4D.
I never fully got matrices until I left university. I am glad I understand now.
An example:
A good quality camera will save the captured image uncorrected, along with a 3×3 colour correction matrix. Your computer will multiply this with the colour correction matrix of your display, and then by every pixel in the image before putting it on your display. The computer will use a different display matrix for the printer (as it is a different display).
Look at several real world examples. Experiment with colour or 2D/3D transformations, they are fun and visual (if you are a visual person). 2D is easiest and most visual.
Matrices can represent Markov Chains. They provide a means for the tabular representation of data. Their utility in mathematics and computing is huge.
In the most general sense, matrices (and a very important special case of matrices, vectors) provide a way to generalize from single variable equations to equations with arbitrarily many variables. Some of the rules change along the way, hence the importance of learning about matrices – more precisely, learning Linear Algebra, or the algebra of matrices.
I know and use matrices for two things: systems of equations and holding data in programming.
As @bartgol said, matrices in math are useful for solving systems of equations. You arrange all the equations in standard form and make a matrix of their coefficients, making sure to use 0s as placeholders (like if there isn’t an x term). We call this matrix A. Then make a second matrix of the constants and call it B. It will be one term wide (long).
Plug these into your calculator and then evaluate (A^-1)*(B). The resulting matrix, if there is a solution, will solve for each variable. The first row is x, the second y and so on.
I’ve also used two-dimensional arrays in programming (C++, Java) to help store information that just makes sense to be in matrix-form. For example: one program generated magic squares which were stored in a matrix. Another used a 3-by-3 to keep track of spaces on a tic-tac-toe board.
Matrices are used in engineering, physics, computer science, and other applications of mathematics.
We are forever indebted to the mathematicians that made matrix algebra possible…
Real world applications of matrices make them extremely important and include the some of following I’ve had some experience with… (there are so many more!)
Robotics / Kinematics – matrices allow rotations, translations through planes to be easily calculated
Betting – matrices allow complex dutching/betting combinations without separate formulae such as multiple complex simultaneous equations.
Data Mining – Most data mining software use matrices to calculate the algorithms as it’s fundamental to this field of mathematics both in the theory and the handling of data.
Graphics/Gaming – Anything from particle collision to ray tracing use matrices
All posts I’ve read so far have valid uses of matrices and so many more that I couldn’t even comprehend…