Intereting Posts

Evaluating $\int{\frac{1}{\sqrt{x^2-1}(x^2+1)}dx}$
Number of zeros at the end of $k!$
Lower bound for finding second largest element
Prove that B is a basis for a topology
How to move from a right semigroup action to a left semigroup action?
Showing that two maps of the sphere are homotopic if their values are never antipodal
For an outer measure $m^*$, does $m^*(E\cup A)+m^*(E\cap A) = m^*(E)+m^*(A)$ always hold?
How do you calculate how many decimal places there are before the repeating digits, given a fraction that expands to a repeating decimal?
Content of a polynomial
Two-valued measure is a Dirac measure
Why does Intermediate Value Theorem only consider end-points?
Mathmatical Induction
A question about the Zariski topology
Why are groups more important than semigroups?
The strong Markov property with an uncountable index set

Any linear map between two finite-dimensional vector spaces can be represented as a matrix under the bases of the two spaces.

But if one or all of the vector spaces is infinite dimensional, is the linear map still represented as a matrix under their bases?

If there is matrix of infinite dimension, what is it used for if not used as a representation of a linear map between vector spaces?

- Hamel basis for the vector space of real numbers over rational numbers and closedness of the basis under inversion
- Associated elements in a ring
- What does it mean that quaternions/ spinors are negated under a full rotation?
- Given a torsion $R$-module $A$ where $R$ is an integral domain, $\mathrm{Tor}_n^R(A,B)$ is also torsion.
- How to learn commutative algebra?
- Localisation is isomorphic to a quotient of polynomial ring

Thanks and regards!

- Is there any matrix notation the summation of a vector?
- If $G$ is a group, $H$ is a subgroup of $G$ and $g\in G$, is it possible that $gHg^{-1} \subset H$?
- How can we show that $(I-A)$ is invertible?
- $G \cong G \times H$ does not imply $H$ is trivial.
- Show that $A$ is symmetric, with $A \in M_n(\mathbb R)$
- How to define the operation of division apart from the inverse of multiplication?
- On irreducible factors of $x^{2^n}+x+1$ in $\mathbb Z_2$
- Does there exist such an invertible matrix?
- Topology induced by the completion of a topological group
- A proof that BS(1,2) is not polycyclic

Sure, if $T:V\to W$ is a linear transformation between vector spaces $V$ and $W$ with bases $B$ and $C$, respectively, then $T$ can be described in terms of the coordinates with respect to these bases, thus yielding a “matrix”. How closely this relates to the usual notion of matrix depends on the nature of $B$ and $C$. In the usual notion, you take bases that are not only finite, but ordered, so that it makes sense to talk about the 1st row, etc., of the matrix; that is, you make all bases indexed by sets of the form $\{1,2,\ldots,n\}$. The closest to this in the infinite dimensional setting would be to have bases indexed by the positive integers.

More generally, suppose $B=\{v_j\}_{j\in J}$ and $C=\{w_i\}_{i\in I}$, where $I$ and $J$ are sets. Then the matrix of $T$ can be described as a function $M:I\times J\to F$, where $F$ is the base field, by taking $M(i,j)$ to be the coefficient of $w_i$ in the $C$-expansion of $Tv_j$. Such matrices are *column-finite*, in the sense that for each $j\in J$, the set of $i\in I$ such that $M(i,j)\neq0$ is finite. Conversely, each column-finite matrix, in this sense, corresponds uniquely to a linear transformation between $V$ and $W$. Coordinate-wise addition of such matrices corresponds to addition of the linear transformations.

You can also extend multiplication. Suppose that $S:W\to X$ is a linear transformation and that $X$ has basis $D=\{x\_k\}\_{k\in K}$. Let $N:K\times I\to F$ denote the $C$-$D$ matrix of $S$. Then $ST:V\to X$ has $B$-$D$ matrix $NM:K\times J\to F$ defined by

$$(NM)(k,j)=\sum_{i\in I}N(k,i)M(i,j).$$ In particular, note that this sum is always finite because $M$ is column-finite.

Motivated by Calle’s answer, I decided to add a little on a different kind of matrix for continuous linear transformations on Banach spaces with Schauder bases.

If $X$ is an infinite dimensional separable Banach space, then a sequence $(e_n)_{n=1}^\infty$ in $X$ is called a Schauder basis for $X$ if every $x\in X$ has a unique representation $x=\sum_{n=1}^\infty a_ne_n$, the $a_n$ being scalar and the sum being norm convergent. If $X$ and $Y$ are Banach spaces with Schauder bases $(e_n)$ and $(f_n)$ respectively, and if $T:X\to Y$ is a bounded linear operator, then $T$ can be described by a matrix $(a_{ij})_{i,j=1}^\infty$, with $a_{ij}$ being the coefficient of $f_i$ in the $(f_n)$ expansion of $Te_j$. The map from bounded operators to matrices in one-to-one and preserves algebraic structure, but there is typically not any nice description of which matrices correspond to bounded operators.

For example, in a separable Hilbert space any orthonormal basis is a Schauder basis. For maps between Hilbert spaces the coefficients are found as $a_{ij}=\langle Te_j,f_i\rangle$. In $c_0$, the space of sequences converging to $0$ with sup norm, and in $\ell^p$, the sequence space with norm $\|(x_n)_{n=1}^\infty\|_p=(\sum_{n=1}^\infty|x_n|^p)^{1/p}$, the sequence $(e_n)$ such that the $n^\text{th}$ component of $e_n$ is $1$ and all other components are $0$ forms a Schauder basis.

If $c$ is the space of convergent sequences with sup norm, then this will no longer be a Schauder basis, and in particular it is clear that $\sum_{n=1}^\infty x_n e_n$ is not norm convergent unless $\lim_{n\to\infty}x_n=0$. A Schauder basis for $c$ can be obtained by adding $e_0=(1,1,1,\ldots)$. If $(x_n)\in c$ and $x=\lim_n x_n$, then $(x_n)=xe_0 +\sum_{n=1}^\infty(x_n-x)e_n$ is the basis representation. As in Calle’s answer, suppose that $T:c\to c$ is defined by $T(x_1,x_2,x_3,\ldots)=(x,0,0,\ldots)$. Then $T$ has a matrix representation with repsect to $(e_0,e_1,\ldots)$ (but not with respect to $(e_1,e_2,\ldots)$), namely $a_{10}=1$ and all other components are $0$.

Similar to Olod’s warning, such matrices typically play only a marginal role, even in cases where they are guaranteed to exist, like on Hilbert space. Not every separable Banach space has a Schauder basis. Enflo first gave an example of a separable Banach space without the approximation property, which guarantees that it has no Schauder basis.

What, however, must be understood that the role of matrices when one works with linear operators of infinite-dimensional vector spaces is a (very) marginal one. Familiar techniques such as the use of determinants, traces etc. no longer work. For instance, any determinant-like function on $\mathrm{GL}(V)$ where $V$ is an infinite-dimensional vector space over a field is necessarily the trivial one, since any element of $\mathrm{GL}(V)$ is a product of commutators (the group $\mathrm{GL}(V)$ is perfect, in other words; a result by Alex Rosenberg of 1958.)

- Sums of complex numbers – proof in Rudin's book
- Compact form of the series $\sum\limits_{n=-\infty}^{\infty} {1\over (x-n)^2}$
- How to rigorously justify “picking up half a residue”?
- Help me prove this inequality :
- Topics in combinatorial Group Theory
- Two exercises on characters on Marcus (Part 2)
- Different formulations of Class Field Theory
- Prove lower bound $\sum\limits_{k=1}^{n}\frac{1}{\sqrt{n^2+k^2}}\ge\left(1-\frac{1}{n}\right)\ln{(1+\sqrt{2})}+\frac{\sqrt{2}}{2n}$
- Is “being an integral domain” a local property?
- What is $dx$ in integration?
- Why is this weaker then Uniform Integrability?
- What is the simplest way to show that $\cos(r \pi)$ is irrational if $r$ is rational and $r \in (0,1/2)\setminus\{1/3\}$?
- Product of repeated cosec.
- How to prove $\limsup (x_{n}+y_{n})=\lim x_{n}+\limsup y_{n}$?
- Prove $\int_0^{\infty}\! \frac{\mathbb{d}x}{1+x^n}=\frac{\pi}{n \sin\frac{\pi}{n}}$ using real analysis techniques only