Intereting Posts

Nice book on integrals
Numerical analysis textbooks and floating point numbers
Convergence of probability measures in total variation and limits of integrals
Find the equation of the tangent line to the curve at the given point. $y = 1+2x-x^3$ at $(1,2)$
In how many ways can you select one of the two but not both?
How to solve $y'''+2y''-y'-2y= e^x+x^2$?
Does convergence in probability imply a.s. convergence in a countable space?
$4494410$ and friends
Rooks Attacking Every Square on a Chess Board
How to transform a general higher degree five or higher equation to normal form?
Inequalities – Absolute Value $|2x-1| \leq |x-3|$
Constructing a NFA for the following language
Deriving the rest of trigonometric identities from the formulas for $\sin(A+B)$, $\sin(A-B)$, $\cos(A+B)$, and $\cos (A-B)$
What is the probability of rolling $n$ dice until each side appears at least once?
Given that a,b,c are distinct positive real numbers, prove that (a + b +c)( 1/a + 1/b + 1/c)>9

**Motivation**

I know that in a **finite dimensional Euclidean space** $\Bbb{E}^n$, for every basis $G=\{g_1,g_2,…,g_n\} \subset \Bbb{E}^n$ we can define a **dual basis** $G’=\{g^1,g^2,…,g^n\} \subset \Bbb{E}^n$ such that

$$g_i \cdot g^j = \delta_{i}^{j} \tag{1}$$

- A problem For the boundary value problem, $y''+\lambda y=0$, $y(-π)=y(π)$ , $y’(-π)=y’(π)$
- If $f(x)$ is continuous and differentiable function , prove that $f(x) \geq e^{3x}, \forall x \geq 0$
- one-dimensional inverse square laws
- What went wrong?
- Why does the midpoint method have error $O(h^2)$
- Exercise from Stein with partial differential operator

Also, it can be proved that such a basis exists and it is unique. The main advantage of dual bases is that when we write an arbitrary vector as a linear combination of the original basis $G$ then we can obtain the coefficients of the linear combination by just using the **orthogonality property** of $G$ and $G’$ like below

$$\begin{align}

x &= \sum_{i=1}^{n}x^i g_i \\

x \cdot g^j &= \sum_{i=1}^{n} x^i g_i \cdot g^j = \sum_{i=1}^{n} x^i \delta_{i}^{j} = x^j \\

x &= \sum_{i=1}^{n}(x \cdot g^i) g_i

\end{align}

\tag{2}$$

Now, let us go into the **infinite dimensional space** of infinitely differentiable functions $f(x)$ over the interval $[-a,a]$. We all know that the eigen-functions of the **Sturm-Liouville** operator can form an **orthonormal** basis for such functions with proper boundary conditions at the end points. However, this is really a nice operator, a **self-adjoint** one which produces orthonormal eigen-functions. In some **Partial Differential Equations (PDEs)**, we encounter **non-self-adjoint** operators and we should expand our solution and boundary data in terms of their eigen-functions which unfortunately are **not** orthogonal anymore!

Just to give an example, consider the following biharmonic boundary value problem (BVP)

$$\begin{array}{lll}

\Delta^2 \Phi=0 & -a \le x \le a & -b \le y \le b \\

\Phi(a,y)=0 & \Phi_x(a,y)=0 & \\

\Phi(-a,y)=0 & \Phi_x(-a,y)=0 & \\

\Phi(x,b)=f(x) & \Phi_y(x,b)=0 & \\

\Phi(x,-b)=f(x) & \Phi_y(x,-b)=0 & \\

\end{array} \tag{3}$$

where we have the symmetry $f(-x)=f(x)$. Also, for the sake of continuity of boundary conditions at the corners we require that

$$f(a)=f(-a)=f'(a)=f'(-a)=0 \tag{4}$$

Then solving this BVP leads to the following eigen-value problem

$$ \left( \frac{d^4}{dx^4}+2\omega^2\frac{d^2}{dx^2}+\omega^4 \right)X(x)=0 \\

X(a)=X(-a)=X'(a)=X'(-a)=0 \tag{5}$$

which has a non-self adjoint operator. The eigen-functions are known as **Papkovich-Fadle** eigen-functions. They can form a basis for the infinite dimensional space of infinitely differentiable functions $f(x)$ satisfying $(4)$ over the interval $[-a,a]$. As I told before, these eigen-functions are not orthogonal and this makes finding the coefficients $c_i$ of the expansions

$$f(x)= \sum_{i=1}^{\infty} c_i X(\omega_i;x) \tag{6}$$

really **difficult** leading to solve an infinite system of linear algebraic equations for the unknown coefficients $c_i$!

**Questions**

$1.$ Is there a **dual basis thing** for the basis $X(\omega_i;x)$’s that can make the computation of $c_i$’s easier? To be more specific, is there some basis $Y(\omega_j;x)$ such that

$$\int_{-a}^{a} X(\omega_i;x) Y(\omega_j;x) dx =\delta_{ij} \tag{7}$$

which can be considered to have the similar role of $g^j$. If such a thing existed then we could compute the $c_i$’s by using $(6)$ and the orthogonality in $(7)$ easily

$$\int_{-a}^{a} f(x) Y(\omega_j;x) dx = \sum_{i=1}^{\infty} c_i \int_{-a}^{a} X(\omega_i;x) Y(\omega_j;x) dx = \sum_{i=1}^{\infty} c_i \delta_{ij} = c_j \tag{8}$$

$2.$ If the answer to question $1$ is **YES**, how it can be computed? If **NO**, Why?

- What are some examples of infinite dimensional vector spaces?
- Hilbert space and its dual
- Show $\{u_n\}$ orthonormal, A compact implies $\|Au_n\| \to 0$
- About the solution of a difference equation
- Normed algebra with multiplicative norm, non-isomorphic to $R$, $C$, $H$.
- Uniqueness of IVP solution with a condition weaker than Lipschitz?
- $\sigma(x)$ has no hole in the algebra of polynomials
- Is there a continuous $f(x,y)$ which is not of the form $f(x,y) = g_1(x) h_1(y) + \dots + g_n(x) h_n(y)$
- Correct spaces for quantum mechanics
- solution of ordinary differential equation $x''(t)+e^{t^2} x(t)=0, for t\in$

Not a concrete answer, but some generic pointers to your problem.

**General Remarks on the dual space of $L^2$**

It seems reasonable to consider your operator as an operator in $\mathcal H= L^2([-a,a]\times[-b,b])$, which is the Hilbert space of all measurable functions where $$\langle f,f\rangle=\int_{[-a,a]\times [-b,b]}f(x,y)\overline{f(x,y)}dx dy < \infty.$$

This Hilbert space has a nice relation to its dual, namely you can identify it with itself:

For $f \in \mathcal H$, we can define an element of the dual via

$$f'(g):= \langle g,f \rangle =\int_{[-a,a]\times [-b,b]}g(x,y)\overline{f(x,y)}dx dy,$$

defined on all $g\in \mathcal H.$

Vice versa, for every $f \in \mathcal H’$ there’s an $f \in \mathcal H$ such that the equation above holds (cf., e.g. Conway, A Course in Functional analysis, 1990, Theorem 3.4 (The Riesz Representation Theorem)).

Note that neither $C^0$ nor the space of piecwise continuous functions equipped with this innner product is a Hilbert space. They are subsets of $L^2$, and thus, their duals are even larger than the dual of $L^2$. (Here, I’m assuming a bounded domain $[-a,a]\times [-b,b]$ and only finitely many jump points for the piecewise functions).

**Remarks on your operator**

Your differential operator (let’s call it $A$) can be considered as an operator in $\mathcal H$. In this case, you need to specify the domain of $A$ (as a subset of $\mathcal H$) such that the functions $f$ are in $\mathcal H$ and $Af$ still lies in $\mathcal H$. The vector space of piece-wise continuous functions is a subspace of $\mathcal H$, but the first derivative doesn’t need to exists. (I’m no PDE expert, but I’d expect some subset of $C^1$ with some additional constraints that the second derivative also exists.

In the case of Sturm-Liouville differential expressions (like $-(pf’)’+qf$), one usually takes functions $f$ which are absolutely continuous (i.e. the first derivative is locally summable), where $pf’$ is absolutely continuous and $-(pf’)’+qf$ lies in $L^2([a,b])$.

The boundary conditions you listed are an additional important restriction of your domain, but it’s important that you state which kind of solutions you’re allowing for your PDE.

**Disclaimer**

This is just presenting a general framework how to treat differential operators in Hilbert spaces. I don’t know if this framework will help you to solve your PDE problem.

- If $p\mid4n!-1$ then $p>n$
- What is the fastest way to find the characteristic polynomial of a matrix?
- What is a great book to read about sequences, sums and products?
- Closed form for coefficients in Multiple Regression model
- Geometric interpretations of matrix inverses
- Show that $\displaystyle\int_{-\pi}^\pi\sin mx\sin nx d x=\begin{cases}0&\text{if }m\neq n,\\\pi&\text{if }m=n\end{cases}$ using integration by parts
- How to prove $\prod _{a=0}^{9}\prod_{b=0}^{100}\prod_{c=0}^{100}(w^a+z^b+z^c)\equiv?\pmod {101}$
- Prove that $\sum_{k=0}^n\frac{1}{k!}\geq \left(1+\frac{1}{n}\right)^n$
- Inequality between two sequences preserved in the limit?
- Question on Functions of Bounded Variation
- How to solve an exponential equation with two different bases: $3^x – 2^x = 5$
- Checking whether a map satisfies being homomorphism
- Proof of Vandermonde Matrix Inverse Formula
- Help understand the proof of infinitely many primes of the form $4n+3$
- Floret Tessellation of a Sphere