Let $f\colon [a,b] \to \mathbb R$ be of bounded variation. Must it be the case that $|\int_a ^b f’ (x) |\leq |TV(f)|$, where $TV(f)$ is the total variation of $f$ over $[a,b]$? If so, how can one prove this?
In the standard proof of the monotone differentiation theorem, it is shown tat this holds for increasing functions: if $f$ is increasing, then $\int_a ^b f'(x) \leq f(b) – f(a) = TV(f)$. I am trying to generalize this to functions of bounded variation.
The answer to the question posed here as to whether, for a function $f$ of bounded variation on an interval, $$\left|\int_a^b f'(x)\,dx\right| \leq
\int_a^b |f'(x)|\,dx \leq V(f,[a,b]) $$ is of course yes and can be found in numerous textbooks. I don’t think I need to list them. Rather more interesting is the further generalization. If $f$ is not absolutely continuous then you would have strict inequality with the variation. But what explains the difference in the values?
This was nicely done years ago by De La Vallee Poussin. His formula looks like this $$V_f(E) = V_f(E_\infty) + \int_E |f'(x)|\,dx$$ where $V_f$ is a measure that describes the total variation of $f$ on a set $E$ and $E_\infty$ are the points in $E$ at which $f'(x)=\pm\infty$. See Saks Theory of the Integral Chapter 4, Section 9 for a classical write-up of these ideas.
In terms of the title of this question (Integral of the derivative of a function of bounded variation) I thought this might be worth mentioning as otherwise someone randomly reading topics would find little else of interest here.