The first two theorems of this section constitute the
basic “techniques of integration” taught in a calculus course.However, the careful formulations of these standard methods of evaluating integrals
have some subtle points, i.e., some hypotheses.Calculus students are rarely told about these details.
The first two theorems of this section constitute the
basic “techniques of integration” taught in a calculus course.However, the careful formulations of these standard methods of evaluating integrals
have some subtle points, i.e., some hypotheses.Calculus students are rarely told about these details.
Let
$f$ and
$g$ be integrable functions on
$[a,b],$ and as usual let
$F$ and
$G$ denote the functions defined by
$$F\left(x\right)={\int}_{a}^{x}f,\phantom{\rule{4.pt}{0ex}}\text{and}\phantom{\rule{4.pt}{0ex}}G\left(x\right)={\int}_{a}^{x}g.$$
Then
$${\int}_{a}^{b}fG=[F\left(b\right)G\left(b\right)-F\left(a\right)G\left(a\right)]-{\int}_{a}^{b}Fg.$$
Or, recalling that
$f={F}^{\text{'}}$ and
$g={G}^{\text{'}},$
$${\int}_{a}^{b}{F}^{\text{'}}G=[F\left(b\right)G\left(b\right)-F\left(a\right)G\left(a\right)]-{\int}_{a}^{b}F{G}^{\text{'}}.$$
- Prove the preceding theorem.
HINT: Replace the upper limit
$b$ by a variable
$x,$ and differentiate both sides.
By the way, how do we know that the functions
$Fg$ and
$fG$ are integrable?
- Suppose
$f$ and
$g$ are integrable functions on
$[a,b]$ and that
both
${f}^{\text{'}}$ and
${g}^{\text{'}}$ are continuous on
$(a,b)$ and integrable on
$[a,b].$ (Of course
${f}^{\text{'}}$ and
${g}^{\text{'}}$ are not even defined at the endpoints
$a$ and
$b,$ but they can still be integrable on
$[a,b].$ See
the remark following
[link] .)
Prove that
$${\int}_{a}^{b}f{g}^{\text{'}}=[f\left(b\right)g\left(b\right)-f\left(a\right)g\left(a\right)]-{\int}_{a}^{b}{f}^{\text{'}}g.$$
Integration by substitution
Let
$f$ be a continuous function on
$[a,b],$ and suppose
$g$ is a continuous, one-to-one function from
$[c,d]$ onto
$[a,b]$ such that
$g$ is continuously differentiable on
$(c,d),$ and such that
$a=g\left(c\right)$ and
$b=g\left(d\right).$ Assume finally that
${g}^{\text{'}}$ is integrable on
$[c,d].$ Then
$${\int}_{a}^{b}f\left(t\right)\phantom{\rule{0.166667em}{0ex}}dt={\int}_{c}^{d}f\left(g\left(s\right)\right){g}^{\text{'}}\left(s\right)\phantom{\rule{0.166667em}{0ex}}ds.$$
It follows from our assumptions that the function
$f\left(g\left(s\right)\right){g}^{\text{'}}\left(s\right)$ is continuous on
$(a,b)$ and integrable on
$[c,d].$ It also follows from our assumptions that
$g$ maps the open interval
$(c,d)$ onto the open interval
$(a,b).$ As usual, let
$F$ denote the function on
$[a,b]$ defined by
$F\left(x\right)={\int}_{a}^{x}f\left(t\right)\phantom{\rule{0.166667em}{0ex}}dt.$ Then, by part (2) of the Fundamental Theorem,
$F$ is differentiable on
$(a,b),$ and
${F}^{\text{'}}=f.$ Then, by the chain rule,
$F\circ g$ is continuous and differentiable on
$(c,d)$ and
$${(F\circ g)}^{\text{'}}\left(s\right)={F}^{\text{'}}\left(g\left(s\right)\right){g}^{\text{'}}\left(s\right)=f\left(g\left(s\right)\right){g}^{\text{'}}\left(s\right).$$
So, by part (3) of the Fundamental Theorem, we have that
$$\begin{array}{ccc}\hfill {\int}_{c}^{d}f\left(g\left(s\right)\right){g}^{\text{'}}\left(s\right)\phantom{\rule{0.166667em}{0ex}}ds& =& {\int}_{c}^{d}{(F\circ g)}^{\text{'}}\left(s\right)\phantom{\rule{0.166667em}{0ex}}ds\hfill \\ & =& (F\circ g)\left(d\right)-(F\circ g)\left(c\right)\hfill \\ & =& F\left(g\right(d\left)\right)-F\left(g\right(c\left)\right)\hfill \\ & =& F\left(b\right)-F\left(a\right)\hfill \\ & =& {\int}_{a}^{b}f\left(t\right)\phantom{\rule{0.166667em}{0ex}}dt,\hfill \end{array}$$
which finishes the proof.
- Prove the “Mean Value Theorem” for integrals:
If
$f$ is continuous on
$[a,b],$ then
there exists a
$c\in (a,b)$ such that
$${\int}_{a}^{b}f\left(t\right)\phantom{\rule{0.166667em}{0ex}}dt=f\left(c\right)(b-a).$$
- (Uniform limits of differentiable functions. Compare with
[link] .)
Suppose
$\left\{{f}_{n}\right\}$ is a sequence of continuous functions on a closed interval
$[a,b]$ that converges pointwise to a function
$f.$ Suppose that each derivative
${f}_{n}^{\text{'}}$ is continuous on the open interval
$(a,b),$ is integrable on the closed interval
$[a,b],$ and that the sequence
$\left\{{f}_{n}^{\text{'}}\right\}$ converges uniformly to a function
$g$ on
$(a,b).$ Prove that
$f$ is differentiable on
$(a,b),$ and
${f}^{\text{'}}=g.$ HINT: Let
$x$ be in
$(a,b),$ and let
$c$ be in the interval
$(a,x).$ Justify the following equalities, and use them together with the Fundamental Theorem to make the proof.
$$f\left(x\right)-f\left(c\right)=lim({f}_{n}\left(x\right)-{f}_{n}\left(c\right))=lim{\int}_{c}^{x}{f}_{n}^{\text{'}}={\int}_{c}^{x}g.$$
We revisit now the Remainder Theorem of Taylor,
which we first presented in
[link] .
The point is that there is another form of this theorem, the integral form,and this version is more powerful in some instances than the original one, e.g., in the general Binomial Theorem below.
Integral form of taylor's remainder theorem
Let
$c$ be a real number, and let
$f$ have
$n+1$ derivatives on
$(c-r,c+r),$ and suppose that
${f}^{(n+1)}\in I\left([c-r,c+r]\right).$ Then
for each
$c<x<c+r,$
$$f\left(x\right)-{T}_{(f,c)}^{n}\left(x\right)={\int}_{c}^{x}{f}^{(n+1)}\left(t\right)\frac{{(x-t)}^{n}}{n!}\phantom{\rule{0.166667em}{0ex}}dt,$$
where
${T}_{f}^{n}$ denotes the
$n$ th Taylor polynomial for
$f.$
Similarly, for
$c-r<x<c,$
$$f\left(x\right)-{T}_{(f,c)}^{n}\left(x\right)={\int}_{x}^{c}{f}^{(n+1)}\left(t\right)\frac{{(x-t)}^{n}}{n!}\phantom{\rule{0.166667em}{0ex}}dt.$$
Prove the preceding theorem.
HINT: Argue by induction on
$n,$ and integrate by parts.
REMARK We return now to the general Binomial Theorem, first studied
in
[link] .
The proof given there used the derivative form of Taylor's remainder Theorem,but we were only able to prove the Binomial Theorem for
$\left|t\right|<1/2.$ The theorem below uses the integral form of Taylor's Remainder Theorem in its proof,
and it gives the full binomial theorem, i.e., for all
$t$ for which
$\left|t\right|<1.$
General binomial theorem
Let
$\alpha =a+bi$ be a fixed complex number. Then
$${(1+t)}^{\alpha}=\sum _{k=0}^{\infty}\left(\genfrac{}{}{0pt}{}{\alpha}{k}\right){t}^{k}$$
for all
$t\in (-1,1).$
For clarity, we repeat some of the proof of
[link] .
Given a general
$\alpha =a+bi,$ consider the function
$g:(-1,1)\to C$ defined by
$g\left(t\right)={(1+t)}^{\alpha}.$ Observe that the
$n$ th derivative of
$g$ is given by
$${g}^{\left(n\right)}\left(t\right)=\frac{\alpha (\alpha -1)...(\alpha -n+1)}{{(1+t)}^{n-\alpha}}.$$
Then
$g\in {C}^{\infty}\left((-1,1)\right).$
For each nonnegative integer
$k$ define
$${a}_{k}={g}^{\left(k\right)}\left(0\right)/k!=\frac{\alpha (\alpha -1)...(\alpha -k+1)}{k!}=\left(\genfrac{}{}{0pt}{}{\alpha}{k}\right),$$
and set
$h\left(t\right)={\sum}_{k=0}^{\infty}{a}_{k}{t}^{k}.$ The radius of convergence for the power series function
$h$ is 1,
as was shown in
[link] .
We wish to show that
$g\left(t\right)=h\left(t\right)$ for all
$-1<t<1.$ That is, we wish to show that
$g$ is a Taylor series function around 0.
It will suffice to show that the sequence
$\left\{{S}_{n}\right\}$ of partial sums of the power series function
$h$ converges to the function
$g.$ We note also that the
$n$ th partial sum is just the
$n$ th Taylor polynomial
${T}_{g}^{n}$ for
$g.$
Now, fix a
$t$ strictly between 0 and
$1.$ The argument for
$t$ 's between
$-1$ and 0 is completely analogous..
Choose an
$\u03f5>0$ for which
$\beta =(1+\u03f5)t<1.$ We let
${C}_{\u03f5}$ be a numbers such that
$|\left(\genfrac{}{}{0pt}{}{\alpha}{n}\right)|\le {C}_{\u03f5}{(1+\u03f5)}^{n}$ for all nonnegative integers
$n.$ See
[link] .
We will also need the following estimate, which can beeasily deduced as a calculus exercise (See part (d) of
[link] .).
For all
$s$ between 0 and
$t,$ we have
$(t-s)/(1+s)\le t.$ Note also that, for any
$s\in (0,t),$ we have
${|(1+s)}^{\alpha}{|=(1+s)}^{a},$ and this is trapped between 1 and
${(1+t)}^{a}.$ Hence, there exists a number
${M}_{t}$ such that
${|(1+s)}^{\alpha -1}|\le {M}_{t}$ for all
$s\in (-0,t).$ We will need this estimate in the calculation that follows.
Then, by the integral form of Taylor's Remainder Theorem, we have:
$$\begin{array}{ccc}\hfill |g\left(t\right)-\sum _{k=0}^{n}{a}_{k}{t}^{k}|& =& |g\left(t\right)-{T}_{g}^{n}\left(t\right)|\hfill \\ & =& |{\int}_{0}^{t}{g}^{(n+1)}\left(s\right)\frac{{(t-s)}^{n}}{n!}\phantom{\rule{0.166667em}{0ex}}ds|\hfill \\ & =& |{\int}_{0}^{t}\left(\genfrac{}{}{0pt}{}{(n+1)\times \alpha}{n+1}\right){(1+s)}^{\alpha -n-1}{(t-s)}^{n}\phantom{\rule{0.166667em}{0ex}}ds|\hfill \\ & \le & {\int}_{0}^{t}|\left(\genfrac{}{}{0pt}{}{\alpha}{n+1}\right){\left|\right|(1+s)}^{\alpha -1}\left|(n+1)\right|(\frac{t-s}{1+s}{|}^{n}\phantom{\rule{0.166667em}{0ex}}ds\hfill \\ & \le & {\int}_{0}^{t}\left|\left(\genfrac{}{}{0pt}{}{\alpha}{n+1}\right)\right|{M}_{t}(n+1){t}^{n}\phantom{\rule{0.166667em}{0ex}}ds\hfill \\ & \le & {C}_{\u03f5}{M}_{t}(n+1){\int}_{0}^{t}{(1+\u03f5)}^{n+1}{t}^{n}\phantom{\rule{0.166667em}{0ex}}ds\hfill \\ & =& {C}_{\u03f5}{M}_{t}(n+1){(1+\u03f5)}^{n+1}{t}^{n+1}\hfill \\ & =& {C}_{\u03f5}{M}_{t}(n+1){\beta}^{n+1},\hfill \end{array}$$
which tends to 0 as
$n$ goes to
$\infty ,$ because
$\beta <1.$ This completes the proof for
$0<t<1.$