Skip to main content
\(\newcommand{\dollar}{\$} \DeclareMathOperator{\erf}{erf} \DeclareMathOperator{\arctanh}{arctanh} \newcommand{\lt}{<} \newcommand{\gt}{>} \newcommand{\amp}{&} \)

Section7.8Taylor Polynomials

Motivating Questions
  • What is a Taylor polynomial? For what purposes are Taylor polynomials used?

  • How do we determine the accuracy when we use a Taylor polynomial to approximate a function?

Polynomial functions are the simplest possible functions in mathematics, in part because they require only addition and multiplication to evaluate. Consequently, in practical applications, it is often useful to approximate complicated functions using polynomials. In this section we will learn how to obtain polynomial approximations of functions, and how to determine how good an approximation is.

As an example, consider the geometric series

\begin{equation} 1 + x + x^2 + \cdots + x^k + \cdots = \sum_{k=0}^{\infty} x^k\text{.}\label{E-geomx}\tag{7.21} \end{equation}

Here we see something very interesting: because a geometric series converges whenever its ratio \(r\) satisfies \(|r|\lt 1\text{,}\) and the sum of a convergent geometric series is \(\frac{a}{1-r}\text{,}\) we can say that for \(|x| \lt 1\text{,}\)

\begin{equation} 1 + x + x^2 + \cdots + x^k + \cdots = \frac{1}{1-x}\text{.}\label{E-geomxsummed}\tag{7.22} \end{equation}

Equation(7.22) states that the non-polynomial function \(\frac{1}{1-x}\) on the right is equal to the infinite polynomial expresssion on the left. Because the terms on the left get very small as \(k\) gets large, we can truncate the series and say, for example, that

\begin{equation*} 1 + x + x^2 + x^3 \approx \frac{1}{1-x} \end{equation*}

for small values of \(x\text{.}\) This shows one way that a polynomial function can be used to approximate a non-polynomial function; such approximations are one of the main themes in this section and the next.

In Example7.52, we begin our exploration of approximating functions with polynomials.

Example7.52

Example7.20 showed how we can approximate the number \(e\) using linear, quadratic, and other polynomial functions; we then used similar ideas in Example7.35 to approximate \(\ln(2)\text{.}\) In this example, we review and extend the process to find the best quadratic approximation to the exponential function \(e^x\) around the origin. Let \(f(x) = e^x\) throughout this example.

  1. Find a formula for \(P_1(x)\text{,}\) the linearization of \(f(x)\) at \(x=0\text{.}\) (We label this linearization \(P_1\) because it is a first degree polynomial approximation.) Recall that \(P_1(x)\) is a good approximation to \(f(x)\) for values of \(x\) close to \(0\text{.}\) Plot \(f\) and \(P_1\) near \(x=0\) to illustrate this fact.

  2. Since \(f(x) = e^x\) is not linear, the linear approximation eventually is not a very good one. To obtain better approximations, we want to develop a different approximation that bends to make it more closely fit the graph of \(f\) near \(x=0\text{.}\) To do so, we add a quadratic term to \(P_1(x)\text{.}\) In other words, we let

    \begin{equation*} P_2(x) = P_1(x) + c_2x^2 \end{equation*}

    for some real number \(c_2\text{.}\) We need to determine the value of \(c_2\) that makes the graph of \(P_2(x)\) best fit the graph of \(f(x)\) near \(x=0\text{.}\)

    Remember that \(P_1(x)\) was a good linear approximation to \(f(x)\) near \(0\text{;}\) this is because \(P_1(0) = f(0)\) and \(P'_1(0) = f'(0)\text{.}\) It is therefore reasonable to seek a value of \(c_2\) so that

    \begin{align*} P_2(0) \amp = f(0)\text{,} \amp P'_2(0) \amp = f'(0)\text{,} \amp \text{and }P''_2(0) \amp = f''(0)\text{.} \end{align*}

    Remember, we are letting \(P_2(x) = P_1(x) + c_2x^2\text{.}\)

    1. Calculate \(P_2(0)\) to show that \(P_2(0) = f(0)\text{.}\)

    2. Calculate \(P'_2(0)\) to show that \(P'_2(0) = f'(0)\text{.}\)

    3. Calculate \(P''_2(x)\text{.}\) Then find a value for \(c_2\) so that \(P''_2(0) = f''(0)\text{.}\)

    4. Explain why the condition \(P''_2(0) = f''(0)\) will put an appropriate bend in the graph of \(P_2\) to make \(P_2\) fit the graph of \(f\) around \(x=0\text{.}\)

Solution
  1. We know that

    \begin{equation*} P_1(x) = f(0) + f'(0)x = 1+x\text{.} \end{equation*}

    Since \(P_1(0) = f(0) = 1\) and \(P'_1(0) = f'(0) = 1\text{,}\) the graphs of \(P_1\) and \(f\) agree at \(x=a\) and have the same slope at \(x=0\) (which means they go in the same direction at \(x=0\)). This is why \(P_1(x)\) is a good approximation to \(f(x)\) for values of \(x\) close to \(0\text{.}\)

    1. Since

      \begin{equation*} P_2(x) = P_1(x) + c_2(x)^2 = f(0) + f'(0)x + c_2x^2 \end{equation*}

      we have that

      \begin{equation*} P_2(0) = 1 = f(0) \end{equation*}

      as desired.

    2. A simple calculation shows \(P'_2(x) = P'1(x) + 2c_2x\text{.}\) So \(P'_2(0) = P'_1(0) = 1 = f'(0)\) as desired.

    3. A simple calculation shows \(P''_2(x) = 2c_2\text{.}\) So \(P''_2(0) = 2c_2\text{.}\) To have \(P''_2(0) = f''(0)\) we must have \(2c_2 = f''(0)\) or \(c_2 = \frac{f''(0)}{2} = \frac{1}{2}\text{.}\)

    4. The second derivative of a function tells us the concavity of the function. Concavity measures how the slopes of the tangent lines to the graph of the function are changing. This tells us how much bend there is in the graph. So if \(P''_2(0) = f''(0)\text{,}\) then \(P_2\) will have the same bend in it at \(x=0\) as \(f\) does. This will make the graph of \(P_2\) mold to the graph of \(f\) around \(x=0\text{.}\)

SubsectionTaylor Polynomials

Example7.52 illustrates the first steps in the process of approximating functions with polynomials. Using this process we can approximate trigonometric, exponential, logarithmic, and other nonpolynomial functions as closely as we like (for certain values of \(x\)) with polynomials. This is extraordinarily useful in that it allows us to calculate values of these functions to whatever precision we like using only the operations of addition, subtraction, multiplication, and division, which can be easily programmed in a computer.

We next extend the approach in Example7.52 to arbitrary functions at arbitrary points. Let \(f\) be a function that has as many derivatives as we need at a point \(x=a\text{.}\) Recall that \(P_1(x)\) is the tangent line to \(f\) at \((a,f(a))\) and is given by the formula

\begin{equation*} P_1(x) = f(a) + f'(a)(x-a)\text{.} \end{equation*}

\(P_1(x)\) is the linear approximation to \(f\) near \(a\) that has the same slope and function value as \(f\) at the point \(x = a\text{.}\)

We next want to find a quadratic approximation

\begin{equation*} P_2(x) = P_1(x) + c_2(x-a)^2 \end{equation*}

so that \(P_2(x)\) more closely models \(f(x)\) near \(x=a\text{.}\) Consider the following calculations of the values and derivatives of \(P_2(x)\text{:}\)

\begin{align*} P_2(x) \amp = P_1(x) + c_2(x-a)^2 \amp P_2(a) \amp = P_1(a) = f(a)\\ P'_2(x) \amp = P'_1(x) + 2c_2(x-a) \amp P'_2(a) \amp = P'_1(a) = f'(a)\\ P''_2(x) \amp = 2c_2 \amp P''_2(a) \amp = 2c_2\text{.} \end{align*}

To make \(P_2(x)\) fit \(f(x)\) better than \(P_1(x)\text{,}\) we want \(P_2(x)\) and \(f(x)\) to have the same concavity at \(x=a\text{,}\) in addition to having the same slope and function value. That is, we want to have

\begin{equation*} P''_2(a) = f''(a)\text{.} \end{equation*}

This implies that

\begin{equation*} 2c_2 = f''(a) \end{equation*}

and thus

\begin{equation*} c_2 = \frac{f''(a)}{2}\text{.} \end{equation*}

Therefore, the quadratic approximation \(P_2(x)\) to \(f\) centered at \(x=a\) is

\begin{equation*} P_2(x) = f(a) + f'(a)(x-a) + \frac{f''(a)}{2!}(x-a)^2\text{.} \end{equation*}

This approach extends naturally to polynomials of higher degree. We define polynomials

\begin{align*} P_3(x) \amp = P_2(x) + c_3(x-a)^3\text{,}\\ P_4(x) \amp = P_3(x) + c_4(x-a)^4\text{,}\\ P_5(x) \amp = P_4(x) + c_5(x-a)^5\text{,} \end{align*}

and in general

\begin{equation*} P_n(x) = P_{n-1}(x) + c_n(x-a)^n\text{.} \end{equation*}

The defining property of these polynomials is that for each \(n\text{,}\) \(P_n(x)\) and all its first \(n\) derivatives must agree with those of \(f\) at \(x = a\text{.}\) In other words we require that

\begin{equation*} P^{(k)}_n(a) = f^{(k)}(a) \end{equation*}

for all \(k\) from 0 to \(n\text{.}\)

To see the conditions under which this happens, suppose

\begin{equation*} P_n(x) = c_0 + c_1(x-a) + c_2(x-a)^2 + \cdots + c_n(x-a)^n\text{.} \end{equation*}

Then

\begin{align*} P^{(0)}_n(a) \amp = c_0\\ P^{(1)}_n(a) \amp = c_1\\ P^{(2)}_n(a) \amp = 2c_2\\ P^{(3)}_n(a) \amp = (2)(3)c_3\\ P^{(4)}_n(a) \amp = (2)(3)(4)c_4\\ P^{(5)}_n(a) \amp = (2)(3)(4)(5)c_5 \end{align*}

and, in general,

\begin{equation*} P^{(k)}_n(a) = (2)(3)(4) \cdots (k-1)(k)c_k = k!c_k\text{.} \end{equation*}

So having \(P^{(k)}_n(a) = f^{(k)}(a)\) means that \(k!c_k = f^{(k)}(a)\) and therefore

\begin{equation*} c_k = \frac{f^{(k)}(a)}{k!} \end{equation*}

for each value of \(k\text{.}\) Using this expression for \(c_k\text{,}\) we have found the formula for the polynomial approximation of \(f\) that we seek. Such a polynomial is called a Taylor polynomial.

Taylor Polynomials

The \(n\)th order Taylor polynomial of \(f\) centered at \(x = a\) is given by

\begin{align*} P_n(x) =\mathstrut \amp f(a) + f'(a)(x-a) + \frac{f''(a)}{2!}(x-a)^2 + \cdots + \frac{f^{(n)}(a)}{n!}(x-a)^n\\ =\mathstrut \amp \sum_{k=0}^n \frac{f^{(k)}(a)}{k!}(x-a)^k\text{.} \end{align*}

This degree \(n\) polynomial approximates \(f(x)\) near \(x=a\) and has the property that \(P_n^{(k)}(a) = f^{(k)}(a)\) for \(k = 0, 1, \ldots, n\text{.}\)

Example7.53

Determine the third order Taylor polynomial for \(f(x) = e^x\text{,}\) as well as the general \(n\)th order Taylor polynomial for \(f\) centered at \(x=0\text{.}\)

Solution

We know that \(f'(x) = e^x\) and so \(f''(x) = e^x\) and \(f'''(x) = e^x\text{.}\) Thus,

\begin{equation*} f(0) = f'(0) = f''(0) = f'''(0) = 1\text{.} \end{equation*}

So the third order Taylor polynomial of \(f(x) = e^x\) centered at \(x=0\) is

\begin{align*} P_3(x) \amp = f(0) + f'(0)(x-0) + \frac{f''(0)}{2!}(x-0)^2 + \frac{f'''(0)}{3!}(x-0)^3\\ \amp = 1 + x + \frac{x^2}{2} + \frac{x^3}{6}\text{.} \end{align*}

In general, for the exponential function \(f\) we have \(f^{(k)}(x) = e^x\) for every positive integer \(k\text{.}\) Thus, the \(k\)th term in the \(n\)th order Taylor polynomial for \(f(x)\) centered at \(x=0\) is

\begin{equation*} \frac{f^{(k)}(0)}{k!}(x-0)^k = \frac{1}{k!}x^k\text{.} \end{equation*}

Therefore, the \(n\)th order Taylor polynomial for \(f(x) = e^x\) centered at \(x=0\) is

\begin{equation*} P_n(x) = 1+x+\frac{x^2}{2!} + \cdots + \frac{1}{n!}x^n = \sum_{k=0}^n \frac{x^k}{k!}\text{.} \end{equation*}
Example7.54

We have just seen that the \(n\)th order Taylor polynomial centered at \(a = 0\) for the exponential function \(e^x\) is

\begin{equation*} \sum_{k=0}^{n} \frac{x^k}{k!}\text{.} \end{equation*}

In this example, we determine small order Taylor polynomials for several other familiar functions, and look for general patterns.

  1. Let \(f(x) = \frac{1}{1-x}\text{.}\)

    1. Calculate the first four derivatives of \(f(x)\) at \(x=0\text{.}\) Then find the fourth order Taylor polynomial \(P_4(x)\) for \(\frac{1}{1-x}\) centered at \(0\text{.}\)

    2. Based on your results from part (i), determine a general formula for \(f^{(k)}(0)\text{.}\)

  2. Let \(f(x) = \cos(x)\text{.}\)

    1. Calculate the first four derivatives of \(f(x)\) at \(x=0\text{.}\) Then find the fourth order Taylor polynomial \(P_4(x)\) for \(\cos(x)\) centered at \(0\text{.}\)

    2. Based on your results from part (i), find a general formula for \(f^{(k)}(0)\text{.}\) (Think about how \(k\) being even or odd affects the value of the \(k\)th derivative.)

  3. Let \(f(x) = \sin(x)\text{.}\)

    1. Calculate the first four derivatives of \(f(x)\) at \(x=0\text{.}\) Then find the fourth order Taylor polynomial \(P_4(x)\) for \(\sin(x)\) centered at \(0\text{.}\)

    2. Based on your results from part (i), find a general formula for \(f^{(k)}(0)\text{.}\) (Think about how \(k\) being even or odd affects the value of the \(k\)th derivative.)

Answer
    1. \(f^{(k)}(0) = k! \text{.}\)

    2. \begin{equation*} P_n(x) = \sum_{k=0}^n x^k\text{.} \end{equation*}
    1. \(f^{k}(0) = 0\) if \(k\) is odd, and \(f^{2k}(0) = (-1)^k\text{.}\)

    2. \(P_n(x) = 1 - \frac{x^2}{2} + \frac{x^4}{4!} - \frac{x^6}{6!} + \cdots + (-1)^{n/2}\frac{x^n}{n!}\) if \(n\) is even and \(P_n(x) = 1 - \frac{x^2}{2} + \frac{x^4}{4!} - \frac{x^6}{6!} + \cdots + (-1)^{(n-1)/2}\frac{x^(n-1)}{(n-1)!}\) if \(n\) is odd.

    1. \(f^{k}(0) = 0\) if \(k\) is even and \(f^{2k+1}(0) = (-1)^k \text{.}\)

    2. \(P_n(x) = x - \frac{x^3}{3!} + \frac{x^5}{5!} - \frac{x^7}{7!} + \cdots + (-1)^{(n-1)/2}\frac{x^n}{n!}\) if \(n\) is odd and \(P_n(x) = x - \frac{x^3}{3!} + \frac{x^5}{5!} - \frac{x^7}{7!} + \cdots + (-1)^{n/2+1}\frac{x^{n-1}}{(n-1)!}\) if \(n\) is even.

Solution
    1. The first four derivatives of \(f(x)\) at \(x=0\) are

      \begin{align*} f(x) \amp = \frac{1}{1-x} \amp f(0) \amp = 1\\ f'(x) \amp = \frac{1}{(1-x)^2} \amp f'(0) \amp = 1\\ f''(x) \amp = \frac{2}{(1-x)^3} \amp f''(0) \amp = 2\\ f^{(3)}(x) \amp = \frac{3!}{(1-x)^4} \amp f^{(3)}(0) \amp = 3!\\ f^{(4)}(x) \amp = \frac{4!}{(1-x)^5} \amp f^{(4)}(0) \amp = 4!\text{.} \end{align*}

      It appears that the pattern is

      \begin{equation*} f^{(k)}(0) = k!\text{.} \end{equation*}
    2. The \(n\)th order Taylor polynomial for \(f\) at \(x=0\) is

      \begin{equation*} \sum_{k=0}^n \frac{f^{(k)}}{k!} x^k = \sum_{k=0}^n \frac{k!}{k!} x^k = \sum_{k=0}^n x^k\text{.} \end{equation*}

      This makes sense since \(f(x)\) is the sum of the geometric series with ratio \(x\text{,}\) so the \(n\)th order Taylor polynomial should just be the \(n\)th partial sum of this geometric series.

    1. The first four derivatives of \(f(x)\) at \(x=0\) are

      \begin{align*} f(x) \amp = \cos(x) \amp f(0) \amp = 1\\ f'(x) \amp = -\sin(x) \amp f'(0) \amp = 0\\ f''(x) \amp = -\cos(x) \amp f''(0) \amp = -1\\ f^{(3)}(x) \amp = \sin(x) \amp f^{(3)}(0) \amp = 0\\ f^{(4)}(x) \amp = \cos(x) \amp f^{(4)}(0) \amp = 1\text{.} \end{align*}

      It appears that the odd derivatives of \(f(x)\) are all plus or minus \(\sin(x)\) and so have values of 0 at \(x=0\) and the even derivatives are \(\pm \cos(x)\) and have alternating values of 1 and \(-1\) at \(x-0\text{.}\) Since the even numbers can be represented in the form \(2k\) where \(k\) is an integer we have \(f^{k}(0) = 0\) if \(k\) is odd and \(f^{2k}(0) = (-1)^k\text{.}\)

    2. Based on the previous part of this problem the \(n\)th order Taylor polynomial for \(\cos(x)\) is

      \begin{equation*} 1 - \frac{x^2}{2} + \frac{x^4}{4!} - \frac{x^6}{6!} + \cdots + (-1)^{n/2}\frac{x^n}{n!} \end{equation*}

      if \(n\) is even and

      \begin{equation*} 1 - \frac{x^2}{2} + \frac{x^4}{4!} - \frac{x^6}{6!} + \cdots + (-1)^{(n-1)/2}\frac{x^{n-1}}{(n-1)!} \end{equation*}

      if \(n\) is odd.

    1. The first four derivatives of \(f(x)\) at \(x=0\) are

      \begin{align*} f(x) \amp = \sin(x) \amp f(0) \amp = 0\\ f'(x) \amp = \cos(x) \amp f'(0) \amp = 1\\ f''(x) \amp = -\sin(x) \amp f''(0) \amp = 0\\ f^{(3)}(x) \amp = -\cos(x) \amp f^{(3)}(0) \amp = -1\\ f^{(4)}(x) \amp = \sin(x) \amp f^{(4)}(0) \amp = 0\text{.} \end{align*}

      It appears that the even derivatives of \(f(x)\) are all plus or minus \(\sin(x)\) and so have values of 0 at \(x=0\) and the odd derivatives are \(\pm \cos(x)\) and have alternating values of 1 and \(-1\) at \(x=0\text{.}\) Since the odd numbers can be represented in the form \(2k+1\) where \(k\) is an integer we have \(f^{k}(0) = 0 \) if \(k\) is even and \(f^{2k+1}(0) = (-1)^k\text{.}\)

    2. Based on the previous part of this problem the \(n\)th order Taylor polynomial for \(\sin(x)\) is

      \begin{equation*} x - \frac{x^3}{3!} + \frac{x^5}{5!} - \frac{x^7}{7!} + \cdots + (-1)^{(n-1)/2}\frac{x^n}{n!} \end{equation*}

      if \(n\) is odd and

      \begin{equation*} x - \frac{x^3}{3!} + \frac{x^5}{5!} - \frac{x^7}{7!} + \cdots + (-1)^{n/2+1}\frac{x^{n-1}}{(n-1)!} \end{equation*}

      if \(n\) is even.

It is possible that an \(n\)th order Taylor polynomial is not a polynomial of degree \(n\text{;}\) that is, the order of the approximation can be different from the degree of the polynomial. For example, in Example7.56 we found that the second order Taylor polynomial \(P_2(x)\) centered at \(0\) for \(\sin(x)\) is \(P_2(x) = x\text{.}\) In this case, the second order Taylor polynomial is a degree 1 polynomial.

SubsectionSummary

  • We can use Taylor polynomials to approximate functions. This allows us to approximate values of functions using only addition, subtraction, multiplication, and division of real numbers. The \(n\)th order Taylor polynomial centered at \(x=a\) of a function \(f\) is

    \begin{align*} P_n(x) =\mathstrut \amp f(a) + f'(a)(x-a) + \frac{f''(a)}{2!}(x-a)^2 + \cdots + \frac{f^{(n)}(a)}{n!}(x-a)^n\\ =\mathstrut \amp \sum_{k=0}^n \frac{f^{(k)}(a)}{k!}(x-a)^k\text{.} \end{align*}

SubsectionExercises