By the end of this section, you should be able to give precise and thorough answers to the questions listed below. You may want to keep these questions in mind to focus your thoughts as you complete the section.
What is the Gram-Schmidt process and why is it useful?
Since integration of functions is difficult, approximation techniques for definite integrals are very important. In calculus we are introduced to various methods, e.g., the midpoint, trapezoid, and Simpson's rule, for approximating definite integrals. These methods divide the interval of integration into subintervals and then use values of the integrand at the endpoints to approximate the integral. These are useful methods when approximating integrals from tabulated data, but there are better methods for other types of integrands. If we make judicious choices in the points we use to evaluate the integrand, we can obtain more accurate results with less work. One such method is Gaussian quadrature (which, for example, is widely used in solving problems of radiation heat transfer in direct integration of the equation of transfer of radiation over space), which we explore later in this section. This method utilizes the Gram-Schmidt process to produce orthogonal polynomials.
We have seen that orthogonal bases make computations very convenient, and the Gram-Schmidt process allowed us to create orthogonal bases in using the dot product as inner product. In this section we will see how the Gram-Schmidt process works in any inner product space.
Next we need to find a third polynomial that is in and is orthogonal to both and . Let Span. Explain why proj is in and is orthogonal to both and . Then calculate the polynomial .
Preview Activity 36.1 shows the first steps of the Gram-Schmidt process to construct an orthogonal basis from any basis of an inner product space. To understand how the process works in general, let be a basis for a subspace of an inner product space . Let and let Span. Since we have that SpanSpan. Now consider the subspace
Then is an orthogonal set. Note that , and the fact that implies that . So the set is linearly independent, being a set of non-zero orthogonal vectors. Now the question is whether SpanSpan. Note that is a linear combination of and , so is in Span. Since Span is a 2-dimensional subspace of the 2-dimensional space , it must be true that SpanSpan.
is orthogonal to both and and, by construction, is a linear combination of ,, and . So is in . The fact that implies that and is a linearly independent set. Since Span is a 3-dimensional subspace of the 3-dimensional space , we conclude that Span equals Span.
We know that is orthogonal to ,,,. Since ,,,, and are all in Span we see that is also in . Since implies that and is a linearly independent set. Then Span is a -dimensional subspace of the -dimensional space , so it follows that
The Gram-Schmidt process builds an orthogonal basis for us from a given basis. To make an orthonormal basis , all we need do is normalize each basis vector: that is, for each , we let
Find an orthonormal basis for using the Frobenius inner product trace.
Solution.
Recall that the Frobenius inner product is just like a dot product for matrices. First note that ,,, and are linearly independent. We let and the Gram-Schmidt process gives us
and
.
Then is an orthogonal basis for . An orthonormal basis is found by dividing each vector by its magnitude, so
Find the polynomial in (considered as a subspace of ) that is closest to . Use technology to calculate any required integrals. Draw a graph of your approximation against the graph of .
Solution.
Our job is to find proj. To do this, we need an orthogonal basis of . We apply the Gram-Schmidt process to the standard basis of to obtain an orthogonal basis of . We start with , then
and
.
Then
proj.
A graph of the approximation and are shown in Figure 36.4
Each set is linearly independent. Use the Gram-Schmidt process to create an orthogonal set of vectors with the same span as . Then find an orthonormal basis for the same span.
The Legendre polynomials form an orthonormal basis for the infinite dimensional inner product space of all polynomials using the inner product
.
The Legendre polynomials have applications to differential equations, statistics, numerical analysis, and physics (e.g., they appear when solving Schrödinger equation in three dimensions for a central force). The Legendre polynomials are found by using the Gram-Schmidt process to find an orthogonal basis from the standard basis for . Find the first four Legendre polynomials by creating a orthonormal basis from the set .
The sine integral function Si has applications in physics and engineering. We define Si as
Si.
Since we cannot find an elementary formula for Si, we use approximations. Find the best approximation to Si in with inner product . Use appropriate technology for computations and round output six places to the right of the decimal.
Recall from Exercise 18 in Section 35 that any finite dimensional vector space can be made into an inner product space by setting a basis for and defining
(36.1)
if and in . Let and . (You may assume that is a basis for .)
Simpson's rule is a reasonably accurate method for approximating definite integrals since it models the integrand on subintervals with quadratics. For that reason, Simpson's rule provides exact values for integrals of all polynomials of degree less than or equal to 2. In Gaussian quadrature, we will use a family of polynomials to determine points at which to evaluate an integral of the form . By allowing ourselves to select evaluation points that are not uniformly distributed across the interval of integration, we will be able to approximate our integrals much more efficiently. The method is constructed so as to obtain exact values for as large of degree polynomial integrands as possible. As a result, if we can approximate our integrand well with polynomials, we can obtain very good approximations with Gaussian quadrature with minimal effort.
where the (weights) and the (nodes) are points in the interval 62 . Gaussian quadrature describes how to find the weights and the points in (36.4) to obtain suitable approximations. We begin to explore Gaussian quadrature with the simplest cases.
In this activity we find through direct calculation the node and weight with so that
.(36.5)
There are two unknowns in this situation ( and ) and so we will need 2 equations to find these unknowns. Keep in mind that we want to have the approximation (36.5) be exact for as large of degree polynomials as possible.
In this problem we find through direct calculation the nodes and weights with so that
.(36.6)
There are four unknowns in this situation ( and ) and so we will need 4 equations to find these unknowns. Keep in mind that we want to have the approximation (36.6) be exact for as large of degree polynomials as possible. In this case we will use ,,, and .
Solve this system of 4 equations in 4 unknowns. You can do this by hand or with any other appropriate tool. Show that and are the roots of the polynomial .
Other than solving a system of linear equations as in Project Activity 36.4, it might be reasonable to ask what the connection is between Gaussian quadrature and linear algebra. We explore that connection now.
In the general case, we want to find the weights and nodes to make the approximation exact for as large degree polynomials as possible. We have unknowns ,,, and ,,,, so we need to impose conditions to determine the unknowns. We will require equality for the functions for from 0 to . This yields the equations
It is inefficient to always solve these systems of equations to find the nodes and weights, especially since there is a more elegant way to find the nodes.
Use appropriate technology to find the equations satisfied by the for ,, and .
Now we will see the more elegant way to find the nodes. As we will show for some cases, the nodes can be found as roots of a set of orthogonal polynomials in with the inner product . Begin with the basis of . Use appropriate technology to find an orthogonal basis for obtained by applying the Gram-Schmidt process to . The polynomials in this basis are called Legendre polynomials. Check that the nodes are roots of the Legendre polynomials by finding roots of these polynomials using any method. Explain why the appear to be roots of the Legendre polynomials.
Although it would take us beyond the scope of this project to verify this fact, the nodes in the th Gaussian quadrature approximation (36.4) are in fact the roots of the th order Legendre polynomial. In other words, if is the th order Legendre polynomial, then ,,, are the roots of in . Gaussian quadrature as described in (36.4) using the polynomial is exact if the integrand is a polynomial of degree less than .
Let us see now how good the integral estimates are with Gaussian quadrature method using an example. Use Gaussian quadrature with the indicated value of to approximate . Be sure to explain how you found your nodes and weights (approximate the nodes and weights to 8 decimal places). Compare the approximations with the actual value of the integral. Use technology as appropriate to help with calculations.
Our Gaussian quadrature formula was derived for integrals on the interval . We conclude by seeing how a definite integral on an interval can be converted to one on the interval .
Consider the problem of approximating an integral of the form . Show that the change of variables , reduces the integral to the form . (This change of variables can be derived by finding a linear function that maps the interval to the interval .)
As we will see later, integrations over can be converted to an integral over with a change of variable.
Abramowitz, Milton; Stegun, Irene A., eds. (1972), sec. 25.4, Integration, Handbook of Mathematical Functions (with Formulas, Graphs, and Mathematical Tables), Dover,