0% found this document useful (0 votes)
3 views9 pages

Lec 41

This lecture introduces orthogonal polynomials, focusing on their definition, properties, and the importance of weight functions and intervals in establishing orthogonality. It explains that a set of polynomials is orthogonal if their inner product, defined by an integral with respect to a weight function, equals zero for different indices. The lecture also discusses the linear independence of these polynomials and the construction of orthogonal bases for polynomial expansions.

Uploaded by

Masfa Mumtaz
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
3 views9 pages

Lec 41

This lecture introduces orthogonal polynomials, focusing on their definition, properties, and the importance of weight functions and intervals in establishing orthogonality. It explains that a set of polynomials is orthogonal if their inner product, defined by an integral with respect to a weight function, equals zero for different indices. The lecture also discusses the linear independence of these polynomials and the construction of orthogonal bases for polynomial expansions.

Uploaded by

Masfa Mumtaz
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 9

Mathematical Methods in Physics 2

Prof. Auditya Sharma


Department of Physics
Indian Institute of Science Education and Research, Bhopal

Orthogonal polynomials
Lecture – 41
Introduce orthogonal polynomials

So, with this lecture we begin a new topic. So, we will discuss orthogonal polynomials
starting from a general perspective and then we will go on to you know some specific sets of
polynomials and see how some of these general principles will apply in you know these
specific cases, ok.

(Refer Slide Time: 00:43)

So, the idea is that we want to come up with a set of polynomials. Let us call them C naught
of x, C 1 of x, C 2 of x so on; it is a an infinite set of polynomials a countable infinity of such
polynomials where C n of x is going to be a polynomial of degree n, right. So, for every
non-negative integer there is a corresponding polynomial with that degree.

And, such that all these polynomials you know have a nice relationship with each other,
namely they are orthogonal to each other and with respect to a non-negative weight function
which also needs to be defined and there is also a an interval which is taken to be between a
and b, right. So, all these ingredients are essential to define this idea of orthogonality, right.
Orthogonality simply means that this integral from a to b, right that is the interval in which
we have you know we are considering these are polynomials and with respect to this weight
function. So, if you take this integral a to b C n of x C m of x weight function dx is going to
be 0 whenever n is not equal to m. So, that is what is meant by the orthogonality of these
polynomials.

And, the reason why we need this weight function is because you know we need the notion of
an inner product, right. So, and you know often you may have a scenario, where a and b can
go to minus infinity and plus infinity, and it is the weight function which is going to ensure
that you know these integrals will converge and in a nice way.

So, in order to ensure convergence of such integrals and for the notion of an inner product to
be available right so, we are going to start thinking of these polynomials as elements of a
vector space, right. So, and we have seen how it is useful to be able to you know work with
the notion of an inner product and so, these integrals can be thought of as inner products of
vectors and for them to be well defined these kind of integrals have to converge and so, w of
x is going to play the role of ensuring that such integrals converge. So, that is why this weight
function is important.

And, so, let us see just based on some very general observations already several interesting
properties can be brought out, right. So, let us suppose we consider just you know a finite
number of elements in this set. So, the first n elements so, we look at C naught of x, C 1 of x,
C 2 of x all the way upto C n minus 1 of x. So, the orthogonality condition already implies
many you know constraints on this set right.

So, some interesting properties immediately follow. First of all we can immediately say that
this set of polynomials is linearly independent, right. So, the argument is you know it is a; it
is a proof by contradiction.
(Refer Slide Time: 03:55)

Suppose, the set is not linearly independent, so that means, they are linearly independent then
we should be able to find some sort of coefficients a r such that summation over r going from
0 to n minus 1 a r C r of x equal to 0, right.

So, these coefficients are non-trivial in the sense that not all of them are 0, but if all of them
are 0 then I mean that is not going to make it linearly dependent, right. So, that is the
definition of linear dependence is that you will be able to find a set of nontrivial coefficients a
r such that you know if you multiply these coefficients with these polynomials and add them
up you should be able to take it to 0, right.

So, now we can bring in this weight function and some other generic polynomial C n of x
from the left side and then integrate within this interval of relative which is relevant to this set
of polynomials. So, from a to b dx C n of x w of x summation over this is 0 because you
know the coordinate the sum itself is 0 and now because it is a finite sum you can change the
you know order of integration.

So, the summation will come to the left of the integral and so, what it effectively means is
this integral a m 0 a to b dx w of x C m C m of x the whole squared is equal to 0, right. So,
where we have exploited this orthogonality property, right.

So, it is only when C n is equal to r thus you know do you get a you know potentially
non-zero value. Well, in fact, we will argue that it is a non-zero value and every other one of
them, if r is not equal to n then this condition immediately sends that object to 0 and so, there
is only one such term which will survive.

And, so, this m can of course, take all these different values 0, 1, 2 all the way up to N minus
1 and so, now, the key point is that this is the only way for this product. So, there is some
coefficient times this integral which is 0, but now we will argue that this integral cannot be 0,
right.

The reason is that this is what is called a normalization integral, right. So, you have this
normalization integral and the only way for this to be 0 is if this weight function itself were 0,
but it is that is a trivial scenario, we do not want to consider the situation like that and if this
weight function is you know is non-negative object and so, this is a you know a polynomial
squared.

So, this polynomial squared will cross the x axis at most m times, right. So, there is no way
that this quantity can be 0. So, that means, a m is necessarily 0 the only way this equation will
hold is if a m is 0 and which in turn means basically it is a contradiction with our statement
that you know we will be able to find some non-trivial coefficients a r such that this is equal
to 0 that is not possible.

So, the only way you can ensure this type of a result is if all a r are 0 which is the statement
that the set of polynomials is a linearly independent set, right. So, it is a linearly independent
set and in fact, it forms an orthogonal basis for expansion of any polynomial in x of degree
less than or equal to n minus 1, right. So, if you take this set of all these polynomials so
designed C naught of x, C 1 of x so on all the way up to C N minus 1 of x, it will serve as an
orthogonal basis for expansion of any polynomial in x of degree less than or equal to 1, right.

So, the idea is that basically you can think of a linear vector space which is N-dimensional a
real linear vector space of polynomials in x of degree N minus 1 or less and the set this set C
naught of x, C 1 of x, C 2 of x so on all the way up to C N minus 1 of x will serve as a basis
for V N, right. So, that is the you know that is a consequence of this the orthogonality
basically of all these polynomials.

Now, basically the orthogonality of these polynomials implies that you know if you can take
any polynomial and you know expand it in terms of these polynomials you know these are
the basis vectors of your space.
(Refer Slide Time: 08:55)

Then in fact, what you can do is you know write down you know this equivalent relation. So,
what it means is, so, whenever you have a C n of x, where n is greater than you know any of
these ps. So, consider x to the p, where p is an integer which is less than N. So, x to the p is
going to be orthogonal to C n of x, right because after all C 1, C 2, C naught C 1 C 2 all the
way up to C n minus 1 contain terms of this kind x to the p, right.

So, in fact, it is not just all these polynomials which are orthogonal to C n of x, but. In fact,
every x to the p where p is less than or equal to n is going to be orthogonal to C n of x, right.
So, the reason is that you can expand x to the p in terms of these polynomials which are all of
lower order.

So, because you know all these polynomials up to p form a basis and therefore, x to the p is
something which can be expanded in terms of these polynomials and therefore, when you
multiply with C n of x and then you multiply by w of x and take this integral from a to b.
Since x to the p itself is expressed in terms of all these polynomials each of them term by
term will go to 0 and indeed in fact, you can argue that x to the p is orthogonal to C n of x
whenever p is less than n, right.

So, this gives us a way to actually sort of generate these polynomials. So, what you have to
do is take every C n of x to be you know orthogonal to every x to the p which is where p is
less than n, right. So, if you can generate a set of polynomials C naught you start with C
naught and you go to C 1 and make sure that C 1 is orthogonal to all x to the p, where p is
less than 1. In this case that is just p equal to 0 if you go to C 2 of x you must ensure that C 2
of x is orthogonal to x to the 0 and x to the 1, alright.

So, if you can do this, then it is actually the same set that you are generating. You do not even
have to go and work with C n being directly orthogonal to you know all C p, where p is less
than n.

So, you can simply work with these powers x x to the 0, x to the 1, x x squared, x cube, and
so on and then you are done, right. So, this is an immediate consequence of this vector space
structure which comes about because of the orthogonality of this polynomial.

Let us look at an example to illustrate what this means is a rather simple example just to tell
you that you know orthogonality is dependent on the weight function and on the interval that
is being considered. Suppose you consider these two polynomials C 1 of x which is 4 x minus
4 and C 2 of x see some quadratic function 16x squared minus 32x plus 14 and suppose we
consider this weight function e to the minus 4x minus 1 the whole squared and the interval is
taken to be all the way from minus infinity to plus infinity.

So, if you want to check if these two functions are orthogonal to each other with respect to
this weight function in this interval, what you have to do is carry out this integral.

(Refer Slide Time: 12:21)

And, so, you will find that indeed you know if you make this substitution x minus 1 equal to
y then the integral becomes so, 8 comes out then in place of x minus 1 you have a y, then you
have an 8y squared plus 7 times e to the minus 4y squared dy. And, indeed we can actually
immediately argue that this is 0 because integrand is an odd function, right. So, I mean there
are other quadratic functions too which will give you this, right.

So, whenever any quadratic function which you can construct such that you know this
integrand overall if it is; if it is going to be an odd function. So, which will happen if you
know there is a y and there is a square of y if it comes in, then indeed this is going to be a 0.
So, you will be able to find many quadratic functions which are orthogonal to this function
with respect to this w in this interval, right.

So, but on the other hand if you consider the same two functions, but with a different weight
in a different interval so, w of x equal to one and in interval minus 1 to 1, then we would have
to look at this integral and with these limits. So, now, you see that the way to carry out this
integral is to actually work out this product. So, what is a quadratic term times a linear term
will become some cubic term.

So, then we see that although x cube and x are odd functions, but you still have these x
squared and 7 sitting here which means that it is not going to be 0. You can work out the
answer, but the answer is not important. So, we are just trying to argue that you know the
same two functions are orthogonal to each other with respect to a certain weight function in a
certain interval, but they are not orthogonal with respect to another weight function in a
different interval, right.

So, it is important to specify ok.


(Refer Slide Time: 14:21)

So, I mean we have started with a very sort of general interval in mind, but in fact, in print in
practice it is useful to convert this interval to one of three different types and it is always
possible to do this, right. One is if a and b are finite, it is useful to just simply map them to
the interval minus 1 to 1, right.

So, there is a way to do this there is a linear transformation with which you can do it and if
you have you know a is finite, but b is plus infinity then you just send a to 0, it is convenient
to work in this manner and if both a and b are minus infinity and plus infinity that is ok. So,
you leave it as it is, right.

So, the reason why this you know is possible is because you can make this change of variable
you can write x to be cy plus d and c naught equal to 0. So, all these polynomials you know
every polynomial in x will now become a polynomial in y and its degree is unchanged right
after all it is just a linear transformation.

Now, the different polynomials in y are orthogonal to each other, but with a new weight
function right which must also be adjusted and so, the idea behind doing this is to, you know,
work with these new intervals right. So, if you know x were taking the value a y is going to
take a minus d over c and when x takes the value of b y is going to take the value of b minus
d over c.
And, so, you can convince yourself that you can always put this a minus d over C to be minus
1 and b minus d over c to be plus 1 provided both a and b are finite; if one of them is infinity
then you know you can shift the other to 0, right. So, it is convenient to reduce this general
nature of a and d and work with these three specific kinds, right. So, this is what we will do:
build our structure of orthogonal polynomials using these three different types of intervals.

Ok, that is our short introduction to how we are going to start working in from very general
principles, how we will try to you know look at orthogonal functions then we will see how to
construct them and then look at some of their properties and look at specific sets of
orthogonal polynomials as we go along. That is all for this lecture.

Thank you.

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy