0% found this document useful (0 votes)
48 views6 pages

Rapport 2IA - El Atyqy Omar

This report compares three root finding methods - Newton's method, quasi-Newton method, and secant method - in finding the minimum of the function f(x)=x^2 + 78x + 65. It describes the concepts behind each method and provides Python implementations. The results show that all three methods accurately find the minimum value of -1456 at x=-39. While all methods have nearly identical precision, Newton's method and secant method have the fastest execution times compared to the quasi-Newton method.

Uploaded by

Omar El Atyqy
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
48 views6 pages

Rapport 2IA - El Atyqy Omar

This report compares three root finding methods - Newton's method, quasi-Newton method, and secant method - in finding the minimum of the function f(x)=x^2 + 78x + 65. It describes the concepts behind each method and provides Python implementations. The results show that all three methods accurately find the minimum value of -1456 at x=-39. While all methods have nearly identical precision, Newton's method and secant method have the fastest execution times compared to the quasi-Newton method.

Uploaded by

Omar El Atyqy
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 6

Report on Root Finding Methods

Omar El Atyqy
November, 2021

Abstract
This report will focus on the three root finding methods seen in class,
their operating principles, some implementation proposals on Python as well
as the results obtained. The report will end with a comparison between the
different execution times of every algorithm seen up until now.

1 - Objective Function:
The benchmark for every method will be run on the following function:

f (x) = x2 + 78x + 65

Plotting f (x) gives us the following graph, proving that it does indeed respect
the uni-modality condition:

1
An analytical search for the function’s minimum returns the following result:

minx {x2 + 78x + 65} = −1456 for x = −39

which will be used to test the precision of every method.

2 - Root Finding Methods:


2.1 Newton’s Method:
2.1.1 Concept:
Newton’s method consists of applying a quadratic approximation to the objective
function through a Taylor expansion, use that approximation to get the tangeant
line at an initial starting point before computing the x-intercept of that tangeant
line which then becomes the new minimum approximation. This process is iterated
through until an acceptable value of precision has been reached. This method re-
quires knowing the first and second derivatives in order to compute the minimium.

2.1.2 Implementation:

2.1.3 Results:

2.2 Quasi-Newton Method:


2.2.1 Concept:
The concept is the same as Newton’s method, except that it does not require the
knowledge of the first and second derivatives of the target function, as those get
approximated using the following finite difference formulae:

2
f ′ (x) = f (x+∆x)−f
2∆x
(x−∆x)

f ′′ (x) = f (x+∆x)−2f∆x(∆x)+f (x−∆x)


2

This process is iterated through until an acceptable value of precision has been
reached. This method requires the knowledge of the target function only.

2.2.2 Implementation:

2.2.3 Results:

2.3 Secant Method:


2.3.1 Concept:
This method consists of approximating f ′ (x) between two points A and B as a
segment line, before computing the x-intercept of that segment and setting that
value as the next approximation. This process is iterated through until an accept-
able value of precision has been reached. This method requires the knowledge of
the first derivative.

3
2.3.2 Implementation:

2.3.3 Results:

3 - Execution time:
3.1 Execution parameters:
The following parameters were adopted for every optimization method. The la-
beling is as follows:

ˆ epsilon: precision

ˆ delta: step used to approximate derivatives

ˆ low bound: start of initial interval

ˆ high bound: end of initial interval

ˆ rep: number of times to execute a method to get the average time of execu-
tion

4
3.2 Results:
Here are the results returned by each method put into one graph. As we can see,
the returned minimae are so close to each other, there is practically no observable
difference in the graph:

As mentioned before, the time of execution of every method corresponds to the


average time of execution of this method after rep executions (in this case: rep =
10000):

5
And below is the time comparaison between all methods we’ve seen up until now:

4 Conclusion:
Although the precision of every algorithm is practically the same, the plot above
shows that the fastest of the 10 algorithms are Secant and Newton methods
while the slowest is still the Exhaustive Search.

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy