0% found this document useful (0 votes)
17 views35 pages

Linear and Non-Linear Models-Lec4

The document discusses linear and nonlinear models in optimization, detailing their definitions, mathematical formulations, and examples in AI, such as linear regression and support vector machines. It explains techniques for solving these models, including the Simplex Method for linear models and Gradient Descent for nonlinear models. Additionally, it provides insights into the estimation techniques used in nonlinear regression and includes Python code examples for implementation.

Uploaded by

Abdelrhman Adel
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
17 views35 pages

Linear and Non-Linear Models-Lec4

The document discusses linear and nonlinear models in optimization, detailing their definitions, mathematical formulations, and examples in AI, such as linear regression and support vector machines. It explains techniques for solving these models, including the Simplex Method for linear models and Gradient Descent for nonlinear models. Additionally, it provides insights into the estimation techniques used in nonlinear regression and includes Python code examples for implementation.

Uploaded by

Abdelrhman Adel
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 35

Linear and Non-Linear

models
Dr. Radwa Ahmed Osman

1
Linear Models in Optimization

•Definition: Optimization problems where the objective function and constraints are linear.
•Mathematical Formulation:
•Minimize/Maximize 𝑓(𝑥) = 𝑐1 𝑥1 + 𝑐2 𝑥2 … + 𝑐𝑛 𝑥𝑛
•Subject to constraints:
• 𝑎1 𝑥1 + 𝑎2 𝑥2 … + 𝑎𝑛 𝑥𝑛 ≤ 𝑏
•Examples in AI:
•Linear Regression (minimizing error)
•Support Vector Machines (SVM) with linear kernel
•Linear Programming (LP) for resource allocation
•Techniques to Solve:
•Simplex Method
•Graphical Method
•Interior-Point Methods

2
Linear Regression (Minimizing Error)
• Linear Regression is used to predict a continuous output by finding the best-fit line that minimizes the error (Mean Squared Error).
• The best-fit line in Linear Regression is found by minimizing the Mean Squared Error (MSE). This is done using Ordinary Least
Squares (OLS), which calculates the optimal values of the slope (𝑚) and intercept (𝑏) in the equation:

• 𝑦 = 𝑚𝑥 + 𝑏
• Steps to Find the Best-Fit Line (Minimizing MSE)
• Define the error function:
• The error is the difference between the actual values 𝑦𝑖 and predicted values 𝑦ෝ𝑖
• 𝐸𝑟𝑟𝑜𝑟 = 𝑦𝑖 − 𝑦ෝ𝑖
1
• Mean Squared Error (MSE) is calculated as: 𝑀𝑆𝐸 = 𝑛 ∑ 𝑦𝑖 − 𝑦ෝ𝑖 2

• Compute the best parameters (𝒎, 𝒃) using the OLS formula:


𝑛 ∑ 𝑥𝑖 𝑦𝑖 − ∑ 𝑥𝑖 ∑ 𝑦𝑖
• Slope (𝑚): 𝑚 =
𝑛 ∑ 𝑥𝑖 2 − ∑ 𝑥𝑖 2
∑ 𝑥𝑖 ∑ 𝑦𝑖
• Intercept (𝑏): 𝑏 = 𝑦ത − 𝑚𝑥ҧ =
𝑛

𝑛
• Fit the line to the data by applying these computed values.

3
Linear Regression (Minimizing Error)

4
Linear Regression (Minimizing Error)

5
Linear Regression (Minimizing Error)

6
Linear Regression (Minimizing Error)

7
Linear Regression (Minimizing Error)

8
Python Code for Linear regression

9
Python Code for Linear regression

10
Support Vector Machines (SVM)
• SVM aims to find the best possible linear
decision boundary (hyperplane) that
separates data points into different classes.
• Hyperplane Formation:
• SVM searches for the optimal hyperplane that
maximizes the margin between two classes.
• The margin is the distance between the
hyperplane and the closest data points (called
support vectors).
11
Support Vector Machines (SVM)
• Equation of the Hyperplane:
• In a 2D space, the decision boundary is a straight
line given by:
• 𝑤 ⋅ 𝑥 + 𝑏 = 0 where:
• 𝑤 is the weight vector (determining the
orientation of the hyperplane),
• 𝑥 is the input feature vector,
• 𝑏 is the bias term.
12
Support Vector Machines (SVM)

13
Support Vector Machines (SVM)

14
Python code for SVM

15
Python code for SVM

16
Python code for SVM

17
Nonlinear Models in Optimization

•Definition: Problems where at least one function (objective or constraints) is nonlinear.


•Mathematical Formulation:
•Minimize/Maximize 𝑓(𝑥) = 𝑥 2 + 𝑒 𝑥 + log(𝑥)
•Examples in AI:
•Logistic Regression (sigmoid function is nonlinear)
•Neural Networks (Deep Learning)
•Nonlinear Support Vector Machines (SVM)
•Solving Nonlinear Models:
•Gradient Descent
•Newton’s Method
•Lagrange Multiplier Method

18
Non linear Regression

19
Non linear Regression
• Estimation Techniques
• Unlike linear regression, non-linear models often require iterative methods
like:
• Gradient Descent
• Levenberg-Marquardt Algorithm
• Newton-Raphson Method
• These methods adjust model parameters until they minimize the error (e.g.,
using Mean Squared Error).
20
Non linear Regression

21
Non linear Regression

22
Non linear Regression

23
Non linear Regression

24
Non linear Regression

25
Python code for Non Linear Regression

26
Python code for Non Linear Regression

27
Gradient Descent
• Imagine you're on top of a mountain and your goal is to reach the lowest
point in the valley. But there's a problem:
• It's dark 🌙, so you can’t see the entire valley.
• You can only feel the slope under your feet (steepness of the hill).
• You take small steps in the direction of the downward slope.
• If you keep moving in the direction of the steepest descent, you'll eventually reach
the lowest point (the minimum).

28
Gradient Descent
• Gradient Descent is an algorithm used in machine
learning to find the best model parameters (like
weights in regression).
Instead of a mountain, think of a loss function
that tells us how bad our model is.
• Our goal: Find the lowest value of this loss function
(just like reaching the bottom of the valley).
• We take small steps: We update our model
parameters little by little.
• The slope tells us the direction: The gradient
(derivative) helps us move toward the minimum.
29
Python for Gradient Descent (Linear)

30
Python for Gradient Descent (Linear)

31
Python for Gradient Descent (Non-Linear)

32
Python for Gradient Descent (Non-Linear)

33
Python for Gradient Descent (Non-Linear)

34
•Thanks
Any Questions

35

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy