Linear and Non-Linear Models-Lec4
Linear and Non-Linear Models-Lec4
models
Dr. Radwa Ahmed Osman
1
Linear Models in Optimization
•Definition: Optimization problems where the objective function and constraints are linear.
•Mathematical Formulation:
•Minimize/Maximize 𝑓(𝑥) = 𝑐1 𝑥1 + 𝑐2 𝑥2 … + 𝑐𝑛 𝑥𝑛
•Subject to constraints:
• 𝑎1 𝑥1 + 𝑎2 𝑥2 … + 𝑎𝑛 𝑥𝑛 ≤ 𝑏
•Examples in AI:
•Linear Regression (minimizing error)
•Support Vector Machines (SVM) with linear kernel
•Linear Programming (LP) for resource allocation
•Techniques to Solve:
•Simplex Method
•Graphical Method
•Interior-Point Methods
2
Linear Regression (Minimizing Error)
• Linear Regression is used to predict a continuous output by finding the best-fit line that minimizes the error (Mean Squared Error).
• The best-fit line in Linear Regression is found by minimizing the Mean Squared Error (MSE). This is done using Ordinary Least
Squares (OLS), which calculates the optimal values of the slope (𝑚) and intercept (𝑏) in the equation:
• 𝑦 = 𝑚𝑥 + 𝑏
• Steps to Find the Best-Fit Line (Minimizing MSE)
• Define the error function:
• The error is the difference between the actual values 𝑦𝑖 and predicted values 𝑦ෝ𝑖
• 𝐸𝑟𝑟𝑜𝑟 = 𝑦𝑖 − 𝑦ෝ𝑖
1
• Mean Squared Error (MSE) is calculated as: 𝑀𝑆𝐸 = 𝑛 ∑ 𝑦𝑖 − 𝑦ෝ𝑖 2
3
Linear Regression (Minimizing Error)
4
Linear Regression (Minimizing Error)
5
Linear Regression (Minimizing Error)
6
Linear Regression (Minimizing Error)
7
Linear Regression (Minimizing Error)
8
Python Code for Linear regression
9
Python Code for Linear regression
10
Support Vector Machines (SVM)
• SVM aims to find the best possible linear
decision boundary (hyperplane) that
separates data points into different classes.
• Hyperplane Formation:
• SVM searches for the optimal hyperplane that
maximizes the margin between two classes.
• The margin is the distance between the
hyperplane and the closest data points (called
support vectors).
11
Support Vector Machines (SVM)
• Equation of the Hyperplane:
• In a 2D space, the decision boundary is a straight
line given by:
• 𝑤 ⋅ 𝑥 + 𝑏 = 0 where:
• 𝑤 is the weight vector (determining the
orientation of the hyperplane),
• 𝑥 is the input feature vector,
• 𝑏 is the bias term.
12
Support Vector Machines (SVM)
13
Support Vector Machines (SVM)
14
Python code for SVM
15
Python code for SVM
16
Python code for SVM
17
Nonlinear Models in Optimization
18
Non linear Regression
19
Non linear Regression
• Estimation Techniques
• Unlike linear regression, non-linear models often require iterative methods
like:
• Gradient Descent
• Levenberg-Marquardt Algorithm
• Newton-Raphson Method
• These methods adjust model parameters until they minimize the error (e.g.,
using Mean Squared Error).
20
Non linear Regression
21
Non linear Regression
22
Non linear Regression
23
Non linear Regression
24
Non linear Regression
25
Python code for Non Linear Regression
26
Python code for Non Linear Regression
27
Gradient Descent
• Imagine you're on top of a mountain and your goal is to reach the lowest
point in the valley. But there's a problem:
• It's dark 🌙, so you can’t see the entire valley.
• You can only feel the slope under your feet (steepness of the hill).
• You take small steps in the direction of the downward slope.
• If you keep moving in the direction of the steepest descent, you'll eventually reach
the lowest point (the minimum).
28
Gradient Descent
• Gradient Descent is an algorithm used in machine
learning to find the best model parameters (like
weights in regression).
Instead of a mountain, think of a loss function
that tells us how bad our model is.
• Our goal: Find the lowest value of this loss function
(just like reaching the bottom of the valley).
• We take small steps: We update our model
parameters little by little.
• The slope tells us the direction: The gradient
(derivative) helps us move toward the minimum.
29
Python for Gradient Descent (Linear)
30
Python for Gradient Descent (Linear)
31
Python for Gradient Descent (Non-Linear)
32
Python for Gradient Descent (Non-Linear)
33
Python for Gradient Descent (Non-Linear)
34
•Thanks
Any Questions
35