0% found this document useful (0 votes)
17 views13 pages

Linear-Regression 231212 072619

This document discusses linear regression, a simple machine learning algorithm. It begins with an example of linear regression to predict an output variable from an input variable. Linear regression finds the best-fitting straight line through the data by minimizing the mean squared error between the actual outputs and the predicted outputs. It does this by calculating the weights that multiply each input feature through an equation known as the normal equations. The goal of the machine learning algorithm is to improve the weights to reduce error on test data by learning from observations in training data.

Uploaded by

joaolevi777
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
17 views13 pages

Linear-Regression 231212 072619

This document discusses linear regression, a simple machine learning algorithm. It begins with an example of linear regression to predict an output variable from an input variable. Linear regression finds the best-fitting straight line through the data by minimizing the mean squared error between the actual outputs and the predicted outputs. It does this by calculating the weights that multiply each input feature through an equation known as the normal equations. The goal of the machine learning algorithm is to improve the weights to reduce error on test data by learning from observations in training data.

Uploaded by

joaolevi777
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 13

Deep Learning

Linear Regression

Tiago Vieira

Institute of Computing
Universidade Federal de Alagoas

February 9, 2023
Summary

Introduction

Solution
Introduction

Let’s start with a very simple example: Linear Regression.


▶ ML algorithm: Performance (P ) improvement with respect to a given Task (T )
via Experience (E).
Introduction

Figure: Example of linear regression. The black solid line represents f (x) corresponding to the
real phenomenon generating the data. Blue dots represent the samples obtained after the
insertion of the noise. The blue solid line represents the model obtained by linear regression.
Introduction

Linear regression:
Input:

x ∈ Rn . (1)
Output is a linear function of the input:

y ∈ R. (2)
That is:  
x1
 x2 
 ∈ Rn 7→ y ∈ R (3)
 
 ..
 . 
xn
▶ Parameters are values that control the behavior of the system.
▶ wi are coefficients that are multiplied by feature xi before adding the
contributions of all attributes.
▶ [wi ] is a set of weights determining how much each attribute contributes to the
prediction ŷ.
▶ wi > 0 : ↑ xi ↑ ŷ
▶ wi < 0 : ↑ xi ↓ ŷ
▶ wi = 0 : xi has no influence on ŷ
▶ wi ≫ 0 : xi has a large influence on ŷ
Problem statement:
▶ Task T : Predict y from a given input x by computing ŷ = wT x.
Suppose:
▶ Design matrix containing m examples:
h im
X test = − − x(i) − − (4)
i=1

▶ Vector of regression targets:


y(test) . (5)
Performance (P ) is given by the Mean Squared Error (MSE) computed on the model’s
predictions on test examples y(test) :
m
1 X  (test) 2
M SEtest = ŷ − y (test) (6)
m i
i=1

▶ One can see that the error is zero for y = ŷ.


Also:
1 (test)
M SEtest = ||ŷ − y||2i (7)
m
so that the error increases whenever the euclidean distance between predictions and
ground-truth targets increases.
ML algorithm’s goal:
▶ Improve weights w in order to reduce the error M SE(test) .
▶ But the algorithm can “learn” (gain experience) through observations in training
dataset (x(i) , y (i) )m
i=1 .
How do we do that?
▶ Minimizing the mean squared error using the gradient and making it equal to zero:

∇w M SE(train) = 0 (8)

1 (train)
∇w ||ŷ − y (train) ||22 = 0
m
1 (train) (train)
∇w ||X(m,n) w(n,1) − y(m,1) ||22
m
Solving for w gives:
−1
w = XT X XT y (9)

▶ Known as the normal equations.


▶ Simple learning algorithm.
Thank you!
tvieira@ic.ufal.br

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy