Linear Regression With Assumpt
Linear Regression With Assumpt
y = B0 + B1*x
The linear equation assigns one scale factor to each input value
or column, called a coefficient and represented by the capital
Greek letter Beta (B).
In higher dimensions when we have more than one input (x), the
line is called a plane or a hyper-plane.
linkcode
Types
When we have more than one input we can use Ordinary Least
Squares to estimate the values of the coefficients.
This approach treats the data as a matrix and uses linear algebra
operations to estimate the optimal values for the coefficients. It
means that all of the data must be available and you must have
enough memory to fit the data and perform matrix operations.
3. Gradient Descent
When there are one or more inputs you can use a process of
optimizing the values of the coefficients by iteratively minimizing
the error of the model on your training data.
When using this method, you must select a learning rate (alpha)
parameter that determines the size of the improvement step to
take on each iteration of the procedure
4. Regularization
These seek to both minimize the sum of the squared error of the
model on the training data (using ordinary least squares) but also
to reduce the complexity of the model (like the number or
absolute size of the sum of all coefficients in the model).