0% found this document useful (0 votes)
38 views2 pages

Ridge and Lasso

Stats

Uploaded by

Universal
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
38 views2 pages

Ridge and Lasso

Stats

Uploaded by

Universal
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 2

12/16/24, 9:33 PM SML-LAB4 - Colab

Ridge and Lasso Regression: Ridge and LASSO Regression: Load the Housing Price dataset and fit a LASSO regression model to identify the
optimal 𝜆 λ (learning rate). Determine which variables are eliminated by LASSO. Also, fit a Ridge regression model to the dataset and compare
the accuracy of both models.

import pandas as pd
import numpy as np
import seaborn as sns
import matplotlib.pyplot as plt
from sklearn.model_selection import train_test_split
from sklearn.linear_model import LassoCV, RidgeCV
from sklearn.metrics import mean_squared_error
from sklearn.preprocessing import StandardScaler

df = pd.read_csv("Housing.csv")

X = df.drop("price", axis=1)
y = df["price"]

X = pd.get_dummies(X, drop_first=True)

plt.figure(figsize=(12, 10))
sns.heatmap(X.corr(), annot=True, cmap="coolwarm", fmt=".2f", linewidths=0.5)
plt.title("Feature Correlation Heatmap")
plt.show()

X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.2, random_state=42)

scaler = StandardScaler()
X_train = scaler.fit_transform(X_train)
X_test = scaler.transform(X_test)

lasso = LassoCV(cv=5, random_state=42)


lasso.fit(X_train, y_train)

ridge = RidgeCV(alphas=np.logspace(-6, 6, 13), cv=5)


ridge.fit(X_train, y_train)

lasso_pred = lasso.predict(X_test)
ridge_pred = ridge.predict(X_test)

lasso_mse = mean_squared_error(y_test, lasso_pred)


ridge_mse = mean_squared_error(y_test, ridge_pred)

print("Optimal lambda for LASSO:", lasso.alpha_)


print("Number of variables eliminated by LASSO:", np.sum(lasso.coef_ == 0))
print("LASSO MSE:", lasso_mse)

print("Optimal lambda for Ridge:", ridge.alpha_)


print("Ridge MSE:", ridge_mse)

if lasso_mse < ridge_mse:


print("LASSO performs better with lower MSE.")
else:
print("Ridge performs better with lower MSE.")

https://colab.research.google.com/drive/1XGd51BZRDY3cNmpDxqiRYlooVPEp0AHd#printMode=true 1/2
12/16/24, 9:33 PM SML-LAB4 - Colab

Start coding or generate with AI.

Optimal lambda for LASSO: 937.3652950321955


Number of variables eliminated by LASSO: 0
LASSO MSE: 1755552824227.798
Optimal lambda for Ridge: 10.0
Ridge MSE: 1760077525667.3755
LASSO performs better with lower MSE.

https://colab.research.google.com/drive/1XGd51BZRDY3cNmpDxqiRYlooVPEp0AHd#printMode=true 2/2

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy