0% found this document useful (0 votes)
8 views5 pages

Support Vector Machine

The document details the implementation of a Support Vector Machine (SVM) using the e1071 library in R, including data preparation, model training, and parameter tuning. It demonstrates the use of radial kernel SVM with various cost and gamma values, and evaluates model performance through ROC plots. The best parameters identified were cost=1 and gamma=0.5, achieving a performance error of 0.07.

Uploaded by

hubertkuo418
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
8 views5 pages

Support Vector Machine

The document details the implementation of a Support Vector Machine (SVM) using the e1071 library in R, including data preparation, model training, and parameter tuning. It demonstrates the use of radial kernel SVM with various cost and gamma values, and evaluates model performance through ROC plots. The best parameters identified were cost=1 and gamma=0.5, achieving a performance error of 0.07.

Uploaded by

hubertkuo418
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 5

Support Vector Machine

411210002 郭玉皓

2024-12-14
set.seed(1)
library(e1071)

## Warning: 套件 'e1071' 是用 R 版本 4.4.2 來建造的

x=matrix(rnorm(200*2), ncol=2)
x[1:100, ]=x[1:100, ]+2
x[101:150, ]=x[101:150, ]-2
y=c(rep(1, 150), rep(2, 50))
dat=data.frame(x=x, y=as.factor(y))
plot(x, col=y)

train=sample(200, 100)
svmfit=svm(y~., data=dat[train, ], kernel="radial", gamma=1, cost=1)
plot(svmfit, dat[train, ])
summary(svmfit)

##
## Call:
## svm(formula = y ~ ., data = dat[train, ], kernel = "radial", gamma
= 1,
## cost = 1)
##
##
## Parameters:
## SVM-Type: C-classification
## SVM-Kernel: radial
## cost: 1
##
## Number of Support Vectors: 31
##
## ( 16 15 )
##
##
## Number of Classes: 2
##
## Levels:
## 1 2

svmfit=svm(y~., data=dat[train, ], kernel="radial", gamma=1, cost=1e5)


plot(svmfit ,dat[train, ])
set.seed (1)
tune.out=tune(svm, y~., data=dat[train, ], kernel="radial",
ranges=list(cost=c(0.1, 1, 10, 100, 1000), gamma=c(0.5, 1, 2, 3, 4)))
summary(tune.out)

##
## Parameter tuning of 'svm':
##
## - sampling method: 10-fold cross validation
##
## - best parameters:
## cost gamma
## 1 0.5
##
## - best performance: 0.07
##
## - Detailed performance results:
## cost gamma error dispersion
## 1 1e-01 0.5 0.26 0.15776213
## 2 1e+00 0.5 0.07 0.08232726
## 3 1e+01 0.5 0.07 0.08232726
## 4 1e+02 0.5 0.14 0.15055453
## 5 1e+03 0.5 0.11 0.07378648
## 6 1e-01 1.0 0.22 0.16193277
## 7 1e+00 1.0 0.07 0.08232726
## 8 1e+01 1.0 0.09 0.07378648
## 9 1e+02 1.0 0.12 0.12292726
## 10 1e+03 1.0 0.11 0.11005049
## 11 1e-01 2.0 0.27 0.15670212
## 12 1e+00 2.0 0.07 0.08232726
## 13 1e+01 2.0 0.11 0.07378648
## 14 1e+02 2.0 0.12 0.13165612
## 15 1e+03 2.0 0.16 0.13498971
## 16 1e-01 3.0 0.27 0.15670212
## 17 1e+00 3.0 0.07 0.08232726
## 18 1e+01 3.0 0.08 0.07888106
## 19 1e+02 3.0 0.13 0.14181365
## 20 1e+03 3.0 0.15 0.13540064
## 21 1e-01 4.0 0.27 0.15670212
## 22 1e+00 4.0 0.07 0.08232726
## 23 1e+01 4.0 0.09 0.07378648
## 24 1e+02 4.0 0.13 0.14181365
## 25 1e+03 4.0 0.15 0.13540064

table(true=dat[-train, "y"], pred=predict(tune.out$best.model,


newx=dat[-train, ]))

## pred
## true 1 2
## 1 54 23
## 2 17 6

library(ROCR)

## Warning: 套件 'ROCR' 是用 R 版本 4.4.2 來建造的

rocplot=function(pred, truth, ...){


predob=prediction(pred, truth)
perf=performance(predob, "tpr", "fpr")
plot(perf, ...)
}

svmfit.opt=svm(y~., data=dat[train, ], kernel="radial", gamma=2,


cost=1, decision.values=T)

fitted=attributes(predict(svmfit.opt, dat[train, ],
decision.values=TRUE))$decision.values

par(mfrow=c(1, 2))

rocplot(-fitted, dat[train, "y"], main="Training Data")

svmfit.flex=svm(y~., data=dat[train, ], kernel="radial", gamma=50,


cost=1, decision.values=T)

fitted=attributes(predict(svmfit.flex, dat[train, ],
decision.values=TRUE))$decision.values
rocplot(-fitted, dat[train, "y"], add=TRUE, col="red")

fitted=attributes(predict(svmfit.opt, dat[-train, ],
decision.values=TRUE))$decision.values

rocplot(fitted, dat[-train, "y"], main="Test Data")

fitted=attributes(predict(svmfit.flex, dat[-
train,],decision.values=TRUE))$decision.values

rocplot(-fitted, dat[-train, "y"], add=TRUE, col="red")

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy