0% found this document useful (0 votes)
19 views7 pages

Support-Vector-Classifier

The document details the implementation of a Support Vector Classifier (SVC) using the R programming language, specifically with the e1071 library. It includes data generation, model training with different cost parameters, and performance evaluation through cross-validation and predictions on test data. The results indicate the number of support vectors and the performance metrics for various cost values, highlighting the best model identified through tuning.

Uploaded by

hubertkuo418
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
19 views7 pages

Support-Vector-Classifier

The document details the implementation of a Support Vector Classifier (SVC) using the R programming language, specifically with the e1071 library. It includes data generation, model training with different cost parameters, and performance evaluation through cross-validation and predictions on test data. The results indicate the number of support vectors and the performance metrics for various cost values, highlighting the best model identified through tuning.

Uploaded by

hubertkuo418
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 7

Support Vector ClassifierSupport Vector Classifier

411210002 郭玉皓

2024-12-14
set.seed(1)
x = matrix(rnorm(20*2), ncol = 2)
y =c(rep(-1, 10), rep(1, 10))
x[y==1,] = x[y==1,] +1
plot(x, col = (3 - y))

dat = data.frame(x = x, y = as.factor(y))


library(e1071)

svmfit = svm(y~., data = dat, kernel = "linear", cost = 10, scale


= FALSE)
plot(svmfit, dat)
svmfit$index

## [1] 1 2 5 7 14 16 17

summary(svmfit)

##
## Call:
## svm(formula = y ~ ., data = dat, kernel = "linear", cost = 10,
scale = FALSE)
##
##
## Parameters:
## SVM-Type: C-classification
## SVM-Kernel: linear
## cost: 10
##
## Number of Support Vectors: 7
##
## ( 4 3 )
##
##
## Number of Classes: 2
##
## Levels:
## -1 1
svmfit = svm(y~., data = dat, kernel = "linear", cost = 0.1,
scale = FALSE)
svmfit = svm(y~., data = dat, kernel = "linear", cost = 0.1,
scale = FALSE)
plot(svmfit, dat)

svmfit$index

## [1] 1 2 3 4 5 7 9 10 12 13 14 15 16 17 18 20

set.seed(1)
tune.out = tune(svm, y~., data = dat, kernel = "linear", ranges =
list(cost=c(0.001, 0.01, 0.1, 1, 5, 10, 100)))
summary(tune.out)

##
## Parameter tuning of 'svm':
##
## - sampling method: 10-fold cross validation
##
## - best parameters:
## cost
## 0.1
##
## - best performance: 0.05
##
## - Detailed performance results:
## cost error dispersion
## 1 1e-03 0.55 0.4377975
## 2 1e-02 0.55 0.4377975
## 3 1e-01 0.05 0.1581139
## 4 1e+00 0.15 0.2415229
## 5 5e+00 0.15 0.2415229
## 6 1e+01 0.15 0.2415229
## 7 1e+02 0.15 0.2415229

bestmod = tune.out$best.model
summary(bestmod)

##
## Call:
## best.tune(METHOD = svm, train.x = y ~ ., data = dat, ranges =
list(cost = c(0.001,
## 0.01, 0.1, 1, 5, 10, 100)), kernel = "linear")
##
##
## Parameters:
## SVM-Type: C-classification
## SVM-Kernel: linear
## cost: 0.1
##
## Number of Support Vectors: 16
##
## ( 8 8 )
##
##
## Number of Classes: 2
##
## Levels:
## -1 1

xtest = matrix(rnorm(20*2), ncol = 2)


ytest = sample(c(-1, 1), 20, rep = TRUE)
xtest[ytest==1, ] = xtest[ytest==1, ] + 1
testdat = data.frame(x = xtest, y = as.factor(ytest))
ypred = predict(bestmod, testdat)
table(predict = ypred, truth = testdat$y)

## truth
## predict -1 1
## -1 9 1
## 1 2 8

svmfit = svm(y~., data = dat, kernel = "linear", cost = 0.1,


scale =
FALSE)
ypred = predict(svmfit, testdat)
table(predict = ypred, truth = testdat$y)
## truth
## predict -1 1
## -1 9 1
## 1 2 8

x[y==1, ] = x[y==1, ] + 0.5


plot(x, col = (y+5)/2, pch = 19)

dat = data.frame(x = x, y = as.factor(y))


svmfit = svm(y~., data = dat, kernel = "linear", cost = 1e+05)
summary(svmfit)

##
## Call:
## svm(formula = y ~ ., data = dat, kernel = "linear", cost =
1e+05)
##
##
## Parameters:
## SVM-Type: C-classification
## SVM-Kernel: linear
## cost: 1e+05
##
## Number of Support Vectors: 3
##
## ( 1 2 )
##
##
## Number of Classes: 2
##
## Levels:
## -1 1

plot(svmfit, dat)

svmfit = svm(y~., data = dat, kernel = "linear", cost = 1)


summary(svmfit)

##
## Call:
## svm(formula = y ~ ., data = dat, kernel = "linear", cost = 1)
##
##
## Parameters:
## SVM-Type: C-classification
## SVM-Kernel: linear
## cost: 1
##
## Number of Support Vectors: 7
##
## ( 4 3 )
##
##
## Number of Classes: 2
##
## Levels:
## -1 1

plot(svmfit, dat)

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy