This second part of **“Porting Support Vector Machine Models…”** explains the algorithm for prediction of new data by nonlinear support vector machines (SVM) and Gaussian radial basis kernel. Instead of using pseudo-code, the algorithm is low level implemented in R. This low level implementation is easy to translate to C/C++ or any other language.

Remind the following steps already listed in part 1:

**Step 1:** To port your SVM model, first extract all data and model parameters you need for prediction from the R environment and save it outside R.

**Step 2:** Implement the SVM prediction algorithm in a programming language of your choice and parameterise it with the data coming from R.

In part 1 you learned how to finish off **step 1**. You learned which data you need for prediction in another environment and how to extract it from kernlab SVM model.

In **part 2** of this article you will learn how to finish off **step 2**. The SVM prediction algorithm and an implementation example will be presented.

## Step 2

Formal the prediction algorithm is displayed in the following manner:

The nonlinear support vector machine prediction formula

and the kernel function (Gaussian radial basis function)

### Import model parameters

First import all model parameters you need (compare part 1). Following is listet if the model information is a single value or an array of one ([]) or two ([][]) dimensions.

- sigma (),
*single value* - a (),
*array dim usually: []* - supportV (),
*Array dim usually: [][]* - bias (),
*single value* - ymean,
*single value* - yscale,
*single value* - xmean,
*Array dim usually: []* - xscale,
*Array dim usually: []*

### Import new data to predict

Below the new predictors are named as `newdata`

. For example use predictors X from your SVM training (**svmdat**, part 1) to test your algorithm:

` newdata <- X #(from R in part 1: svmdat[,colnames(svmdat)!="y"])`

### Kernel Function: Gaussian Radial Basis Function

Define the kernel function, in our example the Gaussian radial basis function.

```
K <- function(xi,x,sigma) {
if(length(xi) != length(x)) stop("vector dimensions not consistent")
p <- length(xi)
#Preallocation
r <-0
#Calculate euklidian norm of differences
for(l in 1:p) r <- r + (xi[l]-x[l]) * (xi[l]-x[l])
return(exp(-sigma * r ))
}
```

### Autoscale New Data

```
n <- ncol(X) # j = 1...n (columns of X)
m <- nrow(X) # i = 1...m (rows of X)
```

Preallocation with zeros

` newdataAS <- matrix(data = numeric(m*n),nrow = m, ncol = n)`

Columnwise mean subtraction and scale division

```
for(k in 1:n){
for(i in 1:m){
newdataAS[i,j] <- (newdata[i,j]-xmean[j]) / xscale[j]
}
}
```

### SVM Prediction Algorithm

Dimension (number of rows) of support vector matrix

`mu <- nrow(supportV) # iota = 1...mu (rows of support vector matrix)`

Preallocation with zeros

`H <- matrix(numeric(mu * m), ncol = mu)`

Calculate kernel (projecting *newdataAS* and *supportV* to the **feature space H**)

```
for (i in 1:m) {
for (iota in 1:mu) {
H[i, iota] <- K(supportV[iota, ], newdataAS[i, ], sigma)
}
}
```

Prediction of new values **yp**

```
yp <- numeric(m) # Preallocation with zeros
for(i in 1:m){
for(iv in 1:mu){
yp[i] <- yp[i] + (H[i,iota] * a[iota])
}
yp[i] <- (yp[i] - bias) * yscale + ymean
}
```

That’s it!

Yours faithfully,

Dennis Vier