Support Vector Regression

Support vector regression is a regression model inspired from support vector machines. The solution can be written as:

\[f({\bf x})=\sum_{i=1}^{N} \alpha_i k({\bf x}, {\bf x}_i)+b\]

where \({\bf x}\) is the new data point, \({\bf x}_i\) is a training sample, \(N\) denotes number of training samples, \(k\) is a kernel function, \(\alpha\) and \(b\) are determined in training.

See [ScholkopfS02] for a more detailed introduction. LibSVR performs support vector regression using LibSVM [CL11].

Example

Imagine we have files with training and test data. We create CDenseFeatures (here 64 bit floats aka RealFeatures) and CRegressionLabels as

features_train = RealFeatures(f_feats_train)
features_test = RealFeatures(f_feats_test)
labels_train = RegressionLabels(f_labels_train)
labels_test = RegressionLabels(f_labels_test)
features_train = RealFeatures(f_feats_train);
features_test = RealFeatures(f_feats_test);
labels_train = RegressionLabels(f_labels_train);
labels_test = RegressionLabels(f_labels_test);
RealFeatures features_train = new RealFeatures(f_feats_train);
RealFeatures features_test = new RealFeatures(f_feats_test);
RegressionLabels labels_train = new RegressionLabels(f_labels_train);
RegressionLabels labels_test = new RegressionLabels(f_labels_test);
features_train = Modshogun::RealFeatures.new f_feats_train
features_test = Modshogun::RealFeatures.new f_feats_test
labels_train = Modshogun::RegressionLabels.new f_labels_train
labels_test = Modshogun::RegressionLabels.new f_labels_test
features_train <- RealFeatures(f_feats_train)
features_test <- RealFeatures(f_feats_test)
labels_train <- RegressionLabels(f_labels_train)
labels_test <- RegressionLabels(f_labels_test)
features_train = modshogun.RealFeatures(f_feats_train)
features_test = modshogun.RealFeatures(f_feats_test)
labels_train = modshogun.RegressionLabels(f_labels_train)
labels_test = modshogun.RegressionLabels(f_labels_test)
RealFeatures features_train = new RealFeatures(f_feats_train);
RealFeatures features_test = new RealFeatures(f_feats_test);
RegressionLabels labels_train = new RegressionLabels(f_labels_train);
RegressionLabels labels_test = new RegressionLabels(f_labels_test);
auto features_train = some<CDenseFeatures<float64_t>>(f_feats_train);
auto features_test = some<CDenseFeatures<float64_t>>(f_feats_test);
auto labels_train = some<CRegressionLabels>(f_labels_train);
auto labels_test = some<CRegressionLabels>(f_labels_test);

Choose an appropriate CKernel and instantiate it. Here we use a CGaussianKernel.

width = 1.0
kernel = GaussianKernel(width)
width = 1.0;
kernel = GaussianKernel(width);
double width = 1.0;
GaussianKernel kernel = new GaussianKernel(width);
width = 1.0
kernel = Modshogun::GaussianKernel.new width
width <- 1.0
kernel <- GaussianKernel(width)
width = 1.0
kernel = modshogun.GaussianKernel(width)
double width = 1.0;
GaussianKernel kernel = new GaussianKernel(width);
auto width = 1.0;
auto kernel = some<CGaussianKernel>(width);

We create an instance of CLibSVR classifier by passing it the kernel, labels, solver type and some more parameters. More solver types are available in CLibSVR. See [CL02] for more details.

svm_c = 1.0
svr_param = 0.1
svr = LibSVR(svm_c, svr_param, kernel, labels_train, LIBSVR_EPSILON_SVR)
svm_c = 1.0;
svr_param = 0.1;
svr = LibSVR(svm_c, svr_param, kernel, labels_train, LIBSVR_EPSILON_SVR);
double svm_c = 1.0;
double svr_param = 0.1;
LibSVR svr = new LibSVR(svm_c, svr_param, kernel, labels_train, LIBSVR_SOLVER_TYPE.LIBSVR_EPSILON_SVR);
svm_c = 1.0
svr_param = 0.1
svr = Modshogun::LibSVR.new svm_c, svr_param, kernel, labels_train, Modshogun::LIBSVR_EPSILON_SVR
svm_c <- 1.0
svr_param <- 0.1
svr <- LibSVR(svm_c, svr_param, kernel, labels_train, "LIBSVR_EPSILON_SVR")
svm_c = 1.0
svr_param = 0.1
svr = modshogun.LibSVR(svm_c, svr_param, kernel, labels_train, modshogun.LIBSVR_EPSILON_SVR)
double svm_c = 1.0;
double svr_param = 0.1;
LibSVR svr = new LibSVR(svm_c, svr_param, kernel, labels_train, LIBSVR_SOLVER_TYPE.LIBSVR_EPSILON_SVR);
auto svm_c = 1.0;
auto svr_param = 0.1;
auto svr = some<CLibSVR>(svm_c, svr_param, kernel, labels_train, LIBSVR_SOLVER_TYPE::LIBSVR_EPSILON_SVR);

Then we train the regression model and apply it to test data to get the predicted CRegressionLabels.

svr.train(features_train)
labels_predict = svr.apply_regression(features_test)
svr.train(features_train);
labels_predict = svr.apply_regression(features_test);
svr.train(features_train);
RegressionLabels labels_predict = svr.apply_regression(features_test);
svr.train features_train
labels_predict = svr.apply_regression features_test
svr$train(features_train)
labels_predict <- svr$apply_regression(features_test)
svr:train(features_train)
labels_predict = svr:apply_regression(features_test)
svr.train(features_train);
RegressionLabels labels_predict = svr.apply_regression(features_test);
svr->train(features_train);
auto labels_predict = svr->apply_regression(features_test);

After training, we can extract \(\alpha\).

alpha = svr.get_alphas()
alpha = svr.get_alphas();
DoubleMatrix alpha = svr.get_alphas();
alpha = svr.get_alphas 
alpha <- svr$get_alphas()
alpha = svr:get_alphas()
double[] alpha = svr.get_alphas();
auto alpha = svr->get_alphas();

Finally, we can evaluate the CMeanSquaredError.

eval = MeanSquaredError()
mse = eval.evaluate(labels_predict, labels_test)
eval = MeanSquaredError();
mse = eval.evaluate(labels_predict, labels_test);
MeanSquaredError eval = new MeanSquaredError();
double mse = eval.evaluate(labels_predict, labels_test);
eval = Modshogun::MeanSquaredError.new 
mse = eval.evaluate labels_predict, labels_test
eval <- MeanSquaredError()
mse <- eval$evaluate(labels_predict, labels_test)
eval = modshogun.MeanSquaredError()
mse = eval:evaluate(labels_predict, labels_test)
MeanSquaredError eval = new MeanSquaredError();
double mse = eval.evaluate(labels_predict, labels_test);
auto eval = some<CMeanSquaredError>();
auto mse = eval->evaluate(labels_predict, labels_test);

References

Wikipedia: Support_vector_machine

[CL02]C.C. Chang and C.B. Lin. Training v-support vector regression: theory and algorithms. Neural Computation, 14(8):1959–1977, 2002.
[CL11]C.C. Chang and C.J. Lin. Libsvm: a library for support vector machines. ACM Transactions on Intelligent Systems and Technology, 2(3):27, 2011.
[ScholkopfS02]B. Schölkopf and A.J. Smola. Learning with kernels: support vector machines, regularization, optimization, and beyond. MIT press, 2002.