
Scala for Machine Learning, Second Edition
By :

In the Binomial classification section of Chapter 9, Regression and Regularization, you learned the concept of hyperplanes that segregate observations into two classes. These hyperplanes are also known as linear decision boundaries. In the case of the logistic regression, the datasets must be linearly separated. This constraint is particularly an issue for problems with many features that are nonlinearly dependent (high dimension models).
Support vector machines (SVMs) overcome this limitation by estimating the optimal separating hyperplane using kernel functions.
This chapter introduces kernel functions; binary support vectors classifiers, one-class SVMs for anomaly detection, and support vector regression.
In this chapter, you will answer the following questions: