{"id":76,"date":"2022-07-14T01:51:33","date_gmt":"2022-07-14T01:51:33","guid":{"rendered":"https:\/\/www.svms.org\/?p=76"},"modified":"2022-07-14T01:51:33","modified_gmt":"2022-07-14T01:51:33","slug":"regression","status":"publish","type":"post","link":"https:\/\/www.svms.org\/regression\/","title":{"rendered":"Regression"},"content":{"rendered":"

Support Vector Machines for Regression<\/h1>\n

 <\/p>\n

“The Support Vector method can also be applied to the case of regression, maintaining all the main features that characterise the maximal margin algorithm: a non-linear function is learned by a linear learning machine in a kernel-induced feature space while the capacity of the system is controlled by a parameter that does not depend on the dimensionality of the space.”
\nCristianini and Shawe-Taylor (2000)<\/p>\n

“In SVM the basic idea is to map the data x into a high-dimensional feature space F via a nonlinear mapping ?, and to do linear regression in this space (cf. Boser et al. (1992); Vapnik (1995)).”<\/p>\n

M?uller et al.<\/p>\n