Concept of Support Vector Regression(SVR) | SVM | Mathematics | Machine Learning

Concept of Support Vector Regression(SVR) | SVM | Mathematics | Machine Learning



SVM stands for Support vectors machine, it is a famous classification and regression algorithm. We will today talk about Support vector regression more. So, let's begin...

Look at this graph...

Suppose you have to classify the elements, how you will you do?

You will do it something like this...


Now, let's suppose you have this graph...

So, for the separation for this, we don't have any simple line separation method, so we will add one more axis to it i.e. z-axis.



From upside, it will look like this...


This task which we have done in previous graphs, this is actually SVM does. Support Vector Machine is a supervised machine learning algorithm, which is used for classification and regression challenges.

Terminologies:


There are a few points to learn before going to further lesson...

Hyperplane: This is a line we draw ago for classification of data classes in SVM. And in support vector regression this line is used to predict the continuous output.

Kernels: A kernel helps us to find a hyperplane in the high dimensional space. Because in SVM algorithm we draw n-dimensional space, for the n-features. Increase in dimension required when we are unable to find the hyperplane in a given dimension then we increase the dimension to find its hyperplane, it helps us to do it.

Decision Boundary: These lines can be referred to as limitation line, to the middle hyperplane, on one side of which lies only positive points and on the other hand lies only negative points.

Support Vectors: Support vectors are those points, which lie on the decision boundary.

So let's move on to the Support Vector Regression().

Support Vector Regression()

SVM is a machine learning algorithm which can solve classification, regression and clustering. That SVM which solves the regression problem i.e. predicting the next data point called Support Vector Regression. An advantage with the SVR is, it gives us the rights to define how much error is acceptable in our model. Then it will find a hyperplane to fit it in our data.

In Support Vector Regression we minimize the coefficients, where we set the absolute error less than or equal to a specific margin, called the maximum error(denoted by an epsilon(𝛜)).




So, according to the constraint, we have to take only those points which are in boundary i.e. -𝛜 < y- wx+ b < +𝛜

It is the most basic explanation for the mathematics behind the SVR, obviously, there are huge challenges for SVR, but for now basic is enough, as we further go through tutorial we will cover all those cases.

Hence, the SVM is a very powerful algorithm which allows us to choose an acceptable error margin(𝛜) further we can tune it to give the best results.













Comments

Popular posts from this blog

SMART HOSPITAL | IoT based smart health monitoring system | Real time responses | Internet Of Things | ESP8266

A Quick Guide to Data pre-processing for Machine Learning | Python | IMPUTATION | STANDARDISATION | Data Analysis | Data Science

Plotly & Cufflinks | A Data Visualisation Library with Modern Features | Python | Data Science | Data Visualisation