OpenCV Tutorials —— Support Vector Machines for Non-Linearly Separable Data

与上一篇类似 ~~ 只不过是非线性数据

#include "stdafx.h"

#include <iostream>
#include <opencv2/core/core.hpp>
#include <opencv2/highgui/highgui.hpp>
#include <opencv2/ml/ml.hpp>

#define NTRAINING_SAMPLES   100         // Number of training samples per class
#define FRAC_LINEAR_SEP     0.9f        // Fraction of samples which compose the linear separable part

using namespace cv;
using namespace std;

int main()
{
	// Data for visual representation
	const int WIDTH = 512, HEIGHT = 512;
	Mat I = Mat::zeros(HEIGHT, WIDTH, CV_8UC3);

	//--------------------- 1. Set up training data randomly ---------------------------------------
	Mat trainData(2*NTRAINING_SAMPLES, 2, CV_32FC1);
	Mat labels   (2*NTRAINING_SAMPLES, 1, CV_32FC1);

	RNG rng(100); // Random value generation class

	// Set up the linearly separable part of the training data
	int nLinearSamples = (int) (FRAC_LINEAR_SEP * NTRAINING_SAMPLES);

	// Generate random points for the class 1
	Mat trainClass = trainData.rowRange(0, nLinearSamples);
	// The x coordinate of the points is in [0, 0.4)
	Mat c = trainClass.colRange(0, 1);
	rng.fill(c, RNG::UNIFORM, Scalar(1), Scalar(0.4 * WIDTH));
	// The y coordinate of the points is in [0, 1)
	c = trainClass.colRange(1,2);
	rng.fill(c, RNG::UNIFORM, Scalar(1), Scalar(HEIGHT));

	// Generate random points for the class 2
	trainClass = trainData.rowRange(2*NTRAINING_SAMPLES-nLinearSamples, 2*NTRAINING_SAMPLES);
	// The x coordinate of the points is in [0.6, 1]
	c = trainClass.colRange(0 , 1);
	rng.fill(c, RNG::UNIFORM, Scalar(0.6*WIDTH), Scalar(WIDTH));
	// The y coordinate of the points is in [0, 1)
	c = trainClass.colRange(1,2);
	rng.fill(c, RNG::UNIFORM, Scalar(1), Scalar(HEIGHT));

	//------------------ Set up the non-linearly separable part of the training data ---------------

	// Generate random points for the classes 1 and 2
	trainClass = trainData.rowRange(  nLinearSamples, 2*NTRAINING_SAMPLES-nLinearSamples);
	// The x coordinate of the points is in [0.4, 0.6)
	c = trainClass.colRange(0,1);
	rng.fill(c, RNG::UNIFORM, Scalar(0.4*WIDTH), Scalar(0.6*WIDTH));
	// The y coordinate of the points is in [0, 1)
	c = trainClass.colRange(1,2);
	rng.fill(c, RNG::UNIFORM, Scalar(1), Scalar(HEIGHT));

	//------------------------- Set up the labels for the classes ---------------------------------
	labels.rowRange(                0,   NTRAINING_SAMPLES).setTo(1);  // Class 1
	labels.rowRange(NTRAINING_SAMPLES, 2*NTRAINING_SAMPLES).setTo(2);  // Class 2

	//------------------------ 2. Set up the support vector machines parameters --------------------
	CvSVMParams params;
	params.svm_type    = SVM::C_SVC;
	params.C           = 0.1;
	params.kernel_type = SVM::LINEAR;
	params.term_crit   = TermCriteria(CV_TERMCRIT_ITER, (int)1e7, 1e-6);

	//------------------------ 3. Train the svm ----------------------------------------------------
	cout << "Starting training process" << endl;
	CvSVM svm;
	svm.train(trainData, labels, Mat(), Mat(), params);
	cout << "Finished training process" << endl;

	//------------------------ 4. Show the decision regions ----------------------------------------
	Vec3b green(0,100,0), blue (100,0,0);
	for (int i = 0; i < I.rows; ++i)
		for (int j = 0; j < I.cols; ++j)
		{
			Mat sampleMat = (Mat_<float>(1,2) << i, j);
			float response = svm.predict(sampleMat);

			if      (response == 1)    I.at<Vec3b>(j, i)  = green;
			else if (response == 2)    I.at<Vec3b>(j, i)  = blue;
		}

		//----------------------- 5. Show the training data --------------------------------------------
		int thick = -1;
		int lineType = 8;
		float px, py;
		// Class 1
		for (int i = 0; i < NTRAINING_SAMPLES; ++i)
		{
			px = trainData.at<float>(i,0);
			py = trainData.at<float>(i,1);
			circle(I, Point( (int) px,  (int) py ), 3, Scalar(0, 255, 0), thick, lineType);
		}
		// Class 2
		for (int i = NTRAINING_SAMPLES; i <2*NTRAINING_SAMPLES; ++i)
		{
			px = trainData.at<float>(i,0);
			py = trainData.at<float>(i,1);
			circle(I, Point( (int) px, (int) py ), 3, Scalar(255, 0, 0), thick, lineType);
		}

		//------------------------- 6. Show support vectors --------------------------------------------
		thick = 2;
		lineType  = 8;
		int x     = svm.get_support_vector_count();

		for (int i = 0; i < x; ++i)
		{
			const float* v = svm.get_support_vector(i);
			circle( I,  Point( (int) v[0], (int) v[1]), 6, Scalar(128, 128, 128), thick, lineType);
		}

		imwrite("result.png", I);                      // save the Image
		imshow("SVM for Non-Linear Training Data", I); // show it to the user
		waitKey(0);
}
时间: 2024-10-13 11:35:30

OpenCV Tutorials —— Support Vector Machines for Non-Linearly Separable Data的相关文章

Support Vector Machines for classification

Support Vector Machines for classification To whet your appetite for support vector machines, here’s a quote from machine learning researcher Andrew Ng: “SVMs are among the best (and many believe are indeed the best) ‘off-the-shelf’ supervised learni

初译 Support Vector Machines:A Simple Tutorial(一)

从本次开始我将开始尝试着逐章翻译一下 Alexey Nefedov的<Support Vector Machines:A Simple Tutorial>这本教材,这可是我们导师极力推荐的SVM教材,看了好久一直感觉一脸懵逼,索性开坑翻译一下吧,也当是加深理解,毕竟我也是一知半解,如果翻译的有不对的地方还望大佬们斧正,欢迎提意见,欢迎讨论. 嗯,就是这样. (一)Introduction 在本章节中将会介绍一些用于定义支持向量机(SVM)的基础的概念,这些概念对于理解SVM至关重要,假定读者了

Machine Learning - XII. Support Vector Machines (Week 7)

http://blog.csdn.net/pipisorry/article/details/44522881 机器学习Machine Learning - Andrew NG courses学习笔记 Support Vector Machines支持向量机 {SVM sometimes gives a cleaner and more powerful way of learning complex nonlinear functions} Optimization Objective优化目标

(原创)Stanford Machine Learning (by Andrew NG) --- (week 7) Support Vector Machines

本栏目内容来源于Andrew NG老师讲解的SVM部分,包括SVM的优化目标.最大判定边界.核函数.SVM使用方法.多分类问题等,Machine learning课程地址为:https://www.coursera.org/course/ml 大家对于支持向量机(SVM)可能会比较熟悉,是个强大且流行的算法,有时能解决一些复杂的非线性问题.我之前用过它的工具包libsvm来做情感分析的研究,感觉效果还不错.NG在进行SVM的讲解时也同样建议我们使用此类的工具来运用SVM. (一)优化目标(Opt

[机器学习] Coursera笔记 - Support Vector Machines

序言 机器学习栏目记录我在学习Machine Learning过程的一些心得笔记,包括在线课程或Tutorial的学习笔记,论文资料的阅读笔记,算法代码的调试心得,前沿理论的思考等等,针对不同的内容会开设不同的专栏系列. 机器学习是一个令人激动令人着迷的研究领域,既有美妙的理论公式,又有实用的工程技术,在不断学习和应用机器学习算法的过程中,我愈发的被这个领域所吸引,只恨自己没有早点接触到这个神奇伟大的领域!不过我也觉得自己非常幸运,生活在这个机器学习技术发展如火如荼的时代,并且做着与之相关的工作

scikit-learn(工程中用的相对较多的模型介绍):1.4. Support Vector Machines

参考:http://scikit-learn.org/stable/modules/svm.html 在实际项目中,我们真的很少用到那些简单的模型,比如LR.kNN.NB等,虽然经典,但在工程中确实不实用. 今天我们关注在工程中用的相对较多的SVM. SVM功能不少:Support vector machines (SVMs) are a set of supervised learning methods used for classification, regression and outl

Machine Learning in Action -- Support Vector Machines

虽然SVM本身算法理论,水比较深,很难懂 但是基本原理却非常直观易懂,就是找到与训练集中支持向量有最大间隔的超平面 形式化的描述: 其中需要满足m个约束条件,m为数据集大小,即数据集中的每个数据点function margin都是>=1,因为之前假设所有支持向量,即离超平面最近的点,的function margin为1 对于这种有约束条件的最优化问题,用拉格朗日定理,于是得到如下的形式, 现在我们的目的就是求出最优化的m个拉格朗日算子,因为通过他们我们可以间接的算出w和b,从而得到最优超平面 考

机器学习 第七讲:Support Vector Machines 1

引言 这一讲及接下来的几讲,我们要介绍supervised learning 算法中最好的算法之一:Support Vector Machines (SVM,支持向量机).为了介绍支持向量机,我们先讨论"边界"的概念,接下来,我们将讨论优化的边界分类器,并将引出拉格朗日数乘法.我们还会给出 kernel function 的概念,利用 kernel function,可以有效地处理高维(甚至无限维数)的特征向量,最后,我们会介绍SMO算法,该算法说明了如何高效地实现SVM. Margi

斯坦福CS229机器学习课程笔记五:支持向量机 Support Vector Machines

SVM被许多人认为是有监督学习中最好的算法,去年的这个时候我就在尝试学习.不过,面对长长的公式和拗口的中文翻译最终放弃了.时隔一年,看到Andrew讲解SVM,总算对它有了较为完整的认识,总体思路是这样的:1.介绍间隔的概念并重新定义符号:2.分别介绍functional margins与geometric margins:3.由此引出最大间隔分类器,同时将最优化问题转化为凸函数的优化问题:4.补充了拉格朗日对偶性的知识:5.利用拉格朗日对偶性,推导最大间隔分类器最优化的对偶问题,即SVM的最优