Machine Learning - XII. Support Vector Machines (Week 7)

http://blog.csdn.net/pipisorry/article/details/44522881

机器学习Machine Learning - Andrew NG courses学习笔记

Support Vector Machines支持向量机

{SVM sometimes gives a cleaner and more powerful way of learning complex nonlinear functions}

Optimization Objective优化目标

Alternative view of logistic regression

Note:

1. 线条表示:蓝色线代表logistic regression, 紫红色代表SVM;for the SVM we‘re going to replace this blue line with purple and magenta line.It just do something pretty similar to logistic regression but it turns out that this will give the SVM computational
advantage that will give us later on an easier optimization problem, that will be easier for stock trades and so on.

Cost function for SVM

Note:

1. The first is the term which is the cost that comes from the training set
A
and the second is this term, which is the regularization term
B
(without lambda).

2. by setting different values for this regularization parameter lambda.We could trade off the relative way between how much we want to fit the training set well,as minimizing A, versus how much we care about keeping the values of the parameters
small.

3. use a different parameter C and we instead are going to minimize C times A plus B.C playing a role similar to 1 over lambda.

So for logistic regression if we send a very large value of lambda, that means to give B a very high weight.Here is that if we set C to be a very small value corresponds to giving B much larger weight than C than A.

Hypothesis for SVM

Note:

1. SVM doesn‘t output the probability. I predict 1, if theta transpose x is greater than or equal to 0.And so, having learned the parameters theta, this is the form of the hypothesis for the support vector machine.

intuition: 因为logstic reg的hypothesis是概率,加一个log就可以得到蓝色线条;而SVM中是紫红色线条不能由紫红线条 = log(H)推出H的表达式

Large Margin Intuition大边缘的直觉知识

Mathematics Behind Large Margin Classification (Optional)大边缘分类背后的数学

Kernels核

Using An SVM使用SVM

from:http://blog.csdn.net/pipisorry/article/details/44522881

时间: 2024-12-06 08:34:41

Machine Learning - XII. Support Vector Machines (Week 7)的相关文章

(原创)Stanford Machine Learning (by Andrew NG) --- (week 7) Support Vector Machines

本栏目内容来源于Andrew NG老师讲解的SVM部分,包括SVM的优化目标.最大判定边界.核函数.SVM使用方法.多分类问题等,Machine learning课程地址为:https://www.coursera.org/course/ml 大家对于支持向量机(SVM)可能会比较熟悉,是个强大且流行的算法,有时能解决一些复杂的非线性问题.我之前用过它的工具包libsvm来做情感分析的研究,感觉效果还不错.NG在进行SVM的讲解时也同样建议我们使用此类的工具来运用SVM. (一)优化目标(Opt

Machine Learning in Action -- Support Vector Machines

虽然SVM本身算法理论,水比较深,很难懂 但是基本原理却非常直观易懂,就是找到与训练集中支持向量有最大间隔的超平面 形式化的描述: 其中需要满足m个约束条件,m为数据集大小,即数据集中的每个数据点function margin都是>=1,因为之前假设所有支持向量,即离超平面最近的点,的function margin为1 对于这种有约束条件的最优化问题,用拉格朗日定理,于是得到如下的形式, 现在我们的目的就是求出最优化的m个拉格朗日算子,因为通过他们我们可以间接的算出w和b,从而得到最优超平面 考

Support Vector Machines for classification

Support Vector Machines for classification To whet your appetite for support vector machines, here’s a quote from machine learning researcher Andrew Ng: “SVMs are among the best (and many believe are indeed the best) ‘off-the-shelf’ supervised learni

[机器学习] Coursera笔记 - Support Vector Machines

序言 机器学习栏目记录我在学习Machine Learning过程的一些心得笔记,包括在线课程或Tutorial的学习笔记,论文资料的阅读笔记,算法代码的调试心得,前沿理论的思考等等,针对不同的内容会开设不同的专栏系列. 机器学习是一个令人激动令人着迷的研究领域,既有美妙的理论公式,又有实用的工程技术,在不断学习和应用机器学习算法的过程中,我愈发的被这个领域所吸引,只恨自己没有早点接触到这个神奇伟大的领域!不过我也觉得自己非常幸运,生活在这个机器学习技术发展如火如荼的时代,并且做着与之相关的工作

机器学习 第七讲:Support Vector Machines 1

引言 这一讲及接下来的几讲,我们要介绍supervised learning 算法中最好的算法之一:Support Vector Machines (SVM,支持向量机).为了介绍支持向量机,我们先讨论"边界"的概念,接下来,我们将讨论优化的边界分类器,并将引出拉格朗日数乘法.我们还会给出 kernel function 的概念,利用 kernel function,可以有效地处理高维(甚至无限维数)的特征向量,最后,我们会介绍SMO算法,该算法说明了如何高效地实现SVM. Margi

scikit-learn(工程中用的相对较多的模型介绍):1.4. Support Vector Machines

参考:http://scikit-learn.org/stable/modules/svm.html 在实际项目中,我们真的很少用到那些简单的模型,比如LR.kNN.NB等,虽然经典,但在工程中确实不实用. 今天我们关注在工程中用的相对较多的SVM. SVM功能不少:Support vector machines (SVMs) are a set of supervised learning methods used for classification, regression and outl

初译 Support Vector Machines:A Simple Tutorial(一)

从本次开始我将开始尝试着逐章翻译一下 Alexey Nefedov的<Support Vector Machines:A Simple Tutorial>这本教材,这可是我们导师极力推荐的SVM教材,看了好久一直感觉一脸懵逼,索性开坑翻译一下吧,也当是加深理解,毕竟我也是一知半解,如果翻译的有不对的地方还望大佬们斧正,欢迎提意见,欢迎讨论. 嗯,就是这样. (一)Introduction 在本章节中将会介绍一些用于定义支持向量机(SVM)的基础的概念,这些概念对于理解SVM至关重要,假定读者了

OpenCV Tutorials &mdash;&mdash; Support Vector Machines for Non-Linearly Separable Data

与上一篇类似 ~~ 只不过是非线性数据 #include "stdafx.h" #include <iostream> #include <opencv2/core/core.hpp> #include <opencv2/highgui/highgui.hpp> #include <opencv2/ml/ml.hpp> #define NTRAINING_SAMPLES 100 // Number of training samples

斯坦福CS229机器学习课程笔记五:支持向量机 Support Vector Machines

SVM被许多人认为是有监督学习中最好的算法,去年的这个时候我就在尝试学习.不过,面对长长的公式和拗口的中文翻译最终放弃了.时隔一年,看到Andrew讲解SVM,总算对它有了较为完整的认识,总体思路是这样的:1.介绍间隔的概念并重新定义符号:2.分别介绍functional margins与geometric margins:3.由此引出最大间隔分类器,同时将最优化问题转化为凸函数的优化问题:4.补充了拉格朗日对偶性的知识:5.利用拉格朗日对偶性,推导最大间隔分类器最优化的对偶问题,即SVM的最优