Machine Learning - II. Linear Regression with One Variable (Week 1)

http://blog.csdn.net/pipisorry/article/details/43115525

机器学习Machine Learning - Andrew NG courses学习笔记

单变量线性回归Linear regression with one variable

模型表示Model representation

例子:

这是Regression Problem(one of supervised learning)并且是Univariate linear regression (Linear regression with one variable.)

变量定义Notation(术语terminology):

m = Number of training examples

x’s = “input” variable  /  features

y’s = “output” variable  /  “target” variable

e.g.  (x,y)表示一个trainning example 而 (xi,yi)表示ith trainning example.

Model representation

h代表假设hypothesis,h maps x‘s to y‘s(其实就是求解x到y的一个函数)

成本函数cost function

上个例子中h设为下图中的式子,我们要做的是

how to go about choosing these parameter values, theta zero and theta one.

try to minimize the square difference between the output of the hypothesis and the actual price of the house.

定义这个函数为(J函数就是cost function的一种)

why do we minimize one by 2M?

going to minimize one by 2M.Putting the 2, the constant one half, in front it just makes some of the math a little easier.

why do we take the squares of the errors?

It turns out that the squared error cost function is a reasonable choice and will work well for most problems, for most regression problems. There are other cost functions that will work pretty well, but the squared error cost function is probably the most
commonly used one for regression problems.

from:http://blog.csdn.net/pipisorry/article/details/43115525

时间: 2024-10-24 06:08:57

Machine Learning - II. Linear Regression with One Variable (Week 1)的相关文章

【Stanford Open Courses】Machine Learning:Linear Regression with One Variable (Week 1)

从Ⅱ到Ⅳ都在讲的是线性回归,其中第Ⅱ章讲得是简单线性回归(simple linear regression, SLR)(单变量),第Ⅲ章讲的是线代基础,第Ⅳ章讲的是多元回归(大于一个自变量). 本文的目的主要是对Ⅱ章中出现的一些算法进行实现,适合的人群为已经看完本章节Stanford课程的学者.本人只是一名初学者,尽可能以白话的方式来说明问题.不足之处,还请指正. 在开始讨论具体步骤之前,首先给出简要的思维路线: 1.拥有一个点集,为了得到一条最佳拟合的直线: 2.通过"最小二乘法"来

Machine Learning:Linear Regression With One Variable

Machine Learning:Linear Regression With One Variable 机器学习可以应用于计算机视觉,自然语言处理,数据挖掘等领域,可以分为监督学习(Supervised Learning),无监督学习(Unsupervised Learning),强化学习(Reinforcement Learning)等. 首先我们从一个简单的监督学习入手:假如给我们一组训练集(在这里就是Size和Price),我们如何才能建立一个可以预测房价的模型呢? 这里(x,y)称为一

Machine Learning - IV. Linear Regression with Multiple Variables (Week 2)

http://blog.csdn.net/pipisorry/article/details/43529845 机器学习Machine Learning - Andrew NG courses学习笔记 multivariate linear regression多变量线性规划 (linear regression works with multiple variables or with multiple features) Multiple Features(variables)多特征(变量)

Machine Learning:Linear Regression With Multiple Variables

Machine Learning:Linear Regression With Multiple Variables 接着上次预测房子售价的例子,引出多变量的线性回归. 在这里我们用向量的表示方法使表达式更加简洁. 变量梯度下降跟单变量一样需同步更新所有的theta值. 进行feature scaling的原因是为了使gradient descent算法收敛速度加快.如下图所示,左图theta2与theta1的量级相差太大,这样导致Cost Function的等高图为一个细高的椭圆形状,可以看到

CheeseZH: Stanford University: Machine Learning Ex1:Linear Regression

(1) How to comput the Cost function in Univirate/Multivariate Linear Regression; (2) How to comput the Batch Gradient Descent function in Univirate/Multivariate Linear Regression; (3) How to scale features by mean value and standard deviation; (4) Ho

【Coursera - machine learning】 Linear regression with one variable-quiz

Question 1 Consider the problem of predicting how well a student does in her second year of college/university, given how well they did in their first year. Specifically, let x be equal to the number of "A" grades (including A-. A and A+ grades)

[Machine Learning (Andrew NG courses)]II. Linear Regression with One Variable

【machine learning】linear regression

一.曲线拟合 1.问题引入 ①假设现在有一份关于某城市的住房面积与相应房价的数据集 表1    居住面积与房价关系 图1    居住面积与房价关系 那么给定这样一个数据集,我们怎么学习出一个以住房面积大小为自变量的用于预测该城市房价的函数? 问题可形式化为 给定大小为m的训练样本集 我们希望学习的目标函数为 房价预测本质上是回归问题 a.回归分析挖掘自变量与因变量之间的关系 b.有监督的学习问题,所有的样本点都带有目标变量 c.输出变量为连续值,可取任意实数 ②假设现在我们有份更详尽的数据集,它

机器学习 Machine Learning(by Andrew Ng)----第二章 单变量线性回归(Linear Regression with One Variable)

第二章 单变量线性回归(Linear Regression with One Variable) <模型表示(Model Representation)>                                                             <代价函数(Cost Function)>                                                          <梯度下降(Gradient Descent)