正则化后的线性回归模型
模型
\[{h_\theta }\left( x \right) = {\theta _0} + {\theta _1}x + {\theta _2}{x^2} + {\theta _3}{x^3} + {\theta _4}{x^4}\]
\[J\left( \theta \right) = \frac{1}{{2m}}\left[ {\sum\limits_{i = 1}^m {{{\left( {{h_\theta }\left( {{x^{\left( i \right)}}} \right) - {y^{\left( i \right)}}} \right)}^2}} + \lambda \sum\limits_{j = 1}^n {\theta _j^2} } \right]\]
当正则化参数λ很大时
\[{h_\theta }\left( x \right) \approx {\theta _0}\]
这时处于“高偏差(High bias)”(underfit)的情况
当正则化参数很小(λ=0)时
\[{h_\theta }\left( x \right) = {\theta _0} + {\theta _1}x + {\theta _2}{x^2} + {\theta _3}{x^3} + {\theta _4}{x^4}\]
这时处于“高方差(High variance)”(overfit)
当正则化参数λ适当时
模型处于“Just right”状态
如何选择正确的λ呢?
除了以下两个公式
\[{h_\theta }\left( x \right) = {\theta _0} + {\theta _1}x + {\theta _2}{x^2} + {\theta _3}{x^3} + {\theta _4}{x^4}\]
\[J\left( \theta \right) = \frac{1}{{2m}}\left[ {\sum\limits_{i = 1}^m {{{\left( {{h_\theta }\left( {{x^{\left( i \right)}}} \right) - {y^{\left( i \right)}}} \right)}^2}} + \lambda \sum\limits_{j = 1}^n {\theta _j^2} } \right]\]
再定义
\[\begin{array}{l}
{J_{train}}\left( \theta \right) = \frac{1}{{2{m_{train}}}}\sum\limits_{i = 1}^{{m_{train}}} {{{\left( {{h_\theta }\left( {{x^{\left( i \right)}}} \right) - {y^{\left( i \right)}}} \right)}^2}} \\
{J_{CV}}\left( \theta \right) = \frac{1}{{2{m_{CV}}}}\sum\limits_{i = 1}^{{m_{CV}}} {{{\left( {{h_\theta }\left( {x_{CV}^{\left( i \right)}} \right) - y_{CV}^{\left( i \right)}} \right)}^2}} \\
{J_{test}}\left( \theta \right) = \frac{1}{{2{m_{test}}}}\sum\limits_{i = 1}^{{m_{test}}} {{{\left( {{h_\theta }\left( {{x^{\left( i \right)}}} \right) - {y^{\left( i \right)}}} \right)}^2}}
\end{array}\]
分别表示“训练误差”、‘“交叉验证误差”和“测试误差”
选择λ
尝试如下λ
- λ=0----------->minJ(θ)----->Θ(1)------>JCV(Θ(1))
- λ=0.01------->minJ(θ)----->Θ(2)------>JCV(Θ(2))
- λ=0.02------->minJ(θ)----->Θ(3)------>JCV(Θ(3))
- λ=0.04------->minJ(θ)----->Θ(4)------>JCV(Θ(4))
- .
- .
- .
- λ=10--------->minJ(θ)----->Θ(12)------>JCV(Θ(12))
运用不同的λ去最小化“代价函数”得到不同的Θ;
不同的Θ带入h(x)中得到不同的模型,然后用“交叉验证集”验证;
取“交叉验证误差”最小的那个模型;
将最终得到的模型运用于测试集,测试模型的表现。
原文地址:https://www.cnblogs.com/qkloveslife/p/9885500.html