Coursera - Machine Learning, Stanford: Week 5

Overview

  • Cost Function and Backpropagation

    • Cost Function
    • Backpropagation Algorithm
    • Backpropagation Intuition
  • Back Propagation in Practice
    • Implementation Note: Unrolling Parameters
    • Gradient Check
    • Random Initialization
    • Put it together
  • Application of Neural Networks
    • Autonomous Driving
  • Review

Log

  • 2/10/2017: all the videos; puzzled about backprogation
  • 2/11/2017: reviewed backpropagation

Reading

Note

  • Backpropagation Algorithm

    • "Backpropagation" is neural-network terminology for minimizing our cost function, just like what we were doing with gradient descent in logistic and linear regression.
时间: 2024-08-11 23:52:03

Coursera - Machine Learning, Stanford: Week 5的相关文章

Coursera - Machine Learning, Stanford: Week 1

Welcome and introduction Overview Reading Log 9/9 videos and quiz completed; 10/29 Review; Note 1.1 Welcome 1) What is machine learning? Machine learning is the science of getting compters to learn, without being explicitly programmed. 1.2 Introducti

Coursera - Machine Learning, Stanford: Week 10

Overview Gradient Descent with Large Datasets Learning With Large Datasets Stochastic Gradient Descent Mini-Batch Gradient Descent Stochastic Gradient Descent Convergence Advanced Topics Online Learning Map Reduce and Data Parallelism Review Log June

Coursera - Machine Learning, Stanford: Week 11

Overview Photo OCR Problem Description and Pipeline Sliding Windows Getting Lots of Data and Artificial Data Ceiling Analysis: What Part of the Pipeline to Work on Next Review Lecture Slides Quiz: Application: Photo OCR Conclusion Summary and Thank Y

Coursera Machine Learning 学习笔记(一)

之前就对Machine Learning很感兴趣,假期得闲看了Coursera Machine Learning 的全部课程,整理了笔记以便反复体会. I. Introduction (Week 1) - What is machine learning 对于机器学习的定义,并没有一个被一致认同的答案. Arthur Samuel (1959) 给出对机器学习的定义: 机器学习所研究的是赋予计算机在没有明确编程的情况下仍能学习的能力. Samuel设计了一个西洋棋游戏,他让程序自己跟自己下棋,并

Coursera machine learning 第二周 quiz 答案 Octave/Matlab Tutorial

https://www.coursera.org/learn/machine-learning/exam/dbM1J/octave-matlab-tutorial Octave Tutorial 5 试题 1. Suppose I first execute the following Octave commands: A = [1 2; 3 4; 5 6]; B = [1 2 3; 4 5 6]; Which of the following are then valid Octave com

神经网络作业: NN LEARNING Coursera Machine Learning(Andrew Ng) WEEK 5

在WEEK 5中,作业要求完成通过神经网络(NN)实现多分类的逻辑回归(MULTI-CLASS LOGISTIC REGRESSION)的监督学习(SUOERVISED LEARNING)来识别阿拉伯数字.作业主要目的是感受如何在NN中求代价函数(COST FUNCTION)和其假设函数中各个参量(THETA)的求导值(GRADIENT DERIVATIVE)(利用BACKPROPAGGATION). 难度不高,但问题是你要习惯使用MALAB的矩阵QAQ,作为一名蒟蒻,我已经狗带了.以下代核心部

Coursera Machine Learning 学习笔记(六)

- Gradient descent 梯度下降算法是一个用来求得函数最小值的算法,这里我们将使用梯度下降算法来求出代价函数的最小值. 梯度下降的思想是:开始的时候我们随机选择一个参数的组合并计算代价函数,之后我们寻找下一个能使得代价函数值下降最多的参数的组合. 我们持续如此过程直到一个局部最小值(local minimum),由于我们并没有完全尝试完所有参数的组合,所以我们不能够确定我们得到的局部最小值是否为全局最小值(global minimum),而且选择不同的参数组合,我们可能会找到不同的

Coursera Machine Learning 学习笔记(二)

- Supervised Learning 对于监督学习我们先看一个例子,下图中表示的是一个房价预测的例子.图中横坐标表示房屋占地面积,纵坐标表示房屋交易价格.图中的每个叉则表示一个房屋实例. 现在,我们希望能够预测一个房屋占地面积为750平方英尺的房屋的交易价格是多少.简单的方法是根据这些数据点的分布,画出一条合适的直线,然后根据这条直线来预测.当然,在此房价预测例子中,一个二次函数更加适合已有数据的分布.因此,我们可能会更加希望使用这个二次函数的曲线来进行房价预测. 因此,我们称上述这样的学

Coursera machine learning 第二周 编程作业 Linear Regression

必做: [*] warmUpExercise.m - Simple example function in Octave/MATLAB[*] plotData.m - Function to display the dataset[*] computeCost.m - Function to compute the cost of linear regression[*] gradientDescent.m - Function to run gradient descent 1.warmUpE