Training a Neural Network
- Randomly initialize the weights
- Implement forward propagation to get hΘ?(x(i)) for any x(i)
- Implement the cost function
- Implement backpropagation to compute partial derivatives
- Use gradient checking to confirm that your backpropagation works. Then disable gradient checking.
- Use gradient descent or a built-in optimization function to minimize the cost function with the weights in theta.
训练一个神经网络
- 随机初始化权重
- 运用前向传播得到所有样本x(i)的hΘ?(x(i))
- 计算损失函数
- 运用反向传播计算偏导
- 运用梯度检查确保梯度下降算法的正确运行,然后关闭梯度检查
- 运用梯度下降算法或者别的优化算法优化权重以最小化损失函数
原文地址:https://www.cnblogs.com/qkloveslife/p/9873681.html
时间: 2024-11-03 01:28:53