Udacity Nanodegree Program: Deep Learning Foundation: New Syllusbus

Program Structure

Every week, you can expect to see this content coming up:

  • Siraj‘s introductory video & One hour coding session
  • Additional lesson(s) from Mat & other Udacity experts

Then, approximately every four weeks you‘ll get a project.

The first week‘s content contains a bit more than an average week, as we‘re covering some introductory material and two topics from Siraj. So you can expect a little less than that going forward. Keep in mind the program is for students of all backgrounds. Some of the material might feel easy for more experienced students, but we‘re covering a lot of different topics and there will be a lot of advanced material.

Weekly Syllabus

Here is the list of topics that will be taught throughout the program:

Week 1: Introduction to Deep Learning

We’ll start off with a simple introduction to linear regression and machine learning. This will give you the vocabulary you need to understand recent advancements, and make clear where deep learning fits into the broader picture of ML techniques.

Then, you‘ll learn how to build a simple neural network from scratch using Numpy. We‘ll cover the algorithms used to train networks such as gradient descent and backpropagation.

The first project is also available this week. In this project, you‘ll predict bike ridership using a simple neural network.

Week 2: Graph Computations

TensorFlow is the most popular framework for building deep learning networks. It is based on graph computation, an efficient method to represent and calculate the matrix operations involved in training networks. In this lesson, you‘ll build your own small version of TensorFlow, called MiniFlow, to deepen your understanding of backpropagation and start your work with TensorFlow.

You‘ll also learn how to evaluate machine learning models such as neural networks. We do this using validation, testing the model‘s performance on a small portion of the data.

Week 3: Sentiment Analysis

This week, you‘ll learn about sentiment analysis from Siraj and our guest instructor, Andrew Trask. That is, you‘ll use neural networks to predict if some text is positive or negative. Andrew will extend the network from project one and show you how to prepare your data and network to get much more efficient performance.

Week 4: Intro to TensorFlow

In this lesson, you‘ll be learning about TensorFlow, a popular deep learning framework built by Google. You‘ll use it to build a simple neural network.

You‘ll also be introduced to using cloud computing services such as AWS and FloydHub to run your networks on GPUs.

Week 5: Deep Neural Networks

Deep neural networks have revolutionized multiple fields including computer vision, natural language processing, and artificial intelligence. In this lesson, you‘ll learn about using TensorFlow to build deep networks for classifying handwritten digits. We‘ll also cover common training improvements like dropout.

Week 6: Convolutional Networks

Convolutional networks have achieved state of the art results in computer vision. These types of networks can detect and identify objects in images. You‘ll learn how to build convolutional networks in TensorFlow.

You‘ll also get the second project, where you‘ll build a convolutional network to classify images of frogs, planes, cars, and more.

Week 7: Recurrent Neural Networks

In this lesson, you’ll learn about Recurrent Neural Networks?—?a type of network architecture particularly well suited to data that forms sequences like text, music, and time series data. You‘ll build a recurrent neural network that can generate new text character by character.

Week 8: Word Embeddings

When dealing with natural language problems, you‘ll end up working with huge vocabularies. This ends up being computationally inefficient, so instead we find smaller representations for all the words, called word embeddings. The words are represented by vectors that contain information about what the words actually mean semantically. To learn more about word embeddings, you‘ll implement a model known as Word2vec.

Week 9: Using TensorBoard

TensorBoard is a visualization tool useful for inspecting your networks. We‘ll show you how to use TensorBoard to visualize the graphs you build with TensorFlow, as well as find the best parameters for your models.

Week 10: Text Generation

In this lesson, you‘ll learn about using a recurrent neural network to predict sentiment from text. You‘ll also start working on the third project, generating new TV scripts using a recurrent neural network.

Week 11: Sequence to Sequence

Neural Networks have been a fundamental part of the recent advancements in machine translation. The latest production versions of Google Translate and Baidu Translate both use deep learning architectures to automatically translate text from one language to another. This is done using a process known as Sequence to Sequence Learning, which we will explore in this lesson.

Week 11: Chatbot QA System with voice (sequence to sequence more in-depth)

We’ll further explore Sequence to Sequence learning through building our very own Chatbot QA system that can answer unstructured queries from a user.

Week 12: Transfer Learning

A common technique in deep learning is using pre-trained networks on new problems. For example, you can use a convolutional network trained on a huge dataset to classify images in a much smaller dataset. This method is called transfer learning and you‘ll learn how to use it to classify images of flowers without training a whole network yourself.

Week 13: Reinforcement Learning

Some of the most interesting advancements in deep learning have been in the field of Reinforcement Learning, where instead of training on a corpus of existing data, a network learns from live data it receives and adjusts accordingly. We’ll see how to apply Reinforcement Learning to build simple Game-Playing AIs that can win in a wide variety of Atari games.

You‘ll also build a network that can translate text in the fourth project.

Week 14: Autoencoders

As recently shown by Google, deep learning can also be used to dramatically improve compression techniques. In this lesson we’ll explore using deep learning to build autoencoders that automatically find sparse representations of data.

Week 15: Generative Adversarial Networks

Generate Adversarial Networks (GANs) are a recent major advancement in deep learning methods, producing state-of-the-art results in image generation. The inventor of GANs, Ian Goodfellow, will teach you about building the networks yourself.

Week 16: Image Generation

As echoed by Yan LeCunn, Generative Adversarial Networks are one of the most fundamental advancements in deep learning. You’ll explore this state of the art concept to generate images that most humans wouldn’t believe are generated by a computer.

Week 17: One-shot learning (Probabilistic Programming) Finally, we’ll look at one-shot learning, where our neural network is able to just learn from one (or a few) example, as opposed to a large amount of data.

Through this curriculum, you will absorb an exciting introduction to some of the most compelling advancements in deep learning! We hope you join us on this journey and we can’t wait to share more of these ideas with you.

In the fifth project, you‘ll use a GAN to generate new human faces.

时间: 2024-10-04 21:55:29

Udacity Nanodegree Program: Deep Learning Foundation: New Syllusbus的相关文章

Udacity Nanodegree Program: Deep Learning Foundation: Week 17

Week 17: One-shot Learning Overview One-shot Learning Log 5/22/2017 Note What is One-shot Learning? [Wikipedia] One-shot learning is an object categorization problem in computer vision. Whereas most machine learning based object categorization algori

Udacity Nanodegree Program: Deep Learning Foundation: Week 8

Overview Lesson 26 - Embeddings and Word2vec Lesson 27 - Siraj's Style Transfer Log 3/10/2017: Lesson 26; Reading Note Lesson 26 - Embeddings and Word2vec 1. Word embedding Wikipedia: 知乎: Embedding在数学上表示一个maping, f: X -> Y, 也就是一个function. embedding:

Udacity Nanodegree Program: Deep Learning Foundation: Week 12

Sequence to Sequence Overview Lesson 38 - Sequence to Sequence Lesson 39 - Siraj's Chatbot Log Note GRUs control the flow of data like LSTM cells but use 2 gates and more computationallly efficient.

Machine and Deep Learning with Python

Machine and Deep Learning with Python Education Tutorials and courses Supervised learning superstitions cheat sheet Introduction to Deep Learning with Python How to implement a neural network How to build and run your first deep learning network Neur

Deep Learning Enables You to Hide Screen when Your Boss is Approaching

https://github.com/Hironsan/BossSensor/ 背景介绍 学生时代,老师站在窗外的阴影挥之不去.大家在玩手机,看漫画,看小说的时候,总是会找同桌帮忙看着班主任有没有来. 一转眼,曾经的翩翩少年毕业了,新的烦恼来了,在你刷知乎,看视频,玩手机的时候,老板来了! 不用担心,不用着急,基于最新的人脸识别+手机推送做出的BossComing.老板站起来的时候,BossComing会通过人脸识别发现老板已经站起来,然后通过手机推送发送通知“BossComing”,并且震动告

(转) Deep Learning in a Nutshell: Reinforcement Learning

Deep Learning in a Nutshell: Reinforcement Learning Share: Posted on September 8, 2016by Tim Dettmers No CommentsTagged Deep Learning, Deep Neural Networks, Machine Learning,Reinforcement Learning This post is Part 4 of the Deep Learning in a Nutshel

A Full Hardware Guide to Deep Learning

A Full Hardware Guide to Deep Learning Deep Learning is very computationally intensive, so you will need a fast CPU with many cores, right? Or is it maybe wasteful to buy a fast CPU? One of the worst things you can do when building a deep learning sy

深度学习阅读列表 Deep Learning Reading List

Reading List List of reading lists and survey papers: Books Deep Learning, Yoshua Bengio, Ian Goodfellow, Aaron Courville, MIT Press, In preparation. Review Papers Representation Learning: A Review and New Perspectives, Yoshua Bengio, Aaron Courville

Decision Boundaries for Deep Learning and other Machine Learning classifiers

Decision Boundaries for Deep Learning and other Machine Learning classifiers H2O, one of the leading deep learning framework in python, is now available in R. We will show how to get started with H2O, its working, plotting of decision boundaries and