Building your Deep Neural Network: Step by Step
Welcome to your third programming exercise of the deep learning specialization. You will implement all the building blocks of a neural network and use these building blocks in the next assignment to build a neural network of any architecture you want. By completing this assignment you will:
- Develop an intuition of the over all structure of a neural network.
- Write functions (e.g. forward propagation, backward propagation, logistic loss, etc...) that would help you decompose your code and ease the process of building a neural network.
- Initialize/update parameters according to your desired structure.
This assignment prepares you well for the upcoming assignment. Take your time to complete it and make sure you get the expected outputs when working through the different exercises. In some code blocks, you will find a "#GRADED FUNCTION: functionName" comment. Please do not modify it. After you are done, submit your work and check your results. You need to score 70% to pass. Good luck :) !
【中文翻译】
(2)写函数 (如前向传播、反向传播、逻辑损失函数等), 可以帮助您分解代码并简化构建神经网络的过程。
(3)根据所需的结构初始化/更新参数。
Building your Deep Neural Network: Step by Step
Welcome to your week 4 assignment (part 1 of 2)! You have previously trained a 2-layer Neural Network (with a single hidden layer). This week, you will build a deep neural network, with as many layers as you want!
- In this notebook, you will implement all the functions required to build a deep neural network.
- In the next assignment, you will use these functions to build a deep neural network for image classification.
After this assignment you will be able to:
- Use non-linear units like ReLU to improve your model
- Build a deeper neural network (with more than 1 hidden layer)
- Implement an easy-to-use neural network class
Notation:
- Superscript [l] denotes a quantity associated with the lth layer.
- Example: a[L] is the Lth layer activation. W[L]and b[L]are the Lth layer parameters.
- Superscript (i) denotes a quantity associated with the ith example.
- Example: x(i) is the ith training example.
- Lowerscript i denotes the ith entry(项) of a vector.
- Example: ai[l]denotes the ith entry of the lth layer‘s activations.
Let‘s get started!
第四周:Deep Neural Networks(深层神经网络)----------2.Programming Assignments: Building your Deep Neural Network: Step by Step
原文地址:http://www.cnblogs.com/hezhiyao/p/7891035.html