This week is about the mysterious Neural Networks. The courses in this week just explain the basics about Neural Networks.
What is Neural Networks
It’s a technique to train our data based on how human brains works. A simple Neural Network has:
- input layer
- hidden layer
- output layer
We use Neural NetWorks to make classification and regression.
We use sigmoid function the map data from input layer to hidden layer then the output layer, the function is called activation function.
In Neural Network, we add bias unit, x0, a1 to do calculate.
With Neural Network, we build more complex hypothesis function.
To play with neural network, you can try google’s open source tensorflow
Handwritten Digital Classification
This week’s assignment is to do multi classification on handwritten recognize.
Do multi classification with one-vs-all logistic regression
For handwritens in 10 lables: 0~9, we do 10 regression regression to calculate 10 group of theta. And then make 10 predications base on these 10 group of theta, choose the lable with max hypothesis value（probaility value）
1. Cost function
function [J, grad] = lrCostFunction(theta, X, y, lambda)
make 10 classifications
function [all_theta] = oneVsAll(X, y, num_labels, lambda)
function p = predictOneVsAll(all_theta, X)
The logistic has great accurracy, about 95% in this case, but neural network will have higher accuracy, about 97%.
Neural Forward Propagation algorithm
In the assignment, it build hypothesis function with 3 layers neural network.
The predications implementation
function p = predict(Theta1, Theta2, X)
My question is:
- how to train the Theta1, Theta2
- how to decide how many units in hidden layer
In the later course, I think NG will explain it. Next week, I will learn backpropagation algorithm.