## Machine Learning Neural Networks

- 2016-04-17
- Cyanny Liang

This week is about the mysterious Neural Networks. The courses in this week just explain the basics about Neural Networks.

## What is Neural Networks

It’s a technique to train our data based on how human brains works. A simple Neural Network has:

- input layer
- hidden layer
- output layer

We use Neural NetWorks to make classification and regression.

We use sigmoid function the map data from input layer to hidden layer then the output layer, the function is called activation function.

In Neural Network, we add bias unit, x0, a1 to do calculate.

With Neural Network, we build more complex hypothesis function.

To play with neural network, you can try google’s open source tensorflow

## Handwritten Digital Classification

This week’s assignment is to do multi classification on handwritten recognize.

**Do multi classification with one-vs-all logistic regression**

For handwritens in 10 lables: 0~9, we do 10 regression regression to calculate 10 group of theta. And then make 10 predications base on these 10 group of theta, choose the lable with max hypothesis value（probaility value）

**1. Cost function**

1 | function [J, grad] = lrCostFunction(theta, X, y, lambda) |

**2. OneVsAll**

make 10 classifications

1 | function [all_theta] = oneVsAll(X, y, num_labels, lambda) |

**3. PredictOneVsAll**

1 | function p = predictOneVsAll(all_theta, X) |

The logistic has great accurracy, about 95% in this case, but neural network will have higher accuracy, about 97%.

## Neural Forward Propagation algorithm

In the assignment, it build hypothesis function with 3 layers neural network.

The predications implementation

1 | function p = predict(Theta1, Theta2, X) |

My question is:

- how to train the Theta1, Theta2
- how to decide how many units in hidden layer

In the later course, I think NG will explain it. Next week, I will learn backpropagation algorithm.