赞
踩
deep learning attracts lots of attention
I believe you have seen lots of exciting results before. There are more and more deep learning applications, for example, you can see the growing deep learning trends at Google.
The history of deep learning: Ups and downs of deep learning
1958:perceptron(linear model) AI arrives in reality
1969:perceptron has limitation Someone claims that perceptron can recognize truck and tank. But they find later that these two pictures are taken on rainy and sunny days separately. Perceptron can only discriminate luminance.
1980s:multi-layer perceptron Do not have significant difference from DNN today
1986s: Backpropagation Usually more than 3 hidden layers is not helpful
1989: 1 hidden layer is "good enough", why deep? (the effectiveness of neural network is contradicted. Neural network has bad reputation)
The way to redeem NN reputation is to change a name, from neural network to deep learning.(changing name has great power)
2006:RBM initialization (breakthrough) Reseachers think it may work because it seems so powerful. But after they try a lot. They find, actually, RBM initialization is complicated and useless. But it attracts great attention for deep learning. Stone soup
2009:GPU
2011:start to be popular in speech recognition
2012:win ILSVRC image competitiom
function: a neural network structure
You need to decide the network structure to let a good function in your function set.
Q: How many layers? How many neurons for each layer? Experience and intuition trial and error
Q: Can the structure be automatically determined?
E.g. Evolutionary Artificial Neural Networks
Q: Can we design the network structure?
We can, for example, CNN
tool kit
deeper is better?
Copyright © 2003-2013 www.wpsshop.cn 版权所有,并保留所有权利。