2) 人工智能为什么这么强大? 就像《狮子王》中谈到的:the past can hurt. you can either run from it or learn from it(过
去的经历可能会造成伤害,你可以不管这些东西,或者从中学到一些东西) 。过去以前的编
程和人工智能的编程有什么区别?机器学习、深度学习、增强学习以前的编程是不关心机器
的经验的,以前的编程是根据你的经验和代码,无法根据现有的数据或发生的事件改变你的
程序,流程各方面已经确定,无法从已有的数据或个体的数据进行学习,第一行代码,第二
行代码,第三行代码......;而人工智能的核心,能从每次发生的事件中进行学习,并不断改
进优化,然后用不断优化应对现在和未来的变化。科研界认为PyTorch比谷歌的TensorFlow
好很多,因为PyTorch有个动态图的核心特性,可以随时修改图;TensorFlow是业界最核心
主流的人工智能框架,TensorFlow 是静态的,静态图根据已有的数据学习完成以后就不能再
修改,而PyTorch学习完成以后可以动态改变,PyTorch 是业界最先进的人工智能框架。
人工智能从每次的经历进行学习,用于下一次的行为改进。深度学习基于海量的数据完
成这个过程,增强学习是实时的与环境进行交互,来完成这个过程。机器学习、深度学习、
增强学习的算法如果不是从这个角度考虑的,那一定是错的。
Neuron_Network_Entry.py 的运行结果如下:
1. +1 V1 V2
2. Hidden layer creation: 1 N[1][1] N[1][2] N[1][3]
N[1][4] Hidden layer creation: 2 N[2][1] N[2][2]
3. Output layer: Output
4. The weight from 1 to 4 : 0.13435033348808822
5. The weight from 1 to 5 : -0.8746277847113736
6. The weight from 1 to 6 : -0.8013636576296298
7. The weight from 1 to 7 : -0.20116942166494012
8. The weight from 2 to 4 : -0.19491319773895655
9. The weight from 2 to 5 : -0.9890612748469597
10. The weight from 2 to 6 : -0.9366153095486941
11. The weight from 2 to 7 : 0.19916644350391421
12. The weight from 4 to 9 : -0.3796380006519604
13. The weight from 4 to 10 : -0.604548306776146
14. The weight from 5 to 9 : 0.8551829387383982
15. The weight from 5 to 10 : 0.4617959177774711
16. The weight from 6 to 9 : -0.8052493675763375
17. The weight from 6 to 10 : 0.914614435777299
18. The weight from 7 to 9 : -0.5682949753457094
19. The weight from 7 to 10 : -0.8842628312130644
20. The weight from 9 to 11 : 0.20155741846708497
21. The weight from 10 to 11 : 0.0771101729511281
Neuron_Network_Entry.py的运行结果如下:
1. +1 V1 V2
2. Hidden layer creation: 1 N[1][1] N[1][2] N[1][3]
N[1][4] N[1][5] N[1][6] N[1][7] N[1][8]
3. Hidden layer creation: 2 N[2][1] N[2][2] N[2][3]
N[2][4]
4. Hidden layer creation: 3 N[3][1] N[3][2]
5.
6. Output layer: Output
7. The weight from 1 at layers[0] to 4 at layers[1] : 0.682010054426426
8. The weight from 1 at layers[0] to 5 at layers[1] : 0.4462749257618013
9. The weight from 1 at layers[0] to 6 at layers[1] : 0.16160202385864175
10. The weight from 1 at layers[0] to 7 at layers[1] : -0.8293483877977663
11. The weight from 1 at layers[0] to 8 at layers[1] : 0.8425783956944617
12. The weight from 1 at layers[0] to 9 at layers[1] : -0.09707734276349977
13. The weight from 1 at layers[0] to 10 at layers[1] : 0.19246729112981065
14. The weight from 1 at layers[0] to 11 at layers[1] : -0.07144351444843577
15. The weight from 2 at layers[0] to 4 at layers[1] : 0.9655120959373438
16. The weight from 2 at layers[0] to 5 at layers[1] : 0.412297300185698
17. The weight from 2 at layers[0] to 6 at layers[1] : -0.47246534784877203
18. The weight from 2 at layers[0] to 7 at layers[1] : -0.12471427081208997
19. The weight from 2 at layers[0] to 8 at layers[1] : -0.6382557927597117
20. The weight from 2 at layers[0] to 9 at layers[1] : 0.10738693446830583
21. The weight from 2 at layers[0] to 10 at layers[1] : -0.025251306294619247
22. The weight from 2 at layers[0] to 11 at layers[1] : 0.08577583858433924
23. The weight from 4 at layers[1] to 13 at layers[2] : 0.4205437000804866
24. The weight from 4 at layers[1] to 14 at layers[2] : 0.6714193541057805
25. The weight from 4 at layers[1] to 15 at layers[2] : 0.978499088182812
26. The weight from 4 at layers[1] to 16 at layers[2] : 0.40008688442906704
27. The weight from 5 at layers[1] to 13 at layers[2] : 0.9060350693349206
28. The weight from 5 at layers[1] to 14 at layers[2] : -0.954780134506903
29. The weight from 5 at layers[1] to 15 at layers[2] : 0.2002488166002787
30. The weight from 5 at layers[1] to 16 at layers[2] : -0.6783558210922017
31. The weight from 6 at layers[1] to 13 at layers[2] : -0.5080667016535712
32. The weight from 6 at layers[1] to 14 at layers[2] : -0.5164272805019525
33. The weight from 6 at layers[1] to 15 at layers[2] : -0.15325930283914424
34. The weight from 6 at layers[1] to 16 at layers[2] : -0.3511553702447253
35. The weight from 7 at layers[1] to 13 at layers[2] : 0.8839918451858777
36. The weight from 7 at layers[1] to 14 at layers[2] : -0.6242276041022798
37. The weight from 7 at layers[1] to 15 at layers[2] : -0.8719713647888472
38. The weight from 7 at layers[1] to 16 at layers[2] : 0.5821146360823584
39. The weight from 8 at layers[1] to 13 at layers[2] : -0.14909760993973764
40. The weight from 8 at layers[1] to 14 at layers[2] : -0.3889436600158289 41. The weight from 8 at layers[1] to 15 at layers[2] : -0.9584174395773749
42. The weight from 8 at layers[1] to 16 at layers[2] : 0.5349114755539881
43. The weight from 9 at layers[1] to 13 at layers[2] : -0.8054789129243652
44. The weight from 9 at layers[1] to 14 at layers[2] : -0.1624957942951466
45. The weight from 9 at layers[1] to 15 at layers[2] : -0.3449863660814254
46. The weight from 9 at layers[1] to 16 at layers[2] : -0.040445297924280865
47. The weight from 10 at layers[1] to 13 at layers[2] : -0.5636343487353018
48. The weight from 10 at layers[1] to 14 at layers[2] : -0.6331043052233007
49. The weight from 10 at layers[1] to 15 at layers[2] : 0.9258612666017985
50. The weight from 10 at layers[1] to 16 at layers[2] : 0.07836601506397933
51. The weight from 11 at layers[1] to 13 at layers[2] : -0.9801755029220328
52. The weight from 11 at layers[1] to 14 at layers[2] : 0.8581871213959458
53. The weight from 11 at layers[1] to 15 at layers[2] : -0.03585909343817084
54. The weight from 11 at layers[1] to 16 at layers[2] : -0.8362853166389556
55. The weight from 13 at layers[2] to 18 at layers[3] : 0.9352117750168787
56. The weight from 13 at layers[2] to 19 at layers[3] : 0.3384623166221661
57. The weight from 14 at layers[2] to 18 at layers[3] : -0.1925801474485741
58. The weight from 14 at layers[2] to 19 at layers[3] : 0.6629615274975045
59. The weight from 15 at layers[2] to 18 at layers[3] : -0.5156626918145072
60. The weight from 15 at layers[2] to 19 at layers[3] : -0.7717488675712948
61. The weight from 16 at layers[2] to 18 at layers[3] : -0.04612918490370388
62. The weight from 16 at layers[2] to 19 at layers[3] : 0.9760033825108967
63. The weight from 18 at layers[3] to 20 at layers[4] : -0.64341229565003
64. The weight from 19 at layers[3] to 20 at layers[4] : 0.6741240645845519
65. Prediction: 0.5190028617090506
66. Prediction: 0.5177101237343301
67. Prediction: 0.5185081918463893
68. Prediction: 0.5172200931191119
Neuron_Network_Entry.py
1. ……
2. epoch = 100000
3. learning_rate=0.1
4.
5. for i in range(epoch):
6. nodes,weights=
BackPropagation.applyBackPragation(instances,nodes,weights,learning_rate)
运行结果如下:
1. +1 V1 V2
2. Hidden layer creation: 1 N[1][1] N[1][2] N[1][3]
N[1][4] N[1][5] N[1][6] N[1][7] N[1][8]
3. Hidden layer creation: 2 N[2][1] N[2][2] N[2][3]
N[2][4]
4. Hidden layer creation: 3 N[3][1] N[3][2]
5.
6. Output layer: Output
7. The weight from 1 at layers[0] to 4 at layers[1] : 0.014754395461041403
8. The weight from 1 at layers[0] to 5 at layers[1] : 0.4852675953091179
9. The weight from 1 at layers[0] to 6 at layers[1] : -0.04730302732915692
10. The weight from 1 at layers[0] to 7 at layers[1] : 0.8328414197074481 11. The weight from 1 at layers[0] to 8 at layers[1] : -0.292038833157093
12. The weight from 1 at layers[0] to 9 at layers[1] : 0.36985194298359403
13. The weight from 1 at layers[0] to 10 at layers[1] : 0.5863269939947326
14. The weight from 1 at layers[0] to 11 at layers[1] : -0.058199463488485925
15. The weight from 2 at layers[0] to 4 at layers[1] : -0.06391839587513704
16. The weight from 2 at layers[0] to 5 at layers[1] : 0.7155263421232831
17. The weight from 2 at layers[0] to 6 at layers[1] : 0.7929838794184614
18. The weight from 2 at layers[0] to 7 at layers[1] : -0.7692820599673437
19. The weight from 2 at layers[0] to 8 at layers[1] : 0.6461702707507939
20. The weight from 2 at layers[0] to 9 at layers[1] : -0.7691783670361847
21. The weight from 2 at layers[0] to 10 at layers[1] : 0.1427196162091069
22. The weight from 2 at layers[0] to 11 at layers[1] : 0.23743848194973216
23. The weight from 4 at layers[1] to 13 at layers[2] : 0.6052229799647102
24. The weight from 4 at layers[1] to 14 at layers[2] : -0.176208412308542
25. The weight from 4 at layers[1] to 15 at layers[2] : -0.0971505490245631
26. The weight from 4 at layers[1] to 16 at layers[2] : -0.7851333794894211
27. The weight from 5 at layers[1] to 13 at layers[2] : 0.8340699603325874
28. The weight from 5 at layers[1] to 14 at layers[2] : -0.4953866016439531
29. The weight from 5 at layers[1] to 15 at layers[2] : 0.300134717197019
30. The weight from 5 at layers[1] to 16 at layers[2] : -0.3410805860986401
31. The weight from 6 at layers[1] to 13 at layers[2] : -0.05099358623099992
32. The weight from 6 at layers[1] to 14 at layers[2] : 0.6770340678151971
33. The weight from 6 at layers[1] to 15 at layers[2] : -0.3399588934067035
34. The weight from 6 at layers[1] to 16 at layers[2] : -0.36481726753031773
35. The weight from 7 at layers[1] to 13 at layers[2] : -0.3557995724109385
36. The weight from 7 at layers[1] to 14 at layers[2] : -0.21130922293946441
37. The weight from 7 at layers[1] to 15 at layers[2] : -0.3503562853274226
38. The weight from 7 at layers[1] to 16 at layers[2] : -0.58603113217164
39. The weight from 8 at layers[1] to 13 at layers[2] : 0.6620007049827901
40. The weight from 8 at layers[1] to 14 at layers[2] : 0.5428075846795659
41. The weight from 8 at layers[1] to 15 at layers[2] : -0.6968294407883738
42. The weight from 8 at layers[1] to 16 at layers[2] : -0.6477770513493042
43. The weight from 9 at layers[1] to 13 at layers[2] : -0.8814785316164266
44. The weight from 9 at layers[1] to 14 at layers[2] : -0.5402344713592881
45. The weight from 9 at layers[1] to 15 at layers[2] : -0.21791220078736018
46. The weight from 9 at layers[1] to 16 at layers[2] : -0.5927862722093897
47. The weight from 10 at layers[1] to 13 at layers[2] : -0.5857985964431403
48. The weight from 10 at layers[1] to 14 at layers[2] : 0.5621070507473107
49. The weight from 10 at layers[1] to 15 at layers[2] : -0.32785236322128597
50. The weight from 10 at layers[1] to 16 at layers[2] : 0.5168866366231784
51. The weight from 11 at layers[1] to 13 at layers[2] : 0.2653645801268325
52. The weight from 11 at layers[1] to 14 at layers[2] : -0.8283554568707727
53. The weight from 11 at layers[1] to 15 at layers[2] : -0.34734329416660203
54. The weight from 11 at layers[1] to 16 at layers[2] : 0.02855335929143199 55. The weight from 13 at layers[2] to 18 at layers[3] : 0.7571692716017584
56. The weight from 13 at layers[2] to 19 at layers[3] : 0.38199702838698024
57. The weight from 14 at layers[2] to 18 at layers[3] : 0.48899162049930633
58. The weight from 14 at layers[2] to 19 at layers[3] : 0.3846848263168843
59. The weight from 15 at layers[2] to 18 at layers[3] : 0.6777881462245301
60. The weight from 15 at layers[2] to 19 at layers[3] : 0.04792435974248255
61. The weight from 16 at layers[2] to 18 at layers[3] : -0.47975304566677224
62. The weight from 16 at layers[2] to 19 at layers[3] : -0.39110771186712434
63. The weight from 18 at layers[3] to 20 at layers[4] : -0.7800338665855491
64. The weight from 19 at layers[3] to 20 at layers[4] : -0.762021553391644
65. Prediction: 0.27563657064323516
66. Prediction: 0.2699850961962282
67. Prediction: 0.2780605556964589
68. Prediction: 0.27220177815008695
Back Propagation工作流程图:
这里致谢网友提供的三层神经网络的后向传播算法实现过程图,清楚的阐述了 Back
Propagation的算法过程:
每个神经元有两个单元组成。一个是权重和输入信号。另一个是非线性单元,叫做激
励函数。信号e 是激励信号。y = f(e) 是非线性单元的输出,即是神经元的输出。
为了训练神经网络,我们需要训练数据。训练数据由输入信号(x1 and x2 )和期望输出 z
组成。网络的训练过程是一个迭代处理的过程。训练数据集用来在每次迭代过程中更新神
经元的权重。每次学习过程由来自训练数据的输入信号开始。我们可以得出每一层的输
出。下图说明信号在神经网络的传输路线。w(xm)n 是神经元 xm 在第 n层输入的连接权
重。 yn 表示神经元n的输出。
# Ubuntu/Linux 64-bit, GPU enabled, Python 3.4. Requires CUDA toolkit 7.
5 and CuDNN v4.
# For other versions, see "Install from sources" below.
(tensorflow)$ pip install --ignore-installed --upgrade https://storage.g
oogleapis.com/tensorflow/linux/gpu/tensorflow-0.8.0rc0-cp34-cp34m-linux
_x86_64.whl
# Mac OS X, CPU only:
(tensorflow)$ pip install --ignore-installed --upgrade https://storage.g
oogleapis.com/tensorflow/mac/tensorflow-0.8.0rc0-py3-none-any.whl conda 环境激活后,你可以测试
当你不用 TensorFlow 的时候,关闭环境:
(tensorflow)$ source deactivate
$ # Your prompt should change back
再次使用的时候再激活 :-)
$ source activate tensorflow
(tensorflow)$ # Your prompt should change.
# Run Python programs that use TensorFlow.
...
# When you are done using TensorFlow, deactivate the environment.
(tensorflow)$ source deactivate
尝试你的第一个 TensorFlow 程序
(可选) 启用 GPU 支持
如果你使用 pip 二进制包安装了开启 GPU 支持的 TensorFlow, 你必须确保 系统里安
装了正确的 CUDA sdk 和 CUDNN 版本. 请参间 CUDA 安装教程
你还需要设置 LD_LIBRARY_PATH 和 CUDA_HOME 环境变量. 可以考虑将下面的命令 添加
到 ~/.bash_profile 文件中, 这样每次登陆后自动生效. 注意, 下面的命令 假定 CUDA
安装目录为 /usr/local/cuda:
export LD_LIBRARY_PATH="$LD_LIBRARY_PATH:/usr/local/cuda/lib64"
export CUDA_HOME=/usr/local/cuda
运行 TensorFlow
打开一个 python 终端:
$ python
>>> import tensorflow as tf
>>> hello = tf.constant('Hello, TensorFlow!')
>>> sess = tf.Session()
>>> print sess.run(hello)
Hello, TensorFlow! >>> a = tf.constant(10)
>>> b = tf.constant(32)
>>> print sess.run(a+b)
42
>>>
从源码安装
克隆 TensorFlow 仓库
$ git clone --recurse-submodules https://github.com/tensorflow/tensorflo
w
--recurse-submodules 参数是必须得, 用于获取 TesorFlow 依赖的 protobuf 库.
Linux 安装
安装 Bazel
首先依照 教程 安装 Bazel 的依赖. 然后在 链接 中下载适合你的操作系统的最新稳定版,
最后按照下面脚本执行:
$ chmod +x PATH_TO_INSTALL.SH
$ ./PATH_TO_INSTALL.SH --user
注意把 PATH_TO_INSTALL.SH 替换为你下载的安装包的文件路径.
将执行路径 output/bazel 添加到 $PATH 环境变量中.
安装其他依赖
# For Python 2.7:
$ sudo apt-get install python-numpy swig python-dev python-wheel
# For Python 3.x:
$ sudo apt-get install python3-numpy swig python3-dev python3-wheel
可选: 安装 CUDA (在 Linux 上开启 GPU 支持)
为了编译并运行能够使用 GPU 的 TensorFlow, 需要先安装 NVIDIA 提供的 Cuda
Toolkit 7.0 和 CUDNN 6.5 V2.
TensorFlow 的 GPU 特性只支持 NVidia Compute Capability >= 3.5 的显卡. 被支
持的显卡 包括但不限于: • NVidia Titan
• NVidia Titan X
• NVidia K20
• NVidia K40
下载并安装 Cuda Toolkit 7.0
下载地址
将工具安装到诸如 /usr/local/cuda 之类的路径.
下载并安装 CUDNN Toolkit 6.5
下载地址
解压并拷贝 CUDNN 文件到 Cuda Toolkit 7.0 安装路径下. 假设 Cuda Toolkit 7.0
安装 在 /usr/local/cuda, 执行以下命令:
tar xvzf cudnn-6.5-linux-x64-v2.tgz
sudo cp cudnn-6.5-linux-x64-v2/cudnn.h /usr/local/cuda/include
sudo cp cudnn-6.5-linux-x64-v2/libcudnn* /usr/local/cuda/lib64
配置 TensorFlow 的 Cuda 选项
从源码树的根路径执行:
$ ./configure
Do you wish to bulid TensorFlow with GPU support? [y/n] y
GPU support will be enabled for TensorFlow
Please specify the location where CUDA 7.0 toolkit is installed. Refer t
o
README.md for more details. [default is: /usr/local/cuda]: /usr/local/cu
da
Please specify the location where CUDNN 6.5 V2 library is installed. Ref
er to
README.md for more details. [default is: /usr/local/cuda]: /usr/local/cu
da
Setting up Cuda include
Setting up Cuda lib64
Setting up Cuda bin
Setting up Cuda nvvm
Configuration finished
这些配置将建立到系统 Cuda 库的符号链接. 每当 Cuda 库的路径发生变更时, 必须重
新执行上述 步骤, 否则无法调用 bazel 编译命令. 编译目标程序, 开启 GPU 支持
从源码树的根路径执行:
$ bazel build -c opt --config=cuda //tensorflow/cc:tutorials_example_tra
iner
$ bazel-bin/tensorflow/cc/tutorials_example_trainer --use_gpu
# 大量的输出信息. 这个例子用 GPU 迭代计算一个 2x2 矩阵的主特征值 (major eigen
value).
# 最后几行输出和下面的信息类似.
000009/000005 lambda = 2.000000 x = [0.894427 -0.447214] y = [1.788854 -
0.894427]
000006/000001 lambda = 2.000000 x = [0.894427 -0.447214] y = [1.788854 -
0.894427]
000009/000009 lambda = 2.000000 x = [0.894427 -0.447214] y = [1.788854 -
0.894427]
注意, GPU 支持需通过编译选项 "--config=cuda" 开启.
已知问题
• 尽管可以在同一个源码树下编译开启 Cuda 支持和禁用 Cuda 支持的版本, 我们还是
推荐在 在切换这两种不同的编译配置时, 使用 "bazel clean" 清理环境.
• 在执行 bazel 编译前必须先运行 configure, 否则编译会失败并提示错误信息. 未来,
我们可能考虑将 configure 步骤包含在编译过程中, 以简化整个过程, 前提是 bazel
能够提供新的特性支持这样.
Mac OS X 安装
Mac 和 Linux 需要的软件依赖完全一样, 但是安装过程区别很大. 以下链接用于帮助你
在 Mac OS X 上安装这些依赖:
Bazel
参见本网页的 Mac OS X 安装指南.
SWIG
Mac OS X 安装教程.
注意: 你需要安装PCRE, 而不是 PCRE2.
Numpy
参见安装教程. 创建 pip 包并安装
$ bazel build -c opt //tensorflow/tools/pip_package:build_pip_package