当前位置:   article > 正文

DL之TCN:基于keras框架利用一维时间卷积网络TCN算法(Conv1D+Flatten+Dense)对上海最高气温实现回归预测(把时间序列数据集转化为有监督学习数据集)案例_keras-tcn

keras-tcn
DL之TCN:基于keras框架利用一维时间卷积网络TCN算法(Conv1D+Flatten+Dense)对上海最高气温实现回归预测(把时间序列数据集转化为有监督学习数据集)案例

目录

利用时间卷积网络TCN算法对上海最高气温实现回归预测(把时间序列数据集转化为有监督学习数据集)案例

# 1、定义数据集

# 2、数据预处理

# 2.1、挑选入模特征:定义时间序列数据

# 2.2、切分数据集

# 2.3、将时间序列转化为有监督学习数据集

# 2.4、分离特征与标签

# 3、模型训练与推理

# 3.1、数据再处理

# 将输入数据转化为3D格式 (样本数,时间步,特征数)

# 3.2、建立模型:定义和训练时间卷积神经网络模型

# 3.3、模型训练

# 模型训练可视化:绘制训练和验证损失

# 3.4、模型预测与评估

# 绘制预测结果


相关文章
Keras之TCN:基于keras框架利用时间卷积网络TCN算法对上海最高气温实现回归预测(把时间序列数据集转化为有监督学习数据集)案例
Keras之TCN:基于keras框架利用时间卷积网络TCN算法对上海最高气温实现回归预测(把时间序列数据集转化为有监督学习数据集)案例实现代码

利用时间卷积网络TCN算法对上海最高气温实现回归预测(把时间序列数据集转化为有监督学习数据集)案例

# 1、定义数据集

dateweekmax_temperaturemin_temperatureweatherwind_directionwind_levelair_quality_indexair_quality_level
2021/1/1周五4-1晴~多云西北风2级52
2021/1/2周六71晴~多云东北风2级69
2021/1/3周日106东北风2级66
2021/1/4周一137东风2级44
2021/1/5周二82阴~多云东北风3级49
2021/1/6周三5-4北风3级46
2021/1/7周四-3-6西北风4级67
2021/1/8周五-1-5阴~晴西北风3级50
2021/1/9周六3-1晴~多云西北风3级57
2021/1/10周日5-1阴~多云西北风2级73
  1. <class 'pandas.core.frame.DataFrame'>
  2. DatetimeIndex: 805 entries, 2021-01-01 to 2023-03-16
  3. Data columns (total 8 columns):
  4. # Column Non-Null Count Dtype
  5. --- ------ -------------- -----
  6. 0 week 805 non-null object
  7. 1 max_temperature 805 non-null int64
  8. 2 min_temperature 805 non-null int64
  9. 3 weather 805 non-null object
  10. 4 wind_direction 805 non-null object
  11. 5 wind_level 805 non-null object
  12. 6 air_quality_index 667 non-null float64
  13. 7 air_quality_level 775 non-null object
  14. dtypes: float64(1), int64(2), object(5)
  15. memory usage: 56.6+ KB
  16. None
  17. week max_temperature ... air_quality_index air_quality_level
  18. date ...
  19. 2021-01-01 周五 4 ... 52.0
  20. 2021-01-02 周六 7 ... 69.0
  21. 2021-01-03 周日 10 ... 66.0
  22. 2021-01-04 周一 13 ... 44.0
  23. 2021-01-05 周二 8 ... 49.0
  24. ... ... ... ... ... ...
  25. 2023-03-12 周日 12 ... 68.0
  26. 2023-03-13 周一 14 ... 52.0
  27. 2023-03-14 周二 20 ... 55.0
  28. 2023-03-15 周三 23 ... 52.0
  29. 2023-03-16 周四 15 ... 69.0
  30. [805 rows x 8 columns]

# 2、数据预处理

# 2.1、挑选入模特征:定义时间序列数据

# 2.2、切分数据集

  1. df_train (725,)
  2. df_test (80,)

# 2.3、将时间序列转化为有监督学习数据集

  1. train_supervised (684, 42)
  2. test_supervised (39, 42)
  3. train_supervised [[ 4. 7. 10. ... 11. 12. 10.]
  4. [ 7. 10. 13. ... 12. 10. 13.]
  5. [10. 13. 8. ... 10. 13. 17.]
  6. ...
  7. [15. 13. 16. ... 6. 4. 5.]
  8. [13. 16. 18. ... 4. 5. 6.]
  9. [16. 18. 17. ... 5. 6. 7.]]

# 2.4、分离特征与标签

# 3、模型训练与推理

# 3.1、数据再处理

# 将输入数据转化为3D格式 (样本数,时间步,特征数)

# 3.2、建立模型:定义和训练时间卷积神经网络模型

# 3.3、模型训练

  1. Epoch 1/1000
  2. 43/43 - 1s - loss: 124.9150 - val_loss: 28.0470 - 764ms/epoch - 18ms/step
  3. Epoch 2/1000
  4. 43/43 - 0s - loss: 41.9717 - val_loss: 41.4632 - 100ms/epoch - 2ms/step
  5. Epoch 3/1000
  6. 43/43 - 0s - loss: 55.0186 - val_loss: 36.8242 - 92ms/epoch - 2ms/step
  7. Epoch 4/1000
  8. 43/43 - 0s - loss: 48.0289 - val_loss: 35.4341 - 105ms/epoch - 2ms/step
  9. Epoch 5/1000
  10. 43/43 - 0s - loss: 46.5520 - val_loss: 34.2107 - 97ms/epoch - 2ms/step
  11. Epoch 6/1000
  12. 43/43 - 0s - loss: 39.3062 - val_loss: 34.9127 - 98ms/epoch - 2ms/step
  13. Epoch 7/1000
  14. 43/43 - 0s - loss: 41.5857 - val_loss: 33.3312 - 102ms/epoch - 2ms/step
  15. Epoch 8/1000
  16. 43/43 - 0s - loss: 39.6624 - val_loss: 36.8456 - 89ms/epoch - 2ms/step
  17. Epoch 9/1000
  18. 43/43 - 0s - loss: 37.2232 - val_loss: 32.2082 - 78ms/epoch - 2ms/step
  19. Epoch 10/1000
  20. 43/43 - 0s - loss: 32.3461 - val_loss: 32.3342 - 81ms/epoch - 2ms/step
  21. Epoch 11/1000
  22. 43/43 - 0s - loss: 32.4692 - val_loss: 31.6105 - 87ms/epoch - 2ms/step
  23. Epoch 12/1000
  24. 43/43 - 0s - loss: 31.0049 - val_loss: 31.7516 - 85ms/epoch - 2ms/step
  25. Epoch 13/1000
  26. 43/43 - 0s - loss: 31.4120 - val_loss: 32.4814 - 94ms/epoch - 2ms/step
  27. Epoch 14/1000
  28. 43/43 - 0s - loss: 30.3727 - val_loss: 32.3661 - 172ms/epoch - 4ms/step
  29. Epoch 15/1000
  30. 43/43 - 0s - loss: 28.5779 - val_loss: 33.0914 - 104ms/epoch - 2ms/step
  31. Epoch 16/1000
  32. 43/43 - 0s - loss: 28.2869 - val_loss: 33.0207 - 96ms/epoch - 2ms/step
  33. Epoch 17/1000
  34. 43/43 - 0s - loss: 27.9438 - val_loss: 33.5665 - 96ms/epoch - 2ms/step
  35. Epoch 18/1000
  36. 43/43 - 0s - loss: 26.8220 - val_loss: 34.3124 - 108ms/epoch - 3ms/step
  37. Epoch 19/1000
  38. 43/43 - 0s - loss: 26.4627 - val_loss: 34.0599 - 95ms/epoch - 2ms/step
  39. Epoch 20/1000
  40. 43/43 - 0s - loss: 26.7424 - val_loss: 34.9252 - 91ms/epoch - 2ms/step
  41. Epoch 21/1000
  42. 43/43 - 0s - loss: 25.3610 - val_loss: 35.1354 - 104ms/epoch - 2ms/step
  43. Epoch 22/1000
  44. 43/43 - 0s - loss: 25.3079 - val_loss: 36.1249 - 97ms/epoch - 2ms/step
  45. Epoch 23/1000
  46. 43/43 - 0s - loss: 24.5994 - val_loss: 36.0049 - 93ms/epoch - 2ms/step
  47. Epoch 24/1000
  48. 43/43 - 0s - loss: 24.4569 - val_loss: 36.4239 - 105ms/epoch - 2ms/step
  49. Epoch 25/1000
  50. 43/43 - 0s - loss: 23.9317 - val_loss: 35.9488 - 93ms/epoch - 2ms/step
  51. Epoch 26/1000
  52. 43/43 - 0s - loss: 23.4671 - val_loss: 36.1248 - 87ms/epoch - 2ms/step
  53. Epoch 27/1000
  54. 43/43 - 0s - loss: 23.2060 - val_loss: 36.7490 - 85ms/epoch - 2ms/step
  55. Epoch 28/1000
  56. 43/43 - 0s - loss: 23.0000 - val_loss: 36.9432 - 74ms/epoch - 2ms/step
  57. Epoch 29/1000
  58. 43/43 - 0s - loss: 22.9452 - val_loss: 37.4504 - 69ms/epoch - 2ms/step
  59. Epoch 30/1000
  60. 43/43 - 0s - loss: 22.1462 - val_loss: 38.7905 - 70ms/epoch - 2ms/step
  61. Epoch 31/1000
  62. 43/43 - 0s - loss: 22.0330 - val_loss: 38.7575 - 103ms/epoch - 2ms/step
  63. Epoch 32/1000
  64. 43/43 - 0s - loss: 22.0065 - val_loss: 38.2103 - 84ms/epoch - 2ms/step
  65. Epoch 33/1000
  66. 43/43 - 0s - loss: 21.5495 - val_loss: 39.7773 - 98ms/epoch - 2ms/step
  67. Epoch 34/1000
  68. 43/43 - 0s - loss: 21.3134 - val_loss: 40.6234 - 106ms/epoch - 2ms/step
  69. Epoch 35/1000
  70. 43/43 - 0s - loss: 20.8954 - val_loss: 41.8935 - 91ms/epoch - 2ms/step
  71. Epoch 36/1000
  72. 43/43 - 0s - loss: 20.6854 - val_loss: 42.6756 - 98ms/epoch - 2ms/step
  73. Epoch 37/1000
  74. 43/43 - 0s - loss: 20.6962 - val_loss: 44.5400 - 102ms/epoch - 2ms/step
  75. Epoch 38/1000
  76. 43/43 - 0s - loss: 20.2958 - val_loss: 43.9005 - 83ms/epoch - 2ms/step
  77. Epoch 39/1000
  78. 43/43 - 0s - loss: 20.0900 - val_loss: 46.3218 - 80ms/epoch - 2ms/step
  79. Epoch 40/1000
  80. 43/43 - 0s - loss: 19.8518 - val_loss: 47.3231 - 104ms/epoch - 2ms/step
  81. Epoch 41/1000
  82. 43/43 - 0s - loss: 19.7069 - val_loss: 48.0385 - 100ms/epoch - 2ms/step
  83. Epoch 42/1000
  84. 43/43 - 0s - loss: 19.4992 - val_loss: 50.7614 - 97ms/epoch - 2ms/step
  85. Epoch 43/1000
  86. 43/43 - 0s - loss: 19.1684 - val_loss: 49.1486 - 105ms/epoch - 2ms/step
  87. Epoch 44/1000
  88. 43/43 - 0s - loss: 19.0760 - val_loss: 53.2720 - 99ms/epoch - 2ms/step
  89. Epoch 45/1000
  90. 43/43 - 0s - loss: 18.9300 - val_loss: 50.1390 - 100ms/epoch - 2ms/step
  91. Epoch 46/1000
  92. 43/43 - 0s - loss: 18.8669 - val_loss: 54.0359 - 106ms/epoch - 2ms/step
  93. Epoch 47/1000
  94. 43/43 - 0s - loss: 18.6694 - val_loss: 52.4647 - 95ms/epoch - 2ms/step
  95. Epoch 48/1000
  96. 43/43 - 0s - loss: 18.5207 - val_loss: 55.3733 - 97ms/epoch - 2ms/step
  97. Epoch 49/1000
  98. 43/43 - 0s - loss: 18.2313 - val_loss: 54.8835 - 106ms/epoch - 2ms/step
  99. Epoch 50/1000
  100. 43/43 - 0s - loss: 18.1582 - val_loss: 53.5451 - 100ms/epoch - 2ms/step
  101. ……………………
  102. 43/43 - 0s - loss: 1.2712 - val_loss: 151.7615 - 95ms/epoch - 2ms/step
  103. Epoch 993/1000
  104. 43/43 - 0s - loss: 1.1109 - val_loss: 153.4365 - 90ms/epoch - 2ms/step
  105. Epoch 994/1000
  106. 43/43 - 0s - loss: 1.2277 - val_loss: 154.8537 - 135ms/epoch - 3ms/step
  107. Epoch 995/1000
  108. 43/43 - 0s - loss: 1.1820 - val_loss: 154.8023 - 202ms/epoch - 5ms/step
  109. Epoch 996/1000
  110. 43/43 - 0s - loss: 1.5359 - val_loss: 153.2385 - 195ms/epoch - 5ms/step
  111. Epoch 997/1000
  112. 43/43 - 0s - loss: 1.6835 - val_loss: 154.5790 - 212ms/epoch - 5ms/step
  113. Epoch 998/1000
  114. 43/43 - 0s - loss: 2.7265 - val_loss: 149.3467 - 197ms/epoch - 5ms/step
  115. Epoch 999/1000
  116. 43/43 - 0s - loss: 3.3956 - val_loss: 158.7523 - 179ms/epoch - 4ms/step
  117. Epoch 1000/1000
  118. 43/43 - 0s - loss: 6.5273 - val_loss: 141.5004 - 180ms/epoch - 4ms/step

# 模型训练可视化:绘制训练和验证损失

# 3.4、模型预测与评估

  1. y_val y_val_pred
  2. 0 18.0 14.574027
  3. 1 23.0 9.982560
  4. 2 23.0 11.338496
  5. 3 24.0 10.234162
  6. 4 25.0 16.762114
  7. 5 27.0 9.477368
  8. 6 22.0 -0.121278
  9. 7 12.0 19.867815
  10. 8 14.0 22.165188
  11. 9 20.0 22.839424
  12. 10 23.0 20.204948
  13. 11 15.0 16.035151
  14. weather_shanghai_2000_val_MAE: 9.371264984210333
  15. weather_shanghai_2000_val_MSE: 126.371276982466
  16. weather_shanghai_2000_val_RMSE: 11.241497986588175
  17. weather_shanghai_2000_val_R2: -5.1394952380145424

# 绘制预测结果

声明:本文内容由网友自发贡献,不代表【wpsshop博客】立场,版权归原作者所有,本站不承担相应法律责任。如您发现有侵权的内容,请联系我们。转载请注明出处:https://www.wpsshop.cn/w/笔触狂放9/article/detail/103887
推荐阅读
相关标签
  

闽ICP备14008679号