当前位置:   article > 正文

NLP 基于kashgari和BERT实现中文命名实体识别(NER)_kashgari基于bert的ner模型

kashgari基于bert的ner模型
from kashgari.corpus import ChineseDailyNerCorpus
import kashgari
from kashgari.embeddings import BERTEmbedding
from kashgari.tasks.labeling import BiLSTM_CRF_Model

train_x, train_y = ChineseDailyNerCorpus.load_data('train')
valid_x, valid_y = ChineseDailyNerCorpus.load_data('validate')
test_x, test_y  = ChineseDailyNerCorpus.load_data('test')

print(f"train data count: {len(train_x)}")
print(f"validate data count: {len(valid_x)}")
print(f"test data count: {len(test_x)}")
#train data count: 20864
#validate data count: 2318
#test data count: 4636
bert_embed = BERTEmbedding('chinese_L-12_H-768_A-12',
                           task=kashgari.LABELING,
                           sequence_length=100)

model = BiLSTM_CRF_Model(bert_embed)
model.fit(train_x,
          train_y,
          x_validate=valid_x,
          y_validate=valid_y,
          epochs=20,
          batch_size=512)
model.save('ner.h5')

model.evaluate(test_x, test_y)
  • 1
  • 2
  • 3
  • 4
  • 5
  • 6
  • 7
  • 8
  • 9
  • 10
  • 11
  • 12
  • 13
  • 14
  • 15
  • 16
  • 17
  • 18
  • 19
  • 20
  • 21
  • 22
  • 23
  • 24
  • 25
  • 26
  • 27
  • 28
  • 29

结果:

train data count: 20864
validate data count: 2318
test data count: 4636
WARNING:root:seq_len: 100
Model: "model_4"
__________________________________________________________________________________________________
Layer (type)                    Output Shape         Param #     Connected to                     
==================================================================================================
Input-Token (InputLayer)        [(None, 100)]        0                                            
__________________________________________________________________________________________________
Input-Segment (InputLayer)      [(None, 100)]        0                                            
__________________________________________________________________________________________________
Embedding-Token (TokenEmbedding [(None, 100, 768), ( 16226304    Input-Token[0][0]                
__________________________________________________________________________________________________
Embedding-Segment (Embedding)   (None, 100, 768)     1536        Input-Segment[0][0]              
__________________________________________________________________________________________________
Embedding-Token-Segment (Add)   (None, 100, 768)     0           Embedding-Token[0][0]            
                                                                 Embedding-Segment[0][0]          
__________________________________________________________________________________________________
Embedding-Position (PositionEmb (None, 100, 768)     76800       Embedding-Token-Segment[0][0]    
__________________________________________________________________________________________________
Embedding-Dropout (Dropout)     (None, 100, 768)     0           Embedding-Position[0][0]         
__________________________________________________________________________________________________
Embedding-Norm (LayerNormalizat (None, 100, 768)     1536        Embedding-Dropout[0][0]          
__________________________________________________________________________________________________
Encoder-1-MultiHeadSelfAttentio (None, 100, 768)     2362368     Embedding-Norm[0][0]             
__________________________________________________________________________________________________
Encoder-1-MultiHeadSelfAttentio (None, 100, 768)     0           Encoder-1-MultiHeadSelfAttention[
__________________________________________________________________________________________________
Encoder-1-MultiHeadSelfAttentio (None, 100, 768)     0           Embedding-Norm[0][0]             
                                                                 Encoder-1-MultiHeadSelfAttention-
__________________________________________________________________________________________________
Encoder-1-MultiHeadSelfAttentio (None, 100, 768)     1536        Encoder-1-MultiHeadSelfAttention-
__________________________________________________________________________________________________
Encoder-1-FeedForward (FeedForw (None, 100, 768)     4722432     Encoder-1-MultiHeadSelfAttention-
__________________________________________________________________________________________________
Encoder-1-FeedForward-Dropout ( (None, 100, 768)     0           Encoder-1-FeedForward[0][0]      
__________________________________________________________________________________________________
Encoder-1-FeedForward-Add (Add) (None, 100, 768)     0           Encoder-1-MultiHeadSelfAttention-
                                                                 Encoder-1-FeedForward-Dropout[0][
__________________________________________________________________________________________________
Encoder-1-FeedForward-Norm (Lay (None, 100, 768)     1536        Encoder-1-FeedForward-Add[0][0]  
__________________________________________________________________________________________________
Encoder-2-MultiHeadSelfAttentio (None, 100, 768)     2362368     Encoder-1-FeedForward-Norm[0][0] 
__________________________________________________________________________________________________
Encoder-2-MultiHeadSelfAttentio (None, 100, 768)     0           Encoder-2-MultiHeadSelfAttention[
__________________________________________________________________________________________________
Encoder-2-MultiHeadSelfAttentio (None, 100, 768)     0           Encoder-1-FeedForward-Norm[0][0] 
                                                                 Encoder-2-MultiHeadSelfAttention-
__________________________________________________________________________________________________
Encoder-2-MultiHeadSelfAttentio (None, 100, 768)     1536        Encoder-2-MultiHeadSelfAttention-
__________________________________________________________________________________________________
Encoder-2-FeedForward (FeedForw (None, 100, 768)     4722432     Encoder-2-MultiHeadSelfAttention-
__________________________________________________________________________________________________
Encoder-2-FeedForward-Dropout ( (None, 100, 768)     0           Encoder-2-FeedForward[0][0]      
__________________________________________________________________________________________________
Encoder-2-FeedForward-Add (Add) (None, 100, 768)     0           Encoder-2-MultiHeadSelfAttention-
                                                                 Encoder-2-FeedForward-Dropout[0][
__________________________________________________________________________________________________
Encoder-2-FeedForward-Norm (Lay (None, 100, 768)     1536        Encoder-2-FeedForward-Add[0][0]  
__________________________________________________________________________________________________
Encoder-3-MultiHeadSelfAttentio (None, 100, 768)     2362368     Encoder-2-FeedForward-Norm[0][0] 
__________________________________________________________________________________________________
Encoder-3-MultiHeadSelfAttentio (None, 100, 768)     0           Encoder-3-MultiHeadSelfAttention[
__________________________________________________________________________________________________
Encoder-3-MultiHeadSelfAttentio (None, 100, 768)     0           Encoder-2-FeedForward-Norm[0][0] 
                                                                 Encoder-3-MultiHeadSelfAttention-
__________________________________________________________________________________________________
Encoder-3-MultiHeadSelfAttentio (None, 100, 768)     1536        Encoder-3-MultiHeadSelfAttention-
__________________________________________________________________________________________________
Encoder-3-FeedForward (FeedForw (None, 100, 768)     4722432     Encoder-3-MultiHeadSelfAttention-
__________________________________________________________________________________________________
Encoder-3-FeedForward-Dropout ( (None, 100, 768)     0           Encoder-3-FeedForward[0][0]      
__________________________________________________________________________________________________
Encoder-3-FeedForward-Add (Add) (None, 100, 768)     0           Encoder-3-MultiHeadSelfAttention-
                                                                 Encoder-3-FeedForward-Dropout[0][
__________________________________________________________________________________________________
Encoder-3-FeedForward-Norm (Lay (None, 100, 768)     1536        Encoder-3-FeedForward-Add[0][0]  
__________________________________________________________________________________________________
Encoder-4-MultiHeadSelfAttentio (None, 100, 768)     2362368     Encoder-3-FeedForward-Norm[0][0] 
__________________________________________________________________________________________________
Encoder-4-MultiHeadSelfAttentio (None, 100, 768)     0           Encoder-4-MultiHeadSelfAttention[
__________________________________________________________________________________________________
Encoder-4-MultiHeadSelfAttentio (None, 100, 768)     0           Encoder-3-FeedForward-Norm[0][0] 
                                                                 Encoder-4-MultiHeadSelfAttention-
__________________________________________________________________________________________________
Encoder-4-MultiHeadSelfAttentio (None, 100, 768)     1536        Encoder-4-MultiHeadSelfAttention-
__________________________________________________________________________________________________
Encoder-4-FeedForward (FeedForw (None, 100, 768)     4722432     Encoder-4-MultiHeadSelfAttention-
__________________________________________________________________________________________________
Encoder-4-FeedForward-Dropout ( (None, 100, 768)     0           Encoder-4-FeedForward[0][0]      
__________________________________________________________________________________________________
Encoder-4-FeedForward-Add (Add) (None, 100, 768)     0           Encoder-4-MultiHeadSelfAttention-
                                                                 Encoder-4-FeedForward-Dropout[0][
__________________________________________________________________________________________________
Encoder-4-FeedForward-Norm (Lay (None, 100, 768)     1536        Encoder-4-FeedForward-Add[0][0]  
__________________________________________________________________________________________________
Encoder-5-MultiHeadSelfAttentio (None, 100, 768)     2362368     Encoder-4-FeedForward-Norm[0][0] 
__________________________________________________________________________________________________
Encoder-5-MultiHeadSelfAttentio (None, 100, 768)     0           Encoder-5-MultiHeadSelfAttention[
__________________________________________________________________________________________________
Encoder-5-MultiHeadSelfAttentio (None, 100, 768)     0           Encoder-4-FeedForward-Norm[0][0] 
                                                                 Encoder-5-MultiHeadSelfAttention-
__________________________________________________________________________________________________
Encoder-5-MultiHeadSelfAttentio (None, 100, 768)     1536        Encoder-5-MultiHeadSelfAttention-
__________________________________________________________________________________________________
Encoder-5-FeedForward (FeedForw (None, 100, 768)     4722432     Encoder-5-MultiHeadSelfAttention-
__________________________________________________________________________________________________
Encoder-5-FeedForward-Dropout ( (None, 100, 768)     0           Encoder-5-FeedForward[0][0]      
__________________________________________________________________________________________________
Encoder-5-FeedForward-Add (Add) (None, 100, 768)     0           Encoder-5-MultiHeadSelfAttention-
                                                                 Encoder-5-FeedForward-Dropout[0][
__________________________________________________________________________________________________
Encoder-5-FeedForward-Norm (Lay (None, 100, 768)     1536        Encoder-5-FeedForward-Add[0][0]  
__________________________________________________________________________________________________
Encoder-6-MultiHeadSelfAttentio (None, 100, 768)     2362368     Encoder-5-FeedForward-Norm[0][0] 
__________________________________________________________________________________________________
Encoder-6-MultiHeadSelfAttentio (None, 100, 768)     0           Encoder-6-MultiHeadSelfAttention[
__________________________________________________________________________________________________
Encoder-6-MultiHeadSelfAttentio (None, 100, 768)     0           Encoder-5-FeedForward-Norm[0][0] 
                                                                 Encoder-6-MultiHeadSelfAttention-
__________________________________________________________________________________________________
Encoder-6-MultiHeadSelfAttentio (None, 100, 768)     1536        Encoder-6-MultiHeadSelfAttention-
__________________________________________________________________________________________________
Encoder-6-FeedForward (FeedForw (None, 100, 768)     4722432     Encoder-6-MultiHeadSelfAttention-
__________________________________________________________________________________________________
Encoder-6-FeedForward-Dropout ( (None, 100, 768)     0           Encoder-6-FeedForward[0][0]      
__________________________________________________________________________________________________
Encoder-6-FeedForward-Add (Add) (None, 100, 768)     0           Encoder-6-MultiHeadSelfAttention-
                                                                 Encoder-6-FeedForward-Dropout[0][
__________________________________________________________________________________________________
Encoder-6-FeedForward-Norm (Lay (None, 100, 768)     1536        Encoder-6-FeedForward-Add[0][0]  
__________________________________________________________________________________________________
Encoder-7-MultiHeadSelfAttentio (None, 100, 768)     2362368     Encoder-6-FeedForward-Norm[0][0] 
__________________________________________________________________________________________________
Encoder-7-MultiHeadSelfAttentio (None, 100, 768)     0           Encoder-7-MultiHeadSelfAttention[
__________________________________________________________________________________________________
Encoder-7-MultiHeadSelfAttentio (None, 100, 768)     0           Encoder-6-FeedForward-Norm[0][0] 
                                                                 Encoder-7-MultiHeadSelfAttention-
__________________________________________________________________________________________________
Encoder-7-MultiHeadSelfAttentio (None, 100, 768)     1536        Encoder-7-MultiHeadSelfAttention-
__________________________________________________________________________________________________
Encoder-7-FeedForward (FeedForw (None, 100, 768)     4722432     Encoder-7-MultiHeadSelfAttention-
__________________________________________________________________________________________________
Encoder-7-FeedForward-Dropout ( (None, 100, 768)     0           Encoder-7-FeedForward[0][0]      
__________________________________________________________________________________________________
Encoder-7-FeedForward-Add (Add) (None, 100, 768)     0           Encoder-7-MultiHeadSelfAttention-
                                                                 Encoder-7-FeedForward-Dropout[0][
__________________________________________________________________________________________________
Encoder-7-FeedForward-Norm (Lay (None, 100, 768)     1536        Encoder-7-FeedForward-Add[0][0]  
__________________________________________________________________________________________________
Encoder-8-MultiHeadSelfAttentio (None, 100, 768)     2362368     Encoder-7-FeedForward-Norm[0][0] 
__________________________________________________________________________________________________
Encoder-8-MultiHeadSelfAttentio (None, 100, 768)     0           Encoder-8-MultiHeadSelfAttention[
__________________________________________________________________________________________________
Encoder-8-MultiHeadSelfAttentio (None, 100, 768)     0           Encoder-7-FeedForward-Norm[0][0] 
                                                                 Encoder-8-MultiHeadSelfAttention-
__________________________________________________________________________________________________
Encoder-8-MultiHeadSelfAttentio (None, 100, 768)     1536        Encoder-8-MultiHeadSelfAttention-
__________________________________________________________________________________________________
Encoder-8-FeedForward (FeedForw (None, 100, 768)     4722432     Encoder-8-MultiHeadSelfAttention-
__________________________________________________________________________________________________
Encoder-8-FeedForward-Dropout ( (None, 100, 768)     0           Encoder-8-FeedForward[0][0]      
__________________________________________________________________________________________________
Encoder-8-FeedForward-Add (Add) (None, 100, 768)     0           Encoder-8-MultiHeadSelfAttention-
                                                                 Encoder-8-FeedForward-Dropout[0][
__________________________________________________________________________________________________
Encoder-8-FeedForward-Norm (Lay (None, 100, 768)     1536        Encoder-8-FeedForward-Add[0][0]  
__________________________________________________________________________________________________
Encoder-9-MultiHeadSelfAttentio (None, 100, 768)     2362368     Encoder-8-FeedForward-Norm[0][0] 
__________________________________________________________________________________________________
Encoder-9-MultiHeadSelfAttentio (None, 100, 768)     0           Encoder-9-MultiHeadSelfAttention[
__________________________________________________________________________________________________
Encoder-9-MultiHeadSelfAttentio (None, 100, 768)     0           Encoder-8-FeedForward-Norm[0][0] 
                                                                 Encoder-9-MultiHeadSelfAttention-
__________________________________________________________________________________________________
Encoder-9-MultiHeadSelfAttentio (None, 100, 768)     1536        Encoder-9-MultiHeadSelfAttention-
__________________________________________________________________________________________________
Encoder-9-FeedForward (FeedForw (None, 100, 768)     4722432     Encoder-9-MultiHeadSelfAttention-
__________________________________________________________________________________________________
Encoder-9-FeedForward-Dropout ( (None, 100, 768)     0           Encoder-9-FeedForward[0][0]      
__________________________________________________________________________________________________
Encoder-9-FeedForward-Add (Add) (None, 100, 768)     0           Encoder-9-MultiHeadSelfAttention-
                                                                 Encoder-9-FeedForward-Dropout[0][
__________________________________________________________________________________________________
Encoder-9-FeedForward-Norm (Lay (None, 100, 768)     1536        Encoder-9-FeedForward-Add[0][0]  
__________________________________________________________________________________________________
Encoder-10-MultiHeadSelfAttenti (None, 100, 768)     2362368     Encoder-9-FeedForward-Norm[0][0] 
__________________________________________________________________________________________________
Encoder-10-MultiHeadSelfAttenti (None, 100, 768)     0           Encoder-10-MultiHeadSelfAttention
__________________________________________________________________________________________________
Encoder-10-MultiHeadSelfAttenti (None, 100, 768)     0           Encoder-9-FeedForward-Norm[0][0] 
                                                                 Encoder-10-MultiHeadSelfAttention
__________________________________________________________________________________________________
Encoder-10-MultiHeadSelfAttenti (None, 100, 768)     1536        Encoder-10-MultiHeadSelfAttention
__________________________________________________________________________________________________
Encoder-10-FeedForward (FeedFor (None, 100, 768)     4722432     Encoder-10-MultiHeadSelfAttention
__________________________________________________________________________________________________
Encoder-10-FeedForward-Dropout  (None, 100, 768)     0           Encoder-10-FeedForward[0][0]     
__________________________________________________________________________________________________
Encoder-10-FeedForward-Add (Add (None, 100, 768)     0           Encoder-10-MultiHeadSelfAttention
                                                                 Encoder-10-FeedForward-Dropout[0]
__________________________________________________________________________________________________
Encoder-10-FeedForward-Norm (La (None, 100, 768)     1536        Encoder-10-FeedForward-Add[0][0] 
__________________________________________________________________________________________________
Encoder-11-MultiHeadSelfAttenti (None, 100, 768)     2362368     Encoder-10-FeedForward-Norm[0][0]
__________________________________________________________________________________________________
Encoder-11-MultiHeadSelfAttenti (None, 100, 768)     0           Encoder-11-MultiHeadSelfAttention
__________________________________________________________________________________________________
Encoder-11-MultiHeadSelfAttenti (None, 100, 768)     0           Encoder-10-FeedForward-Norm[0][0]
                                                                 Encoder-11-MultiHeadSelfAttention
__________________________________________________________________________________________________
Encoder-11-MultiHeadSelfAttenti (None, 100, 768)     1536        Encoder-11-MultiHeadSelfAttention
__________________________________________________________________________________________________
Encoder-11-FeedForward (FeedFor (None, 100, 768)     4722432     Encoder-11-MultiHeadSelfAttention
__________________________________________________________________________________________________
Encoder-11-FeedForward-Dropout  (None, 100, 768)     0           Encoder-11-FeedForward[0][0]     
__________________________________________________________________________________________________
Encoder-11-FeedForward-Add (Add (None, 100, 768)     0           Encoder-11-MultiHeadSelfAttention
                                                                 Encoder-11-FeedForward-Dropout[0]
__________________________________________________________________________________________________
Encoder-11-FeedForward-Norm (La (None, 100, 768)     1536        Encoder-11-FeedForward-Add[0][0] 
__________________________________________________________________________________________________
Encoder-12-MultiHeadSelfAttenti (None, 100, 768)     2362368     Encoder-11-FeedForward-Norm[0][0]
__________________________________________________________________________________________________
Encoder-12-MultiHeadSelfAttenti (None, 100, 768)     0           Encoder-12-MultiHeadSelfAttention
__________________________________________________________________________________________________
Encoder-12-MultiHeadSelfAttenti (None, 100, 768)     0           Encoder-11-FeedForward-Norm[0][0]
                                                                 Encoder-12-MultiHeadSelfAttention
__________________________________________________________________________________________________
Encoder-12-MultiHeadSelfAttenti (None, 100, 768)     1536        Encoder-12-MultiHeadSelfAttention
__________________________________________________________________________________________________
Encoder-12-FeedForward (FeedFor (None, 100, 768)     4722432     Encoder-12-MultiHeadSelfAttention
__________________________________________________________________________________________________
Encoder-12-FeedForward-Dropout  (None, 100, 768)     0           Encoder-12-FeedForward[0][0]     
__________________________________________________________________________________________________
Encoder-12-FeedForward-Add (Add (None, 100, 768)     0           Encoder-12-MultiHeadSelfAttention
                                                                 Encoder-12-FeedForward-Dropout[0]
__________________________________________________________________________________________________
Encoder-12-FeedForward-Norm (La (None, 100, 768)     1536        Encoder-12-FeedForward-Add[0][0] 
__________________________________________________________________________________________________
Encoder-Output (Concatenate)    (None, 100, 3072)    0           Encoder-9-FeedForward-Norm[0][0] 
                                                                 Encoder-10-FeedForward-Norm[0][0]
                                                                 Encoder-11-FeedForward-Norm[0][0]
                                                                 Encoder-12-FeedForward-Norm[0][0]
__________________________________________________________________________________________________
non_masking_layer (NonMaskingLa (None, 100, 3072)    0           Encoder-Output[0][0]             
__________________________________________________________________________________________________
layer_blstm (Bidirectional)     (None, 100, 256)     3277824     non_masking_layer[0][0]          
__________________________________________________________________________________________________
layer_dense (Dense)             (None, 100, 64)      16448       layer_blstm[0][0]                
__________________________________________________________________________________________________
layer_crf_dense (Dense)         (None, 100, 8)       520         layer_dense[0][0]                
__________________________________________________________________________________________________
layer_crf (CRF)                 (None, 100, 8)       64          layer_crf_dense[0][0]            
==================================================================================================
Total params: 104,655,496
Trainable params: 3,294,856
Non-trainable params: 101,360,640
__________________________________________________________________________________________________
Epoch 1/20

 1/41 [..............................] - ETA: 2:23:31 - loss: 189.6792 - accuracy: 0.0289
 2/41 [>.............................] - ETA: 2:20:27 - loss: 111.6294 - accuracy: 0.4817
 3/41 [=>............................] - ETA: 2:14:09 - loss: 85.3781 - accuracy: 0.6322 
 4/41 [=>............................] - ETA: 2:07:16 - loss: 70.9284 - accuracy: 0.7092
 5/41 [==>...........................] - ETA: 1:53:34 - loss: 61.8583 - accuracy: 0.7563
 6/41 [===>..........................] - ETA: 1:43:57 - loss: 55.4911 - accuracy: 0.7879
 7/41 [====>.........................] - ETA: 1:36:36 - loss: 50.8054 - accuracy: 0.8099
 8/41 [====>.........................] - ETA: 1:30:29 - loss: 46.8067 - accuracy: 0.8272
 9/41 [=====>........................] - ETA: 1:25:18 - loss: 43.7727 - accuracy: 0.8397
10/41 [======>.......................] - ETA: 1:20:42 - loss: 41.0071 - accuracy: 0.8504
11/41 [=======>......................] - ETA: 1:16:35 - loss: 38.5938 - accuracy: 0.8594
12/41 [=======>......................] - ETA: 1:12:49 - loss: 36.5614 - accuracy: 0.8668
13/41 [========>.....................] - ETA: 1:09:18 - loss: 34.7422 - accuracy: 0.8736
14/41 [=========>....................] - ETA: 1:05:59 - loss: 33.1137 - accuracy: 0.8800
15/41 [=========>....................] - ETA: 1:02:51 - loss: 31.5732 - accuracy: 0.8861
16/41 [==========>...................] - ETA: 59:51 - loss: 30.2485 - accuracy: 0.8912  
17/41 [===========>..................] - ETA: 56:58 - loss: 29.0256 - accuracy: 0.8958
18/41 [============>.................] - ETA: 54:10 - loss: 27.9502 - accuracy: 0.8999
19/41 [============>.................] - ETA: 51:27 - loss: 26.9262 - accuracy: 0.9037
20/41 [=============>................] - ETA: 48:48 - loss: 26.0287 - accuracy: 0.9070
21/41 [==============>...............] - ETA: 46:13 - loss: 25.1085 - accuracy: 0.9104
22/41 [===============>..............] - ETA: 43:40 - loss: 24.3304 - accuracy: 0.9133
23/41 [===============>..............] - ETA: 41:10 - loss: 23.5406 - accuracy: 0.9162
24/41 [================>.............] - ETA: 38:43 - loss: 22.8484 - accuracy: 0.9188
25/41 [=================>............] - ETA: 36:18 - loss: 22.2058 - accuracy: 0.9212
26/41 [==================>...........] - ETA: 33:54 - loss: 21.6206 - accuracy: 0.9234
27/41 [==================>...........] - ETA: 31:32 - loss: 21.0356 - accuracy: 0.9256
28/41 [===================>..........] - ETA: 29:11 - loss: 20.5043 - accuracy: 0.9276
29/41 [====================>.........] - ETA: 26:51 - loss: 19.9868 - accuracy: 0.9295
30/41 [====================>.........] - ETA: 24:33 - loss: 19.5049 - accuracy: 0.9314
31/41 [=====================>........] - ETA: 22:15 - loss: 19.0341 - accuracy: 0.9332
32/41 [======================>.......] - ETA: 19:59 - loss: 18.5822 - accuracy: 0.9349
33/41 [=======================>......] - ETA: 17:43 - loss: 18.1677 - accuracy: 0.9364
34/41 [=======================>......] - ETA: 15:28 - loss: 17.7761 - accuracy: 0.9379
35/41 [========================>.....] - ETA: 13:15 - loss: 17.3926 - accuracy: 0.9394
36/41 [=========================>....] - ETA: 11:01 - loss: 17.0286 - accuracy: 0.9407
37/41 [==========================>...] - ETA: 8:48 - loss: 16.6884 - accuracy: 0.9420 
38/41 [==========================>...] - ETA: 6:35 - loss: 16.3570 - accuracy: 0.9432
39/41 [===========================>..] - ETA: 4:23 - loss: 16.0342 - accuracy: 0.9444
40/41 [============================>.] - ETA: 2:12 - loss: 15.7314 - accuracy: 0.9455
41/41 [==============================] - 10803s 263s/step - loss: 15.4359 - accuracy: 0.9463 - val_loss: 89.0157 - val_accuracy: 0.9726
Epoch 2/20

 1/41 [..............................] - ETA: 1:52:16 - loss: 3.8260 - accuracy: 0.9887
 2/41 [>.............................] - ETA: 2:02:00 - loss: 3.6733 - accuracy: 0.9896
 3/41 [=>............................] - ETA: 2:03:56 - loss: 3.8901 - accuracy: 0.9888
 4/41 [=>............................] - ETA: 2:02:28 - loss: 3.8505 - accuracy: 0.9888
 5/41 [==>...........................] - ETA: 2:00:36 - loss: 3.6963 - accuracy: 0.9894
 6/41 [===>..........................] - ETA: 1:58:02 - loss: 3.6410 - accuracy: 0.9895
 7/41 [====>.........................] - ETA: 1:50:32 - loss: 3.6530 - accuracy: 0.9894
 8/41 [====>.........................] - ETA: 1:43:52 - loss: 3.6497 - accuracy: 0.9894
 9/41 [=====>........................] - ETA: 1:38:02 - loss: 3.5950 - accuracy: 0.9896
10/41 [======>.......................] - ETA: 1:31:57 - loss: 3.5294 - accuracy: 0.9899
11/41 [=======>......................] - ETA: 1:26:34 - loss: 3.4709 - accuracy: 0.9900
12/41 [=======>......................] - ETA: 1:21:42 - loss: 3.4425 - accuracy: 0.9901
13/41 [========>.....................] - ETA: 1:17:19 - loss: 3.4268 - accuracy: 0.9901
14/41 [=========>....................] - ETA: 1:15:31 - loss: 3.4032 - accuracy: 0.9902
15/41 [=========>....................] - ETA: 1:11:30 - loss: 3.3936 - accuracy: 0.9902
16/41 [==========>...................] - ETA: 1:08:36 - loss: 3.3785 - accuracy: 0.9902
17/41 [===========>..................] - ETA: 1:04:56 - loss: 3.3780 - accuracy: 0.9902
18/41 [============>.................] - ETA: 1:01:25 - loss: 3.3540 - accuracy: 0.9903
19/41 [============>.................] - ETA: 58:02 - loss: 3.3473 - accuracy: 0.9903  
20/41 [=============>................] - ETA: 54:56 - loss: 3.3040 - accuracy: 0.9904
21/41 [==============>...............] - ETA: 51:55 - loss: 3.2783 - accuracy: 0.9904
22/41 [===============>..............] - ETA: 48:51 - loss: 3.2713 - accuracy: 0.9904
23/41 [===============>..............] - ETA: 45:54 - loss: 3.2449 - accuracy: 0.9905
24/41 [================>.............] - ETA: 42:59 - loss: 3.2201 - accuracy: 0.9906
25/41 [=================>............] - ETA: 40:10 - loss: 3.2031 - accuracy: 0.9906
26/41 [==================>...........] - ETA: 37:23 - loss: 3.1892 - accuracy: 0.9907
27/41 [==================>...........] - ETA: 34:40 - loss: 3.1780 - accuracy: 0.9907
28/41 [===================>..........] - ETA: 31:59 - loss: 3.1598 - accuracy: 0.9907
29/41 [====================>.........] - ETA: 29:21 - loss: 3.1312 - accuracy: 0.9908
30/41 [====================>.........] - ETA: 26:45 - loss: 3.1161 - accuracy: 0.9908
31/41 [=====================>........] - ETA: 24:13 - loss: 3.0992 - accuracy: 0.9909
32/41 [======================>.......] - ETA: 21:41 - loss: 3.0769 - accuracy: 0.9909
33/41 [=======================>......] - ETA: 19:11 - loss: 3.0552 - accuracy: 0.9910
34/41 [=======================>......] - ETA: 16:43 - loss: 3.0336 - accuracy: 0.9911
35/41 [========================>.....] - ETA: 14:16 - loss: 3.0293 - accuracy: 0.9911
36/41 [=========================>....] - ETA: 11:51 - loss: 3.0031 - accuracy: 0.9912
37/41 [==========================>...] - ETA: 9:27 - loss: 2.9999 - accuracy: 0.9911 
38/41 [==========================>...] - ETA: 7:04 - loss: 2.9762 - accuracy: 0.9912
39/41 [===========================>..] - ETA: 4:42 - loss: 2.9595 - accuracy: 0.9912
40/41 [============================>.] - ETA: 2:20 - loss: 2.9461 - accuracy: 0.9913
41/41 [==============================] - 6191s 151s/step - loss: 2.9283 - accuracy: 0.9913 - val_loss: 88.1398 - val_accuracy: 0.9763
Epoch 3/20

 1/41 [..............................] - ETA: 1:21:47 - loss: 2.3645 - accuracy: 0.9929
 2/41 [>.............................] - ETA: 1:19:44 - loss: 2.3826 - accuracy: 0.9926
 3/41 [=>............................] - ETA: 1:17:39 - loss: 2.2856 - accuracy: 0.9929
 4/41 [=>............................] - ETA: 1:15:51 - loss: 2.3566 - accuracy: 0.9927
 5/41 [==>...........................] - ETA: 1:13:48 - loss: 2.4427 - accuracy: 0.9923
 6/41 [===>..........................] - ETA: 1:11:44 - loss: 2.3604 - accuracy: 0.9926
 7/41 [====>.........................] - ETA: 1:09:43 - loss: 2.3260 - accuracy: 0.9927
 8/41 [====>.........................] - ETA: 1:07:43 - loss: 2.2732 - accuracy: 0.9929
 9/41 [=====>........................] - ETA: 1:05:42 - loss: 2.2614 - accuracy: 0.9930
10/41 [======>.......................] - ETA: 1:03:39 - loss: 2.2640 - accuracy: 0.9930
11/41 [=======>......................] - ETA: 1:01:35 - loss: 2.2310 - accuracy: 0.9931
12/41 [=======>......................] - ETA: 59:31 - loss: 2.2228 - accuracy: 0.9932  
13/41 [========>.....................] - ETA: 57:27 - loss: 2.2280 - accuracy: 0.9931
14/41 [=========>....................] - ETA: 55:26 - loss: 2.2386 - accuracy: 0.9931
15/41 [=========>....................] - ETA: 53:23 - loss: 2.2361 - accuracy: 0.9930
16/41 [==========>...................] - ETA: 51:22 - loss: 2.2266 - accuracy: 0.9931
17/41 [===========>..................] - ETA: 49:21 - loss: 2.2104 - accuracy: 0.9931
18/41 [============>.................] - ETA: 47:21 - loss: 2.2063 - accuracy: 0.9931
19/41 [============>.................] - ETA: 45:18 - loss: 2.1942 - accuracy: 0.9932
20/41 [=============>................] - ETA: 43:14 - loss: 2.1704 - accuracy: 0.9933
21/41 [==============>...............] - ETA: 41:11 - loss: 2.1761 - accuracy: 0.9932
22/41 [===============>..............] - ETA: 39:09 - loss: 2.1709 - accuracy: 0.9933
23/41 [===============>..............] - ETA: 37:05 - loss: 2.1618 - accuracy: 0.9933
24/41 [================>.............] - ETA: 35:02 - loss: 2.1493 - accuracy: 0.9933
25/
  • 1
  • 2
  • 3
  • 4
  • 5
  • 6
  • 7
  • 8
  • 9
  • 10
  • 11
  • 12
  • 13
  • 14
  • 15
  • 16
  • 17
  • 18
  • 19
  • 20
  • 21
  • 22
  • 23
  • 24
  • 25
  • 26
  • 27
  • 28
  • 29
  • 30
  • 31
  • 32
  • 33
  • 34
  • 35
  • 36
  • 37
  • 38
  • 39
  • 40
  • 41
  • 42
  • 43
  • 44
  • 45
  • 46
  • 47
  • 48
  • 49
  • 50
  • 51
  • 52
  • 53
  • 54
  • 55
  • 56
  • 57
  • 58
  • 59
  • 60
  • 61
  • 62
  • 63
  • 64
  • 65
  • 66
  • 67
  • 68
  • 69
  • 70
  • 71
  • 72
  • 73
  • 74
  • 75
  • 76
  • 77
  • 78
  • 79
  • 80
  • 81
  • 82
  • 83
  • 84
  • 85
  • 86
  • 87
  • 88
  • 89
  • 90
  • 91
  • 92
  • 93
  • 94
  • 95
  • 96
  • 97
  • 98
  • 99
  • 100
  • 101
  • 102
  • 103
  • 104
  • 105
  • 106
  • 107
  • 108
  • 109
  • 110
  • 111
  • 112
  • 113
  • 114
  • 115
  • 116
  • 117
  • 118
  • 119
  • 120
  • 121
  • 122
  • 123
  • 124
  • 125
  • 126
  • 127
  • 128
  • 129
  • 130
  • 131
  • 132
  • 133
  • 134
  • 135
  • 136
  • 137
  • 138
  • 139
  • 140
  • 141
  • 142
  • 143
  • 144
  • 145
  • 146
  • 147
  • 148
  • 149
  • 150
  • 151
  • 152
  • 153
  • 154
  • 155
  • 156
  • 157
  • 158
  • 159
  • 160
  • 161
  • 162
  • 163
  • 164
  • 165
  • 166
  • 167
  • 168
  • 169
  • 170
  • 171
  • 172
  • 173
  • 174
  • 175
  • 176
  • 177
  • 178
  • 179
  • 180
  • 181
  • 182
  • 183
  • 184
  • 185
  • 186
  • 187
  • 188
  • 189
  • 190
  • 191
  • 192
  • 193
  • 194
  • 195
  • 196
  • 197
  • 198
  • 199
  • 200
  • 201
  • 202
  • 203
  • 204
  • 205
  • 206
  • 207
  • 208
  • 209
  • 210
  • 211
  • 212
  • 213
  • 214
  • 215
  • 216
  • 217
  • 218
  • 219
  • 220
  • 221
  • 222
  • 223
  • 224
  • 225
  • 226
  • 227
  • 228
  • 229
  • 230
  • 231
  • 232
  • 233
  • 234
  • 235
  • 236
  • 237
  • 238
  • 239
  • 240
  • 241
  • 242
  • 243
  • 244
  • 245
  • 246
  • 247
  • 248
  • 249
  • 250
  • 251
  • 252
  • 253
  • 254
  • 255
  • 256
  • 257
  • 258
  • 259
  • 260
  • 261
  • 262
  • 263
  • 264
  • 265
  • 266
  • 267
  • 268
  • 269
  • 270
  • 271
  • 272
  • 273
  • 274
  • 275
  • 276
  • 277
  • 278
  • 279
  • 280
  • 281
  • 282
  • 283
  • 284
  • 285
  • 286
  • 287
  • 288
  • 289
  • 290
  • 291
  • 292
  • 293
  • 294
  • 295
  • 296
  • 297
  • 298
  • 299
  • 300
  • 301
  • 302
  • 303
  • 304
  • 305
  • 306
  • 307
  • 308
  • 309
  • 310
  • 311
  • 312
  • 313
  • 314
  • 315
  • 316
  • 317
  • 318
  • 319
  • 320
  • 321
  • 322
  • 323
  • 324
  • 325
  • 326
  • 327
  • 328
  • 329
  • 330
  • 331
  • 332
  • 333
  • 334
  • 335
  • 336
  • 337
  • 338
  • 339
  • 340
  • 341
  • 342
  • 343
  • 344
  • 345
  • 346
  • 347
  • 348
  • 349
  • 350
  • 351
  • 352
  • 353
  • 354
  • 355
  • 356
  • 357
  • 358
  • 359
  • 360
  • 361
  • 362
  • 363
  • 364
  • 365
  • 366
  • 367
  • 368
  • 369
  • 370
  • 371
  • 372
声明:本文内容由网友自发贡献,不代表【wpsshop博客】立场,版权归原作者所有,本站不承担相应法律责任。如您发现有侵权的内容,请联系我们。转载请注明出处:https://www.wpsshop.cn/w/weixin_40725706/article/detail/464473
推荐阅读
相关标签
  

闽ICP备14008679号