当前位置:   article > 正文

『NLP经典项目集』09 :基于Bi-GRU+CRF的快递单信息抽取_chunkevaluator

chunkevaluator
如何从快递单中抽取关键信息
注意

运行项目之前,需加载完成以下项目文件。



本项目将演示如何从用户提供的快递单中,抽取姓名、电话、省、市、区、详细地址等内容,形成结构化信息。辅助物流行业从业者进行有效信息的提取,从而降低客户填单的成本。

此外,通过从快递单抽取信息这个任务,介绍序列化标注模型及其在 Paddle 的使用方式。

本项目基于PaddleNLP NER example的代码进行修改,主要包括“背景介绍”、“代码实践”、“进阶使用”、“概念解释”等四个部分。

主要介绍:

PaddleNLP中的网络层BiGRU、CRF、ViterbiDecoder。
通过paddlenlp.embedding的功能,热启动加载中文词向量,提升效果
评价指标paddlenlp.metrics.ChunkEvaluator
记得给PaddleNLP点个小小的Star⭐

开源不易,希望大家多多支持~

GitHub地址:https://github.com/PaddlePaddle/PaddleNLP 

AI Studio平台后续会默认安装PaddleNLP,在此之前可使用如下命令安装。

In [1]
!pip install --upgrade paddlenlp\>=2.0.0rc0 -i https://pypi.org/simple
Requirement already up-to-date: paddlenlp>=2.0.0rc0 in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (2.0.0rc14)
Requirement already satisfied, skipping upgrade: h5py in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from paddlenlp>=2.0.0rc0) (2.9.0)
Requirement already satisfied, skipping upgrade: seqeval in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from paddlenlp>=2.0.0rc0) (1.2.2)
Requirement already satisfied, skipping upgrade: visualdl in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from paddlenlp>=2.0.0rc0) (2.1.1)
Requirement already satisfied, skipping upgrade: colorlog in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from paddlenlp>=2.0.0rc0) (4.1.0)
Requirement already satisfied, skipping upgrade: colorama in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from paddlenlp>=2.0.0rc0) (0.4.4)
Requirement already satisfied, skipping upgrade: jieba in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from paddlenlp>=2.0.0rc0) (0.42.1)
Requirement already satisfied, skipping upgrade: numpy>=1.7 in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from h5py->paddlenlp>=2.0.0rc0) (1.16.4)
Requirement already satisfied, skipping upgrade: six in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from h5py->paddlenlp>=2.0.0rc0) (1.15.0)
Requirement already satisfied, skipping upgrade: scikit-learn>=0.21.3 in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from seqeval->paddlenlp>=2.0.0rc0) (0.22.1)
Requirement already satisfied, skipping upgrade: Flask-Babel>=1.0.0 in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from visualdl->paddlenlp>=2.0.0rc0) (1.0.0)
Requirement already satisfied, skipping upgrade: flask>=1.1.1 in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from visualdl->paddlenlp>=2.0.0rc0) (1.1.1)
Requirement already satisfied, skipping upgrade: bce-python-sdk in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from visualdl->paddlenlp>=2.0.0rc0) (0.8.53)
Requirement already satisfied, skipping upgrade: flake8>=3.7.9 in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from visualdl->paddlenlp>=2.0.0rc0) (3.8.2)
Requirement already satisfied, skipping upgrade: pre-commit in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from visualdl->paddlenlp>=2.0.0rc0) (1.21.0)
Requirement already satisfied, skipping upgrade: protobuf>=3.11.0 in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from visualdl->paddlenlp>=2.0.0rc0) (3.14.0)
Requirement already satisfied, skipping upgrade: Pillow>=7.0.0 in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from visualdl->paddlenlp>=2.0.0rc0) (7.1.2)
Requirement already satisfied, skipping upgrade: requests in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from visualdl->paddlenlp>=2.0.0rc0) (2.22.0)
Requirement already satisfied, skipping upgrade: shellcheck-py in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from visualdl->paddlenlp>=2.0.0rc0) (0.7.1.1)
Requirement already satisfied, skipping upgrade: scipy>=0.17.0 in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from scikit-learn>=0.21.3->seqeval->paddlenlp>=2.0.0rc0) (1.3.0)
Requirement already satisfied, skipping upgrade: joblib>=0.11 in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from scikit-learn>=0.21.3->seqeval->paddlenlp>=2.0.0rc0) (0.14.1)
Requirement already satisfied, skipping upgrade: pytz in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from Flask-Babel>=1.0.0->visualdl->paddlenlp>=2.0.0rc0) (2019.3)
Requirement already satisfied, skipping upgrade: Jinja2>=2.5 in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from Flask-Babel>=1.0.0->visualdl->paddlenlp>=2.0.0rc0) (2.10.1)
Requirement already satisfied, skipping upgrade: Babel>=2.3 in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from Flask-Babel>=1.0.0->visualdl->paddlenlp>=2.0.0rc0) (2.8.0)
Requirement already satisfied, skipping upgrade: itsdangerous>=0.24 in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from flask>=1.1.1->visualdl->paddlenlp>=2.0.0rc0) (1.1.0)
Requirement already satisfied, skipping upgrade: click>=5.1 in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from flask>=1.1.1->visualdl->paddlenlp>=2.0.0rc0) (7.0)
Requirement already satisfied, skipping upgrade: Werkzeug>=0.15 in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from flask>=1.1.1->visualdl->paddlenlp>=2.0.0rc0) (0.16.0)
Requirement already satisfied, skipping upgrade: future>=0.6.0 in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from bce-python-sdk->visualdl->paddlenlp>=2.0.0rc0) (0.18.0)
Requirement already satisfied, skipping upgrade: pycryptodome>=3.8.0 in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from bce-python-sdk->visualdl->paddlenlp>=2.0.0rc0) (3.9.9)
Requirement already satisfied, skipping upgrade: pyflakes<2.3.0,>=2.2.0 in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from flake8>=3.7.9->visualdl->paddlenlp>=2.0.0rc0) (2.2.0)
Requirement already satisfied, skipping upgrade: importlib-metadata; python_version < "3.8" in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from flake8>=3.7.9->visualdl->paddlenlp>=2.0.0rc0) (0.23)
Requirement already satisfied, skipping upgrade: pycodestyle<2.7.0,>=2.6.0a1 in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from flake8>=3.7.9->visualdl->paddlenlp>=2.0.0rc0) (2.6.0)
Requirement already satisfied, skipping upgrade: mccabe<0.7.0,>=0.6.0 in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from flake8>=3.7.9->visualdl->paddlenlp>=2.0.0rc0) (0.6.1)
Requirement already satisfied, skipping upgrade: pyyaml in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from pre-commit->visualdl->paddlenlp>=2.0.0rc0) (5.1.2)
Requirement already satisfied, skipping upgrade: nodeenv>=0.11.1 in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from pre-commit->visualdl->paddlenlp>=2.0.0rc0) (1.3.4)
Requirement already satisfied, skipping upgrade: virtualenv>=15.2 in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from pre-commit->visualdl->paddlenlp>=2.0.0rc0) (16.7.9)
Requirement already satisfied, skipping upgrade: identify>=1.0.0 in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from pre-commit->visualdl->paddlenlp>=2.0.0rc0) (1.4.10)
Requirement already satisfied, skipping upgrade: aspy.yaml in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from pre-commit->visualdl->paddlenlp>=2.0.0rc0) (1.3.0)
Requirement already satisfied, skipping upgrade: toml in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from pre-commit->visualdl->paddlenlp>=2.0.0rc0) (0.10.0)
Requirement already satisfied, skipping upgrade: cfgv>=2.0.0 in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from pre-commit->visualdl->paddlenlp>=2.0.0rc0) (2.0.1)
Requirement already satisfied, skipping upgrade: urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from requests->visualdl->paddlenlp>=2.0.0rc0) (1.25.6)
Requirement already satisfied, skipping upgrade: idna<2.9,>=2.5 in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from requests->visualdl->paddlenlp>=2.0.0rc0) (2.8)
Requirement already satisfied, skipping upgrade: chardet<3.1.0,>=3.0.2 in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from requests->visualdl->paddlenlp>=2.0.0rc0) (3.0.4)
Requirement already satisfied, skipping upgrade: certifi>=2017.4.17 in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from requests->visualdl->paddlenlp>=2.0.0rc0) (2019.9.11)
Requirement already satisfied, skipping upgrade: MarkupSafe>=0.23 in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from Jinja2>=2.5->Flask-Babel>=1.0.0->visualdl->paddlenlp>=2.0.0rc0) (1.1.1)
Requirement already satisfied, skipping upgrade: zipp>=0.5 in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from importlib-metadata; python_version < "3.8"->flake8>=3.7.9->visualdl->paddlenlp>=2.0.0rc0) (0.6.0)
Requirement already satisfied, skipping upgrade: more-itertools in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from zipp>=0.5->importlib-metadata; python_version < "3.8"->flake8>=3.7.9->visualdl->paddlenlp>=2.0.0rc0) (7.2.0)
PART A. 背景介绍
A.1 快递单信息抽取任务
如何从物流信息中抽取想要的关键信息呢?我们首先要定义好需要抽取哪些字段。

比如现在拿到一个快递单,可以作为我们的模型输入,例如“张三18625584663广东省深圳市南山区学府路东百度国际大厦”,那么序列标注模型的目的就是识别出其中的“张三”为人名(用符号 P 表示),“18625584663”为电话名(用符号 T 表示),“广东省深圳市南山区百度国际大厦”分别是 1-4 级的地址(分别用 A1~A4 表示,可以释义为省、市、区、街道)。

这是一个典型的命名实体识别(Named Entity Recognition,NER)场景,各实体类型及相应符号表示见下表:

抽取实体/字段	符号	抽取结果
姓名	P	张三
电话	T	18625584663
省	A1	广东省
市	A2	深圳市
区	A3	南山区
详细地址	A4	百度国际大厦
A.2 序列标注模型
我们可以用序列标注模型来解决快递单的信息抽取任务,下面具体介绍一下序列标注模型。

在序列标注任务中,一般会定义一个标签集合,来表示所以可能取到的预测结果。在本案例中,针对需要被抽取的“姓名、电话、省、市、区、详细地址”等实体,标签集合可以定义为:

label = {P-B, P-I, T-B, T-I, A1-B, A1-I, A2-B, A2-I, A3-B, A3-I, A4-B, A4-I, O}

每个标签的定义分别为:

标签	定义
P-B	姓名起始位置
P-I	姓名中间位置或结束位置
T-B	电话起始位置
T-I	电话中间位置或结束位置
A1-B	省份起始位置
A1-I	省份中间位置或结束位置
A2-B	城市起始位置
A2-I	城市中间位置或结束位置
A3-B	县区起始位置
A3-I	县区中间位置或结束位置
A4-B	详细地址起始位置
A4-I	详细地址中间位置或结束位置
O	无关字符
注意每个标签的结果只有 B、I、O 三种,这种标签的定义方式叫做 BIO 体系,也有稍麻烦一点的 BIESO 体系,这里不做展开。其中 B 表示一个标签类别的开头,比如 P-B 指的是姓名的开头;相应的,I 表示一个标签的延续。

对于句子“张三18625584663广东省深圳市南山区百度国际大厦”,每个汉字及对应标签为:



图1:数据集标注示例

注意到“张“,”三”在这里表示成了“P-B” 和 “P-I”,“P-B”和“P-I”合并成“P” 这个标签。这样重新组合后可以得到以下信息抽取结果:

张三	18625584663	广东省	深圳市	南山区	百度国际大厦
P	T	A1	A2	A3	A4
PART B. 代码实践
In [1]
import paddle
import paddle.nn as nn

import paddlenlp
from paddlenlp.datasets import MapDataset
from paddlenlp.data import Stack, Tuple, Pad
from paddlenlp.layers import LinearChainCrf, ViterbiDecoder, LinearChainCrfLoss
from paddlenlp.metrics import ChunkEvaluator
B.1 数据准备
为了训练序列标注模型,一般需要准备三个数据集:训练集train.txt、验证集dev.txt、测试集test.txt。数据集存放在data目录中。

训练集,用来训练模型参数的数据集,模型直接根据训练集来调整自身参数以获得更好的分类效果。
验证集,用于在训练过程中检验模型的状态,收敛情况。验证集通常用于调整超参数,根据几组模型验证集上的表现决定哪组超参数拥有最好的性能。
测试集,用来计算模型的各项评估指标,验证模型泛化能力。
此外,序列标注模型还依赖以下词典数据,词典数据存放在conf目录中。

输入文本词典word.dic
对输入文本中特殊字符进行转换的词典q2b.dic
标记标签的词典tag.dic
这里我们提供一份已标注的快递单关键信息数据集。训练使用的数据也可以由大家自己组织数据。数据格式除了第一行是 text_a\tlabel 固定的开头,后面的每行数据都是由两列组成,以制表符分隔,第一列是 utf-8 编码的中文文本,以 \002 分割,第二列是对应每个字的标注,以 \002 分割。

数据集及词典数据的目录结构如下:

在训练和预测阶段,我们都需要进行原始数据的预处理,具体处理工作包括:

从原始数据文件中抽取出句子和标签,构造句子序列和标签序列
将句子序列中的特殊字符进行转换
依据词典获取词对应的id索引
自定义数据集
看一下训练集
训练集中除第一行是 text_a\tlabel,后面的每行数据都是由两列组成,以制表符分隔,第一列是 utf-8 编码的中文文本,以 \002 分割,第二列是对应序列标注的结果,以 \002 分割。

In [3]
# 下载并解压数据集
from paddle.utils.download import get_path_from_url
URL = "https://paddlenlp.bj.bcebos.com/paddlenlp/datasets/waybill.tar.gz"
get_path_from_url(URL,"./")

for i, line in enumerate(open('data/train.txt')):
    if 0 < i < 5:
        print ('%d: ' % i, line.split()[0])
        print ('   ', line.split()[1])
2021-04-06 19:34:34,890 - INFO - unique_endpoints {''}
2021-04-06 19:34:34,892 - INFO - Found ./waybill.tar.gz
2021-04-06 19:34:34,893 - INFO - Decompressing ./waybill.tar.gz...
1: 
 
16620200077宣荣嗣甘肃省白银市会宁县河畔镇十字街金海超市西行50米

   
 
T-BT-IT-IT-IT-IT-IT-IT-IT-IT-IT-IP-BP-IP-IA1-BA1-IA1-IA2-BA2-IA2-IA3-BA3-IA3-IA4-BA4-IA4-IA4-IA4-IA4-IA4-IA4-IA4-IA4-IA4-IA4-IA4-IA4-IA4-I

2: 
 
13552664307姜骏炜云南省德宏傣族景颇族自治州盈江县平原镇蜜回路下段

   
 
T-BT-IT-IT-IT-IT-IT-IT-IT-IT-IT-IP-BP-IP-IA1-BA1-IA1-IA2-BA2-IA2-IA2-IA2-IA2-IA2-IA2-IA2-IA2-IA3-BA3-IA3-IA4-BA4-IA4-IA4-IA4-IA4-IA4-IA4-I

3: 
 
内蒙古自治区赤峰市阿鲁科尔沁旗汉林西街路南13701085390那峥

   
 
A1-BA1-IA1-IA1-IA1-IA1-IA2-BA2-IA2-IA3-BA3-IA3-IA3-IA3-IA3-IA4-BA4-IA4-IA4-IA4-IA4-IT-BT-IT-IT-IT-IT-IT-IT-IT-IT-IT-IP-BP-I

4: 
 
广东省梅州市大埔县茶阳镇胜利路13601328173张铱

   
 
A1-BA1-IA1-IA2-BA2-IA2-IA3-BA3-IA3-IA4-BA4-IA4-IA4-IA4-IA4-IT-BT-IT-IT-IT-IT-IT-IT-IT-IT-IT-IP-BP-I

继承paddle.io.Dataset自定义数据集

In [5]
def convert_tokens_to_ids(tokens, vocab, oov_token=None):
    token_ids = []
    oov_id = vocab.get(oov_token) if oov_token else None
    for token in tokens:
        token_id = vocab.get(token, oov_id)
        token_ids.append(token_id)
    return token_ids


def load_dict(dict_path):
    vocab = {}
    i = 0
    for line in open(dict_path, 'r', encoding='utf-8'):
        key = line.strip('\n')
        vocab[key] = i
        i += 1
    return vocab


def load_dataset(datafiles):
    def read(data_path):
        with open(data_path, 'r', encoding='utf-8') as fp:
            next(fp)
            for line in fp.readlines():
                words, labels = line.strip('\n').split('\t')
                words = words.split('\002')
                labels = labels.split('\002')
                yield words, labels

    if isinstance(datafiles, str):
        return MapDataset(list(read(datafiles)))
    elif isinstance(datafiles, list) or isinstance(datafiles, tuple):
        return [MapDataset(list(read(datafile))) for datafile in datafiles]

train_ds, dev_ds, test_ds = load_dataset(datafiles=('data/train.txt', 'data/dev.txt', 'data/test.txt'))

label_vocab = load_dict('./data/tag.dic')
word_vocab = load_dict('./data/word.dic')

def convert_example(example):
        tokens, labels = example
        token_ids = convert_tokens_to_ids(tokens, word_vocab, 'OOV')
        label_ids = convert_tokens_to_ids(labels, label_vocab, 'O')
        return token_ids, len(token_ids), label_ids

train_ds.map(convert_example)
dev_ds.map(convert_example)
test_ds.map(convert_example)
<paddlenlp.datasets.experimental.dataset.MapDataset at 0x7f5dcb963250>
构造dataloder
In [6]
batchify_fn = lambda samples, fn=Tuple(
        Pad(axis=0, pad_val=word_vocab.get('OOV')),  # token_ids
        Stack(),  # seq_len
        Pad(axis=0, pad_val=label_vocab.get('O'))  # label_ids
    ): fn(samples)

train_loader = paddle.io.DataLoader(
        dataset=train_ds,
        batch_size=32,
        shuffle=True,
        drop_last=True,
        return_list=True,
        collate_fn=batchify_fn)

dev_loader = paddle.io.DataLoader(
        dataset=dev_ds,
        batch_size=32,
        drop_last=True,
        return_list=True,
        collate_fn=batchify_fn)

test_loader = paddle.io.DataLoader(
        dataset=test_ds,
        batch_size=32,
        drop_last=True,
        return_list=True,
        collate_fn=batchify_fn)
B.2 网络构建
随着深度学习的发展,目前主流的序列化标注任务基于词向量(word embedding)进行表示学习。下面介绍模型的整体训练流程如下,



图2:训练流程图

序列标注任务常用的模型是RNN+CRF。GRU和LSTM都是常用的RNN单元。这里我们以Bi-GRU+CRF模型为例,介绍如何使用 PaddlePaddle 定义序列化标注任务的网络结构。如下图所示,GRU的输出可以作为 CRF 的输入,最后 CRF 的输出作为模型整体的预测结果。



图3:Bi-GRU+CRF

In [7]
class BiGRUWithCRF(nn.Layer):
    def __init__(self,
                 emb_size,
                 hidden_size,
                 word_num,
                 label_num,
                 use_w2v_emb=False):
        super(BiGRUWithCRF, self).__init__()
        if use_w2v_emb:
            self.word_emb = TokenEmbedding(
                extended_vocab_path='./conf/word.dic', unknown_token='OOV')
        else:
            self.word_emb = nn.Embedding(word_num, emb_size)
        self.gru = nn.GRU(emb_size,
                          hidden_size,
                          num_layers=2,
                          direction='bidirectional')
        self.fc = nn.Linear(hidden_size * 2, label_num + 2)  # BOS EOS
        self.crf = LinearChainCrf(label_num)
        self.decoder = ViterbiDecoder(self.crf.transitions)

    def forward(self, x, lens):
        embs = self.word_emb(x)
        output, _ = self.gru(embs)
        output = self.fc(output)
        _, pred = self.decoder(output, lens)
        return output, lens, pred

# Define the model netword and its loss
network = BiGRUWithCRF(300, 300, len(word_vocab), len(label_vocab))
model = paddle.Model(network)
B.3 网络配置
定义网络结构后,需要配置优化器、损失函数、评价指标。

评价指标
针对每条序列样本的预测结果,序列标注任务将预测结果按照语块(chunk)进行结合并进行评价。评价指标通常有 Precision、Recall 和 F1。

Precision,精确率,也叫查准率,由模型预测正确的个数除以模型总的预测的个数得到,关注模型预测出来的结果准不准
Recall,召回率,又叫查全率, 由模型预测正确的个数除以真实标签的个数得到,关注模型漏了哪些东西
F1,综合评价指标,计算公式如下,F1=2∗Precision∗RecallPrecision+RecallF1 = \frac{2*Precision*Recall}{Precision+Recall}F1= 
Precision+Recall
2∗Precision∗Recall
​	
 ,同时考虑 Precision 和 Recall ,是 Precision 和 Recall 的折中。
paddlenlp.metrics中集成了ChunkEvaluator评价指标,并逐步丰富中,

In [9]
optimizer = paddle.optimizer.Adam(learning_rate=0.001, parameters=model.parameters())
crf_loss = LinearChainCrfLoss(network.crf)
chunk_evaluator = ChunkEvaluator(label_list=label_vocab.keys(), suffix=True)
model.prepare(optimizer, crf_loss, chunk_evaluator)
B.4 模型训练
In [10]
model.fit(train_data=train_loader,
              eval_data=dev_loader,
              epochs=10,
              save_dir='./results',
              log_freq=1)
The loss value printed in the log is the current step, and the metric is the average value of previous step.
Epoch 1/10
/opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages/paddle/fluid/layers/utils.py:77: DeprecationWarning: Using or importing the ABCs from 'collections' instead of from 'collections.abc' is deprecated, and in 3.8 it will stop working
  return (isinstance(seq, collections.Sequence) and
/opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages/paddle/fluid/dygraph/math_op_patch.py:238: UserWarning: The dtype of left and right variables are not the same, left dtype is VarType.INT32, but right dtype is VarType.INT64, the right dtype will convert to VarType.INT32
  format(lhs_dtype, rhs_dtype, lhs_dtype))
/opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages/paddle/fluid/dygraph/math_op_patch.py:238: UserWarning: The dtype of left and right variables are not the same, left dtype is VarType.FP32, but right dtype is VarType.INT32, the right dtype will convert to VarType.FP32
  format(lhs_dtype, rhs_dtype, lhs_dtype))
step  1/50 - loss: 78.5088 - precision: 0.0000e+00 - recall: 0.0000e+00 - f1: 0.0000e+00 - 1s/step
step  2/50 - loss: 82.8188 - precision: 0.0000e+00 - recall: 0.0000e+00 - f1: 0.0000e+00 - 943ms/step
step  3/50 - loss: 74.2187 - precision: 0.0000e+00 - recall: 0.0000e+00 - f1: 0.0000e+00 - 813ms/step
step  4/50 - loss: 80.7896 - precision: 0.0000e+00 - recall: 0.0000e+00 - f1: 0.0000e+00 - 762ms/step
step  5/50 - loss: 74.7695 - precision: 0.0000e+00 - recall: 0.0000e+00 - f1: 0.0000e+00 - 752ms/step
step  6/50 - loss: 69.3572 - precision: 0.0000e+00 - recall: 0.0000e+00 - f1: 0.0000e+00 - 727ms/step
step  7/50 - loss: 65.6301 - precision: 0.0000e+00 - recall: 0.0000e+00 - f1: 0.0000e+00 - 695ms/step
step  8/50 - loss: 75.9714 - precision: 0.0000e+00 - recall: 0.0000e+00 - f1: 0.0000e+00 - 679ms/step
step  9/50 - loss: 87.0327 - precision: 0.0049 - recall: 0.0046 - f1: 0.0048 - 664ms/step
step 10/50 - loss: 60.2472 - precision: 0.0150 - recall: 0.0136 - f1: 0.0143 - 656ms/step
step 11/50 - loss: 50.8904 - precision: 0.0196 - recall: 0.0171 - f1: 0.0183 - 645ms/step
step 12/50 - loss: 52.0267 - precision: 0.0214 - recall: 0.0179 - f1: 0.0195 - 642ms/step
step 13/50 - loss: 63.9512 - precision: 0.0236 - recall: 0.0189 - f1: 0.0210 - 631ms/step
step 14/50 - loss: 49.6095 - precision: 0.0247 - recall: 0.0191 - f1: 0.0215 - 624ms/step
step 15/50 - loss: 49.1425 - precision: 0.0263 - recall: 0.0195 - f1: 0.0224 - 624ms/step
step 16/50 - loss: 42.3839 - precision: 0.0267 - recall: 0.0193 - f1: 0.0224 - 618ms/step
step 17/50 - loss: 41.6498 - precision: 0.0270 - recall: 0.0194 - f1: 0.0225 - 611ms/step
step 18/50 - loss: 36.9919 - precision: 0.0273 - recall: 0.0203 - f1: 0.0233 - 609ms/step
step 19/50 - loss: 34.7696 - precision: 0.0261 - recall: 0.0206 - f1: 0.0231 - 614ms/step
step 20/50 - loss: 35.8755 - precision: 0.0262 - recall: 0.0220 - f1: 0.0239 - 614ms/step
step 21/50 - loss: 42.0558 - precision: 0.0298 - recall: 0.0259 - f1: 0.0277 - 613ms/step
step 22/50 - loss: 34.1032 - precision: 0.0353 - recall: 0.0316 - f1: 0.0333 - 610ms/step
step 23/50 - loss: 47.4666 - precision: 0.0402 - recall: 0.0371 - f1: 0.0386 - 612ms/step
step 24/50 - loss: 40.1994 - precision: 0.0415 - recall: 0.0394 - f1: 0.0405 - 614ms/step
step 25/50 - loss: 27.4477 - precision: 0.0442 - recall: 0.0429 - f1: 0.0435 - 610ms/step
step 26/50 - loss: 26.3397 - precision: 0.0488 - recall: 0.0479 - f1: 0.0483 - 611ms/step
step 27/50 - loss: 39.2796 - precision: 0.0510 - recall: 0.0509 - f1: 0.0509 - 610ms/step
step 28/50 - loss: 22.1896 - precision: 0.0551 - recall: 0.0560 - f1: 0.0556 - 609ms/step
step 29/50 - loss: 21.3124 - precision: 0.0598 - recall: 0.0620 - f1: 0.0609 - 607ms/step
step 30/50 - loss: 23.5684 - precision: 0.0646 - recall: 0.0678 - f1: 0.0662 - 605ms/step
step 31/50 - loss: 20.9514 - precision: 0.0710 - recall: 0.0752 - f1: 0.0731 - 603ms/step
step 32/50 - loss: 21.5752 - precision: 0.0776 - recall: 0.0827 - f1: 0.0801 - 604ms/step
step 33/50 - loss: 17.9267 - precision: 0.0848 - recall: 0.0906 - f1: 0.0876 - 602ms/step
step 34/50 - loss: 16.6697 - precision: 0.0900 - recall: 0.0968 - f1: 0.0933 - 600ms/step
step 35/50 - loss: 12.8358 - precision: 0.0984 - recall: 0.1060 - f1: 0.1021 - 598ms/step
step 36/50 - loss: 26.8808 - precision: 0.1056 - recall: 0.1138 - f1: 0.1096 - 599ms/step
step 37/50 - loss: 18.7130 - precision: 0.1133 - recall: 0.1219 - f1: 0.1175 - 597ms/step
step 38/50 - loss: 12.6454 - precision: 0.1226 - recall: 0.1319 - f1: 0.1271 - 597ms/step
step 39/50 - loss: 19.2935 - precision: 0.1299 - recall: 0.1399 - f1: 0.1347 - 599ms/step
step 40/50 - loss: 9.0211 - precision: 0.1346 - recall: 0.1449 - f1: 0.1396 - 598ms/step
step 41/50 - loss: 10.9109 - precision: 0.1392 - recall: 0.1499 - f1: 0.1444 - 596ms/step
step 42/50 - loss: 18.2951 - precision: 0.1464 - recall: 0.1578 - f1: 0.1519 - 594ms/step
step 43/50 - loss: 59.3916 - precision: 0.1551 - recall: 0.1676 - f1: 0.1611 - 594ms/step
step 44/50 - loss: 15.7017 - precision: 0.1629 - recall: 0.1763 - f1: 0.1693 - 593ms/step
step 45/50 - loss: 16.9626 - precision: 0.1681 - recall: 0.1821 - f1: 0.1748 - 595ms/step
step 46/50 - loss: 13.9831 - precision: 0.1753 - recall: 0.1904 - f1: 0.1826 - 599ms/step
step 47/50 - loss: 7.6885 - precision: 0.1841 - recall: 0.2000 - f1: 0.1917 - 598ms/step
step 48/50 - loss: 13.1062 - precision: 0.1938 - recall: 0.2104 - f1: 0.2018 - 600ms/step
step 49/50 - loss: 3.9712 - precision: 0.2012 - recall: 0.2185 - f1: 0.2095 - 600ms/step
step 50/50 - loss: 5.7236 - precision: 0.2058 - recall: 0.2237 - f1: 0.2144 - 600ms/step
save checkpoint at /home/aistudio/results/0
Eval begin...
The loss value printed in the log is the current batch, and the metric is the average value of previous step.
step 1/6 - loss: 11.2086 - precision: 0.4689 - recall: 0.5213 - f1: 0.4937 - 221ms/step
step 2/6 - loss: 7.8795 - precision: 0.5049 - recall: 0.5474 - f1: 0.5253 - 215ms/step
step 3/6 - loss: 6.4277 - precision: 0.5144 - recall: 0.5622 - f1: 0.5372 - 237ms/step
step 4/6 - loss: 4.6586 - precision: 0.5224 - recall: 0.5656 - f1: 0.5432 - 223ms/step
step 5/6 - loss: 19.6233 - precision: 0.5295 - recall: 0.5744 - f1: 0.5510 - 222ms/step
step 6/6 - loss: 37.3755 - precision: 0.5299 - recall: 0.5721 - f1: 0.5502 - 223ms/step
Eval samples: 192
Epoch 2/10
step  1/50 - loss: 8.1588 - precision: 0.4450 - recall: 0.5105 - f1: 0.4755 - 475ms/step
step  2/50 - loss: 12.5652 - precision: 0.4811 - recall: 0.5354 - f1: 0.5068 - 528ms/step
step  3/50 - loss: 4.2273 - precision: 0.5420 - recall: 0.5969 - f1: 0.5681 - 550ms/step
step  4/50 - loss: 2.5278 - precision: 0.5690 - recall: 0.6196 - f1: 0.5932 - 537ms/step
step  5/50 - loss: 2.3135 - precision: 0.5824 - recall: 0.6360 - f1: 0.6080 - 536ms/step
step  6/50 - loss: 12.4161 - precision: 0.5908 - recall: 0.6463 - f1: 0.6173 - 546ms/step
step  7/50 - loss: 14.2292 - precision: 0.5978 - recall: 0.6590 - f1: 0.6269 - 547ms/step
step  8/50 - loss: 6.5313 - precision: 0.6046 - recall: 0.6686 - f1: 0.6350 - 555ms/step
step  9/50 - loss: 4.7747 - precision: 0.6114 - recall: 0.6789 - f1: 0.6434 - 551ms/step
step 10/50 - loss: 3.7639 - precision: 0.6097 - recall: 0.6792 - f1: 0.6426 - 558ms/step
step 11/50 - loss: 11.0703 - precision: 0.6135 - recall: 0.6833 - f1: 0.6465 - 557ms/step
step 12/50 - loss: 3.9761 - precision: 0.6269 - recall: 0.6950 - f1: 0.6592 - 562ms/step
step 13/50 - loss: 3.9664 - precision: 0.6340 - recall: 0.7001 - f1: 0.6654 - 571ms/step
step 14/50 - loss: 1.3712 - precision: 0.6411 - recall: 0.7070 - f1: 0.6725 - 564ms/step
step 15/50 - loss: 9.3628 - precision: 0.6468 - recall: 0.7144 - f1: 0.6789 - 563ms/step
step 16/50 - loss: 3.4607 - precision: 0.6494 - recall: 0.7173 - f1: 0.6817 - 563ms/step
step 17/50 - loss: 9.8583 - precision: 0.6519 - recall: 0.7207 - f1: 0.6846 - 560ms/step
step 18/50 - loss: 0.0000e+00 - precision: 0.6611 - recall: 0.7296 - f1: 0.6936 - 559ms/step
step 19/50 - loss: 1.7324 - precision: 0.6707 - recall: 0.7384 - f1: 0.7029 - 560ms/step
step 20/50 - loss: 0.0000e+00 - precision: 0.6786 - recall: 0.7452 - f1: 0.7104 - 560ms/step
step 21/50 - loss: 4.3630 - precision: 0.6840 - recall: 0.7504 - f1: 0.7157 - 574ms/step
step 22/50 - loss: 0.0000e+00 - precision: 0.6903 - recall: 0.7566 - f1: 0.7219 - 573ms/step
step 23/50 - loss: 1.9147 - precision: 0.6951 - recall: 0.7615 - f1: 0.7268 - 575ms/step
step 24/50 - loss: 2.2605 - precision: 0.6952 - recall: 0.7632 - f1: 0.7276 - 575ms/step
step 25/50 - loss: 0.0000e+00 - precision: 0.7006 - recall: 0.7678 - f1: 0.7327 - 575ms/step
step 26/50 - loss: 0.0000e+00 - precision: 0.7056 - recall: 0.7722 - f1: 0.7374 - 577ms/step
step 27/50 - loss: 1.8594 - precision: 0.7103 - recall: 0.7766 - f1: 0.7420 - 577ms/step
step 28/50 - loss: 0.0000e+00 - precision: 0.7141 - recall: 0.7806 - f1: 0.7459 - 577ms/step
step 29/50 - loss: 0.1596 - precision: 0.7192 - recall: 0.7848 - f1: 0.7506 - 577ms/step
step 30/50 - loss: 0.2711 - precision: 0.7228 - recall: 0.7878 - f1: 0.7539 - 575ms/step
step 31/50 - loss: 3.2716 - precision: 0.7264 - recall: 0.7918 - f1: 0.7577 - 576ms/step
step 32/50 - loss: 6.8695 - precision: 0.7296 - recall: 0.7954 - f1: 0.7611 - 580ms/step
step 33/50 - loss: 0.0000e+00 - precision: 0.7342 - recall: 0.7995 - f1: 0.7655 - 581ms/step
step 34/50 - loss: 0.0000e+00 - precision: 0.7372 - recall: 0.8016 - f1: 0.7680 - 581ms/step
step 35/50 - loss: 0.0000e+00 - precision: 0.7416 - recall: 0.8056 - f1: 0.7723 - 580ms/step
step 36/50 - loss: 1.7162 - precision: 0.7433 - recall: 0.8081 - f1: 0.7743 - 584ms/step
step 37/50 - loss: 0.0000e+00 - precision: 0.7457 - recall: 0.8100 - f1: 0.7766 - 584ms/step
step 38/50 - loss: 0.0000e+00 - precision: 0.7495 - recall: 0.8128 - f1: 0.7798 - 583ms/step
step 39/50 - loss: 1.9127 - precision: 0.7523 - recall: 0.8145 - f1: 0.7822 - 582ms/step
step 40/50 - loss: 0.0000e+00 - precision: 0.7552 - recall: 0.8168 - f1: 0.7848 - 581ms/step
step 41/50 - loss: 0.0000e+00 - precision: 0.7572 - recall: 0.8182 - f1: 0.7865 - 580ms/step
step 42/50 - loss: 0.0000e+00 - precision: 0.7606 - recall: 0.8213 - f1: 0.7898 - 580ms/step
step 43/50 - loss: 0.5456 - precision: 0.7635 - recall: 0.8238 - f1: 0.7925 - 578ms/step
step 44/50 - loss: 7.6007 - precision: 0.7668 - recall: 0.8262 - f1: 0.7954 - 576ms/step
step 45/50 - loss: 0.0000e+00 - precision: 0.7687 - recall: 0.8283 - f1: 0.7974 - 576ms/step
step 46/50 - loss: 0.0000e+00 - precision: 0.7714 - recall: 0.8307 - f1: 0.7999 - 575ms/step
step 47/50 - loss: 0.0000e+00 - precision: 0.7729 - recall: 0.8324 - f1: 0.8016 - 574ms/step
step 48/50 - loss: 0.0000e+00 - precision: 0.7746 - recall: 0.8344 - f1: 0.8034 - 575ms/step
step 49/50 - loss: 0.0000e+00 - precision: 0.7767 - recall: 0.8362 - f1: 0.8054 - 576ms/step
step 50/50 - loss: 3.9779 - precision: 0.7790 - recall: 0.8375 - f1: 0.8072 - 574ms/step
save checkpoint at /home/aistudio/results/1
Eval begin...
The loss value printed in the log is the current batch, and the metric is the average value of previous step.
step 1/6 - loss: 0.0000e+00 - precision: 0.9167 - recall: 0.9362 - f1: 0.9263 - 241ms/step
step 2/6 - loss: 0.0000e+00 - precision: 0.9243 - recall: 0.9316 - f1: 0.9279 - 252ms/step
step 3/6 - loss: 0.0000e+00 - precision: 0.9171 - recall: 0.9299 - f1: 0.9235 - 263ms/step
step 4/6 - loss: 0.0000e+00 - precision: 0.9171 - recall: 0.9291 - f1: 0.9231 - 251ms/step
step 5/6 - loss: 6.3467 - precision: 0.9124 - recall: 0.9277 - f1: 0.9200 - 252ms/step
step 6/6 - loss: 8.0251 - precision: 0.9024 - recall: 0.9205 - f1: 0.9114 - 254ms/step
Eval samples: 192
Epoch 3/10
step  1/50 - loss: 0.0000e+00 - precision: 0.9072 - recall: 0.9215 - f1: 0.9143 - 608ms/step
step  2/50 - loss: 0.0000e+00 - precision: 0.9028 - recall: 0.9217 - f1: 0.9121 - 607ms/step
step  3/50 - loss: 0.0000e+00 - precision: 0.9015 - recall: 0.9251 - f1: 0.9132 - 578ms/step
step  4/50 - loss: 0.0000e+00 - precision: 0.9006 - recall: 0.9290 - f1: 0.9146 - 603ms/step
step  5/50 - loss: 0.0000e+00 - precision: 0.9100 - recall: 0.9349 - f1: 0.9223 - 599ms/step
step  6/50 - loss: 0.0000e+00 - precision: 0.9089 - recall: 0.9327 - f1: 0.9206 - 594ms/step
step  7/50 - loss: 0.0000e+00 - precision: 0.9077 - recall: 0.9349 - f1: 0.9211 - 601ms/step
step  8/50 - loss: 0.0000e+00 - precision: 0.9098 - recall: 0.9378 - f1: 0.9236 - 587ms/step
step  9/50 - loss: 0.0000e+00 - precision: 0.9136 - recall: 0.9407 - f1: 0.9269 - 575ms/step
step 10/50 - loss: 0.1934 - precision: 0.9155 - recall: 0.9419 - f1: 0.9285 - 577ms/step
step 11/50 - loss: 0.0000e+00 - precision: 0.9186 - recall: 0.9453 - f1: 0.9318 - 575ms/step
step 12/50 - loss: 0.0000e+00 - precision: 0.9228 - recall: 0.9481 - f1: 0.9353 - 575ms/step
step 13/50 - loss: 0.0000e+00 - precision: 0.9157 - recall: 0.9441 - f1: 0.9297 - 576ms/step
step 14/50 - loss: 0.0000e+00 - precision: 0.9162 - recall: 0.9440 - f1: 0.9299 - 571ms/step
step 15/50 - loss: 0.0000e+00 - precision: 0.9171 - recall: 0.9453 - f1: 0.9310 - 576ms/step
step 16/50 - loss: 0.0000e+00 - precision: 0.9178 - recall: 0.9454 - f1: 0.9314 - 578ms/step
step 17/50 - loss: 0.0000e+00 - precision: 0.9192 - recall: 0.9446 - f1: 0.9318 - 579ms/step
step 18/50 - loss: 0.0000e+00 - precision: 0.9194 - recall: 0.9437 - f1: 0.9313 - 581ms/step
step 19/50 - loss: 0.0000e+00 - precision: 0.9209 - recall: 0.9452 - f1: 0.9329 - 582ms/step
step 20/50 - loss: 1.0538 - precision: 0.9198 - recall: 0.9446 - f1: 0.9320 - 584ms/step
step 21/50 - loss: 0.0000e+00 - precision: 0.9190 - recall: 0.9435 - f1: 0.9311 - 582ms/step
step 22/50 - loss: 0.0000e+00 - precision: 0.9179 - recall: 0.9427 - f1: 0.9301 - 585ms/step
step 23/50 - loss: 0.0000e+00 - precision: 0.9192 - recall: 0.9439 - f1: 0.9314 - 580ms/step
step 24/50 - loss: 0.0000e+00 - precision: 0.9204 - recall: 0.9442 - f1: 0.9322 - 577ms/step
step 25/50 - loss: 0.0121 - precision: 0.9223 - recall: 0.9456 - f1: 0.9338 - 575ms/step
step 26/50 - loss: 0.0000e+00 - precision: 0.9237 - recall: 0.9467 - f1: 0.9351 - 572ms/step
step 27/50 - loss: 0.0000e+00 - precision: 0.9249 - recall: 0.9473 - f1: 0.9360 - 572ms/step
step 28/50 - loss: 0.0000e+00 - precision: 0.9258 - recall: 0.9479 - f1: 0.9367 - 572ms/step
step 29/50 - loss: 0.0000e+00 - precision: 0.9268 - recall: 0.9488 - f1: 0.9377 - 572ms/step
step 30/50 - loss: 0.0000e+00 - precision: 0.9284 - recall: 0.9496 - f1: 0.9389 - 571ms/step
step 31/50 - loss: 0.0000e+00 - precision: 0.9289 - recall: 0.9502 - f1: 0.9394 - 572ms/step
step 32/50 - loss: 0.0000e+00 - precision: 0.9296 - recall: 0.9506 - f1: 0.9400 - 572ms/step
step 33/50 - loss: 0.0000e+00 - precision: 0.9301 - recall: 0.9510 - f1: 0.9404 - 571ms/step
step 34/50 - loss: 4.7646 - precision: 0.9310 - recall: 0.9514 - f1: 0.9411 - 570ms/step
step 35/50 - loss: 0.7620 - precision: 0.9294 - recall: 0.9505 - f1: 0.9399 - 568ms/step
step 36/50 - loss: 0.0000e+00 - precision: 0.9302 - recall: 0.9512 - f1: 0.9406 - 569ms/step
step 37/50 - loss: 0.0000e+00 - precision: 0.9310 - recall: 0.9519 - f1: 0.9414 - 568ms/step
step 38/50 - loss: 0.0000e+00 - precision: 0.9319 - recall: 0.9524 - f1: 0.9420 - 568ms/step
step 39/50 - loss: 0.0000e+00 - precision: 0.9318 - recall: 0.9528 - f1: 0.9422 - 569ms/step
step 40/50 - loss: 0.0000e+00 - precision: 0.9316 - recall: 0.9531 - f1: 0.9422 - 572ms/step
step 41/50 - loss: 0.0000e+00 - precision: 0.9313 - recall: 0.9531 - f1: 0.9421 - 571ms/step
step 42/50 - loss: 0.0000e+00 - precision: 0.9320 - recall: 0.9537 - f1: 0.9427 - 573ms/step
step 43/50 - loss: 0.0000e+00 - precision: 0.9314 - recall: 0.9537 - f1: 0.9424 - 575ms/step
step 44/50 - loss: 0.2008 - precision: 0.9315 - recall: 0.9536 - f1: 0.9424 - 575ms/step
step 45/50 - loss: 3.9984 - precision: 0.9320 - recall: 0.9540 - f1: 0.9428 - 575ms/step
step 46/50 - loss: 0.0000e+00 - precision: 0.9327 - recall: 0.9544 - f1: 0.9434 - 576ms/step
step 47/50 - loss: 0.0000e+00 - precision: 0.9333 - recall: 0.9549 - f1: 0.9440 - 579ms/step
step 48/50 - loss: 0.0000e+00 - precision: 0.9325 - recall: 0.9542 - f1: 0.9432 - 579ms/step
step 49/50 - loss: 2.5998 - precision: 0.9323 - recall: 0.9538 - f1: 0.9429 - 578ms/step
step 50/50 - loss: 0.0000e+00 - precision: 0.9315 - recall: 0.9530 - f1: 0.9421 - 578ms/step
save checkpoint at /home/aistudio/results/2
Eval begin...
The loss value printed in the log is the current batch, and the metric is the average value of previous step.
step 1/6 - loss: 0.0000e+00 - precision: 0.8990 - recall: 0.9468 - f1: 0.9223 - 223ms/step
step 2/6 - loss: 0.0000e+00 - precision: 0.9278 - recall: 0.9474 - f1: 0.9375 - 217ms/step
step 3/6 - loss: 0.0000e+00 - precision: 0.9329 - recall: 0.9492 - f1: 0.9410 - 235ms/step
step 4/6 - loss: 0.0000e+00 - precision: 0.9290 - recall: 0.9449 - f1: 0.9369 - 221ms/step
step 5/6 - loss: 3.8817 - precision: 0.9340 - recall: 0.9486 - f1: 0.9412 - 218ms/step
step 6/6 - loss: 1.5098 - precision: 0.9247 - recall: 0.9432 - f1: 0.9339 - 222ms/step
Eval samples: 192
Epoch 4/10
step  1/50 - loss: 0.0000e+00 - precision: 0.9588 - recall: 0.9688 - f1: 0.9637 - 549ms/step
step  2/50 - loss: 0.0000e+00 - precision: 0.9508 - recall: 0.9557 - f1: 0.9532 - 562ms/step
step  3/50 - loss: 0.0000e+00 - precision: 0.9568 - recall: 0.9618 - f1: 0.9593 - 580ms/step
step  4/50 - loss: 0.0000e+00 - precision: 0.9597 - recall: 0.9635 - f1: 0.9616 - 583ms/step
step  5/50 - loss: 0.0000e+00 - precision: 0.9565 - recall: 0.9625 - f1: 0.9595 - 592ms/step
step  6/50 - loss: 0.0000e+00 - precision: 0.9527 - recall: 0.9635 - f1: 0.9581 - 616ms/step
step  7/50 - loss: 0.0343 - precision: 0.9522 - recall: 0.9650 - f1: 0.9586 - 613ms/step
step  8/50 - loss: 0.0000e+00 - precision: 0.9563 - recall: 0.9687 - f1: 0.9625 - 605ms/step
step  9/50 - loss: 0.0000e+00 - precision: 0.9555 - recall: 0.9699 - f1: 0.9626 - 602ms/step
step 10/50 - loss: 0.0000e+00 - precision: 0.9564 - recall: 0.9713 - f1: 0.9638 - 603ms/step
step 11/50 - loss: 0.0000e+00 - precision: 0.9557 - recall: 0.9711 - f1: 0.9633 - 606ms/step
step 12/50 - loss: 0.0000e+00 - precision: 0.9534 - recall: 0.9691 - f1: 0.9612 - 611ms/step
step 13/50 - loss: 0.0000e+00 - precision: 0.9538 - recall: 0.9703 - f1: 0.9620 - 603ms/step
step 14/50 - loss: 0.0000e+00 - precision: 0.9513 - recall: 0.9687 - f1: 0.9599 - 614ms/step
step 15/50 - loss: 0.2153 - precision: 0.9515 - recall: 0.9690 - f1: 0.9602 - 609ms/step
step 16/50 - loss: 0.0000e+00 - precision: 0.9532 - recall: 0.9700 - f1: 0.9615 - 605ms/step
step 17/50 - loss: 0.0000e+00 - precision: 0.9518 - recall: 0.9696 - f1: 0.9606 - 603ms/step
step 18/50 - loss: 0.0000e+00 - precision: 0.9510 - recall: 0.9687 - f1: 0.9598 - 612ms/step
step 19/50 - loss: 0.8114 - precision: 0.9525 - recall: 0.9695 - f1: 0.9609 - 610ms/step
step 20/50 - loss: 0.0000e+00 - precision: 0.9507 - recall: 0.9674 - f1: 0.9590 - 604ms/step
step 21/50 - loss: 0.0000e+00 - precision: 0.9497 - recall: 0.9669 - f1: 0.9582 - 601ms/step
step 22/50 - loss: 0.0000e+00 - precision: 0.9489 - recall: 0.9656 - f1: 0.9572 - 595ms/step
step 23/50 - loss: 0.0000e+00 - precision: 0.9485 - recall: 0.9653 - f1: 0.9568 - 593ms/step
step 24/50 - loss: 0.0000e+00 - precision: 0.9500 - recall: 0.9663 - f1: 0.9581 - 595ms/step
step 25/50 - loss: 0.0000e+00 - precision: 0.9501 - recall: 0.9666 - f1: 0.9583 - 597ms/step
step 26/50 - loss: 0.0000e+00 - precision: 0.9501 - recall: 0.9667 - f1: 0.9583 - 598ms/step
step 27/50 - loss: 0.0000e+00 - precision: 0.9500 - recall: 0.9671 - f1: 0.9585 - 599ms/step
step 28/50 - loss: 0.0000e+00 - precision: 0.9516 - recall: 0.9679 - f1: 0.9597 - 595ms/step
step 29/50 - loss: 0.0000e+00 - precision: 0.9532 - recall: 0.9690 - f1: 0.9611 - 592ms/step
step 30/50 - loss: 0.0000e+00 - precision: 0.9532 - recall: 0.9690 - f1: 0.9611 - 596ms/step
step 31/50 - loss: 0.0000e+00 - precision: 0.9527 - recall: 0.9688 - f1: 0.9607 - 596ms/step
step 32/50 - loss: 0.0000e+00 - precision: 0.9525 - recall: 0.9686 - f1: 0.9605 - 595ms/step
step 33/50 - loss: 0.0000e+00 - precision: 0.9522 - recall: 0.9683 - f1: 0.9602 - 595ms/step
step 34/50 - loss: 0.0000e+00 - precision: 0.9530 - recall: 0.9688 - f1: 0.9608 - 594ms/step
step 35/50 - loss: 0.0000e+00 - precision: 0.9525 - recall: 0.9685 - f1: 0.9604 - 593ms/step
step 36/50 - loss: 0.0000e+00 - precision: 0.9532 - recall: 0.9690 - f1: 0.9611 - 591ms/step
step 37/50 - loss: 0.0000e+00 - precision: 0.9533 - recall: 0.9693 - f1: 0.9612 - 589ms/step
step 38/50 - loss: 0.0000e+00 - precision: 0.9537 - recall: 0.9696 - f1: 0.9616 - 588ms/step
step 39/50 - loss: 0.0000e+00 - precision: 0.9542 - recall: 0.9699 - f1: 0.9620 - 586ms/step
step 40/50 - loss: 0.0000e+00 - precision: 0.9533 - recall: 0.9689 - f1: 0.9610 - 586ms/step
step 41/50 - loss: 0.0000e+00 - precision: 0.9528 - recall: 0.9686 - f1: 0.9606 - 587ms/step
step 42/50 - loss: 0.0000e+00 - precision: 0.9527 - recall: 0.9689 - f1: 0.9607 - 586ms/step
step 43/50 - loss: 0.0000e+00 - precision: 0.9530 - recall: 0.9691 - f1: 0.9610 - 585ms/step
step 44/50 - loss: 0.0000e+00 - precision: 0.9536 - recall: 0.9696 - f1: 0.9615 - 585ms/step
step 45/50 - loss: 0.0000e+00 - precision: 0.9536 - recall: 0.9695 - f1: 0.9615 - 585ms/step
step 46/50 - loss: 0.0000e+00 - precision: 0.9538 - recall: 0.9698 - f1: 0.9618 - 585ms/step
step 47/50 - loss: 0.0000e+00 - precision: 0.9540 - recall: 0.9697 - f1: 0.9618 - 585ms/step
step 48/50 - loss: 1.3438 - precision: 0.9538 - recall: 0.9696 - f1: 0.9616 - 584ms/step
step 49/50 - loss: 0.1308 - precision: 0.9540 - recall: 0.9695 - f1: 0.9617 - 583ms/step
step 50/50 - loss: 0.0815 - precision: 0.9543 - recall: 0.9697 - f1: 0.9619 - 583ms/step
save checkpoint at /home/aistudio/results/3
Eval begin...
The loss value printed in the log is the current batch, and the metric is the average value of previous step.
step 1/6 - loss: 0.0000e+00 - precision: 0.9231 - recall: 0.9574 - f1: 0.9399 - 234ms/step
step 2/6 - loss: 0.0000e+00 - precision: 0.9352 - recall: 0.9500 - f1: 0.9426 - 230ms/step
step 3/6 - loss: 0.0000e+00 - precision: 0.9361 - recall: 0.9492 - f1: 0.9426 - 248ms/step
step 4/6 - loss: 0.0000e+00 - precision: 0.9289 - recall: 0.9436 - f1: 0.9362 - 234ms/step
step 5/6 - loss: 3.5602 - precision: 0.9360 - recall: 0.9497 - f1: 0.9428 - 231ms/step
step 6/6 - loss: 1.0404 - precision: 0.9222 - recall: 0.9415 - f1: 0.9317 - 233ms/step
Eval samples: 192
Epoch 5/10
step  1/50 - loss: 0.0000e+00 - precision: 0.9794 - recall: 0.9896 - f1: 0.9845 - 503ms/step
step  2/50 - loss: 0.0000e+00 - precision: 0.9667 - recall: 0.9818 - f1: 0.9742 - 514ms/step
step  3/50 - loss: 2.0967 - precision: 0.9742 - recall: 0.9844 - f1: 0.9793 - 524ms/step
step  4/50 - loss: 0.0000e+00 - precision: 0.9780 - recall: 0.9870 - f1: 0.9825 - 563ms/step
step  5/50 - loss: 0.0000e+00 - precision: 0.9803 - recall: 0.9885 - f1: 0.9844 - 566ms/step
step  6/50 - loss: 0.0000e+00 - precision: 0.9733 - recall: 0.9843 - f1: 0.9788 - 574ms/step
step  7/50 - loss: 0.4565 - precision: 0.9698 - recall: 0.9828 - f1: 0.9763 - 568ms/step
step  8/50 - loss: 0.0000e+00 - precision: 0.9710 - recall: 0.9830 - f1: 0.9770 - 580ms/step
step  9/50 - loss: 0.0000e+00 - precision: 0.9680 - recall: 0.9814 - f1: 0.9747 - 577ms/step
step 10/50 - loss: 0.0000e+00 - precision: 0.9681 - recall: 0.9807 - f1: 0.9743 - 576ms/step
step 11/50 - loss: 0.0000e+00 - precision: 0.9686 - recall: 0.9810 - f1: 0.9748 - 577ms/step
step 12/50 - loss: 0.0000e+00 - precision: 0.9678 - recall: 0.9813 - f1: 0.9745 - 576ms/step
step 13/50 - loss: 0.0000e+00 - precision: 0.9648 - recall: 0.9799 - f1: 0.9723 - 574ms/step
step 14/50 - loss: 0.0000e+00 - precision: 0.9618 - recall: 0.9787 - f1: 0.9702 - 576ms/step
step 15/50 - loss: 0.0000e+00 - precision: 0.9613 - recall: 0.9780 - f1: 0.9696 - 576ms/step
step 16/50 - loss: 0.0000e+00 - precision: 0.9637 - recall: 0.9794 - f1: 0.9715 - 577ms/step
step 17/50 - loss: 0.0000e+00 - precision: 0.9619 - recall: 0.9785 - f1: 0.9701 - 574ms/step
step 18/50 - loss: 0.0000e+00 - precision: 0.9600 - recall: 0.9765 - f1: 0.9682 - 574ms/step
step 19/50 - loss: 0.0000e+00 - precision: 0.9583 - recall: 0.9744 - f1: 0.9663 - 575ms/step
step 20/50 - loss: 0.0000e+00 - precision: 0.9583 - recall: 0.9741 - f1: 0.9661 - 571ms/step
step 21/50 - loss: 0.0000e+00 - precision: 0.9569 - recall: 0.9731 - f1: 0.9649 - 569ms/step
step 22/50 - loss: 0.0000e+00 - precision: 0.9563 - recall: 0.9727 - f1: 0.9644 - 569ms/step
step 23/50 - loss: 0.0000e+00 - precision: 0.9562 - recall: 0.9725 - f1: 0.9643 - 573ms/step
step 24/50 - loss: 0.0000e+00 - precision: 0.9567 - recall: 0.9728 - f1: 0.9647 - 571ms/step
step 25/50 - loss: 0.0000e+00 - precision: 0.9572 - recall: 0.9730 - f1: 0.9650 - 572ms/step
step 26/50 - loss: 0.0000e+00 - precision: 0.9555 - recall: 0.9716 - f1: 0.9635 - 573ms/step
step 27/50 - loss: 0.0000e+00 - precision: 0.9560 - recall: 0.9721 - f1: 0.9640 - 573ms/step
step 28/50 - loss: 0.0000e+00 - precision: 0.9552 - recall: 0.9718 - f1: 0.9634 - 573ms/step
step 29/50 - loss: 0.0000e+00 - precision: 0.9553 - recall: 0.9719 - f1: 0.9635 - 573ms/step
step 30/50 - loss: 0.6933 - precision: 0.9554 - recall: 0.9717 - f1: 0.9635 - 575ms/step
step 31/50 - loss: 0.0000e+00 - precision: 0.9560 - recall: 0.9723 - f1: 0.9641 - 574ms/step
step 32/50 - loss: 0.0000e+00 - precision: 0.9567 - recall: 0.9727 - f1: 0.9647 - 571ms/step
step 33/50 - loss: 0.0000e+00 - precision: 0.9580 - recall: 0.9735 - f1: 0.9657 - 571ms/step
step 34/50 - loss: 0.4958 - precision: 0.9590 - recall: 0.9742 - f1: 0.9665 - 576ms/step
step 35/50 - loss: 0.0000e+00 - precision: 0.9588 - recall: 0.9740 - f1: 0.9663 - 575ms/step
step 36/50 - loss: 0.0000e+00 - precision: 0.9587 - recall: 0.9740 - f1: 0.9663 - 577ms/step
step 37/50 - loss: 0.0000e+00 - precision: 0.9581 - recall: 0.9733 - f1: 0.9656 - 578ms/step
step 38/50 - loss: 5.5389 - precision: 0.9585 - recall: 0.9737 - f1: 0.9661 - 582ms/step
step 39/50 - loss: 0.0000e+00 - precision: 0.9591 - recall: 0.9740 - f1: 0.9665 - 580ms/step
step 40/50 - loss: 0.0000e+00 - precision: 0.9597 - recall: 0.9742 - f1: 0.9669 - 579ms/step
step 41/50 - loss: 0.0000e+00 - precision: 0.9601 - recall: 0.9746 - f1: 0.9673 - 579ms/step
step 42/50 - loss: 0.0000e+00 - precision: 0.9608 - recall: 0.9751 - f1: 0.9679 - 579ms/step
step 43/50 - loss: 0.6173 - precision: 0.9606 - recall: 0.9747 - f1: 0.9676 - 582ms/step
step 44/50 - loss: 0.0000e+00 - precision: 0.9604 - recall: 0.9743 - f1: 0.9673 - 581ms/step
step 45/50 - loss: 0.8225 - precision: 0.9610 - recall: 0.9748 - f1: 0.9679 - 580ms/step
step 46/50 - loss: 0.0000e+00 - precision: 0.9602 - recall: 0.9741 - f1: 0.9671 - 580ms/step
step 47/50 - loss: 0.0000e+00 - precision: 0.9599 - recall: 0.9741 - f1: 0.9670 - 581ms/step
step 48/50 - loss: 0.0000e+00 - precision: 0.9608 - recall: 0.9746 - f1: 0.9676 - 581ms/step
step 49/50 - loss: 0.0000e+00 - precision: 0.9602 - recall: 0.9743 - f1: 0.9672 - 580ms/step
step 50/50 - loss: 0.0000e+00 - precision: 0.9604 - recall: 0.9743 - f1: 0.9673 - 579ms/step
save checkpoint at /home/aistudio/results/4
Eval begin...
The loss value printed in the log is the current batch, and the metric is the average value of previous step.
step 1/6 - loss: 0.0000e+00 - precision: 0.9141 - recall: 0.9628 - f1: 0.9378 - 222ms/step
step 2/6 - loss: 0.0000e+00 - precision: 0.9357 - recall: 0.9579 - f1: 0.9467 - 217ms/step
step 3/6 - loss: 0.0000e+00 - precision: 0.9333 - recall: 0.9562 - f1: 0.9446 - 235ms/step
step 4/6 - loss: 0.0000e+00 - precision: 0.9370 - recall: 0.9567 - f1: 0.9468 - 223ms/step
step 5/6 - loss: 2.2417 - precision: 0.9403 - recall: 0.9581 - f1: 0.9491 - 220ms/step
step 6/6 - loss: 1.1119 - precision: 0.9324 - recall: 0.9520 - f1: 0.9421 - 221ms/step
Eval samples: 192
Epoch 6/10
step  1/50 - loss: 0.0000e+00 - precision: 0.9896 - recall: 0.9948 - f1: 0.9922 - 600ms/step
step  2/50 - loss: 0.0000e+00 - precision: 0.9793 - recall: 0.9870 - f1: 0.9831 - 560ms/step
step  3/50 - loss: 0.0000e+00 - precision: 0.9862 - recall: 0.9913 - f1: 0.9887 - 559ms/step
step  4/50 - loss: 0.0000e+00 - precision: 0.9793 - recall: 0.9870 - f1: 0.9831 - 584ms/step
step  5/50 - loss: 0.0000e+00 - precision: 0.9732 - recall: 0.9823 - f1: 0.9777 - 571ms/step
step  6/50 - loss: 0.0000e+00 - precision: 0.9724 - recall: 0.9809 - f1: 0.9766 - 575ms/step
step  7/50 - loss: 0.0000e+00 - precision: 0.9654 - recall: 0.9769 - f1: 0.9711 - 565ms/step
step  8/50 - loss: 0.0000e+00 - precision: 0.9671 - recall: 0.9778 - f1: 0.9724 - 565ms/step
step  9/50 - loss: 0.1811 - precision: 0.9673 - recall: 0.9785 - f1: 0.9729 - 564ms/step
step 10/50 - loss: 0.0000e+00 - precision: 0.9675 - recall: 0.9791 - f1: 0.9733 - 567ms/step
step 11/50 - loss: 0.0000e+00 - precision: 0.9662 - recall: 0.9786 - f1: 0.9724 - 567ms/step
step 12/50 - loss: 0.0000e+00 - precision: 0.9661 - recall: 0.9787 - f1: 0.9723 - 567ms/step
step 13/50 - loss: 0.0000e+00 - precision: 0.9663 - recall: 0.9791 - f1: 0.9727 - 568ms/step
step 14/50 - loss: 0.0000e+00 - precision: 0.9658 - recall: 0.9784 - f1: 0.9720 - 580ms/step
step 15/50 - loss: 0.0000e+00 - precision: 0.9673 - recall: 0.9795 - f1: 0.9734 - 589ms/step
step 16/50 - loss: 0.0000e+00 - precision: 0.9671 - recall: 0.9798 - f1: 0.9734 - 585ms/step
step 17/50 - loss: 0.0000e+00 - precision: 0.9679 - recall: 0.9800 - f1: 0.9739 - 590ms/step
step 18/50 - loss: 0.0000e+00 - precision: 0.9685 - recall: 0.9800 - f1: 0.9742 - 594ms/step
step 19/50 - loss: 0.0000e+00 - precision: 0.9677 - recall: 0.9797 - f1: 0.9736 - 593ms/step
step 20/50 - loss: 0.0000e+00 - precision: 0.9667 - recall: 0.9791 - f1: 0.9729 - 593ms/step
step 21/50 - loss: 0.0000e+00 - precision: 0.9663 - recall: 0.9784 - f1: 0.9723 - 588ms/step
step 22/50 - loss: 0.0000e+00 - precision: 0.9669 - recall: 0.9789 - f1: 0.9729 - 584ms/step
step 23/50 - loss: 0.0000e+00 - precision: 0.9679 - recall: 0.9796 - f1: 0.9737 - 586ms/step
step 24/50 - loss: 0.0000e+00 - precision: 0.9675 - recall: 0.9793 - f1: 0.9734 - 586ms/step
step 25/50 - loss: 0.0000e+00 - precision: 0.9674 - recall: 0.9793 - f1: 0.9733 - 584ms/step
step 26/50 - loss: 0.0000e+00 - precision: 0.9671 - recall: 0.9791 - f1: 0.9730 - 582ms/step
step 27/50 - loss: 0.0000e+00 - precision: 0.9660 - recall: 0.9781 - f1: 0.9720 - 578ms/step
step 28/50 - loss: 0.0000e+00 - precision: 0.9664 - recall: 0.9782 - f1: 0.9723 - 579ms/step
step 29/50 - loss: 0.0000e+00 - precision: 0.9656 - recall: 0.9778 - f1: 0.9717 - 580ms/step
step 30/50 - loss: 0.3634 - precision: 0.9654 - recall: 0.9777 - f1: 0.9715 - 577ms/step
step 31/50 - loss: 0.0000e+00 - precision: 0.9639 - recall: 0.9772 - f1: 0.9705 - 579ms/step
step 32/50 - loss: 0.0000e+00 - precision: 0.9637 - recall: 0.9771 - f1: 0.9704 - 581ms/step
step 33/50 - loss: 0.0000e+00 - precision: 0.9636 - recall: 0.9769 - f1: 0.9702 - 582ms/step
step 34/50 - loss: 0.0000e+00 - precision: 0.9633 - recall: 0.9765 - f1: 0.9698 - 579ms/step
step 35/50 - loss: 0.0000e+00 - precision: 0.9639 - recall: 0.9767 - f1: 0.9703 - 577ms/step
step 36/50 - loss: 0.0000e+00 - precision: 0.9645 - recall: 0.9772 - f1: 0.9708 - 577ms/step
step 37/50 - loss: 0.0000e+00 - precision: 0.9646 - recall: 0.9774 - f1: 0.9710 - 576ms/step
step 38/50 - loss: 0.0000e+00 - precision: 0.9644 - recall: 0.9774 - f1: 0.9709 - 575ms/step
step 39/50 - loss: 0.0000e+00 - precision: 0.9651 - recall: 0.9778 - f1: 0.9714 - 576ms/step
step 40/50 - loss: 0.0000e+00 - precision: 0.9646 - recall: 0.9775 - f1: 0.9710 - 575ms/step
step 41/50 - loss: 0.0000e+00 - precision: 0.9638 - recall: 0.9772 - f1: 0.9704 - 578ms/step
step 42/50 - loss: 0.0000e+00 - precision: 0.9639 - recall: 0.9773 - f1: 0.9706 - 577ms/step
step 43/50 - loss: 0.0000e+00 - precision: 0.9643 - recall: 0.9775 - f1: 0.9708 - 576ms/step
step 44/50 - loss: 0.0000e+00 - precision: 0.9645 - recall: 0.9777 - f1: 0.9710 - 574ms/step
step 45/50 - loss: 0.0000e+00 - precision: 0.9645 - recall: 0.9778 - f1: 0.9711 - 574ms/step
step 46/50 - loss: 3.4914 - precision: 0.9644 - recall: 0.9776 - f1: 0.9710 - 572ms/step
step 47/50 - loss: 0.0000e+00 - precision: 0.9643 - recall: 0.9775 - f1: 0.9709 - 571ms/step
step 48/50 - loss: 1.3900 - precision: 0.9642 - recall: 0.9774 - f1: 0.9708 - 573ms/step
step 49/50 - loss: 0.0000e+00 - precision: 0.9642 - recall: 0.9771 - f1: 0.9706 - 573ms/step
step 50/50 - loss: 0.0000e+00 - precision: 0.9636 - recall: 0.9769 - f1: 0.9702 - 573ms/step
save checkpoint at /home/aistudio/results/5
Eval begin...
The loss value printed in the log is the current batch, and the metric is the average value of previous step.
step 1/6 - loss: 0.0000e+00 - precision: 0.8900 - recall: 0.9468 - f1: 0.9175 - 204ms/step
step 2/6 - loss: 0.0000e+00 - precision: 0.9133 - recall: 0.9421 - f1: 0.9275 - 203ms/step
step 3/6 - loss: 0.0000e+00 - precision: 0.9264 - recall: 0.9475 - f1: 0.9368 - 222ms/step
step 4/6 - loss: 0.0000e+00 - precision: 0.9293 - recall: 0.9488 - f1: 0.9390 - 215ms/step
step 5/6 - loss: 2.4852 - precision: 0.9342 - recall: 0.9518 - f1: 0.9429 - 215ms/step
step 6/6 - loss: 2.5991 - precision: 0.9231 - recall: 0.9432 - f1: 0.9330 - 218ms/step
Eval samples: 192
Epoch 7/10
step  1/50 - loss: 0.0000e+00 - precision: 0.9691 - recall: 0.9843 - f1: 0.9766 - 518ms/step
step  2/50 - loss: 0.0000e+00 - precision: 0.9741 - recall: 0.9817 - f1: 0.9779 - 547ms/step
step  3/50 - loss: 0.0000e+00 - precision: 0.9758 - recall: 0.9843 - f1: 0.9801 - 593ms/step
step  4/50 - loss: 0.0000e+00 - precision: 0.9793 - recall: 0.9869 - f1: 0.9831 - 599ms/step
step  5/50 - loss: 0.0000e+00 - precision: 0.9731 - recall: 0.9812 - f1: 0.9771 - 597ms/step
step  6/50 - loss: 0.0000e+00 - precision: 0.9775 - recall: 0.9843 - f1: 0.9809 - 600ms/step
step  7/50 - loss: 0.0000e+00 - precision: 0.9712 - recall: 0.9806 - f1: 0.9759 - 586ms/step
step  8/50 - loss: 0.0000e+00 - precision: 0.9722 - recall: 0.9811 - f1: 0.9766 - 581ms/step
step  9/50 - loss: 0.0000e+00 - precision: 0.9718 - recall: 0.9808 - f1: 0.9763 - 581ms/step
step 10/50 - loss: 0.0000e+00 - precision: 0.9715 - recall: 0.9806 - f1: 0.9760 - 579ms/step
step 11/50 - loss: 0.0000e+00 - precision: 0.9708 - recall: 0.9810 - f1: 0.9758 - 577ms/step
step 12/50 - loss: 0.0000e+00 - precision: 0.9711 - recall: 0.9817 - f1: 0.9764 - 576ms/step
step 13/50 - loss: 0.0000e+00 - precision: 0.9717 - recall: 0.9819 - f1: 0.9768 - 580ms/step
step 14/50 - loss: 0.0000e+00 - precision: 0.9715 - recall: 0.9817 - f1: 0.9766 - 581ms/step
step 15/50 - loss: 0.0000e+00 - precision: 0.9706 - recall: 0.9811 - f1: 0.9759 - 585ms/step
step 16/50 - loss: 0.0000e+00 - precision: 0.9712 - recall: 0.9817 - f1: 0.9764 - 591ms/step
step 17/50 - loss: 0.0000e+00 - precision: 0.9708 - recall: 0.9818 - f1: 0.9763 - 588ms/step
step 18/50 - loss: 0.0000e+00 - precision: 0.9701 - recall: 0.9808 - f1: 0.9754 - 589ms/step
step 19/50 - loss: 0.0000e+00 - precision: 0.9695 - recall: 0.9801 - f1: 0.9748 - 582ms/step
step 20/50 - loss: 0.0000e+00 - precision: 0.9705 - recall: 0.9809 - f1: 0.9756 - 590ms/step
step 21/50 - loss: 0.0000e+00 - precision: 0.9709 - recall: 0.9810 - f1: 0.9759 - 591ms/step
step 22/50 - loss: 0.0000e+00 - precision: 0.9717 - recall: 0.9817 - f1: 0.9767 - 593ms/step
step 23/50 - loss: 0.0000e+00 - precision: 0.9707 - recall: 0.9806 - f1: 0.9756 - 589ms/step
step 24/50 - loss: 0.6602 - precision: 0.9687 - recall: 0.9790 - f1: 0.9738 - 585ms/step
step 25/50 - loss: 0.0000e+00 - precision: 0.9693 - recall: 0.9797 - f1: 0.9745 - 585ms/step
step 26/50 - loss: 0.0000e+00 - precision: 0.9695 - recall: 0.9801 - f1: 0.9747 - 582ms/step
step 27/50 - loss: 0.0000e+00 - precision: 0.9695 - recall: 0.9806 - f1: 0.9750 - 582ms/step
step 28/50 - loss: 0.0000e+00 - precision: 0.9704 - recall: 0.9809 - f1: 0.9756 - 579ms/step
step 29/50 - loss: 0.0000e+00 - precision: 0.9703 - recall: 0.9807 - f1: 0.9755 - 580ms/step
step 30/50 - loss: 0.0000e+00 - precision: 0.9699 - recall: 0.9803 - f1: 0.9751 - 579ms/step
step 31/50 - loss: 0.0000e+00 - precision: 0.9699 - recall: 0.9804 - f1: 0.9751 - 578ms/step
step 32/50 - loss: 0.0000e+00 - precision: 0.9697 - recall: 0.9805 - f1: 0.9751 - 579ms/step
step 33/50 - loss: 0.0000e+00 - precision: 0.9688 - recall: 0.9802 - f1: 0.9745 - 578ms/step
step 34/50 - loss: 0.0000e+00 - precision: 0.9670 - recall: 0.9791 - f1: 0.9730 - 580ms/step
step 35/50 - loss: 0.0000e+00 - precision: 0.9668 - recall: 0.9786 - f1: 0.9727 - 578ms/step
step 36/50 - loss: 0.0000e+00 - precision: 0.9654 - recall: 0.9776 - f1: 0.9715 - 580ms/step
step 37/50 - loss: 0.0000e+00 - precision: 0.9654 - recall: 0.9778 - f1: 0.9715 - 582ms/step
step 38/50 - loss: 0.0000e+00 - precision: 0.9660 - recall: 0.9781 - f1: 0.9720 - 581ms/step
step 39/50 - loss: 0.0000e+00 - precision: 0.9661 - recall: 0.9783 - f1: 0.9721 - 581ms/step
step 40/50 - loss: 0.0000e+00 - precision: 0.9659 - recall: 0.9783 - f1: 0.9721 - 580ms/step
step 41/50 - loss: 0.0000e+00 - precision: 0.9656 - recall: 0.9782 - f1: 0.9718 - 581ms/step
step 42/50 - loss: 25.6443 - precision: 0.9659 - recall: 0.9783 - f1: 0.9721 - 584ms/step
step 43/50 - loss: 0.0000e+00 - precision: 0.9661 - recall: 0.9786 - f1: 0.9723 - 583ms/step
step 44/50 - loss: 0.0000e+00 - precision: 0.9664 - recall: 0.9788 - f1: 0.9726 - 584ms/step
step 45/50 - loss: 0.0000e+00 - precision: 0.9669 - recall: 0.9792 - f1: 0.9730 - 584ms/step
step 46/50 - loss: 0.0000e+00 - precision: 0.9670 - recall: 0.9792 - f1: 0.9730 - 584ms/step
step 47/50 - loss: 0.0000e+00 - precision: 0.9670 - recall: 0.9794 - f1: 0.9732 - 581ms/step
step 48/50 - loss: 0.0000e+00 - precision: 0.9669 - recall: 0.9794 - f1: 0.9731 - 580ms/step
step 49/50 - loss: 0.0000e+00 - precision: 0.9671 - recall: 0.9795 - f1: 0.9733 - 580ms/step
step 50/50 - loss: 0.0000e+00 - precision: 0.9676 - recall: 0.9798 - f1: 0.9736 - 580ms/step
save checkpoint at /home/aistudio/results/6
Eval begin...
The loss value printed in the log is the current batch, and the metric is the average value of previous step.
step 1/6 - loss: 0.0000e+00 - precision: 0.9381 - recall: 0.9681 - f1: 0.9529 - 214ms/step
step 2/6 - loss: 0.0000e+00 - precision: 0.9485 - recall: 0.9684 - f1: 0.9583 - 213ms/step
step 3/6 - loss: 0.0000e+00 - precision: 0.9468 - recall: 0.9667 - f1: 0.9567 - 240ms/step
step 4/6 - loss: 0.0000e+00 - precision: 0.9422 - recall: 0.9619 - f1: 0.9519 - 225ms/step
step 5/6 - loss: 0.2956 - precision: 0.9465 - recall: 0.9644 - f1: 0.9553 - 227ms/step
step 6/6 - loss: 0.0000e+00 - precision: 0.9408 - recall: 0.9581 - f1: 0.9494 - 230ms/step
Eval samples: 192
Epoch 8/10
step  1/50 - loss: 0.0000e+00 - precision: 0.9590 - recall: 0.9740 - f1: 0.9664 - 575ms/step
step  2/50 - loss: 0.0000e+00 - precision: 0.9742 - recall: 0.9843 - f1: 0.9792 - 562ms/step
step  3/50 - loss: 0.0000e+00 - precision: 0.9793 - recall: 0.9878 - f1: 0.9835 - 599ms/step
step  4/50 - loss: 0.0000e+00 - precision: 0.9755 - recall: 0.9869 - f1: 0.9812 - 577ms/step
step  5/50 - loss: 0.0000e+00 - precision: 0.9763 - recall: 0.9875 - f1: 0.9818 - 557ms/step
step  6/50 - loss: 0.0000e+00 - precision: 0.9751 - recall: 0.9861 - f1: 0.9805 - 557ms/step
step  7/50 - loss: 0.0000e+00 - precision: 0.9757 - recall: 0.9866 - f1: 0.9811 - 569ms/step
step  8/50 - loss: 0.0000e+00 - precision: 0.9774 - recall: 0.9876 - f1: 0.9825 - 574ms/step
step  9/50 - loss: 0.0000e+00 - precision: 0.9753 - recall: 0.9861 - f1: 0.9807 - 569ms/step
step 10/50 - loss: 0.0000e+00 - precision: 0.9747 - recall: 0.9859 - f1: 0.9803 - 583ms/step
step 11/50 - loss: 0.0000e+00 - precision: 0.9751 - recall: 0.9858 - f1: 0.9804 - 586ms/step
step 12/50 - loss: 0.0000e+00 - precision: 0.9750 - recall: 0.9861 - f1: 0.9805 - 582ms/step
step 13/50 - loss: 0.0000e+00 - precision: 0.9769 - recall: 0.9871 - f1: 0.9820 - 577ms/step
step 14/50 - loss: 0.0000e+00 - precision: 0.9760 - recall: 0.9866 - f1: 0.9813 - 576ms/step
step 15/50 - loss: 0.0000e+00 - precision: 0.9745 - recall: 0.9861 - f1: 0.9803 - 577ms/step
step 16/50 - loss: 0.0000e+00 - precision: 0.9713 - recall: 0.9837 - f1: 0.9774 - 580ms/step
step 17/50 - loss: 0.2116 - precision: 0.9717 - recall: 0.9837 - f1: 0.9777 - 580ms/step
step 18/50 - loss: 0.0000e+00 - precision: 0.9690 - recall: 0.9820 - f1: 0.9754 - 578ms/step
step 19/50 - loss: 0.0000e+00 - precision: 0.9690 - recall: 0.9813 - f1: 0.9751 - 571ms/step
step 20/50 - loss: 0.0000e+00 - precision: 0.9687 - recall: 0.9809 - f1: 0.9748 - 571ms/step
step 21/50 - loss: 0.0000e+00 - precision: 0.9695 - recall: 0.9813 - f1: 0.9754 - 579ms/step
step 22/50 - loss: 0.0000e+00 - precision: 0.9692 - recall: 0.9810 - f1: 0.9751 - 575ms/step
step 23/50 - loss: 0.0000e+00 - precision: 0.9694 - recall: 0.9813 - f1: 0.9754 - 574ms/step
step 24/50 - loss: 0.0000e+00 - precision: 0.9696 - recall: 0.9815 - f1: 0.9755 - 576ms/step
step 25/50 - loss: 0.0000e+00 - precision: 0.9694 - recall: 0.9814 - f1: 0.9754 - 580ms/step
step 26/50 - loss: 0.0000e+00 - precision: 0.9698 - recall: 0.9813 - f1: 0.9755 - 581ms/step
step 27/50 - loss: 0.0000e+00 - precision: 0.9692 - recall: 0.9814 - f1: 0.9753 - 581ms/step
step 28/50 - loss: 0.0000e+00 - precision: 0.9699 - recall: 0.9819 - f1: 0.9759 - 584ms/step
step 29/50 - loss: 0.0000e+00 - precision: 0.9695 - recall: 0.9816 - f1: 0.9755 - 581ms/step
step 30/50 - loss: 0.0000e+00 - precision: 0.9690 - recall: 0.9815 - f1: 0.9752 - 580ms/step
step 31/50 - loss: 0.0000e+00 - precision: 0.9693 - recall: 0.9816 - f1: 0.9754 - 577ms/step
step 32/50 - loss: 0.0000e+00 - precision: 0.9692 - recall: 0.9817 - f1: 0.9754 - 577ms/step
step 33/50 - loss: 0.0000e+00 - precision: 0.9695 - recall: 0.9818 - f1: 0.9756 - 578ms/step
step 34/50 - loss: 0.0000e+00 - precision: 0.9701 - recall: 0.9822 - f1: 0.9761 - 580ms/step
step 35/50 - loss: 0.0000e+00 - precision: 0.9703 - recall: 0.9822 - f1: 0.9762 - 579ms/step
step 36/50 - loss: 0.0000e+00 - precision: 0.9709 - recall: 0.9826 - f1: 0.9767 - 578ms/step
step 37/50 - loss: 0.0000e+00 - precision: 0.9711 - recall: 0.9826 - f1: 0.9768 - 578ms/step
step 38/50 - loss: 0.9814 - precision: 0.9713 - recall: 0.9827 - f1: 0.9769 - 578ms/step
step 39/50 - loss: 0.0000e+00 - precision: 0.9714 - recall: 0.9828 - f1: 0.9771 - 580ms/step
step 40/50 - loss: 0.0000e+00 - precision: 0.9710 - recall: 0.9826 - f1: 0.9768 - 581ms/step
step 41/50 - loss: 0.0000e+00 - precision: 0.9704 - recall: 0.9820 - f1: 0.9761 - 581ms/step
step 42/50 - loss: 0.0000e+00 - precision: 0.9701 - recall: 0.9818 - f1: 0.9759 - 580ms/step
step 43/50 - loss: 0.0000e+00 - precision: 0.9703 - recall: 0.9817 - f1: 0.9760 - 580ms/step
step 44/50 - loss: 0.0000e+00 - precision: 0.9698 - recall: 0.9814 - f1: 0.9756 - 579ms/step
step 45/50 - loss: 0.0000e+00 - precision: 0.9698 - recall: 0.9815 - f1: 0.9756 - 579ms/step
step 46/50 - loss: 0.0000e+00 - precision: 0.9702 - recall: 0.9818 - f1: 0.9760 - 581ms/step
step 47/50 - loss: 0.0000e+00 - precision: 0.9704 - recall: 0.9819 - f1: 0.9761 - 583ms/step
step 48/50 - loss: 0.0000e+00 - precision: 0.9700 - recall: 0.9816 - f1: 0.9758 - 582ms/step
step 49/50 - loss: 0.0000e+00 - precision: 0.9702 - recall: 0.9817 - f1: 0.9760 - 582ms/step
step 50/50 - loss: 0.0000e+00 - precision: 0.9708 - recall: 0.9821 - f1: 0.9764 - 580ms/step
save checkpoint at /home/aistudio/results/7
Eval begin...
The loss value printed in the log is the current batch, and the metric is the average value of previous step.
step 1/6 - loss: 0.0000e+00 - precision: 0.8856 - recall: 0.9468 - f1: 0.9152 - 233ms/step
step 2/6 - loss: 0.0000e+00 - precision: 0.9286 - recall: 0.9579 - f1: 0.9430 - 251ms/step
step 3/6 - loss: 0.0000e+00 - precision: 0.9250 - recall: 0.9510 - f1: 0.9378 - 279ms/step
step 4/6 - loss: 0.0000e+00 - precision: 0.9273 - recall: 0.9541 - f1: 0.9405 - 260ms/step
step 5/6 - loss: 0.7449 - precision: 0.9345 - recall: 0.9570 - f1: 0.9456 - 253ms/step
step 6/6 - loss: 2.4329 - precision: 0.9284 - recall: 0.9520 - f1: 0.9401 - 251ms/step
Eval samples: 192
Epoch 9/10
step  1/50 - loss: 0.0000e+00 - precision: 0.9791 - recall: 0.9842 - f1: 0.9816 - 607ms/step
step  2/50 - loss: 0.0000e+00 - precision: 0.9714 - recall: 0.9816 - f1: 0.9764 - 654ms/step
step  3/50 - loss: 0.0000e+00 - precision: 0.9740 - recall: 0.9842 - f1: 0.9791 - 646ms/step
step  4/50 - loss: 0.0000e+00 - precision: 0.9715 - recall: 0.9830 - f1: 0.9772 - 626ms/step
step  5/50 - loss: 0.0000e+00 - precision: 0.9659 - recall: 0.9801 - f1: 0.9729 - 607ms/step
step  6/50 - loss: 0.0000e+00 - precision: 0.9682 - recall: 0.9817 - f1: 0.9749 - 607ms/step
step  7/50 - loss: 0.0000e+00 - precision: 0.9654 - recall: 0.9791 - f1: 0.9722 - 596ms/step
step  8/50 - loss: 0.0000e+00 - precision: 0.9671 - recall: 0.9804 - f1: 0.9737 - 603ms/step
step  9/50 - loss: 0.0000e+00 - precision: 0.9673 - recall: 0.9796 - f1: 0.9734 - 593ms/step
step 10/50 - loss: 0.0000e+00 - precision: 0.9695 - recall: 0.9806 - f1: 0.9750 - 596ms/step
step 11/50 - loss: 0.0000e+00 - precision: 0.9699 - recall: 0.9814 - f1: 0.9756 - 609ms/step
step 12/50 - loss: 0.0000e+00 - precision: 0.9702 - recall: 0.9817 - f1: 0.9759 - 604ms/step
step 13/50 - loss: 0.0000e+00 - precision: 0.9717 - recall: 0.9823 - f1: 0.9769 - 601ms/step
step 14/50 - loss: 0.0000e+00 - precision: 0.9730 - recall: 0.9832 - f1: 0.9780 - 609ms/step
step 15/50 - loss: 0.0000e+00 - precision: 0.9727 - recall: 0.9822 - f1: 0.9774 - 604ms/step
step 16/50 - loss: 0.0000e+00 - precision: 0.9731 - recall: 0.9823 - f1: 0.9777 - 600ms/step
step 17/50 - loss: 0.0000e+00 - precision: 0.9735 - recall: 0.9827 - f1: 0.9781 - 596ms/step
step 18/50 - loss: 0.0000e+00 - precision: 0.9732 - recall: 0.9820 - f1: 0.9776 - 596ms/step
step 19/50 - loss: 0.0000e+00 - precision: 0.9738 - recall: 0.9818 - f1: 0.9778 - 601ms/step
step 20/50 - loss: 0.0000e+00 - precision: 0.9746 - recall: 0.9825 - f1: 0.9785 - 601ms/step
step 21/50 - loss: 0.0000e+00 - precision: 0.9743 - recall: 0.9823 - f1: 0.9783 - 595ms/step
step 22/50 - loss: 0.0000e+00 - precision: 0.9736 - recall: 0.9822 - f1: 0.9779 - 598ms/step
step 23/50 - loss: 0.0000e+00 - precision: 0.9725 - recall: 0.9816 - f1: 0.9770 - 596ms/step
step 24/50 - loss: 0.0000e+00 - precision: 0.9721 - recall: 0.9817 - f1: 0.9769 - 598ms/step
step 25/50 - loss: 0.0000e+00 - precision: 0.9710 - recall: 0.9814 - f1: 0.9762 - 601ms/step
step 26/50 - loss: 0.0000e+00 - precision: 0.9701 - recall: 0.9801 - f1: 0.9751 - 597ms/step
step 27/50 - loss: 0.0000e+00 - precision: 0.9705 - recall: 0.9804 - f1: 0.9754 - 596ms/step
step 28/50 - loss: 0.0000e+00 - precision: 0.9704 - recall: 0.9804 - f1: 0.9754 - 595ms/step
step 29/50 - loss: 0.0000e+00 - precision: 0.9705 - recall: 0.9807 - f1: 0.9756 - 596ms/step
step 30/50 - loss: 0.0000e+00 - precision: 0.9706 - recall: 0.9806 - f1: 0.9756 - 596ms/step
step 31/50 - loss: 0.0370 - precision: 0.9704 - recall: 0.9806 - f1: 0.9755 - 595ms/step
step 32/50 - loss: 0.0000e+00 - precision: 0.9710 - recall: 0.9810 - f1: 0.9760 - 593ms/step
step 33/50 - loss: 0.0000e+00 - precision: 0.9719 - recall: 0.9816 - f1: 0.9767 - 592ms/step
step 34/50 - loss: 0.0000e+00 - precision: 0.9720 - recall: 0.9815 - f1: 0.9767 - 590ms/step
step 35/50 - loss: 0.0000e+00 - precision: 0.9723 - recall: 0.9819 - f1: 0.9771 - 590ms/step
step 36/50 - loss: 0.0000e+00 - precision: 0.9725 - recall: 0.9821 - f1: 0.9773 - 589ms/step
step 37/50 - loss: 0.0000e+00 - precision: 0.9730 - recall: 0.9823 - f1: 0.9776 - 588ms/step
step 38/50 - loss: 0.0000e+00 - precision: 0.9726 - recall: 0.9821 - f1: 0.9773 - 586ms/step
step 39/50 - loss: 0.0000e+00 - precision: 0.9730 - recall: 0.9824 - f1: 0.9777 - 586ms/step
step 40/50 - loss: 0.0000e+00 - precision: 0.9732 - recall: 0.9825 - f1: 0.9778 - 585ms/step
step 41/50 - loss: 0.0000e+00 - precision: 0.9738 - recall: 0.9829 - f1: 0.9784 - 584ms/step
step 42/50 - loss: 0.0000e+00 - precision: 0.9742 - recall: 0.9831 - f1: 0.9786 - 583ms/step
step 43/50 - loss: 0.0000e+00 - precision: 0.9743 - recall: 0.9832 - f1: 0.9788 - 581ms/step
step 44/50 - loss: 0.0000e+00 - precision: 0.9747 - recall: 0.9835 - f1: 0.9791 - 581ms/step
step 45/50 - loss: 0.0000e+00 - precision: 0.9743 - recall: 0.9834 - f1: 0.9788 - 581ms/step
step 46/50 - loss: 0.0000e+00 - precision: 0.9744 - recall: 0.9835 - f1: 0.9789 - 579ms/step
step 47/50 - loss: 0.0000e+00 - precision: 0.9741 - recall: 0.9832 - f1: 0.9786 - 579ms/step
step 48/50 - loss: 0.0000e+00 - precision: 0.9740 - recall: 0.9832 - f1: 0.9786 - 577ms/step
step 49/50 - loss: 0.0000e+00 - precision: 0.9743 - recall: 0.9834 - f1: 0.9788 - 577ms/step
step 50/50 - loss: 0.0000e+00 - precision: 0.9747 - recall: 0.9836 - f1: 0.9791 - 577ms/step
save checkpoint at /home/aistudio/results/8
Eval begin...
The loss value printed in the log is the current batch, and the metric is the average value of previous step.
step 1/6 - loss: 0.0000e+00 - precision: 0.9175 - recall: 0.9468 - f1: 0.9319 - 242ms/step
step 2/6 - loss: 0.0000e+00 - precision: 0.9352 - recall: 0.9500 - f1: 0.9426 - 239ms/step
step 3/6 - loss: 0.0000e+00 - precision: 0.9244 - recall: 0.9422 - f1: 0.9332 - 254ms/step
step 4/6 - loss: 0.0000e+00 - precision: 0.9265 - recall: 0.9436 - f1: 0.9350 - 238ms/step
step 5/6 - loss: 2.9493 - precision: 0.9360 - recall: 0.9497 - f1: 0.9428 - 238ms/step
step 6/6 - loss: 1.0696 - precision: 0.9262 - recall: 0.9424 - f1: 0.9342 - 237ms/step
Eval samples: 192
Epoch 10/10
step  1/50 - loss: 0.0000e+00 - precision: 0.9585 - recall: 0.9737 - f1: 0.9661 - 543ms/step
step  2/50 - loss: 0.0000e+00 - precision: 0.9634 - recall: 0.9735 - f1: 0.9684 - 545ms/step
step  3/50 - loss: 0.0000e+00 - precision: 0.9583 - recall: 0.9701 - f1: 0.9641 - 554ms/step
step  4/50 - loss: 0.0000e+00 - precision: 0.9635 - recall: 0.9737 - f1: 0.9686 - 598ms/step
step  5/50 - loss: 0.0000e+00 - precision: 0.9667 - recall: 0.9758 - f1: 0.9712 - 583ms/step
step  6/50 - loss: 0.0000e+00 - precision: 0.9688 - recall: 0.9781 - f1: 0.9735 - 573ms/step
step  7/50 - loss: 0.0000e+00 - precision: 0.9703 - recall: 0.9798 - f1: 0.9750 - 582ms/step
step  8/50 - loss: 0.0000e+00 - precision: 0.9676 - recall: 0.9777 - f1: 0.9726 - 579ms/step
step  9/50 - loss: 0.0000e+00 - precision: 0.9689 - recall: 0.9785 - f1: 0.9736 - 572ms/step
step 10/50 - loss: 0.0000e+00 - precision: 0.9699 - recall: 0.9791 - f1: 0.9745 - 570ms/step
step 11/50 - loss: 0.0000e+00 - precision: 0.9717 - recall: 0.9800 - f1: 0.9758 - 565ms/step
step 12/50 - loss: 0.0000e+00 - precision: 0.9685 - recall: 0.9786 - f1: 0.9735 - 569ms/step
step 13/50 - loss: 0.0000e+00 - precision: 0.9693 - recall: 0.9795 - f1: 0.9744 - 563ms/step
step 14/50 - loss: 0.0000e+00 - precision: 0.9686 - recall: 0.9794 - f1: 0.9740 - 567ms/step
step 15/50 - loss: 0.0000e+00 - precision: 0.9672 - recall: 0.9790 - f1: 0.9731 - 568ms/step
step 16/50 - loss: 0.0000e+00 - precision: 0.9680 - recall: 0.9797 - f1: 0.9738 - 570ms/step
step 17/50 - loss: 0.0000e+00 - precision: 0.9692 - recall: 0.9806 - f1: 0.9749 - 570ms/step
step 18/50 - loss: 0.0000e+00 - precision: 0.9704 - recall: 0.9814 - f1: 0.9758 - 567ms/step
step 19/50 - loss: 0.0000e+00 - precision: 0.9714 - recall: 0.9821 - f1: 0.9767 - 571ms/step
step 20/50 - loss: 0.0364 - precision: 0.9723 - recall: 0.9827 - f1: 0.9775 - 570ms/step
step 21/50 - loss: 0.0000e+00 - precision: 0.9731 - recall: 0.9833 - f1: 0.9782 - 572ms/step
step 22/50 - loss: 0.0000e+00 - precision: 0.9739 - recall: 0.9836 - f1: 0.9787 - 570ms/step
step 23/50 - loss: 0.0000e+00 - precision: 0.9736 - recall: 0.9832 - f1: 0.9784 - 574ms/step
step 24/50 - loss: 0.0000e+00 - precision: 0.9743 - recall: 0.9837 - f1: 0.9790 - 578ms/step
step 25/50 - loss: 0.0000e+00 - precision: 0.9749 - recall: 0.9839 - f1: 0.9794 - 578ms/step
step 26/50 - loss: 0.0000e+00 - precision: 0.9753 - recall: 0.9837 - f1: 0.9795 - 579ms/step
step 27/50 - loss: 0.0000e+00 - precision: 0.9748 - recall: 0.9833 - f1: 0.9791 - 581ms/step
step 28/50 - loss: 0.0000e+00 - precision: 0.9754 - recall: 0.9836 - f1: 0.9794 - 583ms/step
step 29/50 - loss: 0.0000e+00 - precision: 0.9755 - recall: 0.9834 - f1: 0.9794 - 580ms/step
step 30/50 - loss: 0.0000e+00 - precision: 0.9756 - recall: 0.9833 - f1: 0.9794 - 579ms/step
step 31/50 - loss: 0.0000e+00 - precision: 0.9761 - recall: 0.9836 - f1: 0.9798 - 581ms/step
step 32/50 - loss: 0.0000e+00 - precision: 0.9749 - recall: 0.9828 - f1: 0.9788 - 584ms/step
step 33/50 - loss: 0.0000e+00 - precision: 0.9750 - recall: 0.9830 - f1: 0.9790 - 581ms/step
step 34/50 - loss: 0.0000e+00 - precision: 0.9748 - recall: 0.9831 - f1: 0.9789 - 584ms/step
step 35/50 - loss: 0.0000e+00 - precision: 0.9744 - recall: 0.9828 - f1: 0.9786 - 583ms/step
step 36/50 - loss: 0.0000e+00 - precision: 0.9748 - recall: 0.9832 - f1: 0.9790 - 581ms/step
step 37/50 - loss: 0.0000e+00 - precision: 0.9749 - recall: 0.9833 - f1: 0.9791 - 579ms/step
step 38/50 - loss: 0.0000e+00 - precision: 0.9744 - recall: 0.9832 - f1: 0.9788 - 577ms/step
step 39/50 - loss: 0.0000e+00 - precision: 0.9736 - recall: 0.9830 - f1: 0.9783 - 575ms/step
step 40/50 - loss: 0.0000e+00 - precision: 0.9730 - recall: 0.9826 - f1: 0.9778 - 574ms/step
step 41/50 - loss: 0.0000e+00 - precision: 0.9731 - recall: 0.9828 - f1: 0.9779 - 574ms/step
step 42/50 - loss: 0.0000e+00 - precision: 0.9723 - recall: 0.9826 - f1: 0.9774 - 575ms/step
step 43/50 - loss: 0.0000e+00 - precision: 0.9720 - recall: 0.9825 - f1: 0.9772 - 577ms/step
step 44/50 - loss: 0.0000e+00 - precision: 0.9724 - recall: 0.9828 - f1: 0.9775 - 577ms/step
step 45/50 - loss: 0.0000e+00 - precision: 0.9725 - recall: 0.9829 - f1: 0.9777 - 576ms/step
step 46/50 - loss: 0.0000e+00 - precision: 0.9722 - recall: 0.9828 - f1: 0.9775 - 574ms/step
step 47/50 - loss: 0.0000e+00 - precision: 0.9722 - recall: 0.9828 - f1: 0.9774 - 575ms/step
step 48/50 - loss: 0.0000e+00 - precision: 0.9720 - recall: 0.9827 - f1: 0.9773 - 576ms/step
step 49/50 - loss: 0.0000e+00 - precision: 0.9718 - recall: 0.9826 - f1: 0.9772 - 575ms/step
step 50/50 - loss: 0.0000e+00 - precision: 0.9713 - recall: 0.9821 - f1: 0.9767 - 575ms/step
save checkpoint at /home/aistudio/results/9
Eval begin...
The loss value printed in the log is the current batch, and the metric is the average value of previous step.
step 1/6 - loss: 0.0000e+00 - precision: 0.9031 - recall: 0.9415 - f1: 0.9219 - 231ms/step
step 2/6 - loss: 0.0000e+00 - precision: 0.9225 - recall: 0.9395 - f1: 0.9309 - 225ms/step
step 3/6 - loss: 0.0000e+00 - precision: 0.9275 - recall: 0.9405 - f1: 0.9339 - 241ms/step
step 4/6 - loss: 0.0000e+00 - precision: 0.9338 - recall: 0.9436 - f1: 0.9386 - 232ms/step
step 5/6 - loss: 6.0253 - precision: 0.9407 - recall: 0.9486 - f1: 0.9447 - 230ms/step
step 6/6 - loss: 3.5843 - precision: 0.9376 - recall: 0.9450 - f1: 0.9413 - 232ms/step
Eval samples: 192
save checkpoint at /home/aistudio/results/final
B.5 模型评估
调用model.evaluate,查看序列化标注模型在测试集(test.txt)上的评测结果。

In [11]
model.evaluate(eval_data=test_loader, log_freq=1)
Eval begin...
The loss value printed in the log is the current batch, and the metric is the average value of previous step.
step 1/6 - loss: 2.7858 - precision: 0.9634 - recall: 0.9634 - f1: 0.9634 - 261ms/step
step 2/6 - loss: 0.0000e+00 - precision: 0.9474 - recall: 0.9399 - f1: 0.9436 - 258ms/step
step 3/6 - loss: 19.7272 - precision: 0.9347 - recall: 0.9266 - f1: 0.9306 - 255ms/step
step 4/6 - loss: 0.5692 - precision: 0.9380 - recall: 0.9306 - f1: 0.9343 - 251ms/step
step 5/6 - loss: 0.0000e+00 - precision: 0.9337 - recall: 0.9278 - f1: 0.9307 - 251ms/step
step 6/6 - loss: 0.0000e+00 - precision: 0.9245 - recall: 0.9188 - f1: 0.9217 - 254ms/step
Eval samples: 192
{'loss': [0.0],
 'precision': 0.9244951712028094,
 'recall': 0.918848167539267,
 'f1': 0.9216630196936543}
B.6 预测
利用已有模型,可在未知label的数据集(此处复用测试集test.txt)上进行预测,得到模型预测结果及各label的概率。

In [12]
def parse_decodes(ds, decodes, lens, label_vocab):
    decodes = [x for batch in decodes for x in batch]
    lens = [x for batch in lens for x in batch]
    id_label = dict(zip(label_vocab.values(), label_vocab.keys()))

    outputs = []
    for idx, end in enumerate(lens):
        sent = ds.data[idx][0][:end]
        tags = [id_label[x] for x in decodes[idx][:end]]
        sent_out = []
        tags_out = []
        words = ""
        for s, t in zip(sent, tags):
            if t.endswith('-B') or t == 'O':
                if len(words):
                    sent_out.append(words)
                tags_out.append(t.split('-')[0])
                words = s
            else:
                words += s
        if len(sent_out) < len(tags_out):
            sent_out.append(words)
        outputs.append(''.join(
            [str((s, t)) for s, t in zip(sent_out, tags_out)]))
    return outputs
In [13]
outputs, lens, decodes = model.predict(test_data=test_loader)
preds = parse_decodes(test_ds, decodes, lens, label_vocab)

print('\n'.join(preds[:5]))
Predict begin...
step 6/6 [==============================] - ETA: 0s - 216ms/st - ETA: 0s - 202ms/st - 197ms/step          
Predict samples: 192
('黑龙江省', 'A1')('双鸭山市', 'A2')('尖山区', 'A3')('八马路与东平行路交叉口北40米韦业涛', 'A4')('18600009172', 'T')
('广西壮族自治区', 'A1')('桂林市', 'A2')('雁山区', 'A3')('雁山镇西龙村老年活动中心', 'A4')('17610348888', 'T')('羊卓卫', 'P')
('15652864561', 'T')('河南省', 'A1')('开封市', 'A2')('顺河回族区', 'A3')('顺河区公园路32号赵本山', 'A4')
('河北省', 'A1')('唐山市', 'A2')('玉田县', 'A3')('无终大街159号', 'A4')('18614253058', 'T')('尚汉生', 'P')
('台湾', 'A1')('台中市', 'A2')('北区', 'A3')('北区锦新街18号', 'A4')('18511226708', 'T')('蓟丽', 'P')
PART C 优化进阶-使用预训练的词向量优化模型效果
在Baseline版本中,我们调用了paddle.nn.Embedding获取词的向量表示,有如下特点....
这里,我们调用paddlenlp.embeddings中内置的向量表示TokenEmbedding,有如下特点...

In [14]
from paddlenlp.embeddings import TokenEmbedding # EMB

del model
del preds
del network
In [15]
class BiGRUWithCRF2(nn.Layer):
    def __init__(self,
                 emb_size,
                 hidden_size,
                 word_num,
                 label_num,
                 use_w2v_emb=True):
        super(BiGRUWithCRF2, self).__init__()
        if use_w2v_emb:
            self.word_emb = TokenEmbedding(
                extended_vocab_path='./data/word.dic', unknown_token='OOV')
        else:
            self.word_emb = nn.Embedding(word_num, emb_size)
        self.gru = nn.GRU(emb_size,
                          hidden_size,
                          num_layers=2,
                          direction='bidirectional')
        self.fc = nn.Linear(hidden_size * 2, label_num + 2)  # BOS EOS
        self.crf = LinearChainCrf(label_num)
        self.decoder = ViterbiDecoder(self.crf.transitions)

    def forward(self, x, lens):
        embs = self.word_emb(x)
        output, _ = self.gru(embs)
        output = self.fc(output)
        _, pred = self.decoder(output, lens)
        return output, lens, pred
In [16]
network = BiGRUWithCRF2(300, 300, len(word_vocab), len(label_vocab))
model = paddle.Model(network)
optimizer = paddle.optimizer.Adam(learning_rate=0.001, parameters=model.parameters())
crf_loss = LinearChainCrfLoss(network.crf)
chunk_evaluator = ChunkEvaluator(label_list=label_vocab.keys(), suffix=True)
model.prepare(optimizer, crf_loss, chunk_evaluator)
2021-04-06 19:42:43,819 - INFO - unique_endpoints {''}
2021-04-06 19:42:43,821 - INFO - Downloading w2v.baidu_encyclopedia.target.word-word.dim300.tar.gz from https://paddlenlp.bj.bcebos.com/models/embeddings/w2v.baidu_encyclopedia.target.word-word.dim300.tar.gz
100%|██████████| 694483/694483 [00:13<00:00, 53352.21it/s]
2021-04-06 19:42:57,039 - INFO - Decompressing /home/aistudio/.paddlenlp/models/embeddings/w2v.baidu_encyclopedia.target.word-word.dim300.tar.gz...
[2021-04-06 19:43:23,932] [    INFO] - Loading token embedding...
[2021-04-06 19:43:25,926] [    INFO] - Start extending vocab.
[2021-04-06 19:43:35,139] [    INFO] - Finish extending vocab.
[2021-04-06 19:43:39,630] [    INFO] - Finish loading embedding vector.
[2021-04-06 19:43:39,635] [    INFO] - Token Embedding info:             
Unknown index: 20939             
Unknown token: OOV             
Padding index: 643697             
Padding token: [PAD]             
Shape :[643698, 300]
In [30]
model.fit(train_data=train_loader,
            eval_data=dev_loader,
            epochs=10,
            save_dir='./results',
            log_freq=1)
The loss value printed in the log is the current step, and the metric is the average value of previous step.
Epoch 1/5
step  1/50 - loss: 88.5678 - precision: 0.0000e+00 - recall: 0.0000e+00 - f1: 0.0000e+00 - 3s/step
step  2/50 - loss: 62.3248 - precision: 0.0036 - recall: 0.0078 - f1: 0.0049 - 2s/step
step  3/50 - loss: 45.4475 - precision: 0.0088 - recall: 0.0139 - f1: 0.0108 - 2s/step
step  4/50 - loss: 48.9159 - precision: 0.0182 - recall: 0.0234 - f1: 0.0205 - 2s/step
step  5/50 - loss: 36.5155 - precision: 0.0236 - recall: 0.0281 - f1: 0.0257 - 2s/step
step  6/50 - loss: 48.2510 - precision: 0.0385 - recall: 0.0469 - f1: 0.0423 - 2s/step
step  7/50 - loss: 31.3799 - precision: 0.0443 - recall: 0.0543 - f1: 0.0488 - 2s/step
step  8/50 - loss: 32.9661 - precision: 0.0550 - recall: 0.0658 - f1: 0.0599 - 2s/step
step  9/50 - loss: 25.5001 - precision: 0.0614 - recall: 0.0713 - f1: 0.0660 - 2s/step
step 10/50 - loss: 26.5028 - precision: 0.0741 - recall: 0.0845 - f1: 0.0789 - 2s/step
step 11/50 - loss: 23.2316 - precision: 0.0865 - recall: 0.0987 - f1: 0.0922 - 2s/step
step 12/50 - loss: 40.8448 - precision: 0.1005 - recall: 0.1140 - f1: 0.1068 - 2s/step
step 13/50 - loss: 27.2463 - precision: 0.1174 - recall: 0.1325 - f1: 0.1245 - 2s/step
step 14/50 - loss: 14.7858 - precision: 0.1260 - recall: 0.1409 - f1: 0.1331 - 2s/step
step 15/50 - loss: 19.6964 - precision: 0.1344 - recall: 0.1494 - f1: 0.1415 - 2s/step
step 16/50 - loss: 17.7165 - precision: 0.1428 - recall: 0.1583 - f1: 0.1502 - 2s/step
step 17/50 - loss: 11.3291 - precision: 0.1604 - recall: 0.1780 - f1: 0.1687 - 2s/step
step 18/50 - loss: 19.3877 - precision: 0.1789 - recall: 0.1980 - f1: 0.1880 - 2s/step
step 19/50 - loss: 28.5863 - precision: 0.1980 - recall: 0.2204 - f1: 0.2086 - 2s/step
step 20/50 - loss: 10.0464 - precision: 0.2161 - recall: 0.2410 - f1: 0.2279 - 2s/step
step 21/50 - loss: 14.3640 - precision: 0.2348 - recall: 0.2620 - f1: 0.2476 - 2s/step
step 22/50 - loss: 10.9893 - precision: 0.2482 - recall: 0.2767 - f1: 0.2617 - 2s/step
step 23/50 - loss: 16.0034 - precision: 0.2623 - recall: 0.2924 - f1: 0.2765 - 2s/step
step 24/50 - loss: 13.8267 - precision: 0.2729 - recall: 0.3039 - f1: 0.2876 - 2s/step
step 25/50 - loss: 5.6605 - precision: 0.2841 - recall: 0.3164 - f1: 0.2994 - 2s/step
step 26/50 - loss: 5.3409 - precision: 0.3014 - recall: 0.3352 - f1: 0.3174 - 2s/step
step 27/50 - loss: 18.3858 - precision: 0.3149 - recall: 0.3499 - f1: 0.3315 - 2s/step
step 28/50 - loss: 5.0789 - precision: 0.3289 - recall: 0.3652 - f1: 0.3461 - 2s/step
step 29/50 - loss: 10.1857 - precision: 0.3400 - recall: 0.3783 - f1: 0.3581 - 2s/step
step 30/50 - loss: 22.1781 - precision: 0.3495 - recall: 0.3897 - f1: 0.3685 - 2s/step
step 31/50 - loss: 3.4667 - precision: 0.3596 - recall: 0.4010 - f1: 0.3791 - 2s/step
step 32/50 - loss: 5.7708 - precision: 0.3726 - recall: 0.4145 - f1: 0.3925 - 2s/step
step 33/50 - loss: 5.0261 - precision: 0.3846 - recall: 0.4274 - f1: 0.4049 - 2s/step
step 34/50 - loss: 3.8509 - precision: 0.3955 - recall: 0.4385 - f1: 0.4159 - 2s/step
step 35/50 - loss: 11.9566 - precision: 0.4059 - recall: 0.4493 - f1: 0.4265 - 2s/step
step 36/50 - loss: 1.4500 - precision: 0.4152 - recall: 0.4599 - f1: 0.4364 - 2s/step
step 37/50 - loss: 10.3430 - precision: 0.4222 - recall: 0.4680 - f1: 0.4439 - 2s/step
step 38/50 - loss: 7.4301 - precision: 0.4318 - recall: 0.4785 - f1: 0.4540 - 2s/step
step 39/50 - loss: 2.5816 - precision: 0.4417 - recall: 0.4885 - f1: 0.4640 - 2s/step
step 40/50 - loss: 4.5552 - precision: 0.4514 - recall: 0.4985 - f1: 0.4738 - 2s/step
step 41/50 - loss: 11.7065 - precision: 0.4605 - recall: 0.5082 - f1: 0.4832 - 2s/step
step 42/50 - loss: 8.7833 - precision: 0.4702 - recall: 0.5180 - f1: 0.4930 - 2s/step
step 43/50 - loss: 1.6122 - precision: 0.4788 - recall: 0.5268 - f1: 0.5016 - 2s/step
step 44/50 - loss: 2.8585 - precision: 0.4849 - recall: 0.5341 - f1: 0.5083 - 2s/step
step 45/50 - loss: 4.8276 - precision: 0.4923 - recall: 0.5421 - f1: 0.5160 - 2s/step
step 46/50 - loss: 2.8296 - precision: 0.4992 - recall: 0.5491 - f1: 0.5230 - 2s/step
step 47/50 - loss: 5.2778 - precision: 0.5062 - recall: 0.5562 - f1: 0.5300 - 2s/step
step 48/50 - loss: 7.7174 - precision: 0.5133 - recall: 0.5630 - f1: 0.5370 - 2s/step
In [31]
model.evaluate(eval_data=test_loader)
Eval begin...
The loss value printed in the log is the current batch, and the metric is the average value of previous step.
step 1/1 - loss: 2.6293 - precision: 0.9314 - recall: 0.9588 - f1: 0.9449 - 176ms/step
Eval samples: 200
{'loss': [2.6293335],
 'precision': 0.9314285714285714,
 'recall': 0.9588235294117647,
 'f1': 0.944927536231884}
模型在验证集上的f1 score较之前有明显提升。

In [32]
outputs, lens, decodes = model.predict(test_data=test_loader)
preds = parse_decodes(test_ds, decodes, lens, label_vocab)

print('\n'.join(preds[:5]))
Predict begin...
step 1/1 [==============================] - 100ms/step
Predict samples: 200
('黑龙江省', 'A1')('双鸭山市', 'A2')('尖山区', 'A3')('八马路与东平行路交叉口北40米', 'A4')('韦业涛', 'P')('18600009172', 'T')
('广西壮族自治区', 'A1')('桂林市', 'A2')('雁山区', 'A3')('雁山镇西龙村老年活动中心', 'A4')('17610348888', 'T')('羊卓卫', 'P')
('15652864561', 'T')('河南省', 'A1')('开封市', 'A2')('顺河回族区', 'A3')('顺河区公园路32号', 'A4')('赵本山', 'P')
('河北省', 'A1')('唐山市', 'A2')('玉田县', 'A3')('无终大街159号', 'A4')('18614253058', 'T')('尚汉生', 'P')
('台湾', 'A1')('台中市', 'A2')('北区', 'A3')('北区', 'A4')('锦新街18号', 'A4')('18511226708', 'T')('蓟丽', 'P')
PART D. 概念解释
D.1 门控循环单元GRU(Gate Recurrent Unit)
BIGRU是一种经典的循环神经网络(RNN,Recurrent Neural Network),前面一些步骤基本是把该模型当做是黑盒子来用,这里我们重点解释下其概念和相关原理。一个 RNN 的示意图如下所示,



图4:RNN示意图

左边是原始的 RNN,可以看到绿色的点代码输入 x,红色的点代表输出 y,中间的蓝色是 RNN 模型部分。橙色的箭头由自身指向自身,表示 RNN 的输入来自于上时刻的输出,这也是为什么名字中带有循环(Recurrent)这个词。

右边是按照时间序列展开的示意图,注意到蓝色的 RNN 模块是同一个,只不过在不同的时刻复用了。这时候能够清晰地表示序列标注模型的输入输出。

GRU为了解决长期记忆和反向传播中梯度问题而提出来的,和LSTM一样能够有效对长序列建模,且GRU训练效率更高。

D.2 条件随机场CRF(Conditional Random Fields)
长句子的问题解决了,序列标注任务的另外一个问题也亟待解决,即标签之间的依赖性。举个例子,我们预测的标签一般不会出现 P-B,T-I 并列的情况,因为这样的标签不合理,也无法解析。无论是 RNN 还是 LSTM 都只能尽量不出现,却无法从原理上避免这个问题。下面要提到的条件随机场(CRF,Conditional Random Field)却很好的解决了这个问题。

条件随机场这个模型属于概率图模型中的无向图模型,这里我们不做展开,只直观解释下该模型背后考量的思想。一个经典的链式 CRF 如下图所示,



图5:CRF示意图

CRF 本质是一个无向图,其中绿色点表示输入,红色点表示输出。点与点之间的边可以分成两类,一类是 xxx 与 yyy 之间的连线,表示其相关性;另一类是相邻时刻的 yyy 之间的相关性。也就是说,在预测某时刻 yyy 时,同时要考虑相邻的标签解决。当 CRF 模型收敛时,就会学到类似 P-B 和 T-I 作为相邻标签的概率非常低。

PaddleNLP 中提供了相关的 API ,可以调用 CRF 模型,进行维特比译码。

PaddleNLP更多教程
使用seq2vec模块进行句子情感分类
使用预训练模型ERNIE优化情感分析
使用预训练模型ERNIE优化快递单信息抽取
使用Seq2Seq模型完成自动对联
使用预训练模型ERNIE-GEN实现智能写诗
使用TCN网络完成新冠疫情病例数预测
使用预训练模型完成阅读理解
自定义数据集实现文本多分类任务
  • 1
  • 2
  • 3
  • 4
  • 5
  • 6
  • 7
  • 8
  • 9
  • 10
  • 11
  • 12
  • 13
  • 14
  • 15
  • 16
  • 17
  • 18
  • 19
  • 20
  • 21
  • 22
  • 23
  • 24
  • 25
  • 26
  • 27
  • 28
  • 29
  • 30
  • 31
  • 32
  • 33
  • 34
  • 35
  • 36
  • 37
  • 38
  • 39
  • 40
  • 41
  • 42
  • 43
  • 44
  • 45
  • 46
  • 47
  • 48
  • 49
  • 50
  • 51
  • 52
  • 53
  • 54
  • 55
  • 56
  • 57
  • 58
  • 59
  • 60
  • 61
  • 62
  • 63
  • 64
  • 65
  • 66
  • 67
  • 68
  • 69
  • 70
  • 71
  • 72
  • 73
  • 74
  • 75
  • 76
  • 77
  • 78
  • 79
  • 80
  • 81
  • 82
  • 83
  • 84
  • 85
  • 86
  • 87
  • 88
  • 89
  • 90
  • 91
  • 92
  • 93
  • 94
  • 95
  • 96
  • 97
  • 98
  • 99
  • 100
  • 101
  • 102
  • 103
  • 104
  • 105
  • 106
  • 107
  • 108
  • 109
  • 110
  • 111
  • 112
  • 113
  • 114
  • 115
  • 116
  • 117
  • 118
  • 119
  • 120
  • 121
  • 122
  • 123
  • 124
  • 125
  • 126
  • 127
  • 128
  • 129
  • 130
  • 131
  • 132
  • 133
  • 134
  • 135
  • 136
  • 137
  • 138
  • 139
  • 140
  • 141
  • 142
  • 143
  • 144
  • 145
  • 146
  • 147
  • 148
  • 149
  • 150
  • 151
  • 152
  • 153
  • 154
  • 155
  • 156
  • 157
  • 158
  • 159
  • 160
  • 161
  • 162
  • 163
  • 164
  • 165
  • 166
  • 167
  • 168
  • 169
  • 170
  • 171
  • 172
  • 173
  • 174
  • 175
  • 176
  • 177
  • 178
  • 179
  • 180
  • 181
  • 182
  • 183
  • 184
  • 185
  • 186
  • 187
  • 188
  • 189
  • 190
  • 191
  • 192
  • 193
  • 194
  • 195
  • 196
  • 197
  • 198
  • 199
  • 200
  • 201
  • 202
  • 203
  • 204
  • 205
  • 206
  • 207
  • 208
  • 209
  • 210
  • 211
  • 212
  • 213
  • 214
  • 215
  • 216
  • 217
  • 218
  • 219
  • 220
  • 221
  • 222
  • 223
  • 224
  • 225
  • 226
  • 227
  • 228
  • 229
  • 230
  • 231
  • 232
  • 233
  • 234
  • 235
  • 236
  • 237
  • 238
  • 239
  • 240
  • 241
  • 242
  • 243
  • 244
  • 245
  • 246
  • 247
  • 248
  • 249
  • 250
  • 251
  • 252
  • 253
  • 254
  • 255
  • 256
  • 257
  • 258
  • 259
  • 260
  • 261
  • 262
  • 263
  • 264
  • 265
  • 266
  • 267
  • 268
  • 269
  • 270
  • 271
  • 272
  • 273
  • 274
  • 275
  • 276
  • 277
  • 278
  • 279
  • 280
  • 281
  • 282
  • 283
  • 284
  • 285
  • 286
  • 287
  • 288
  • 289
  • 290
  • 291
  • 292
  • 293
  • 294
  • 295
  • 296
  • 297
  • 298
  • 299
  • 300
  • 301
  • 302
  • 303
  • 304
  • 305
  • 306
  • 307
  • 308
  • 309
  • 310
  • 311
  • 312
  • 313
  • 314
  • 315
  • 316
  • 317
  • 318
  • 319
  • 320
  • 321
  • 322
  • 323
  • 324
  • 325
  • 326
  • 327
  • 328
  • 329
  • 330
  • 331
  • 332
  • 333
  • 334
  • 335
  • 336
  • 337
  • 338
  • 339
  • 340
  • 341
  • 342
  • 343
  • 344
  • 345
  • 346
  • 347
  • 348
  • 349
  • 350
  • 351
  • 352
  • 353
  • 354
  • 355
  • 356
  • 357
  • 358
  • 359
  • 360
  • 361
  • 362
  • 363
  • 364
  • 365
  • 366
  • 367
  • 368
  • 369
  • 370
  • 371
  • 372
  • 373
  • 374
  • 375
  • 376
  • 377
  • 378
  • 379
  • 380
  • 381
  • 382
  • 383
  • 384
  • 385
  • 386
  • 387
  • 388
  • 389
  • 390
  • 391
  • 392
  • 393
  • 394
  • 395
  • 396
  • 397
  • 398
  • 399
  • 400
  • 401
  • 402
  • 403
  • 404
  • 405
  • 406
  • 407
  • 408
  • 409
  • 410
  • 411
  • 412
  • 413
  • 414
  • 415
  • 416
  • 417
  • 418
  • 419
  • 420
  • 421
  • 422
  • 423
  • 424
  • 425
  • 426
  • 427
  • 428
  • 429
  • 430
  • 431
  • 432
  • 433
  • 434
  • 435
  • 436
  • 437
  • 438
  • 439
  • 440
  • 441
  • 442
  • 443
  • 444
  • 445
  • 446
  • 447
  • 448
  • 449
  • 450
  • 451
  • 452
  • 453
  • 454
  • 455
  • 456
  • 457
  • 458
  • 459
  • 460
  • 461
  • 462
  • 463
  • 464
  • 465
  • 466
  • 467
  • 468
  • 469
  • 470
  • 471
  • 472
  • 473
  • 474
  • 475
  • 476
  • 477
  • 478
  • 479
  • 480
  • 481
  • 482
  • 483
  • 484
  • 485
  • 486
  • 487
  • 488
  • 489
  • 490
  • 491
  • 492
  • 493
  • 494
  • 495
  • 496
  • 497
  • 498
  • 499
  • 500
  • 501
  • 502
  • 503
  • 504
  • 505
  • 506
  • 507
  • 508
  • 509
  • 510
  • 511
  • 512
  • 513
  • 514
  • 515
  • 516
  • 517
  • 518
  • 519
  • 520
  • 521
  • 522
  • 523
  • 524
  • 525
  • 526
  • 527
  • 528
  • 529
  • 530
  • 531
  • 532
  • 533
  • 534
  • 535
  • 536
  • 537
  • 538
  • 539
  • 540
  • 541
  • 542
  • 543
  • 544
  • 545
  • 546
  • 547
  • 548
  • 549
  • 550
  • 551
  • 552
  • 553
  • 554
  • 555
  • 556
  • 557
  • 558
  • 559
  • 560
  • 561
  • 562
  • 563
  • 564
  • 565
  • 566
  • 567
  • 568
  • 569
  • 570
  • 571
  • 572
  • 573
  • 574
  • 575
  • 576
  • 577
  • 578
  • 579
  • 580
  • 581
  • 582
  • 583
  • 584
  • 585
  • 586
  • 587
  • 588
  • 589
  • 590
  • 591
  • 592
  • 593
  • 594
  • 595
  • 596
  • 597
  • 598
  • 599
  • 600
  • 601
  • 602
  • 603
  • 604
  • 605
  • 606
  • 607
  • 608
  • 609
  • 610
  • 611
  • 612
  • 613
  • 614
  • 615
  • 616
  • 617
  • 618
  • 619
  • 620
  • 621
  • 622
  • 623
  • 624
  • 625
  • 626
  • 627
  • 628
  • 629
  • 630
  • 631
  • 632
  • 633
  • 634
  • 635
  • 636
  • 637
  • 638
  • 639
  • 640
  • 641
  • 642
  • 643
  • 644
  • 645
  • 646
  • 647
  • 648
  • 649
  • 650
  • 651
  • 652
  • 653
  • 654
  • 655
  • 656
  • 657
  • 658
  • 659
  • 660
  • 661
  • 662
  • 663
  • 664
  • 665
  • 666
  • 667
  • 668
  • 669
  • 670
  • 671
  • 672
  • 673
  • 674
  • 675
  • 676
  • 677
  • 678
  • 679
  • 680
  • 681
  • 682
  • 683
  • 684
  • 685
  • 686
  • 687
  • 688
  • 689
  • 690
  • 691
  • 692
  • 693
  • 694
  • 695
  • 696
  • 697
  • 698
  • 699
  • 700
  • 701
  • 702
  • 703
  • 704
  • 705
  • 706
  • 707
  • 708
  • 709
  • 710
  • 711
  • 712
  • 713
  • 714
  • 715
  • 716
  • 717
  • 718
  • 719
  • 720
  • 721
  • 722
  • 723
  • 724
  • 725
  • 726
  • 727
  • 728
  • 729
  • 730
  • 731
  • 732
  • 733
  • 734
  • 735
  • 736
  • 737
  • 738
  • 739
  • 740
  • 741
  • 742
  • 743
  • 744
  • 745
  • 746
  • 747
  • 748
  • 749
  • 750
  • 751
  • 752
  • 753
  • 754
  • 755
  • 756
  • 757
  • 758
  • 759
  • 760
  • 761
  • 762
  • 763
  • 764
  • 765
  • 766
  • 767
  • 768
  • 769
  • 770
  • 771
  • 772
  • 773
  • 774
  • 775
  • 776
  • 777
  • 778
  • 779
  • 780
  • 781
  • 782
  • 783
  • 784
  • 785
  • 786
  • 787
  • 788
  • 789
  • 790
  • 791
  • 792
  • 793
  • 794
  • 795
  • 796
  • 797
  • 798
  • 799
  • 800
  • 801
  • 802
  • 803
  • 804
  • 805
  • 806
  • 807
  • 808
  • 809
  • 810
  • 811
  • 812
  • 813
  • 814
  • 815
  • 816
  • 817
  • 818
  • 819
  • 820
  • 821
  • 822
  • 823
  • 824
  • 825
  • 826
  • 827
  • 828
  • 829
  • 830
  • 831
  • 832
  • 833
  • 834
  • 835
  • 836
  • 837
  • 838
  • 839
  • 840
  • 841
  • 842
  • 843
  • 844
  • 845
  • 846
  • 847
  • 848
  • 849
  • 850
  • 851
  • 852
  • 853
  • 854
  • 855
  • 856
  • 857
  • 858
  • 859
  • 860
  • 861
  • 862
  • 863
  • 864
  • 865
  • 866
  • 867
  • 868
  • 869
  • 870
  • 871
  • 872
  • 873
  • 874
  • 875
  • 876
  • 877
  • 878
  • 879
  • 880
  • 881
  • 882
  • 883
  • 884
  • 885
  • 886
  • 887
  • 888
  • 889
  • 890
  • 891
  • 892
  • 893
  • 894
  • 895
  • 896
  • 897
  • 898
  • 899
  • 900
  • 901
  • 902
  • 903
  • 904
  • 905
  • 906
  • 907
  • 908
  • 909
  • 910
  • 911
  • 912
  • 913
  • 914
  • 915
  • 916
  • 917
  • 918
  • 919
  • 920
  • 921
  • 922
  • 923
  • 924
  • 925
  • 926
  • 927
  • 928
  • 929
  • 930
  • 931
  • 932
  • 933
  • 934
  • 935
  • 936
  • 937
  • 938
  • 939
  • 940
  • 941
  • 942
  • 943
  • 944
  • 945
  • 946
  • 947
  • 948
  • 949
  • 950
  • 951
  • 952
  • 953
  • 954
  • 955
  • 956
  • 957
  • 958
  • 959
  • 960
  • 961
  • 962
  • 963
  • 964
  • 965
  • 966
  • 967
  • 968
  • 969
  • 970
  • 971
  • 972
  • 973
  • 974
  • 975
  • 976
  • 977
  • 978
  • 979
  • 980
  • 981
  • 982
  • 983
  • 984
  • 985
  • 986
  • 987
  • 988
  • 989
  • 990
  • 991
  • 992
  • 993
  • 994
  • 995
  • 996
  • 997
  • 998
  • 999
  • 1000
  • 1001
  • 1002
  • 1003
  • 1004
  • 1005
  • 1006
  • 1007
  • 1008
  • 1009
  • 1010
  • 1011
  • 1012
  • 1013
  • 1014
  • 1015
  • 1016
  • 1017
  • 1018
  • 1019
  • 1020
  • 1021
  • 1022
  • 1023
  • 1024
  • 1025
  • 1026
  • 1027
  • 1028
  • 1029
  • 1030
  • 1031
  • 1032
  • 1033
  • 1034
  • 1035
  • 1036
  • 1037
  • 1038
  • 1039
  • 1040
  • 1041
  • 1042
  • 1043
  • 1044
  • 1045
  • 1046
  • 1047
  • 1048
  • 1049
  • 1050
  • 1051
  • 1052
  • 1053
  • 1054
  • 1055
  • 1056
  • 1057
  • 1058
  • 1059
  • 1060
  • 1061
  • 1062
  • 1063
  • 1064
  • 1065
  • 1066
  • 1067
  • 1068
  • 1069
  • 1070
  • 1071
  • 1072
  • 1073
  • 1074
  • 1075
  • 1076
  • 1077
  • 1078
  • 1079
  • 1080
  • 1081
  • 1082
  • 1083
  • 1084
  • 1085
  • 1086
  • 1087
  • 1088
  • 1089
  • 1090
  • 1091
  • 1092
  • 1093
  • 1094
  • 1095
  • 1096
  • 1097
  • 1098
  • 1099
  • 1100
  • 1101
  • 1102
  • 1103
  • 1104
  • 1105
  • 1106
  • 1107
  • 1108
  • 1109
  • 1110
  • 1111
  • 1112
  • 1113
  • 1114
  • 1115
  • 1116
  • 1117
  • 1118
  • 1119
  • 1120
  • 1121
  • 1122
  • 1123
  • 1124
  • 1125
  • 1126
  • 1127
  • 1128
  • 1129
  • 1130
  • 1131
  • 1132
  • 1133
  • 1134
  • 1135
  • 1136
  • 1137
  • 1138
  • 1139
  • 1140
  • 1141
  • 1142
  • 1143
  • 1144
  • 1145
  • 1146
  • 1147
  • 1148
  • 1149
  • 1150
  • 1151
  • 1152
  • 1153
  • 1154
  • 1155
  • 1156
  • 1157
  • 1158
  • 1159
  • 1160
  • 1161
  • 1162
  • 1163
  • 1164
  • 1165
  • 1166
  • 1167
  • 1168
  • 1169
  • 1170
  • 1171
  • 1172
  • 1173
  • 1174
  • 1175
  • 1176
  • 1177
  • 1178
  • 1179
  • 1180
  • 1181
  • 1182
  • 1183
  • 1184
  • 1185
  • 1186
  • 1187
  • 1188
  • 1189
  • 1190
  • 1191
  • 1192
  • 1193
  • 1194
  • 1195
  • 1196
  • 1197
  • 1198
  • 1199
  • 1200
  • 1201
  • 1202
  • 1203
  • 1204
  • 1205
  • 1206
  • 1207
  • 1208
  • 1209
  • 1210
  • 1211
声明:本文内容由网友自发贡献,不代表【wpsshop博客】立场,版权归原作者所有,本站不承担相应法律责任。如您发现有侵权的内容,请联系我们。转载请注明出处:https://www.wpsshop.cn/w/凡人多烦事01/article/detail/500390
推荐阅读
相关标签
  

闽ICP备14008679号