当前位置:   article > 正文

2023年美国数学建模比赛C题部分工作展示_2023年美赛c题

2023年美赛c题

一、属性数据的处理

将单词列表保存到文件中

import pandas as pd

wdl = pd.read_excel('/han/2023_MCM-ICM_Problems/puzzles/Problem_C_Data_Wordle.xlsx')
wrd_lst = list(wdl['Unnamed: 3'])
wrd_lst = wrd_lst[1:]
with open('/han/2023_MCM-ICM_Problems/work/wrd_lst.txt', 'w', encoding='utf-8') as f:
    for wrd in wrd_lst:
        f.write(wrd)
        f.write('\n')
  • 1
  • 2
  • 3
  • 4
  • 5
  • 6
  • 7
  • 8
  • 9

以上是从Problem_C_Data_Wordle.xlsx中得到题目给出的357个单词数据,保存至txt文件中,以下是wrd_lst的部分展示:

在这里插入图片描述

当然,也可以从其他地方导入单词,然后以上面的形式存储在txt文件中

得到单词的前三个属性数据

def get_three(wrd_lst):
    ltr_freq = np.array([[8.167, 1.492, 2.782, 4.253, 12.702, 2.228, 2.015, 6.094, 6.966, 0.153, 0.772, 4.025, 2.406,
                          6.749, 7.507, 1.929, 0.095, 5.987, 6.327, 9.056, 2.758, 0.978, 2.360, 0.150, 1.974, 0.074],
                        [11.602, 4.702, 3.511, 2.670, 2.007, 3.779, 1.950, 7.232, 6.286, 0.597, 0.590, 2.705, 4.374,
                         2.365, 6.264, 2.545, 0.173, 1.653, 7.755, 16.671, 1.487, 0.649, 6.753, 0.037, 1.620, 0.034]]) / 100
    freq_alpha = wrd2freq(wrd_lst)
    freq_alpha = freq2info(freq_alpha)
    wrd_count = {}
    with open('/han/2023_MCM-ICM_Problems/work/word_count.csv', 'r') as f:
        for lin in list(f.readlines())[1:]:
            lst = lin.split(',')
            lst[1] = float(lst[1])
            if len(lst[0]) == 5:
                wrd_count[lst[0]] = lst[1]
    count_s = 0.
    for k in wrd_count:
        count_s += wrd_count[k]
    for k in wrd_count:
        wrd_count[k] = wrd_count[k] / count_s
    freq_wrd = []
    for wrd in wrd_lst:
        if wrd in wrd_count:
            freq_wrd.append(wrd_count[wrd])
        else:
            freq_wrd.append(0)
    freq_wrd = np.array(freq_wrd).reshape(1, len(freq_wrd))
    freq_wrd = freq2info(freq_wrd)
    three_attr = np.concatenate((freq_alpha, freq_wrd), axis=0)
    three_attr_quan = quantify(three_attr, quan_num=10)
    one_hot_three_attr = np.zeros((three_attr.shape[0], three_attr.shape[1], 10))
    for i in range(three_attr.shape[0]):
        one_hot_three_attr[i, :, :] = np.eye(10)[list(three_attr_quan[i, :])]
    return one_hot_three_attr
  • 1
  • 2
  • 3
  • 4
  • 5
  • 6
  • 7
  • 8
  • 9
  • 10
  • 11
  • 12
  • 13
  • 14
  • 15
  • 16
  • 17
  • 18
  • 19
  • 20
  • 21
  • 22
  • 23
  • 24
  • 25
  • 26
  • 27
  • 28
  • 29
  • 30
  • 31
  • 32
  • 33

首先定义一个ltr_freq列表。ltr_freq是一个嵌套列表,里面的两个列表是从维基百科分别得到的26个字母频率和26个首字母频率,按照字母表的顺序进行排列:

ltr_freq = np.array([[8.167, 1.492, 2.782, 4.253, 12.702, 2.228, 2.015, 6.094, 6.966, 0.153, 0.772, 4.025, 2.406,
                          6.749, 7.507, 1.929, 0.095, 5.987, 6.327, 9.056, 2.758, 0.978, 2.360, 0.150, 1.974, 0.074],
                        [11.602, 4.702, 3.511, 2.670, 2.007, 3.779, 1.950, 7.232, 6.286, 0.597, 0.590, 2.705, 4.374,
                         2.365, 6.264, 2.545, 0.173, 1.653, 7.755, 16.671, 1.487, 0.649, 6.753, 0.037, 1.620, 0.034]]) / 100
  • 1
  • 2
  • 3
  • 4

然后求出每一个单词的前两个属性数据。

wrd2freq是定义的一个函数,输入一个单词列表,输出这个单词列表中每一个单词的平均字母频率和首字母频率组成的1*2的数组:

def wrd2freq(wrd_lst):
    freq_lst = []
    freq_ini_lst = []
    # compute the frequency
    for wrd in wrd_lst:
        freq_ini_lst.append(alpha2freq(wrd[0], freq_type='initial'))
        freq_multi = 1
        for alpha in wrd:
            freq_multi = freq_multi * alpha2freq(alpha)
        freq_lst.append(pow(freq_multi, 1/len(wrd)))
    return np.array([freq_lst, freq_ini_lst])
  • 1
  • 2
  • 3
  • 4
  • 5
  • 6
  • 7
  • 8
  • 9
  • 10
  • 11

然后将频率数据转换成信息量数据,方便进行比较和量化。

freq2info是定义的一个函数,输入一个由频率构成的任意维度的数组,输出对应的信息量数组:

def freq2info(freq_arr, epsilon=1e-50):
    info_arr = -np.log2(freq_arr + epsilon)
    return info_arr
  • 1
  • 2
  • 3

以上就完成了前两个属性数据的计算,接着计算第三个属性数据。

word_count.csv是从阿里云数据库中下载的对333,333个常见单词进行频次统计的文件,首先将其中字母数为5的单词提取出来,然后将它们进行归一化以方便后续处理:

wrd_count = {}
with open('/han/2023_MCM-ICM_Problems/work/word_count.csv', 'r') as f:
    for lin in list(f.readlines())[1:]:
        lst = lin.split(',')
        lst[1] = float(lst[1])
        if len(lst[0]) == 5:
            wrd_count[lst[0]] = lst[1]
count_s = 0.
for k in wrd_count:
    count_s += wrd_count[k]
for k in wrd_count:
    wrd_count[k] = wrd_count[k] / count_s
  • 1
  • 2
  • 3
  • 4
  • 5
  • 6
  • 7
  • 8
  • 9
  • 10
  • 11
  • 12

根据得到的单词频次表wrd_count,就可以把wrd_lst中的每一个单词的第三个属性求出:

freq_wrd = []
for wrd in wrd_lst:
    if wrd in wrd_count:
        freq_wrd.append(wrd_count[wrd])
    else:
        freq_wrd.append(0)
freq_wrd = np.array(freq_wrd).reshape(1, len(freq_wrd))
freq_wrd = freq2info(freq_wrd)
  • 1
  • 2
  • 3
  • 4
  • 5
  • 6
  • 7
  • 8

接着把三个属性合并,为了与后面的属性数据相匹配,也为了后续问题的解决,我们将属性数据进行量化。

quantify是定义的一个函数,可以对连续数据进行量化,量化等级自定,默认值是10:

def quantify(arr, quan_num=10, epsilon=1e-50):
    quan_arr = np.zeros(arr.shape)
    quan_gap = int(np.ceil(100 / quan_num))
    q = range(quan_gap, 100, quan_gap)
    thresh = np.percentile(arr, q, axis=1).T
    for i in range(arr.shape[0]):
        quan_arr[i] = np.searchsorted(thresh[i], arr[i])
    return np.int32(quan_arr)
  • 1
  • 2
  • 3
  • 4
  • 5
  • 6
  • 7
  • 8

可以看到,我们采用的标准是n分位,使用了numpy的内置函数percentile和searchsorted,作用分别是量化阈值的确定和数据量化等级的确定。

最后是对三个属性数据进行独热编码(one-hot coding),最终使得三个属性共占有30bit的数据量:

one_hot_three_attr = np.zeros((three_attr.shape[0], three_attr.shape[1], 10))
for i in range(three_attr.shape[0]):
    one_hot_three_attr[i, :, :] = np.eye(10)[list(three_attr_quan[i, :])]
  • 1
  • 2
  • 3

得到单词的前四个属性数据

def get_fourth(wrd_lst):
    fourth_attr = []
    for wrd in wrd_lst:
        fourth_attr.append(repetition_type(wrd))
    one_hot_fourth_attr = np.eye(6)[fourth_attr]
    return one_hot_fourth_attr
  • 1
  • 2
  • 3
  • 4
  • 5
  • 6

第四个属性是单词中字母的重复性质,分成了三种情况:无重复、一对重复、多对重复或重复字母数不止两个。考虑到这三种情况出现的概率从大到小依次是无重复、一对重复、多对重复或重复字母数不止两个,因此把三种情况都使用相同的编码数量是不合理的。鉴于以上的考虑,我们把无重复的情况又分成三类:单词中包含一个元音、两个元音和其他情况,又把一对重复的情况分为元音重复和非元音重复这两个类别。相关代码如下:

def repetition_type(wrd):
    vowel = ['a', 'e', 'i', 'o', 'u']
    temp = []
    val = 0
    for alpha in wrd:
        if alpha not in temp:
            temp.append(alpha)
        else:
            if val == 3 or val == 4:
                val = 5
                break
            elif alpha in vowel:
                val = 3
            else:
                val = 4
    if val == 0:
        count = 0
        for alpha in wrd:
            if alpha in vowel:
                count += 1
        if count == 2:
            val = 1
        elif count != 1:
            val = 2
    return val
  • 1
  • 2
  • 3
  • 4
  • 5
  • 6
  • 7
  • 8
  • 9
  • 10
  • 11
  • 12
  • 13
  • 14
  • 15
  • 16
  • 17
  • 18
  • 19
  • 20
  • 21
  • 22
  • 23
  • 24
  • 25

然后就将第四个属性数据与前三个相连接,然后就将wrd_lst中每一个单词与它们的36bit的前四个属性数据对应起来,并且用‘,’隔开存入wrd_lst.txt中。部分结果展示如下:

在这里插入图片描述

得到单词的前五个属性数据

单词的第五个属性数据存储在cixing.txt中,里面标注了357个单词的词性,共10个类别。我们首先把第五个属性数据存入变量中并进行one-hot coding:

with open('work/cixing.txt', 'r') as f:
    cixing = []
    for lin in f.readlines():
        wrd = lin.split(',')
        cixing.append(int(wrd[1]))
one_hot_fifth_attr = np.int32(np.eye(10)[cixing])
  • 1
  • 2
  • 3
  • 4
  • 5
  • 6

我们再将第五个属性数据连着前四个属性数据存入wrd_lst.txt中:

with open('work/wrd_lst.txt', 'w') as f:
    for i in range(len(wrd_lst)):
        for j in range(one_hot_fifth_attr.shape[1]):
            wrd_lst[i] = wrd_lst[i] + str(one_hot_fifth_attr[i, j])
        f.write(wrd_lst[i])
        f.write('\n')
  • 1
  • 2
  • 3
  • 4
  • 5
  • 6

得到单词的所有六个属性数据

单词的第六个属性是词根词缀,共分了4类,存储在C_Data_Word_level.xlsx中。使用与前面相同的方法,最终wrd_lst.txt中存储了单词的所有属性数据,每个单词有10+10+10+6+10+4=50bit的数据量。部分结果展示如下:

在这里插入图片描述

二、神经网络

用来测试单个属性影响的数据集

为了测试单个属性对单词难度以及困难模式所占比例的影响,我们产生了50000个属性数据,产生的方法是将其中一个属性的取值固定,其他属性的取值随机并产生1000个数据数据,总共就产生了1000*50=50000个属性数据,将此数据集存入generate_attribute.txt中进行备用。

下面展示产生此数据集的核心代码。以下是固定属性的取值生成:

def generate_fix(feature_num, repete_times=1000):
    feature_num = int(feature_num)
    repete_times = int(repete_times)
    fixed_arr = np.zeros((feature_num * repete_times, feature_num))
    for i in range(feature_num):
        temp = np.eye(feature_num)[i]
        for j in range(repete_times):
            fixed_arr[i*repete_times+j, :] = temp
    return fixed_arr
  • 1
  • 2
  • 3
  • 4
  • 5
  • 6
  • 7
  • 8
  • 9

以下是随机属性的取值生成:

def generate_random(feature_num, repete_times):
    return np.eye(feature_num)[list(np.int32(np.random.rand(repete_times)*feature_num))]
  • 1
  • 2

使用单词的属性数据预测单词难度的模型

单词的难度分成了三个等级,存储在difficulty.txt文件中,我们把357个属性数据作为输入、357个难度数据作为输出来对后面的模型进行训练。对于训练集、验证集、测试集,做出了257:50:50的划分。

print(train_x.shape, train_y.shape, dev_x.shape, dev_y.shape, test_x.shape, test_y.shape)
  • 1
(257, 50) (257, 3) (50, 50) (50, 3) (50, 50) (50, 3)
  • 1

我们构建了一个两层的神经网络。

隐藏层有20个节点,采用reLu作为激活函数,除了计算效率的考虑,也考虑到50个属性数据可能有部分并不会对难度有明显的影响,因此要尽量抑制这部分的权值输出,采用20节点的下采样也是基于这个考量;

由于数据量较小,模型很容易出现过拟合,因此我们加入了熄灭概率设为0.2的Dropout层,是为了防止模型出现明显的过拟合,使其在验证集上的表现更好;

由于是多分类问题,因此输出层采用Softmax作为激活函数。

inputs = tf.keras.Input(shape=(50,), name="attr")
a = tfl.Dense(20, activation="relu", name="dense")(inputs)
a2 = tfl.Dropout(0.2)(a)
outputs = tfl.Dense(3, activation="softmax", name="classification")(a2)

model = tf.keras.Model(inputs=inputs, outputs=outputs)
model.summary()
  • 1
  • 2
  • 3
  • 4
  • 5
  • 6
  • 7
Model: "model"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
attr (InputLayer)            [(None, 50)]              0         
_________________________________________________________________
dense (Dense)                (None, 20)                1020      
_________________________________________________________________
dropout (Dropout)            (None, 20)                0         
_________________________________________________________________
classification (Dense)       (None, 3)                 63        
=================================================================
Total params: 1,083
Trainable params: 1,083
Non-trainable params: 0
_________________________________________________________________
  • 1
  • 2
  • 3
  • 4
  • 5
  • 6
  • 7
  • 8
  • 9
  • 10
  • 11
  • 12
  • 13
  • 14
  • 15
  • 16

然后对模型进行编译,优化算法选用了RMSProp,学习率采用了默认的0.001,损失函数选择CategoricalCrossentropy。

model.compile(
    optimizer=tf.keras.optimizers.RMSprop(),
    loss = tf.keras.losses.CategoricalCrossentropy(),
    metrics=[tf.keras.metrics.CategoricalAccuracy()]
)
  • 1
  • 2
  • 3
  • 4
  • 5

接着是训练模型,batch_size选择为100,取值过小会使得模型出现过拟合;epochs选择为100,过大会使得模型在验证集上表现不佳。

history = model.fit(
    train_x,
    train_y,
    batch_size=100,
    epochs=100,
    validation_data=(dev_x, dev_y)
)
  • 1
  • 2
  • 3
  • 4
  • 5
  • 6
  • 7
Epoch 1/100
3/3 [==============================] - 2s 50ms/step - loss: 1.0592 - categorical_accuracy: 0.4669 - val_loss: 1.0435 - val_categorical_accuracy: 0.5400
Epoch 2/100
3/3 [==============================] - 0s 10ms/step - loss: 0.9874 - categorical_accuracy: 0.5331 - val_loss: 1.0210 - val_categorical_accuracy: 0.5400
Epoch 3/100
3/3 [==============================] - 0s 10ms/step - loss: 0.9948 - categorical_accuracy: 0.5409 - val_loss: 1.0043 - val_categorical_accuracy: 0.5200
Epoch 4/100
3/3 [==============================] - 0s 9ms/step - loss: 0.9504 - categorical_accuracy: 0.5759 - val_loss: 0.9923 - val_categorical_accuracy: 0.5400
Epoch 5/100
3/3 [==============================] - 0s 10ms/step - loss: 0.9636 - categorical_accuracy: 0.5642 - val_loss: 0.9823 - val_categorical_accuracy: 0.5400
Epoch 6/100
3/3 [==============================] - 0s 9ms/step - loss: 0.9489 - categorical_accuracy: 0.5798 - val_loss: 0.9737 - val_categorical_accuracy: 0.5600
Epoch 7/100
3/3 [==============================] - 0s 10ms/step - loss: 0.9374 - categorical_accuracy: 0.6109 - val_loss: 0.9651 - val_categorical_accuracy: 0.5600
Epoch 8/100
3/3 [==============================] - 0s 10ms/step - loss: 0.9192 - categorical_accuracy: 0.6381 - val_loss: 0.9582 - val_categorical_accuracy: 0.5800
Epoch 9/100
3/3 [==============================] - 0s 9ms/step - loss: 0.9144 - categorical_accuracy: 0.6381 - val_loss: 0.9517 - val_categorical_accuracy: 0.5800
Epoch 10/100
3/3 [==============================] - 0s 10ms/step - loss: 0.9131 - categorical_accuracy: 0.6265 - val_loss: 0.9474 - val_categorical_accuracy: 0.5800
Epoch 11/100
3/3 [==============================] - 0s 9ms/step - loss: 0.8943 - categorical_accuracy: 0.6109 - val_loss: 0.9429 - val_categorical_accuracy: 0.5800
Epoch 12/100
3/3 [==============================] - 0s 10ms/step - loss: 0.8915 - categorical_accuracy: 0.6226 - val_loss: 0.9384 - val_categorical_accuracy: 0.5800
Epoch 13/100
3/3 [==============================] - 0s 10ms/step - loss: 0.9071 - categorical_accuracy: 0.6304 - val_loss: 0.9347 - val_categorical_accuracy: 0.5800
Epoch 14/100
3/3 [==============================] - 0s 9ms/step - loss: 0.8699 - categorical_accuracy: 0.6342 - val_loss: 0.9326 - val_categorical_accuracy: 0.5800
Epoch 15/100
3/3 [==============================] - 0s 10ms/step - loss: 0.8868 - categorical_accuracy: 0.6342 - val_loss: 0.9305 - val_categorical_accuracy: 0.6200
Epoch 16/100
3/3 [==============================] - 0s 10ms/step - loss: 0.8650 - categorical_accuracy: 0.6187 - val_loss: 0.9285 - val_categorical_accuracy: 0.6200
Epoch 17/100
3/3 [==============================] - 0s 10ms/step - loss: 0.8873 - categorical_accuracy: 0.6576 - val_loss: 0.9258 - val_categorical_accuracy: 0.6200
Epoch 18/100
3/3 [==============================] - 0s 11ms/step - loss: 0.8771 - categorical_accuracy: 0.6537 - val_loss: 0.9234 - val_categorical_accuracy: 0.6200
Epoch 19/100
3/3 [==============================] - 0s 10ms/step - loss: 0.8661 - categorical_accuracy: 0.6226 - val_loss: 0.9207 - val_categorical_accuracy: 0.6200
Epoch 20/100
3/3 [==============================] - 0s 10ms/step - loss: 0.8542 - categorical_accuracy: 0.6342 - val_loss: 0.9198 - val_categorical_accuracy: 0.6200
Epoch 21/100
3/3 [==============================] - 0s 10ms/step - loss: 0.8623 - categorical_accuracy: 0.6109 - val_loss: 0.9187 - val_categorical_accuracy: 0.6200
Epoch 22/100
3/3 [==============================] - 0s 10ms/step - loss: 0.8529 - categorical_accuracy: 0.6265 - val_loss: 0.9179 - val_categorical_accuracy: 0.6200
Epoch 23/100
3/3 [==============================] - 0s 9ms/step - loss: 0.8367 - categorical_accuracy: 0.6342 - val_loss: 0.9163 - val_categorical_accuracy: 0.6200
Epoch 24/100
3/3 [==============================] - 0s 10ms/step - loss: 0.8512 - categorical_accuracy: 0.6226 - val_loss: 0.9152 - val_categorical_accuracy: 0.6200
Epoch 25/100
3/3 [==============================] - 0s 9ms/step - loss: 0.8598 - categorical_accuracy: 0.6265 - val_loss: 0.9132 - val_categorical_accuracy: 0.6200
Epoch 26/100
3/3 [==============================] - 0s 10ms/step - loss: 0.8384 - categorical_accuracy: 0.6537 - val_loss: 0.9141 - val_categorical_accuracy: 0.6200
Epoch 27/100
3/3 [==============================] - 0s 10ms/step - loss: 0.8254 - categorical_accuracy: 0.6420 - val_loss: 0.9119 - val_categorical_accuracy: 0.6200
Epoch 28/100
3/3 [==============================] - 0s 10ms/step - loss: 0.8202 - categorical_accuracy: 0.6265 - val_loss: 0.9114 - val_categorical_accuracy: 0.6200
Epoch 29/100
3/3 [==============================] - 0s 11ms/step - loss: 0.8306 - categorical_accuracy: 0.6381 - val_loss: 0.9087 - val_categorical_accuracy: 0.6400
Epoch 30/100
3/3 [==============================] - 0s 10ms/step - loss: 0.8288 - categorical_accuracy: 0.6537 - val_loss: 0.9079 - val_categorical_accuracy: 0.6400
Epoch 31/100
3/3 [==============================] - 0s 9ms/step - loss: 0.7863 - categorical_accuracy: 0.6576 - val_loss: 0.9089 - val_categorical_accuracy: 0.6400
Epoch 32/100
3/3 [==============================] - 0s 10ms/step - loss: 0.8125 - categorical_accuracy: 0.6654 - val_loss: 0.9097 - val_categorical_accuracy: 0.6400
Epoch 33/100
3/3 [==============================] - 0s 11ms/step - loss: 0.8087 - categorical_accuracy: 0.6381 - val_loss: 0.9090 - val_categorical_accuracy: 0.6400
Epoch 34/100
3/3 [==============================] - 0s 12ms/step - loss: 0.7930 - categorical_accuracy: 0.6615 - val_loss: 0.9079 - val_categorical_accuracy: 0.6400
Epoch 35/100
3/3 [==============================] - 0s 11ms/step - loss: 0.8023 - categorical_accuracy: 0.6693 - val_loss: 0.9068 - val_categorical_accuracy: 0.6400
Epoch 36/100
3/3 [==============================] - 0s 11ms/step - loss: 0.7835 - categorical_accuracy: 0.6732 - val_loss: 0.9078 - val_categorical_accuracy: 0.6400
Epoch 37/100
3/3 [==============================] - 0s 10ms/step - loss: 0.7824 - categorical_accuracy: 0.6615 - val_loss: 0.9081 - val_categorical_accuracy: 0.6400
Epoch 38/100
3/3 [==============================] - 0s 9ms/step - loss: 0.7853 - categorical_accuracy: 0.6576 - val_loss: 0.9075 - val_categorical_accuracy: 0.6400
Epoch 39/100
3/3 [==============================] - 0s 10ms/step - loss: 0.7921 - categorical_accuracy: 0.6809 - val_loss: 0.9063 - val_categorical_accuracy: 0.6400
Epoch 40/100
3/3 [==============================] - 0s 9ms/step - loss: 0.7881 - categorical_accuracy: 0.6537 - val_loss: 0.9053 - val_categorical_accuracy: 0.6400
Epoch 41/100
3/3 [==============================] - 0s 10ms/step - loss: 0.7703 - categorical_accuracy: 0.6381 - val_loss: 0.9052 - val_categorical_accuracy: 0.6400
Epoch 42/100
3/3 [==============================] - 0s 9ms/step - loss: 0.7927 - categorical_accuracy: 0.6576 - val_loss: 0.9048 - val_categorical_accuracy: 0.6400
Epoch 43/100
3/3 [==============================] - 0s 10ms/step - loss: 0.7570 - categorical_accuracy: 0.6809 - val_loss: 0.9048 - val_categorical_accuracy: 0.6400
Epoch 44/100
3/3 [==============================] - 0s 10ms/step - loss: 0.7774 - categorical_accuracy: 0.6770 - val_loss: 0.9032 - val_categorical_accuracy: 0.6400
Epoch 45/100
3/3 [==============================] - 0s 11ms/step - loss: 0.7590 - categorical_accuracy: 0.6732 - val_loss: 0.9041 - val_categorical_accuracy: 0.6400
Epoch 46/100
3/3 [==============================] - 0s 11ms/step - loss: 0.7656 - categorical_accuracy: 0.6732 - val_loss: 0.9027 - val_categorical_accuracy: 0.6400
Epoch 47/100
3/3 [==============================] - 0s 11ms/step - loss: 0.7748 - categorical_accuracy: 0.6537 - val_loss: 0.9046 - val_categorical_accuracy: 0.6400
Epoch 48/100
3/3 [==============================] - 0s 10ms/step - loss: 0.7458 - categorical_accuracy: 0.6965 - val_loss: 0.9040 - val_categorical_accuracy: 0.6400
Epoch 49/100
3/3 [==============================] - 0s 11ms/step - loss: 0.7571 - categorical_accuracy: 0.6887 - val_loss: 0.9048 - val_categorical_accuracy: 0.6400
Epoch 50/100
3/3 [==============================] - 0s 7ms/step - loss: 0.7561 - categorical_accuracy: 0.6576 - val_loss: 0.9049 - val_categorical_accuracy: 0.6400
Epoch 51/100
3/3 [==============================] - 0s 10ms/step - loss: 0.7383 - categorical_accuracy: 0.6732 - val_loss: 0.9044 - val_categorical_accuracy: 0.6400
Epoch 52/100
3/3 [==============================] - 0s 11ms/step - loss: 0.7562 - categorical_accuracy: 0.6576 - val_loss: 0.9040 - val_categorical_accuracy: 0.6400
Epoch 53/100
3/3 [==============================] - 0s 13ms/step - loss: 0.7501 - categorical_accuracy: 0.6732 - val_loss: 0.9041 - val_categorical_accuracy: 0.6400
Epoch 54/100
3/3 [==============================] - 0s 10ms/step - loss: 0.7584 - categorical_accuracy: 0.6693 - val_loss: 0.9043 - val_categorical_accuracy: 0.6400
Epoch 55/100
3/3 [==============================] - 0s 10ms/step - loss: 0.7550 - categorical_accuracy: 0.6693 - val_loss: 0.9032 - val_categorical_accuracy: 0.6600
Epoch 56/100
3/3 [==============================] - 0s 10ms/step - loss: 0.7415 - categorical_accuracy: 0.6809 - val_loss: 0.9031 - val_categorical_accuracy: 0.6600
Epoch 57/100
3/3 [==============================] - 0s 10ms/step - loss: 0.7240 - categorical_accuracy: 0.7004 - val_loss: 0.9038 - val_categorical_accuracy: 0.6600
Epoch 58/100
3/3 [==============================] - 0s 10ms/step - loss: 0.7416 - categorical_accuracy: 0.6770 - val_loss: 0.9025 - val_categorical_accuracy: 0.6600
Epoch 59/100
3/3 [==============================] - 0s 10ms/step - loss: 0.7200 - categorical_accuracy: 0.6887 - val_loss: 0.9030 - val_categorical_accuracy: 0.6600
Epoch 60/100
3/3 [==============================] - 0s 8ms/step - loss: 0.7331 - categorical_accuracy: 0.6615 - val_loss: 0.9029 - val_categorical_accuracy: 0.6600
Epoch 61/100
3/3 [==============================] - 0s 10ms/step - loss: 0.7239 - categorical_accuracy: 0.6848 - val_loss: 0.9019 - val_categorical_accuracy: 0.6800
Epoch 62/100
3/3 [==============================] - 0s 10ms/step - loss: 0.7344 - categorical_accuracy: 0.6848 - val_loss: 0.9025 - val_categorical_accuracy: 0.6800
Epoch 63/100
3/3 [==============================] - 0s 8ms/step - loss: 0.7230 - categorical_accuracy: 0.7082 - val_loss: 0.9011 - val_categorical_accuracy: 0.6600
Epoch 64/100
3/3 [==============================] - 0s 10ms/step - loss: 0.7273 - categorical_accuracy: 0.7121 - val_loss: 0.9010 - val_categorical_accuracy: 0.6600
Epoch 65/100
3/3 [==============================] - 0s 7ms/step - loss: 0.7212 - categorical_accuracy: 0.7004 - val_loss: 0.9016 - val_categorical_accuracy: 0.6600
Epoch 66/100
3/3 [==============================] - 0s 10ms/step - loss: 0.7336 - categorical_accuracy: 0.7082 - val_loss: 0.9025 - val_categorical_accuracy: 0.6600
Epoch 67/100
3/3 [==============================] - 0s 7ms/step - loss: 0.6868 - categorical_accuracy: 0.7160 - val_loss: 0.9012 - val_categorical_accuracy: 0.6600
Epoch 68/100
3/3 [==============================] - 0s 10ms/step - loss: 0.7064 - categorical_accuracy: 0.7276 - val_loss: 0.9028 - val_categorical_accuracy: 0.6600
Epoch 69/100
3/3 [==============================] - 0s 8ms/step - loss: 0.7198 - categorical_accuracy: 0.6848 - val_loss: 0.9013 - val_categorical_accuracy: 0.6600
Epoch 70/100
3/3 [==============================] - 0s 10ms/step - loss: 0.7017 - categorical_accuracy: 0.7121 - val_loss: 0.8987 - val_categorical_accuracy: 0.6600
Epoch 71/100
3/3 [==============================] - 0s 10ms/step - loss: 0.7021 - categorical_accuracy: 0.6887 - val_loss: 0.9006 - val_categorical_accuracy: 0.6600
Epoch 72/100
3/3 [==============================] - 0s 11ms/step - loss: 0.7048 - categorical_accuracy: 0.6809 - val_loss: 0.8985 - val_categorical_accuracy: 0.6600
Epoch 73/100
3/3 [==============================] - 0s 10ms/step - loss: 0.6886 - categorical_accuracy: 0.6848 - val_loss: 0.8995 - val_categorical_accuracy: 0.6600
Epoch 74/100
3/3 [==============================] - 0s 10ms/step - loss: 0.6854 - categorical_accuracy: 0.7004 - val_loss: 0.8998 - val_categorical_accuracy: 0.6600
Epoch 75/100
3/3 [==============================] - 0s 11ms/step - loss: 0.7015 - categorical_accuracy: 0.7315 - val_loss: 0.8988 - val_categorical_accuracy: 0.6600
Epoch 76/100
3/3 [==============================] - 0s 7ms/step - loss: 0.6699 - categorical_accuracy: 0.7082 - val_loss: 0.8995 - val_categorical_accuracy: 0.6600
Epoch 77/100
3/3 [==============================] - 0s 11ms/step - loss: 0.6890 - categorical_accuracy: 0.6926 - val_loss: 0.9010 - val_categorical_accuracy: 0.6600
Epoch 78/100
3/3 [==============================] - 0s 10ms/step - loss: 0.6754 - categorical_accuracy: 0.7121 - val_loss: 0.9021 - val_categorical_accuracy: 0.6600
Epoch 79/100
3/3 [==============================] - 0s 10ms/step - loss: 0.6896 - categorical_accuracy: 0.6965 - val_loss: 0.9032 - val_categorical_accuracy: 0.6600
Epoch 80/100
3/3 [==============================] - 0s 10ms/step - loss: 0.6657 - categorical_accuracy: 0.7237 - val_loss: 0.9022 - val_categorical_accuracy: 0.6600
Epoch 81/100
3/3 [==============================] - 0s 10ms/step - loss: 0.6846 - categorical_accuracy: 0.7160 - val_loss: 0.8999 - val_categorical_accuracy: 0.6600
Epoch 82/100
3/3 [==============================] - 0s 10ms/step - loss: 0.7089 - categorical_accuracy: 0.6770 - val_loss: 0.9001 - val_categorical_accuracy: 0.6600
Epoch 83/100
3/3 [==============================] - 0s 10ms/step - loss: 0.6736 - categorical_accuracy: 0.7082 - val_loss: 0.9011 - val_categorical_accuracy: 0.6600
Epoch 84/100
3/3 [==============================] - 0s 10ms/step - loss: 0.6682 - categorical_accuracy: 0.7315 - val_loss: 0.9011 - val_categorical_accuracy: 0.6600
Epoch 85/100
3/3 [==============================] - 0s 10ms/step - loss: 0.6568 - categorical_accuracy: 0.7043 - val_loss: 0.9006 - val_categorical_accuracy: 0.6600
Epoch 86/100
3/3 [==============================] - 0s 10ms/step - loss: 0.6710 - categorical_accuracy: 0.7043 - val_loss: 0.9018 - val_categorical_accuracy: 0.6600
Epoch 87/100
3/3 [==============================] - 0s 10ms/step - loss: 0.6596 - categorical_accuracy: 0.7082 - val_loss: 0.9018 - val_categorical_accuracy: 0.6600
Epoch 88/100
3/3 [==============================] - 0s 10ms/step - loss: 0.6542 - categorical_accuracy: 0.7354 - val_loss: 0.9035 - val_categorical_accuracy: 0.6600
Epoch 89/100
3/3 [==============================] - 0s 11ms/step - loss: 0.6577 - categorical_accuracy: 0.7043 - val_loss: 0.9040 - val_categorical_accuracy: 0.6600
Epoch 90/100
3/3 [==============================] - 0s 10ms/step - loss: 0.6222 - categorical_accuracy: 0.7549 - val_loss: 0.9035 - val_categorical_accuracy: 0.6600
Epoch 91/100
3/3 [==============================] - 0s 10ms/step - loss: 0.6626 - categorical_accuracy: 0.7121 - val_loss: 0.9039 - val_categorical_accuracy: 0.6600
Epoch 92/100
3/3 [==============================] - 0s 11ms/step - loss: 0.6540 - categorical_accuracy: 0.7237 - val_loss: 0.9026 - val_categorical_accuracy: 0.6600
Epoch 93/100
3/3 [==============================] - 0s 10ms/step - loss: 0.6456 - categorical_accuracy: 0.7043 - val_loss: 0.9026 - val_categorical_accuracy: 0.6400
Epoch 94/100
3/3 [==============================] - 0s 11ms/step - loss: 0.6379 - categorical_accuracy: 0.7121 - val_loss: 0.9054 - val_categorical_accuracy: 0.6400
Epoch 95/100
3/3 [==============================] - 0s 10ms/step - loss: 0.6305 - categorical_accuracy: 0.7237 - val_loss: 0.9076 - val_categorical_accuracy: 0.6400
Epoch 96/100
3/3 [==============================] - 0s 10ms/step - loss: 0.6414 - categorical_accuracy: 0.7198 - val_loss: 0.9082 - val_categorical_accuracy: 0.6400
Epoch 97/100
3/3 [==============================] - 0s 10ms/step - loss: 0.6272 - categorical_accuracy: 0.7393 - val_loss: 0.9069 - val_categorical_accuracy: 0.6400
Epoch 98/100
3/3 [==============================] - 0s 10ms/step - loss: 0.6348 - categorical_accuracy: 0.7043 - val_loss: 0.9072 - val_categorical_accuracy: 0.6400
Epoch 99/100
3/3 [==============================] - 0s 11ms/step - loss: 0.6152 - categorical_accuracy: 0.7432 - val_loss: 0.9067 - val_categorical_accuracy: 0.6400
Epoch 100/100
3/3 [==============================] - 0s 8ms/step - loss: 0.6146 - categorical_accuracy: 0.7588 - val_loss: 0.9079 - val_categorical_accuracy: 0.6400
  • 1
  • 2
  • 3
  • 4
  • 5
  • 6
  • 7
  • 8
  • 9
  • 10
  • 11
  • 12
  • 13
  • 14
  • 15
  • 16
  • 17
  • 18
  • 19
  • 20
  • 21
  • 22
  • 23
  • 24
  • 25
  • 26
  • 27
  • 28
  • 29
  • 30
  • 31
  • 32
  • 33
  • 34
  • 35
  • 36
  • 37
  • 38
  • 39
  • 40
  • 41
  • 42
  • 43
  • 44
  • 45
  • 46
  • 47
  • 48
  • 49
  • 50
  • 51
  • 52
  • 53
  • 54
  • 55
  • 56
  • 57
  • 58
  • 59
  • 60
  • 61
  • 62
  • 63
  • 64
  • 65
  • 66
  • 67
  • 68
  • 69
  • 70
  • 71
  • 72
  • 73
  • 74
  • 75
  • 76
  • 77
  • 78
  • 79
  • 80
  • 81
  • 82
  • 83
  • 84
  • 85
  • 86
  • 87
  • 88
  • 89
  • 90
  • 91
  • 92
  • 93
  • 94
  • 95
  • 96
  • 97
  • 98
  • 99
  • 100
  • 101
  • 102
  • 103
  • 104
  • 105
  • 106
  • 107
  • 108
  • 109
  • 110
  • 111
  • 112
  • 113
  • 114
  • 115
  • 116
  • 117
  • 118
  • 119
  • 120
  • 121
  • 122
  • 123
  • 124
  • 125
  • 126
  • 127
  • 128
  • 129
  • 130
  • 131
  • 132
  • 133
  • 134
  • 135
  • 136
  • 137
  • 138
  • 139
  • 140
  • 141
  • 142
  • 143
  • 144
  • 145
  • 146
  • 147
  • 148
  • 149
  • 150
  • 151
  • 152
  • 153
  • 154
  • 155
  • 156
  • 157
  • 158
  • 159
  • 160
  • 161
  • 162
  • 163
  • 164
  • 165
  • 166
  • 167
  • 168
  • 169
  • 170
  • 171
  • 172
  • 173
  • 174
  • 175
  • 176
  • 177
  • 178
  • 179
  • 180
  • 181
  • 182
  • 183
  • 184
  • 185
  • 186
  • 187
  • 188
  • 189
  • 190
  • 191
  • 192
  • 193
  • 194
  • 195
  • 196
  • 197
  • 198
  • 199
  • 200

绘制出损失值和准确度随着迭代次数变化的曲线

loss = history.history['loss']
val_loss = history.history['val_loss']
acc = history.history['categorical_accuracy']
val_acc = history.history['val_categorical_accuracy']

plt.subplot(1, 2, 1)
plt.plot(loss, label='Training loss')
plt.plot(val_loss, label='Validation loss')
plt.title('Training and Validation loss')
plt.legend()

plt.subplot(1, 2, 2)
plt.plot(acc, label='Training acc')
plt.plot(val_acc, label='Validation acc')
plt.title('Training and Validation acc')
plt.legend()
plt.show()
  • 1
  • 2
  • 3
  • 4
  • 5
  • 6
  • 7
  • 8
  • 9
  • 10
  • 11
  • 12
  • 13
  • 14
  • 15
  • 16
  • 17

在这里插入图片描述

可以看到Validation loss和Training loss之间还是存在这比较大的gap,这说明模型还是存在过拟合的情况,其中主要的原因是数据量过小。

接着得到训练后的模型在测试集上的评估结果:

results = model.evaluate(test_x, test_y)
  • 1
2/2 [==============================] - 0s 2ms/step - loss: 0.6508 - categorical_accuracy: 0.7800
  • 1

可以看到,根据在测试集上的评估结果,利用该模型对单词难度的预测的准确度达到了0.78,已经是不错的结果。

下面的工作就是将generate_attribute.txt中的数据喂入模型输出预测结果,最后取平均得到单个属性对单词难度的影响,将最终的结果存储到one_attr_to_diffi.txt中,具体的代码实现这里不再赘述,仅展示此文件中的结果。

在这里插入图片描述

使用单词的属性数据预测困难模式所占比例的模型

我们观察到困难模式所占比例受日期的影响比较大,尤其是在前期,基本上是随着日期的推进平稳上升,为了尽量去除日期造成的影响,我们只使用后面的103个数据进行模型的训练。困难模式所占比例的数据存储在percent_part.xlsx中。我们将训练集、验证集、测试集做出了83:10:10的划分。

print(train_x.shape, train_y.shape, dev_x.shape, dev_y.shape, test_x.shape, test_y.shape)
  • 1
(83, 50) (83, 1) (10, 50) (10, 1) (10, 50) (10, 1)
  • 1

我们采用线性回归的模型,由于输出y落在0~1的区间内,因此采用sigmoid作为激活函数比较合适。

inputs = tf.keras.Input(shape=(50,), name="attr")
outputs = tfl.Dense(1, activation="sigmoid", name="regression")(inputs)

model = tf.keras.Model(inputs=inputs, outputs=outputs)
model.summary()
  • 1
  • 2
  • 3
  • 4
  • 5
Model: "model_1"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
attr (InputLayer)            [(None, 50)]              0         
_________________________________________________________________
regression (Dense)           (None, 1)                 51        
=================================================================
Total params: 51
Trainable params: 51
Non-trainable params: 0
_________________________________________________________________
  • 1
  • 2
  • 3
  • 4
  • 5
  • 6
  • 7
  • 8
  • 9
  • 10
  • 11
  • 12

然后对模型进行编译,优化算法使用RMSProp,学习率采用默认的0.001,损失函数选择mse(均方误差),评估指标同时选用mae(平均绝对值误差)和mse。

model.compile(
    optimizer=tf.keras.optimizers.RMSprop(),
    loss = 'mse',
    metrics=['mae', 'mse']
)
  • 1
  • 2
  • 3
  • 4
  • 5

接着是训练模型,batch_size选择为20,取值过小会使得模型出现过拟合,取值过大会使得模型出现欠拟合;epochs选择为200,过小会使得模型出现欠拟合。

需要注意的是,与上一个模型相比,这个模型抑制过拟合的力度更轻,主要是基于以下两点原因:

1. 这是一个线性回归问题,相对来说更容易出现欠拟合的问题,需要较多的迭代次数来使得模型更好地拟合训练数据;
2. 这个模型的参数量远小于前一个模型,因此出现过拟合问题的可能性不大。
history = model.fit(
    train_x,
    train_y,
    batch_size=20,
    epochs=200,
    validation_data=(dev_x, dev_y)
)
  • 1
  • 2
  • 3
  • 4
  • 5
  • 6
  • 7
Epoch 1/200
5/5 [==============================] - 0s 22ms/step - loss: 0.1166 - mae: 0.3317 - mse: 0.1166 - val_loss: 0.0980 - val_mae: 0.3030 - val_mse: 0.0980
Epoch 2/200
5/5 [==============================] - 0s 7ms/step - loss: 0.1085 - mae: 0.3194 - mse: 0.1085 - val_loss: 0.0921 - val_mae: 0.2935 - val_mse: 0.0921
Epoch 3/200
5/5 [==============================] - 0s 6ms/step - loss: 0.1029 - mae: 0.3107 - mse: 0.1029 - val_loss: 0.0877 - val_mae: 0.2859 - val_mse: 0.0877
Epoch 4/200
5/5 [==============================] - 0s 6ms/step - loss: 0.0982 - mae: 0.3031 - mse: 0.0982 - val_loss: 0.0833 - val_mae: 0.2785 - val_mse: 0.0833
Epoch 5/200
5/5 [==============================] - 0s 6ms/step - loss: 0.0938 - mae: 0.2959 - mse: 0.0938 - val_loss: 0.0795 - val_mae: 0.2717 - val_mse: 0.0795
Epoch 6/200
5/5 [==============================] - 0s 6ms/step - loss: 0.0896 - mae: 0.2890 - mse: 0.0896 - val_loss: 0.0759 - val_mae: 0.2652 - val_mse: 0.0759
Epoch 7/200
5/5 [==============================] - 0s 6ms/step - loss: 0.0857 - mae: 0.2823 - mse: 0.0857 - val_loss: 0.0725 - val_mae: 0.2589 - val_mse: 0.0725
Epoch 8/200
5/5 [==============================] - 0s 6ms/step - loss: 0.0820 - mae: 0.2758 - mse: 0.0820 - val_loss: 0.0691 - val_mae: 0.2526 - val_mse: 0.0691
Epoch 9/200
5/5 [==============================] - 0s 6ms/step - loss: 0.0783 - mae: 0.2691 - mse: 0.0783 - val_loss: 0.0659 - val_mae: 0.2464 - val_mse: 0.0659
Epoch 10/200
5/5 [==============================] - 0s 6ms/step - loss: 0.0749 - mae: 0.2629 - mse: 0.0749 - val_loss: 0.0626 - val_mae: 0.2399 - val_mse: 0.0626
Epoch 11/200
5/5 [==============================] - 0s 5ms/step - loss: 0.0715 - mae: 0.2565 - mse: 0.0715 - val_loss: 0.0597 - val_mae: 0.2340 - val_mse: 0.0597
Epoch 12/200
5/5 [==============================] - 0s 6ms/step - loss: 0.0684 - mae: 0.2505 - mse: 0.0684 - val_loss: 0.0569 - val_mae: 0.2280 - val_mse: 0.0569
Epoch 13/200
5/5 [==============================] - 0s 6ms/step - loss: 0.0652 - mae: 0.2444 - mse: 0.0652 - val_loss: 0.0540 - val_mae: 0.2217 - val_mse: 0.0540
Epoch 14/200
5/5 [==============================] - 0s 5ms/step - loss: 0.0622 - mae: 0.2383 - mse: 0.0622 - val_loss: 0.0514 - val_mae: 0.2160 - val_mse: 0.0514
Epoch 15/200
5/5 [==============================] - 0s 5ms/step - loss: 0.0593 - mae: 0.2324 - mse: 0.0593 - val_loss: 0.0490 - val_mae: 0.2107 - val_mse: 0.0490
Epoch 16/200
5/5 [==============================] - 0s 5ms/step - loss: 0.0567 - mae: 0.2269 - mse: 0.0567 - val_loss: 0.0464 - val_mae: 0.2046 - val_mse: 0.0464
Epoch 17/200
5/5 [==============================] - 0s 5ms/step - loss: 0.0539 - mae: 0.2209 - mse: 0.0539 - val_loss: 0.0442 - val_mae: 0.1994 - val_mse: 0.0442
Epoch 18/200
5/5 [==============================] - 0s 5ms/step - loss: 0.0514 - mae: 0.2152 - mse: 0.0514 - val_loss: 0.0419 - val_mae: 0.1938 - val_mse: 0.0419
Epoch 19/200
5/5 [==============================] - 0s 5ms/step - loss: 0.0488 - mae: 0.2095 - mse: 0.0488 - val_loss: 0.0396 - val_mae: 0.1880 - val_mse: 0.0396
Epoch 20/200
5/5 [==============================] - 0s 5ms/step - loss: 0.0464 - mae: 0.2037 - mse: 0.0464 - val_loss: 0.0377 - val_mae: 0.1832 - val_mse: 0.0377
Epoch 21/200
5/5 [==============================] - 0s 5ms/step - loss: 0.0442 - mae: 0.1985 - mse: 0.0442 - val_loss: 0.0357 - val_mae: 0.1779 - val_mse: 0.0357
Epoch 22/200
5/5 [==============================] - 0s 5ms/step - loss: 0.0419 - mae: 0.1932 - mse: 0.0419 - val_loss: 0.0337 - val_mae: 0.1727 - val_mse: 0.0337
Epoch 23/200
5/5 [==============================] - 0s 7ms/step - loss: 0.0398 - mae: 0.1876 - mse: 0.0398 - val_loss: 0.0319 - val_mae: 0.1676 - val_mse: 0.0319
Epoch 24/200
5/5 [==============================] - 0s 6ms/step - loss: 0.0377 - mae: 0.1822 - mse: 0.0377 - val_loss: 0.0300 - val_mae: 0.1623 - val_mse: 0.0300
Epoch 25/200
5/5 [==============================] - 0s 6ms/step - loss: 0.0356 - mae: 0.1768 - mse: 0.0356 - val_loss: 0.0284 - val_mae: 0.1574 - val_mse: 0.0284
Epoch 26/200
5/5 [==============================] - 0s 6ms/step - loss: 0.0337 - mae: 0.1718 - mse: 0.0337 - val_loss: 0.0268 - val_mae: 0.1524 - val_mse: 0.0268
Epoch 27/200
5/5 [==============================] - 0s 5ms/step - loss: 0.0319 - mae: 0.1665 - mse: 0.0319 - val_loss: 0.0252 - val_mae: 0.1477 - val_mse: 0.0252
Epoch 28/200
5/5 [==============================] - 0s 5ms/step - loss: 0.0302 - mae: 0.1616 - mse: 0.0302 - val_loss: 0.0238 - val_mae: 0.1429 - val_mse: 0.0238
Epoch 29/200
5/5 [==============================] - 0s 6ms/step - loss: 0.0285 - mae: 0.1567 - mse: 0.0285 - val_loss: 0.0223 - val_mae: 0.1380 - val_mse: 0.0223
Epoch 30/200
5/5 [==============================] - 0s 6ms/step - loss: 0.0269 - mae: 0.1517 - mse: 0.0269 - val_loss: 0.0209 - val_mae: 0.1333 - val_mse: 0.0209
Epoch 31/200
5/5 [==============================] - 0s 6ms/step - loss: 0.0253 - mae: 0.1469 - mse: 0.0253 - val_loss: 0.0196 - val_mae: 0.1286 - val_mse: 0.0196
Epoch 32/200
5/5 [==============================] - 0s 6ms/step - loss: 0.0239 - mae: 0.1422 - mse: 0.0239 - val_loss: 0.0184 - val_mae: 0.1242 - val_mse: 0.0184
Epoch 33/200
5/5 [==============================] - 0s 6ms/step - loss: 0.0225 - mae: 0.1376 - mse: 0.0225 - val_loss: 0.0172 - val_mae: 0.1195 - val_mse: 0.0172
Epoch 34/200
5/5 [==============================] - 0s 6ms/step - loss: 0.0211 - mae: 0.1329 - mse: 0.0211 - val_loss: 0.0160 - val_mae: 0.1149 - val_mse: 0.0160
Epoch 35/200
5/5 [==============================] - 0s 6ms/step - loss: 0.0198 - mae: 0.1282 - mse: 0.0198 - val_loss: 0.0149 - val_mae: 0.1103 - val_mse: 0.0149
Epoch 36/200
5/5 [==============================] - 0s 6ms/step - loss: 0.0185 - mae: 0.1235 - mse: 0.0185 - val_loss: 0.0139 - val_mae: 0.1061 - val_mse: 0.0139
Epoch 37/200
5/5 [==============================] - 0s 5ms/step - loss: 0.0174 - mae: 0.1193 - mse: 0.0174 - val_loss: 0.0129 - val_mae: 0.1018 - val_mse: 0.0129
Epoch 38/200
5/5 [==============================] - 0s 5ms/step - loss: 0.0163 - mae: 0.1148 - mse: 0.0163 - val_loss: 0.0120 - val_mae: 0.0978 - val_mse: 0.0120
Epoch 39/200
5/5 [==============================] - 0s 5ms/step - loss: 0.0153 - mae: 0.1106 - mse: 0.0153 - val_loss: 0.0112 - val_mae: 0.0940 - val_mse: 0.0112
Epoch 40/200
5/5 [==============================] - 0s 5ms/step - loss: 0.0143 - mae: 0.1065 - mse: 0.0143 - val_loss: 0.0104 - val_mae: 0.0903 - val_mse: 0.0104
Epoch 41/200
5/5 [==============================] - 0s 5ms/step - loss: 0.0134 - mae: 0.1028 - mse: 0.0134 - val_loss: 0.0097 - val_mae: 0.0867 - val_mse: 0.0097
Epoch 42/200
5/5 [==============================] - 0s 5ms/step - loss: 0.0126 - mae: 0.0991 - mse: 0.0126 - val_loss: 0.0091 - val_mae: 0.0831 - val_mse: 0.0091
Epoch 43/200
5/5 [==============================] - 0s 5ms/step - loss: 0.0118 - mae: 0.0954 - mse: 0.0118 - val_loss: 0.0084 - val_mae: 0.0791 - val_mse: 0.0084
Epoch 44/200
5/5 [==============================] - 0s 5ms/step - loss: 0.0110 - mae: 0.0916 - mse: 0.0110 - val_loss: 0.0077 - val_mae: 0.0755 - val_mse: 0.0077
Epoch 45/200
5/5 [==============================] - 0s 5ms/step - loss: 0.0102 - mae: 0.0879 - mse: 0.0102 - val_loss: 0.0072 - val_mae: 0.0720 - val_mse: 0.0072
Epoch 46/200
5/5 [==============================] - 0s 5ms/step - loss: 0.0095 - mae: 0.0843 - mse: 0.0095 - val_loss: 0.0066 - val_mae: 0.0686 - val_mse: 0.0066
Epoch 47/200
5/5 [==============================] - 0s 5ms/step - loss: 0.0089 - mae: 0.0807 - mse: 0.0089 - val_loss: 0.0061 - val_mae: 0.0650 - val_mse: 0.0061
Epoch 48/200
5/5 [==============================] - 0s 5ms/step - loss: 0.0082 - mae: 0.0771 - mse: 0.0082 - val_loss: 0.0056 - val_mae: 0.0618 - val_mse: 0.0056
Epoch 49/200
5/5 [==============================] - 0s 5ms/step - loss: 0.0076 - mae: 0.0736 - mse: 0.0076 - val_loss: 0.0051 - val_mae: 0.0591 - val_mse: 0.0051
Epoch 50/200
5/5 [==============================] - 0s 5ms/step - loss: 0.0071 - mae: 0.0704 - mse: 0.0071 - val_loss: 0.0047 - val_mae: 0.0564 - val_mse: 0.0047
Epoch 51/200
5/5 [==============================] - 0s 5ms/step - loss: 0.0065 - mae: 0.0673 - mse: 0.0065 - val_loss: 0.0043 - val_mae: 0.0536 - val_mse: 0.0043
Epoch 52/200
5/5 [==============================] - 0s 5ms/step - loss: 0.0060 - mae: 0.0641 - mse: 0.0060 - val_loss: 0.0039 - val_mae: 0.0508 - val_mse: 0.0039
Epoch 53/200
5/5 [==============================] - 0s 5ms/step - loss: 0.0056 - mae: 0.0612 - mse: 0.0056 - val_loss: 0.0036 - val_mae: 0.0481 - val_mse: 0.0036
Epoch 54/200
5/5 [==============================] - 0s 5ms/step - loss: 0.0051 - mae: 0.0584 - mse: 0.0051 - val_loss: 0.0033 - val_mae: 0.0456 - val_mse: 0.0033
Epoch 55/200
5/5 [==============================] - 0s 5ms/step - loss: 0.0048 - mae: 0.0558 - mse: 0.0048 - val_loss: 0.0030 - val_mae: 0.0430 - val_mse: 0.0030
Epoch 56/200
5/5 [==============================] - 0s 5ms/step - loss: 0.0044 - mae: 0.0532 - mse: 0.0044 - val_loss: 0.0027 - val_mae: 0.0405 - val_mse: 0.0027
Epoch 57/200
5/5 [==============================] - 0s 5ms/step - loss: 0.0040 - mae: 0.0506 - mse: 0.0040 - val_loss: 0.0025 - val_mae: 0.0380 - val_mse: 0.0025
Epoch 58/200
5/5 [==============================] - 0s 5ms/step - loss: 0.0037 - mae: 0.0481 - mse: 0.0037 - val_loss: 0.0023 - val_mae: 0.0365 - val_mse: 0.0023
Epoch 59/200
5/5 [==============================] - 0s 5ms/step - loss: 0.0035 - mae: 0.0462 - mse: 0.0035 - val_loss: 0.0021 - val_mae: 0.0344 - val_mse: 0.0021
Epoch 60/200
5/5 [==============================] - 0s 5ms/step - loss: 0.0032 - mae: 0.0439 - mse: 0.0032 - val_loss: 0.0019 - val_mae: 0.0326 - val_mse: 0.0019
Epoch 61/200
5/5 [==============================] - 0s 5ms/step - loss: 0.0029 - mae: 0.0418 - mse: 0.0029 - val_loss: 0.0017 - val_mae: 0.0307 - val_mse: 0.0017
Epoch 62/200
5/5 [==============================] - 0s 5ms/step - loss: 0.0027 - mae: 0.0397 - mse: 0.0027 - val_loss: 0.0016 - val_mae: 0.0292 - val_mse: 0.0016
Epoch 63/200
5/5 [==============================] - 0s 5ms/step - loss: 0.0025 - mae: 0.0380 - mse: 0.0025 - val_loss: 0.0014 - val_mae: 0.0279 - val_mse: 0.0014
Epoch 64/200
5/5 [==============================] - 0s 5ms/step - loss: 0.0023 - mae: 0.0364 - mse: 0.0023 - val_loss: 0.0013 - val_mae: 0.0265 - val_mse: 0.0013
Epoch 65/200
5/5 [==============================] - 0s 5ms/step - loss: 0.0021 - mae: 0.0345 - mse: 0.0021 - val_loss: 0.0012 - val_mae: 0.0255 - val_mse: 0.0012
Epoch 66/200
5/5 [==============================] - 0s 5ms/step - loss: 0.0019 - mae: 0.0332 - mse: 0.0019 - val_loss: 0.0011 - val_mae: 0.0246 - val_mse: 0.0011
Epoch 67/200
5/5 [==============================] - 0s 6ms/step - loss: 0.0018 - mae: 0.0318 - mse: 0.0018 - val_loss: 0.0010 - val_mae: 0.0239 - val_mse: 0.0010
Epoch 68/200
5/5 [==============================] - 0s 5ms/step - loss: 0.0017 - mae: 0.0305 - mse: 0.0017 - val_loss: 9.2125e-04 - val_mae: 0.0229 - val_mse: 9.2125e-04
Epoch 69/200
5/5 [==============================] - 0s 5ms/step - loss: 0.0015 - mae: 0.0292 - mse: 0.0015 - val_loss: 8.4467e-04 - val_mae: 0.0221 - val_mse: 8.4467e-04
Epoch 70/200
5/5 [==============================] - 0s 5ms/step - loss: 0.0014 - mae: 0.0279 - mse: 0.0014 - val_loss: 7.8465e-04 - val_mae: 0.0215 - val_mse: 7.8465e-04
Epoch 71/200
5/5 [==============================] - 0s 5ms/step - loss: 0.0013 - mae: 0.0268 - mse: 0.0013 - val_loss: 7.4165e-04 - val_mae: 0.0211 - val_mse: 7.4165e-04
Epoch 72/200
5/5 [==============================] - 0s 5ms/step - loss: 0.0012 - mae: 0.0258 - mse: 0.0012 - val_loss: 7.0427e-04 - val_mae: 0.0210 - val_mse: 7.0427e-04
Epoch 73/200
5/5 [==============================] - 0s 5ms/step - loss: 0.0011 - mae: 0.0249 - mse: 0.0011 - val_loss: 6.6638e-04 - val_mae: 0.0208 - val_mse: 6.6638e-04
Epoch 74/200
5/5 [==============================] - 0s 5ms/step - loss: 0.0011 - mae: 0.0240 - mse: 0.0011 - val_loss: 6.5045e-04 - val_mae: 0.0208 - val_mse: 6.5045e-04
Epoch 75/200
5/5 [==============================] - 0s 6ms/step - loss: 0.0010 - mae: 0.0235 - mse: 0.0010 - val_loss: 6.2684e-04 - val_mae: 0.0205 - val_mse: 6.2684e-04
Epoch 76/200
5/5 [==============================] - 0s 7ms/step - loss: 9.5684e-04 - mae: 0.0229 - mse: 9.5684e-04 - val_loss: 6.1455e-04 - val_mae: 0.0204 - val_mse: 6.1455e-04
Epoch 77/200
5/5 [==============================] - 0s 7ms/step - loss: 9.1450e-04 - mae: 0.0224 - mse: 9.1450e-04 - val_loss: 6.0009e-04 - val_mae: 0.0201 - val_mse: 6.0009e-04
Epoch 78/200
5/5 [==============================] - 0s 6ms/step - loss: 8.7952e-04 - mae: 0.0218 - mse: 8.7952e-04 - val_loss: 5.8703e-04 - val_mae: 0.0200 - val_mse: 5.8703e-04
Epoch 79/200
5/5 [==============================] - 0s 6ms/step - loss: 8.4151e-04 - mae: 0.0214 - mse: 8.4151e-04 - val_loss: 5.7280e-04 - val_mae: 0.0197 - val_mse: 5.7280e-04
Epoch 80/200
5/5 [==============================] - 0s 5ms/step - loss: 8.0125e-04 - mae: 0.0210 - mse: 8.0125e-04 - val_loss: 5.6014e-04 - val_mae: 0.0195 - val_mse: 5.6014e-04
Epoch 81/200
5/5 [==============================] - 0s 5ms/step - loss: 7.6662e-04 - mae: 0.0206 - mse: 7.6662e-04 - val_loss: 5.4919e-04 - val_mae: 0.0192 - val_mse: 5.4919e-04
Epoch 82/200
5/5 [==============================] - 0s 5ms/step - loss: 7.2985e-04 - mae: 0.0201 - mse: 7.2985e-04 - val_loss: 5.3562e-04 - val_mae: 0.0189 - val_mse: 5.3562e-04
Epoch 83/200
5/5 [==============================] - 0s 5ms/step - loss: 7.0597e-04 - mae: 0.0198 - mse: 7.0597e-04 - val_loss: 5.2402e-04 - val_mae: 0.0186 - val_mse: 5.2402e-04
Epoch 84/200
5/5 [==============================] - 0s 5ms/step - loss: 6.7537e-04 - mae: 0.0193 - mse: 6.7537e-04 - val_loss: 5.0828e-04 - val_mae: 0.0183 - val_mse: 5.0828e-04
Epoch 85/200
5/5 [==============================] - 0s 6ms/step - loss: 6.4867e-04 - mae: 0.0190 - mse: 6.4867e-04 - val_loss: 4.9662e-04 - val_mae: 0.0182 - val_mse: 4.9662e-04
Epoch 86/200
5/5 [==============================] - 0s 5ms/step - loss: 6.2172e-04 - mae: 0.0186 - mse: 6.2172e-04 - val_loss: 4.8805e-04 - val_mae: 0.0181 - val_mse: 4.8805e-04
Epoch 87/200
5/5 [==============================] - 0s 5ms/step - loss: 5.9329e-04 - mae: 0.0182 - mse: 5.9329e-04 - val_loss: 4.7475e-04 - val_mae: 0.0179 - val_mse: 4.7475e-04
Epoch 88/200
5/5 [==============================] - 0s 6ms/step - loss: 5.6897e-04 - mae: 0.0178 - mse: 5.6897e-04 - val_loss: 4.6604e-04 - val_mae: 0.0177 - val_mse: 4.6604e-04
Epoch 89/200
5/5 [==============================] - 0s 6ms/step - loss: 5.4544e-04 - mae: 0.0175 - mse: 5.4544e-04 - val_loss: 4.5456e-04 - val_mae: 0.0175 - val_mse: 4.5456e-04
Epoch 90/200
5/5 [==============================] - 0s 5ms/step - loss: 5.2534e-04 - mae: 0.0171 - mse: 5.2534e-04 - val_loss: 4.4568e-04 - val_mae: 0.0175 - val_mse: 4.4568e-04
Epoch 91/200
5/5 [==============================] - 0s 5ms/step - loss: 5.0567e-04 - mae: 0.0168 - mse: 5.0567e-04 - val_loss: 4.2352e-04 - val_mae: 0.0169 - val_mse: 4.2352e-04
Epoch 92/200
5/5 [==============================] - 0s 6ms/step - loss: 4.8714e-04 - mae: 0.0165 - mse: 4.8714e-04 - val_loss: 4.1368e-04 - val_mae: 0.0167 - val_mse: 4.1368e-04
Epoch 93/200
5/5 [==============================] - 0s 5ms/step - loss: 4.6966e-04 - mae: 0.0162 - mse: 4.6966e-04 - val_loss: 4.0546e-04 - val_mae: 0.0167 - val_mse: 4.0546e-04
Epoch 94/200
5/5 [==============================] - 0s 5ms/step - loss: 4.5172e-04 - mae: 0.0159 - mse: 4.5172e-04 - val_loss: 3.9557e-04 - val_mae: 0.0165 - val_mse: 3.9557e-04
Epoch 95/200
5/5 [==============================] - 0s 5ms/step - loss: 4.3264e-04 - mae: 0.0155 - mse: 4.3264e-04 - val_loss: 3.8257e-04 - val_mae: 0.0162 - val_mse: 3.8257e-04
Epoch 96/200
5/5 [==============================] - 0s 5ms/step - loss: 4.1932e-04 - mae: 0.0153 - mse: 4.1932e-04 - val_loss: 3.6441e-04 - val_mae: 0.0158 - val_mse: 3.6441e-04
Epoch 97/200
5/5 [==============================] - 0s 5ms/step - loss: 4.0574e-04 - mae: 0.0150 - mse: 4.0574e-04 - val_loss: 3.4961e-04 - val_mae: 0.0155 - val_mse: 3.4961e-04
Epoch 98/200
5/5 [==============================] - 0s 5ms/step - loss: 3.9222e-04 - mae: 0.0147 - mse: 3.9222e-04 - val_loss: 3.3129e-04 - val_mae: 0.0151 - val_mse: 3.3129e-04
Epoch 99/200
5/5 [==============================] - 0s 5ms/step - loss: 3.7340e-04 - mae: 0.0144 - mse: 3.7340e-04 - val_loss: 3.2128e-04 - val_mae: 0.0150 - val_mse: 3.2128e-04
Epoch 100/200
5/5 [==============================] - 0s 5ms/step - loss: 3.6066e-04 - mae: 0.0141 - mse: 3.6066e-04 - val_loss: 3.0840e-04 - val_mae: 0.0146 - val_mse: 3.0840e-04
Epoch 101/200
5/5 [==============================] - 0s 5ms/step - loss: 3.4882e-04 - mae: 0.0139 - mse: 3.4882e-04 - val_loss: 2.9857e-04 - val_mae: 0.0145 - val_mse: 2.9857e-04
Epoch 102/200
5/5 [==============================] - 0s 5ms/step - loss: 3.3478e-04 - mae: 0.0136 - mse: 3.3478e-04 - val_loss: 2.8388e-04 - val_mae: 0.0140 - val_mse: 2.8388e-04
Epoch 103/200
5/5 [==============================] - 0s 5ms/step - loss: 3.2453e-04 - mae: 0.0133 - mse: 3.2453e-04 - val_loss: 2.7081e-04 - val_mae: 0.0136 - val_mse: 2.7081e-04
Epoch 104/200
5/5 [==============================] - 0s 5ms/step - loss: 3.1230e-04 - mae: 0.0131 - mse: 3.1230e-04 - val_loss: 2.6021e-04 - val_mae: 0.0133 - val_mse: 2.6021e-04
Epoch 105/200
5/5 [==============================] - 0s 5ms/step - loss: 3.0097e-04 - mae: 0.0128 - mse: 3.0097e-04 - val_loss: 2.5038e-04 - val_mae: 0.0131 - val_mse: 2.5038e-04
Epoch 106/200
5/5 [==============================] - 0s 5ms/step - loss: 2.8736e-04 - mae: 0.0125 - mse: 2.8736e-04 - val_loss: 2.3501e-04 - val_mae: 0.0125 - val_mse: 2.3501e-04
Epoch 107/200
5/5 [==============================] - 0s 5ms/step - loss: 2.7770e-04 - mae: 0.0123 - mse: 2.7770e-04 - val_loss: 2.1888e-04 - val_mae: 0.0117 - val_mse: 2.1888e-04
Epoch 108/200
5/5 [==============================] - 0s 5ms/step - loss: 2.6642e-04 - mae: 0.0121 - mse: 2.6642e-04 - val_loss: 2.0831e-04 - val_mae: 0.0113 - val_mse: 2.0831e-04
Epoch 109/200
5/5 [==============================] - 0s 5ms/step - loss: 2.5429e-04 - mae: 0.0119 - mse: 2.5429e-04 - val_loss: 1.9728e-04 - val_mae: 0.0110 - val_mse: 1.9728e-04
Epoch 110/200
5/5 [==============================] - 0s 5ms/step - loss: 2.4586e-04 - mae: 0.0117 - mse: 2.4586e-04 - val_loss: 1.9008e-04 - val_mae: 0.0107 - val_mse: 1.9008e-04
Epoch 111/200
5/5 [==============================] - 0s 6ms/step - loss: 2.3885e-04 - mae: 0.0115 - mse: 2.3885e-04 - val_loss: 1.8169e-04 - val_mae: 0.0105 - val_mse: 1.8169e-04
Epoch 112/200
5/5 [==============================] - 0s 5ms/step - loss: 2.2930e-04 - mae: 0.0113 - mse: 2.2930e-04 - val_loss: 1.7345e-04 - val_mae: 0.0100 - val_mse: 1.7345e-04
Epoch 113/200
5/5 [==============================] - 0s 5ms/step - loss: 2.2023e-04 - mae: 0.0111 - mse: 2.2023e-04 - val_loss: 1.6399e-04 - val_mae: 0.0096 - val_mse: 1.6399e-04
Epoch 114/200
5/5 [==============================] - 0s 5ms/step - loss: 2.1183e-04 - mae: 0.0108 - mse: 2.1183e-04 - val_loss: 1.5597e-04 - val_mae: 0.0093 - val_mse: 1.5597e-04
Epoch 115/200
5/5 [==============================] - 0s 7ms/step - loss: 2.0368e-04 - mae: 0.0106 - mse: 2.0368e-04 - val_loss: 1.5246e-04 - val_mae: 0.0093 - val_mse: 1.5246e-04
Epoch 116/200
5/5 [==============================] - 0s 6ms/step - loss: 1.9422e-04 - mae: 0.0104 - mse: 1.9422e-04 - val_loss: 1.4319e-04 - val_mae: 0.0091 - val_mse: 1.4319e-04
Epoch 117/200
5/5 [==============================] - 0s 5ms/step - loss: 1.8737e-04 - mae: 0.0102 - mse: 1.8737e-04 - val_loss: 1.3436e-04 - val_mae: 0.0086 - val_mse: 1.3436e-04
Epoch 118/200
5/5 [==============================] - 0s 5ms/step - loss: 1.8010e-04 - mae: 0.0100 - mse: 1.8010e-04 - val_loss: 1.2781e-04 - val_mae: 0.0084 - val_mse: 1.2781e-04
Epoch 119/200
5/5 [==============================] - 0s 5ms/step - loss: 1.7449e-04 - mae: 0.0099 - mse: 1.7449e-04 - val_loss: 1.2006e-04 - val_mae: 0.0082 - val_mse: 1.2006e-04
Epoch 120/200
5/5 [==============================] - 0s 5ms/step - loss: 1.6709e-04 - mae: 0.0096 - mse: 1.6709e-04 - val_loss: 1.1468e-04 - val_mae: 0.0081 - val_mse: 1.1468e-04
Epoch 121/200
5/5 [==============================] - 0s 5ms/step - loss: 1.6264e-04 - mae: 0.0095 - mse: 1.6264e-04 - val_loss: 1.0849e-04 - val_mae: 0.0080 - val_mse: 1.0849e-04
Epoch 122/200
5/5 [==============================] - 0s 5ms/step - loss: 1.5665e-04 - mae: 0.0093 - mse: 1.5665e-04 - val_loss: 1.0229e-04 - val_mae: 0.0078 - val_mse: 1.0229e-04
Epoch 123/200
5/5 [==============================] - 0s 5ms/step - loss: 1.5241e-04 - mae: 0.0091 - mse: 1.5241e-04 - val_loss: 9.7552e-05 - val_mae: 0.0076 - val_mse: 9.7552e-05
Epoch 124/200
5/5 [==============================] - 0s 5ms/step - loss: 1.4680e-04 - mae: 0.0090 - mse: 1.4680e-04 - val_loss: 9.1493e-05 - val_mae: 0.0074 - val_mse: 9.1493e-05
Epoch 125/200
5/5 [==============================] - 0s 5ms/step - loss: 1.4152e-04 - mae: 0.0088 - mse: 1.4152e-04 - val_loss: 8.9633e-05 - val_mae: 0.0074 - val_mse: 8.9633e-05
Epoch 126/200
5/5 [==============================] - 0s 5ms/step - loss: 1.3517e-04 - mae: 0.0087 - mse: 1.3517e-04 - val_loss: 8.7126e-05 - val_mae: 0.0074 - val_mse: 8.7126e-05
Epoch 127/200
5/5 [==============================] - 0s 5ms/step - loss: 1.3025e-04 - mae: 0.0085 - mse: 1.3025e-04 - val_loss: 8.2249e-05 - val_mae: 0.0072 - val_mse: 8.2249e-05
Epoch 128/200
5/5 [==============================] - 0s 5ms/step - loss: 1.2659e-04 - mae: 0.0084 - mse: 1.2659e-04 - val_loss: 7.8166e-05 - val_mae: 0.0070 - val_mse: 7.8166e-05
Epoch 129/200
5/5 [==============================] - 0s 6ms/step - loss: 1.2147e-04 - mae: 0.0082 - mse: 1.2147e-04 - val_loss: 7.5727e-05 - val_mae: 0.0070 - val_mse: 7.5727e-05
Epoch 130/200
5/5 [==============================] - 0s 5ms/step - loss: 1.1669e-04 - mae: 0.0080 - mse: 1.1669e-04 - val_loss: 7.3111e-05 - val_mae: 0.0068 - val_mse: 7.3111e-05
Epoch 131/200
5/5 [==============================] - 0s 5ms/step - loss: 1.1198e-04 - mae: 0.0079 - mse: 1.1198e-04 - val_loss: 6.9313e-05 - val_mae: 0.0067 - val_mse: 6.9313e-05
Epoch 132/200
5/5 [==============================] - 0s 5ms/step - loss: 1.0780e-04 - mae: 0.0077 - mse: 1.0780e-04 - val_loss: 6.7073e-05 - val_mae: 0.0066 - val_mse: 6.7073e-05
Epoch 133/200
5/5 [==============================] - 0s 4ms/step - loss: 1.0371e-04 - mae: 0.0076 - mse: 1.0371e-04 - val_loss: 6.4523e-05 - val_mae: 0.0064 - val_mse: 6.4523e-05
Epoch 134/200
5/5 [==============================] - 0s 5ms/step - loss: 1.0062e-04 - mae: 0.0075 - mse: 1.0062e-04 - val_loss: 6.2896e-05 - val_mae: 0.0063 - val_mse: 6.2896e-05
Epoch 135/200
5/5 [==============================] - 0s 5ms/step - loss: 9.7328e-05 - mae: 0.0074 - mse: 9.7328e-05 - val_loss: 6.0353e-05 - val_mae: 0.0062 - val_mse: 6.0353e-05
Epoch 136/200
5/5 [==============================] - 0s 5ms/step - loss: 9.4192e-05 - mae: 0.0072 - mse: 9.4192e-05 - val_loss: 5.9177e-05 - val_mae: 0.0061 - val_mse: 5.9177e-05
Epoch 137/200
5/5 [==============================] - 0s 5ms/step - loss: 9.0852e-05 - mae: 0.0071 - mse: 9.0852e-05 - val_loss: 5.6397e-05 - val_mae: 0.0059 - val_mse: 5.6397e-05
Epoch 138/200
5/5 [==============================] - 0s 5ms/step - loss: 8.6898e-05 - mae: 0.0070 - mse: 8.6898e-05 - val_loss: 5.3733e-05 - val_mae: 0.0057 - val_mse: 5.3733e-05
Epoch 139/200
5/5 [==============================] - 0s 4ms/step - loss: 8.3460e-05 - mae: 0.0068 - mse: 8.3460e-05 - val_loss: 5.4444e-05 - val_mae: 0.0058 - val_mse: 5.4444e-05
Epoch 140/200
5/5 [==============================] - 0s 5ms/step - loss: 8.0311e-05 - mae: 0.0067 - mse: 8.0311e-05 - val_loss: 5.3752e-05 - val_mae: 0.0058 - val_mse: 5.3752e-05
Epoch 141/200
5/5 [==============================] - 0s 5ms/step - loss: 7.7334e-05 - mae: 0.0066 - mse: 7.7334e-05 - val_loss: 5.2290e-05 - val_mae: 0.0057 - val_mse: 5.2290e-05
Epoch 142/200
5/5 [==============================] - 0s 5ms/step - loss: 7.4732e-05 - mae: 0.0065 - mse: 7.4732e-05 - val_loss: 4.8645e-05 - val_mae: 0.0054 - val_mse: 4.8645e-05
Epoch 143/200
5/5 [==============================] - 0s 5ms/step - loss: 7.2079e-05 - mae: 0.0064 - mse: 7.2079e-05 - val_loss: 4.4800e-05 - val_mae: 0.0053 - val_mse: 4.4800e-05
Epoch 144/200
5/5 [==============================] - 0s 5ms/step - loss: 6.9418e-05 - mae: 0.0063 - mse: 6.9418e-05 - val_loss: 4.2787e-05 - val_mae: 0.0053 - val_mse: 4.2787e-05
Epoch 145/200
5/5 [==============================] - 0s 5ms/step - loss: 6.6518e-05 - mae: 0.0061 - mse: 6.6518e-05 - val_loss: 4.1261e-05 - val_mae: 0.0052 - val_mse: 4.1261e-05
Epoch 146/200
5/5 [==============================] - 0s 5ms/step - loss: 6.3542e-05 - mae: 0.0060 - mse: 6.3542e-05 - val_loss: 4.1370e-05 - val_mae: 0.0054 - val_mse: 4.1370e-05
Epoch 147/200
5/5 [==============================] - 0s 5ms/step - loss: 6.2388e-05 - mae: 0.0060 - mse: 6.2388e-05 - val_loss: 4.1042e-05 - val_mae: 0.0054 - val_mse: 4.1042e-05
Epoch 148/200
5/5 [==============================] - 0s 4ms/step - loss: 6.0757e-05 - mae: 0.0059 - mse: 6.0757e-05 - val_loss: 4.0565e-05 - val_mae: 0.0053 - val_mse: 4.0565e-05
Epoch 149/200
5/5 [==============================] - 0s 5ms/step - loss: 5.8709e-05 - mae: 0.0058 - mse: 5.8709e-05 - val_loss: 4.0501e-05 - val_mae: 0.0052 - val_mse: 4.0501e-05
Epoch 150/200
5/5 [==============================] - 0s 6ms/step - loss: 5.6752e-05 - mae: 0.0057 - mse: 5.6752e-05 - val_loss: 4.0064e-05 - val_mae: 0.0050 - val_mse: 4.0064e-05
Epoch 151/200
5/5 [==============================] - 0s 5ms/step - loss: 5.5073e-05 - mae: 0.0056 - mse: 5.5073e-05 - val_loss: 4.0341e-05 - val_mae: 0.0048 - val_mse: 4.0341e-05
Epoch 152/200
5/5 [==============================] - 0s 5ms/step - loss: 5.3786e-05 - mae: 0.0055 - mse: 5.3786e-05 - val_loss: 3.7023e-05 - val_mae: 0.0049 - val_mse: 3.7023e-05
Epoch 153/200
5/5 [==============================] - 0s 5ms/step - loss: 5.1794e-05 - mae: 0.0054 - mse: 5.1794e-05 - val_loss: 3.8473e-05 - val_mae: 0.0048 - val_mse: 3.8473e-05
Epoch 154/200
5/5 [==============================] - 0s 5ms/step - loss: 5.0055e-05 - mae: 0.0053 - mse: 5.0055e-05 - val_loss: 3.7344e-05 - val_mae: 0.0048 - val_mse: 3.7344e-05
Epoch 155/200
5/5 [==============================] - 0s 7ms/step - loss: 4.8111e-05 - mae: 0.0052 - mse: 4.8111e-05 - val_loss: 3.8048e-05 - val_mae: 0.0048 - val_mse: 3.8048e-05
Epoch 156/200
5/5 [==============================] - 0s 5ms/step - loss: 4.7111e-05 - mae: 0.0051 - mse: 4.7111e-05 - val_loss: 3.7151e-05 - val_mae: 0.0048 - val_mse: 3.7151e-05
Epoch 157/200
5/5 [==============================] - 0s 6ms/step - loss: 4.6104e-05 - mae: 0.0050 - mse: 4.6104e-05 - val_loss: 3.6057e-05 - val_mae: 0.0048 - val_mse: 3.6057e-05
Epoch 158/200
5/5 [==============================] - 0s 5ms/step - loss: 4.4664e-05 - mae: 0.0050 - mse: 4.4664e-05 - val_loss: 3.5037e-05 - val_mae: 0.0048 - val_mse: 3.5037e-05
Epoch 159/200
5/5 [==============================] - 0s 5ms/step - loss: 4.3519e-05 - mae: 0.0049 - mse: 4.3519e-05 - val_loss: 3.5037e-05 - val_mae: 0.0049 - val_mse: 3.5037e-05
Epoch 160/200
5/5 [==============================] - 0s 4ms/step - loss: 4.1987e-05 - mae: 0.0048 - mse: 4.1987e-05 - val_loss: 3.3912e-05 - val_mae: 0.0049 - val_mse: 3.3912e-05
Epoch 161/200
5/5 [==============================] - 0s 5ms/step - loss: 4.1302e-05 - mae: 0.0048 - mse: 4.1302e-05 - val_loss: 3.6570e-05 - val_mae: 0.0049 - val_mse: 3.6570e-05
Epoch 162/200
5/5 [==============================] - 0s 5ms/step - loss: 3.9836e-05 - mae: 0.0048 - mse: 3.9836e-05 - val_loss: 3.6350e-05 - val_mae: 0.0049 - val_mse: 3.6350e-05
Epoch 163/200
5/5 [==============================] - 0s 5ms/step - loss: 3.8745e-05 - mae: 0.0047 - mse: 3.8745e-05 - val_loss: 3.4917e-05 - val_mae: 0.0050 - val_mse: 3.4917e-05
Epoch 164/200
5/5 [==============================] - 0s 4ms/step - loss: 3.7472e-05 - mae: 0.0046 - mse: 3.7472e-05 - val_loss: 3.5589e-05 - val_mae: 0.0050 - val_mse: 3.5589e-05
Epoch 165/200
5/5 [==============================] - 0s 6ms/step - loss: 3.6769e-05 - mae: 0.0045 - mse: 3.6769e-05 - val_loss: 3.7407e-05 - val_mae: 0.0051 - val_mse: 3.7407e-05
Epoch 166/200
5/5 [==============================] - 0s 5ms/step - loss: 3.5808e-05 - mae: 0.0044 - mse: 3.5808e-05 - val_loss: 3.7502e-05 - val_mae: 0.0051 - val_mse: 3.7502e-05
Epoch 167/200
5/5 [==============================] - 0s 5ms/step - loss: 3.4881e-05 - mae: 0.0045 - mse: 3.4881e-05 - val_loss: 3.6125e-05 - val_mae: 0.0051 - val_mse: 3.6125e-05
Epoch 168/200
5/5 [==============================] - 0s 5ms/step - loss: 3.3861e-05 - mae: 0.0044 - mse: 3.3861e-05 - val_loss: 3.6312e-05 - val_mae: 0.0050 - val_mse: 3.6312e-05
Epoch 169/200
5/5 [==============================] - 0s 5ms/step - loss: 3.2866e-05 - mae: 0.0043 - mse: 3.2866e-05 - val_loss: 3.5793e-05 - val_mae: 0.0050 - val_mse: 3.5793e-05
Epoch 170/200
5/5 [==============================] - 0s 5ms/step - loss: 3.1945e-05 - mae: 0.0042 - mse: 3.1945e-05 - val_loss: 3.4558e-05 - val_mae: 0.0050 - val_mse: 3.4558e-05
Epoch 171/200
5/5 [==============================] - 0s 5ms/step - loss: 3.1145e-05 - mae: 0.0042 - mse: 3.1145e-05 - val_loss: 3.3329e-05 - val_mae: 0.0049 - val_mse: 3.3329e-05
Epoch 172/200
5/5 [==============================] - 0s 5ms/step - loss: 3.1241e-05 - mae: 0.0042 - mse: 3.1241e-05 - val_loss: 3.2999e-05 - val_mae: 0.0049 - val_mse: 3.2999e-05
Epoch 173/200
5/5 [==============================] - 0s 5ms/step - loss: 2.9997e-05 - mae: 0.0041 - mse: 2.9997e-05 - val_loss: 3.6368e-05 - val_mae: 0.0051 - val_mse: 3.6368e-05
Epoch 174/200
5/5 [==============================] - 0s 5ms/step - loss: 2.9700e-05 - mae: 0.0041 - mse: 2.9700e-05 - val_loss: 3.7688e-05 - val_mae: 0.0052 - val_mse: 3.7688e-05
Epoch 175/200
5/5 [==============================] - 0s 6ms/step - loss: 2.9225e-05 - mae: 0.0040 - mse: 2.9225e-05 - val_loss: 3.6629e-05 - val_mae: 0.0051 - val_mse: 3.6629e-05
Epoch 176/200
5/5 [==============================] - 0s 6ms/step - loss: 2.8136e-05 - mae: 0.0040 - mse: 2.8136e-05 - val_loss: 3.5800e-05 - val_mae: 0.0051 - val_mse: 3.5800e-05
Epoch 177/200
5/5 [==============================] - 0s 5ms/step - loss: 2.7569e-05 - mae: 0.0039 - mse: 2.7569e-05 - val_loss: 3.6251e-05 - val_mae: 0.0051 - val_mse: 3.6251e-05
Epoch 178/200
5/5 [==============================] - 0s 5ms/step - loss: 2.7227e-05 - mae: 0.0039 - mse: 2.7227e-05 - val_loss: 3.2368e-05 - val_mae: 0.0049 - val_mse: 3.2368e-05
Epoch 179/200
5/5 [==============================] - 0s 5ms/step - loss: 2.6570e-05 - mae: 0.0038 - mse: 2.6570e-05 - val_loss: 3.3853e-05 - val_mae: 0.0050 - val_mse: 3.3853e-05
Epoch 180/200
5/5 [==============================] - 0s 5ms/step - loss: 2.5977e-05 - mae: 0.0038 - mse: 2.5977e-05 - val_loss: 3.5709e-05 - val_mae: 0.0051 - val_mse: 3.5709e-05
Epoch 181/200
5/5 [==============================] - 0s 5ms/step - loss: 2.5352e-05 - mae: 0.0037 - mse: 2.5352e-05 - val_loss: 3.2158e-05 - val_mae: 0.0049 - val_mse: 3.2158e-05
Epoch 182/200
5/5 [==============================] - 0s 5ms/step - loss: 2.4784e-05 - mae: 0.0037 - mse: 2.4784e-05 - val_loss: 3.5370e-05 - val_mae: 0.0051 - val_mse: 3.5370e-05
Epoch 183/200
5/5 [==============================] - 0s 5ms/step - loss: 2.3947e-05 - mae: 0.0037 - mse: 2.3947e-05 - val_loss: 3.4509e-05 - val_mae: 0.0051 - val_mse: 3.4509e-05
Epoch 184/200
5/5 [==============================] - 0s 5ms/step - loss: 2.3620e-05 - mae: 0.0037 - mse: 2.3620e-05 - val_loss: 3.4436e-05 - val_mae: 0.0051 - val_mse: 3.4436e-05
Epoch 185/200
5/5 [==============================] - 0s 5ms/step - loss: 2.3185e-05 - mae: 0.0036 - mse: 2.3185e-05 - val_loss: 3.4599e-05 - val_mae: 0.0051 - val_mse: 3.4599e-05
Epoch 186/200
5/5 [==============================] - 0s 5ms/step - loss: 2.2825e-05 - mae: 0.0036 - mse: 2.2825e-05 - val_loss: 3.5868e-05 - val_mae: 0.0051 - val_mse: 3.5868e-05
Epoch 187/200
5/5 [==============================] - 0s 5ms/step - loss: 2.2606e-05 - mae: 0.0036 - mse: 2.2606e-05 - val_loss: 3.2989e-05 - val_mae: 0.0049 - val_mse: 3.2989e-05
Epoch 188/200
5/5 [==============================] - 0s 5ms/step - loss: 2.1838e-05 - mae: 0.0035 - mse: 2.1838e-05 - val_loss: 3.1992e-05 - val_mae: 0.0048 - val_mse: 3.1992e-05
Epoch 189/200
5/5 [==============================] - 0s 5ms/step - loss: 2.1705e-05 - mae: 0.0034 - mse: 2.1705e-05 - val_loss: 3.2501e-05 - val_mae: 0.0048 - val_mse: 3.2501e-05
Epoch 190/200
5/5 [==============================] - 0s 5ms/step - loss: 2.1492e-05 - mae: 0.0034 - mse: 2.1492e-05 - val_loss: 2.9445e-05 - val_mae: 0.0047 - val_mse: 2.9445e-05
Epoch 191/200
5/5 [==============================] - 0s 5ms/step - loss: 2.1094e-05 - mae: 0.0034 - mse: 2.1094e-05 - val_loss: 2.9606e-05 - val_mae: 0.0047 - val_mse: 2.9606e-05
Epoch 192/200
5/5 [==============================] - 0s 5ms/step - loss: 2.0588e-05 - mae: 0.0033 - mse: 2.0588e-05 - val_loss: 3.1082e-05 - val_mae: 0.0047 - val_mse: 3.1082e-05
Epoch 193/200
5/5 [==============================] - 0s 5ms/step - loss: 2.0219e-05 - mae: 0.0033 - mse: 2.0219e-05 - val_loss: 3.2002e-05 - val_mae: 0.0048 - val_mse: 3.2002e-05
Epoch 194/200
5/5 [==============================] - 0s 5ms/step - loss: 2.0084e-05 - mae: 0.0033 - mse: 2.0084e-05 - val_loss: 2.9917e-05 - val_mae: 0.0047 - val_mse: 2.9917e-05
Epoch 195/200
5/5 [==============================] - 0s 7ms/step - loss: 2.0301e-05 - mae: 0.0033 - mse: 2.0301e-05 - val_loss: 2.9593e-05 - val_mae: 0.0047 - val_mse: 2.9593e-05
Epoch 196/200
5/5 [==============================] - 0s 5ms/step - loss: 1.9506e-05 - mae: 0.0032 - mse: 1.9506e-05 - val_loss: 2.6969e-05 - val_mae: 0.0046 - val_mse: 2.6969e-05
Epoch 197/200
5/5 [==============================] - 0s 5ms/step - loss: 1.9716e-05 - mae: 0.0032 - mse: 1.9716e-05 - val_loss: 2.6561e-05 - val_mae: 0.0046 - val_mse: 2.6561e-05
Epoch 198/200
5/5 [==============================] - 0s 5ms/step - loss: 1.9180e-05 - mae: 0.0032 - mse: 1.9180e-05 - val_loss: 2.6371e-05 - val_mae: 0.0046 - val_mse: 2.6371e-05
Epoch 199/200
5/5 [==============================] - 0s 5ms/step - loss: 1.8571e-05 - mae: 0.0031 - mse: 1.8571e-05 - val_loss: 2.6779e-05 - val_mae: 0.0047 - val_mse: 2.6779e-05
Epoch 200/200
5/5 [==============================] - 0s 5ms/step - loss: 1.8355e-05 - mae: 0.0031 - mse: 1.8355e-05 - val_loss: 2.7453e-05 - val_mae: 0.0047 - val_mse: 2.7453e-05
  • 1
  • 2
  • 3
  • 4
  • 5
  • 6
  • 7
  • 8
  • 9
  • 10
  • 11
  • 12
  • 13
  • 14
  • 15
  • 16
  • 17
  • 18
  • 19
  • 20
  • 21
  • 22
  • 23
  • 24
  • 25
  • 26
  • 27
  • 28
  • 29
  • 30
  • 31
  • 32
  • 33
  • 34
  • 35
  • 36
  • 37
  • 38
  • 39
  • 40
  • 41
  • 42
  • 43
  • 44
  • 45
  • 46
  • 47
  • 48
  • 49
  • 50
  • 51
  • 52
  • 53
  • 54
  • 55
  • 56
  • 57
  • 58
  • 59
  • 60
  • 61
  • 62
  • 63
  • 64
  • 65
  • 66
  • 67
  • 68
  • 69
  • 70
  • 71
  • 72
  • 73
  • 74
  • 75
  • 76
  • 77
  • 78
  • 79
  • 80
  • 81
  • 82
  • 83
  • 84
  • 85
  • 86
  • 87
  • 88
  • 89
  • 90
  • 91
  • 92
  • 93
  • 94
  • 95
  • 96
  • 97
  • 98
  • 99
  • 100
  • 101
  • 102
  • 103
  • 104
  • 105
  • 106
  • 107
  • 108
  • 109
  • 110
  • 111
  • 112
  • 113
  • 114
  • 115
  • 116
  • 117
  • 118
  • 119
  • 120
  • 121
  • 122
  • 123
  • 124
  • 125
  • 126
  • 127
  • 128
  • 129
  • 130
  • 131
  • 132
  • 133
  • 134
  • 135
  • 136
  • 137
  • 138
  • 139
  • 140
  • 141
  • 142
  • 143
  • 144
  • 145
  • 146
  • 147
  • 148
  • 149
  • 150
  • 151
  • 152
  • 153
  • 154
  • 155
  • 156
  • 157
  • 158
  • 159
  • 160
  • 161
  • 162
  • 163
  • 164
  • 165
  • 166
  • 167
  • 168
  • 169
  • 170
  • 171
  • 172
  • 173
  • 174
  • 175
  • 176
  • 177
  • 178
  • 179
  • 180
  • 181
  • 182
  • 183
  • 184
  • 185
  • 186
  • 187
  • 188
  • 189
  • 190
  • 191
  • 192
  • 193
  • 194
  • 195
  • 196
  • 197
  • 198
  • 199
  • 200
  • 201
  • 202
  • 203
  • 204
  • 205
  • 206
  • 207
  • 208
  • 209
  • 210
  • 211
  • 212
  • 213
  • 214
  • 215
  • 216
  • 217
  • 218
  • 219
  • 220
  • 221
  • 222
  • 223
  • 224
  • 225
  • 226
  • 227
  • 228
  • 229
  • 230
  • 231
  • 232
  • 233
  • 234
  • 235
  • 236
  • 237
  • 238
  • 239
  • 240
  • 241
  • 242
  • 243
  • 244
  • 245
  • 246
  • 247
  • 248
  • 249
  • 250
  • 251
  • 252
  • 253
  • 254
  • 255
  • 256
  • 257
  • 258
  • 259
  • 260
  • 261
  • 262
  • 263
  • 264
  • 265
  • 266
  • 267
  • 268
  • 269
  • 270
  • 271
  • 272
  • 273
  • 274
  • 275
  • 276
  • 277
  • 278
  • 279
  • 280
  • 281
  • 282
  • 283
  • 284
  • 285
  • 286
  • 287
  • 288
  • 289
  • 290
  • 291
  • 292
  • 293
  • 294
  • 295
  • 296
  • 297
  • 298
  • 299
  • 300
  • 301
  • 302
  • 303
  • 304
  • 305
  • 306
  • 307
  • 308
  • 309
  • 310
  • 311
  • 312
  • 313
  • 314
  • 315
  • 316
  • 317
  • 318
  • 319
  • 320
  • 321
  • 322
  • 323
  • 324
  • 325
  • 326
  • 327
  • 328
  • 329
  • 330
  • 331
  • 332
  • 333
  • 334
  • 335
  • 336
  • 337
  • 338
  • 339
  • 340
  • 341
  • 342
  • 343
  • 344
  • 345
  • 346
  • 347
  • 348
  • 349
  • 350
  • 351
  • 352
  • 353
  • 354
  • 355
  • 356
  • 357
  • 358
  • 359
  • 360
  • 361
  • 362
  • 363
  • 364
  • 365
  • 366
  • 367
  • 368
  • 369
  • 370
  • 371
  • 372
  • 373
  • 374
  • 375
  • 376
  • 377
  • 378
  • 379
  • 380
  • 381
  • 382
  • 383
  • 384
  • 385
  • 386
  • 387
  • 388
  • 389
  • 390
  • 391
  • 392
  • 393
  • 394
  • 395
  • 396
  • 397
  • 398
  • 399
  • 400

绘制出mse和mae随着迭代次数变化的曲线

mse = history.history['mse']
val_mse = history.history['val_mse']
mae = history.history['mae']
val_mae = history.history['val_mae']

plt.subplot(1, 2, 1)
plt.plot(mse, label='Training MSE')
plt.plot(val_mse, label='Validation MSE')
plt.title('Training and Validation MSE')
plt.legend()

plt.subplot(1, 2, 2)
plt.plot(mae, label='Training MAE')
plt.plot(val_mae, label='Validation MAE')
plt.title('Training and Validation MAE')
plt.legend()
plt.show()
  • 1
  • 2
  • 3
  • 4
  • 5
  • 6
  • 7
  • 8
  • 9
  • 10
  • 11
  • 12
  • 13
  • 14
  • 15
  • 16
  • 17

在这里插入图片描述

由于两项指标在训练集和验证集上的差别很小,因此可以明显看出模型不存在过拟合问题,从而该模型的泛化能力应该很强。

接着得到训练后的模型在测试集上的评估结果:

results = model.evaluate(test_x, test_y)
  • 1
1/1 [==============================] - 0s 16ms/step - loss: 3.5383e-05 - mae: 0.0056 - mse: 3.5383e-05
  • 1

可以看到,模型在测试集上的表现也很好,mae仅有0.0056,mse仅有3.5383e-05,相对误差约为0.0056 / 0.1 * 100% = 5.6%

predictions = model.predict(test_x)
print(predictions.T)
print(test_y.T)
  • 1
  • 2
  • 3
[[0.10459261 0.10197757 0.08644199 0.11035927 0.09347305 0.09360062
  0.09138721 0.10092913 0.10016835 0.08661837]]
[[0.09803579 0.09590747 0.09069302 0.10081094 0.10042433 0.09706393
  0.09558318 0.09379448 0.09834806 0.09232945]]
  • 1
  • 2
  • 3
  • 4

观察上面的输出结果,第一个数组是预测值,第二个数组是真实值,可以看到预测结果还是比较准确的。

下面的工作就是将generate_attribute.txt中的数据喂入模型输出预测结果,最后取平均得到单个属性对困难模式所占比例的影响,将最终的结果存储到one_attr_to_perc_of_hrd.json中,具体的代码实现这里不再赘述,仅展示此文件中的结果。

在这里插入图片描述

声明:本文内容由网友自发贡献,不代表【wpsshop博客】立场,版权归原作者所有,本站不承担相应法律责任。如您发现有侵权的内容,请联系我们。转载请注明出处:https://www.wpsshop.cn/w/我家自动化/article/detail/89850
推荐阅读
相关标签
  

闽ICP备14008679号