赞
踩
模型中加入embedding层
nn.Embedding(vocab_size,embedding_dim)
过程:获取数据,训练词向量,模型超参数设置,模型效果检验,模型保存与加载 1.获取数据(英语维基百科的部分网页信息, 它的大小在300M左右 ) [root@bhs fastte]# wget -c http://mattmahoney.net/dc/enwik9.zip -P data [root@bhs data]# unzip enwik9.zip [root@bhs data]# ll total 1291608 -rw-r--r-- 1 root root 1000000000 Jun 1 2011 enwik9 -rw-r--r-- 1 root root 322592222 Sep 2 2011 enwik9.zip -rw-r--r-- 1 root root 1995 May 24 00:17 wikifil.pl 查看下载的数据 [root@bhs data]# more enwik9 <mediawiki xmlns="http://www.mediawiki.org/xml/export-0.3/" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:sch emaLocation="http://www.mediawiki.org/xml/export-0.3/ http://www.mediawiki.org/xml/export-0.3.xsd" version="0.3" xml:lang ="en"> <siteinfo> <sitename>Wikipedia</sitename> <base>http://en.wikipedia.org/wiki/Main_Page</base> <generator>MediaWiki 1.6alpha</generator> <case>first-letter</case> <namespaces> <namespace key="-2">Media</namespace> <namespace key="-1">Special</namespace> <namespace key="0" /> <namespace key="1">Talk</namespace> <namespace key="2">User</namespace> <namespace key="3">User talk</namespace> <namespace key="4">Wikipedia</namespace> <namespace key="5">Wikipedia talk</namespace> 数据处理 使用wikifil.pl文件处理脚本来清除XML/HTML格式的内容 [root@bhs data]# perl wikifil.pl enwik9 > fil 输出结果为由空格分割的单词 [root@bhs data]# more fil anarchism originated as a term of abuse first used against early working class radicals including the diggers of the eng lish revolution and the sans culottes of the french revolution whilst the term is still used in a pejorative way to descr 2.训练词向量 # 在训练词向量过程中, 我们可以设定很多常用超参数来调节我们的模型效果, # 无监督训练模式: 'skipgram' 或者 'cbow', 默认为'skipgram', 在实践中,skipgram模式在利用子词方面比cbow更好. # 词嵌入维度dim: 默认为100, 但随着语料库的增大, 词嵌入的维度往往也要更大. # 数据循环次数epoch: 默认为5, 但当你的数据集足够大, 可能不需要那么多次. # 学习率lr: 默认为0.05, 根据经验, 建议选择[0.01,1]范围内. # 使用的线程数thread: 默认为12个线程, 一般建议和你的cpu核数相同 model=fasttext.train_unsupervised('./fil','cbow',dim=300,epoch=1,lr=0.1,thread=2) Read 124M words Number of words: 218316 Number of labels: 0 Progress: 100.0% words/sec/thread: 26829 lr: 0.000000 avg.loss: 1.507921 ETA: 0h 0m 0s 3.模型效果检验 # 检查单词向量质量的一种简单方法就是查看其邻近单词, 通过我们主观来判断这些邻近单词是否与目标单词相关来粗略评定模型效果好坏. 查找"小狗"的邻近单词, 我们可以发现与小狗有关的词汇. >>> model.get_nearest_neighbors('dog') [(0.8416129350662231, 'catdog'), (0.7638700604438782, 'sleddog'), (0.7467442154884338, 'dogcow'), (0.7335803508758545, 'h otdog'), (0.7319445610046387, 'bodog'), (0.715396523475647, 'dogs'), (0.711815595626831, 'dogo'), (0.6796260476112366, 'maddog'), (0.6774388551712036, 'madog'), (0.6770561933517456, 'spurdog')] 4.模型保存与加载 >>> model.save_model("file_save_model.bin") 模型加载 >>> model2=fasttext.load_model("./file_save_model.bin") 报错:内存不足 Warning : `load_model` does not return WordVectorModel or SupervisedModel any more, but a `FastText` object which is very similar.Traceback (most recent call last): File "<stdin>", line 1, in <module> File "/root/anaconda3/lib/python3.6/site-packages/fasttext/FastText.py", line 441, in load_model return _FastText(model_path=path) File "/root/anaconda3/lib/python3.6/site-packages/fasttext/FastText.py", line 98, in __init__ self.f.loadModel(model_path) MemoryError: std::bad_alloc >>> model22=fasttext.load_model("file_save_model.bin") Warning : `load_model` does not return WordVectorModel or SupervisedModel any more, but a `FastText` object which is very similar. 使用模型获取词向量 >>> model22.get_word_vector("baihaisheng") array([-6.14714921e-01, 1.77509800e-01, -6.72733843e-01, 3.89141440e-01, 2.49527186e-01, 2.41667092e-01, -4.54738081e-01, -3.32535744e-01, 1.00533022e-02, 1.30211804e-02, 2.25752905e-01, -4.17809784e-01, 5.90570807e-01, 1.59802660e-02, -9.37078178e-01, -6.96638942e-01, -3.15337270e-01, 6.20727122e-01, -1.84069797e-01, 4.22761917e-01, -2.37838719e-02, -3.68257344e-01, 3.16621125e-01, -4.70072150e-01, 5.71745373e-02, 1.40439391e-01, 2.90389121e-01, 6.27156734e-01, -6.00902587e-02, -4.65739310e-01, 1.03777409e-01, -7.87911355e-01, 5.73584926e-04, 8.69789273e-02, -8.45102191e-01, -4.41818327e-01, 4.94885862e-01, 1.21109761e-01, -5.81156723e-02, -1.74028978e-01, 1.29382350e-02, 2.84348875e-01, 2.40987480e-01, -3.50105286e-01, 1.39359280e-01, 2.79060900e-01, 3.65334660e-01, -7.01365650e-01, 4.64738943e-02, -3.80195081e-01, 7.52594024e-02, -1.32626042e-01, 1.48186862e-01, -2.03920603e-01, -6.26981258e-02, 7.05849111e-01, -2.42000282e-01, -3.99949878e-01, 4.65390146e-01, 3.47595781e-01, -5.09850979e-01, 3.79405737e-01, 1.32703856e-01, -3.32746685e-01, 1.37809172e-01, -9.51760769e-01, -2.99254984e-01, 8.42892751e-02, -1.92956835e-01, -4.32410508e-01, -7.08973587e-01, 2.11728901e-01, 2.65689790e-01, -3.82561058e-01, -8.77650008e-02, 1.32813957e-02, -2.99640179e-01, -6.89251542e-01, -1.98683292e-01, -5.35973869e-02, 1.95847973e-01, -3.16322207e-01, 4.64514673e-01, -1.90718532e-01, -1.98006675e-01, 8.36191952e-01, -5.83671510e-01, 6.64617598e-01, -5.36399782e-01, -7.70042911e-02, -3.57951492e-01, 1.40873818e-02, -5.04313648e-01, 1.17317331e+00, -2.78562754e-01, 5.37931621e-01, 5.22421971e-02, 1.48041770e-01, 2.70346880e-01, -4.47012603e-01, 1.49396583e-01, 2.54133046e-02, -6.32022172e-02, 3.11216176e-01, 2.93466628e-01, -1.37522845e-02, -3.14217508e-01, -3.88895601e-01, 5.53332902e-02, 2.66070604e-01, -3.26904833e-01, -9.38113868e-01, -8.01377296e-01, -2.76655018e-01, 2.95357853e-01, 4.95326258e-02, 2.05409884e-01, -3.83659571e-01, -4.69358861e-01, 2.59294689e-01, -1.05863720e-01, 4.35056478e-01, 7.52935410e-02, 6.90222085e-02, -3.44821930e-01, -1.56436771e-01, -3.90448183e-01, -6.76123917e-01, 4.41117547e-02, 5.24668634e-01, 7.67726973e-02, -8.08430910e-01, 6.40433729e-02, -1.24917634e-01, 3.02044123e-01, 5.49890995e-01, -3.41330796e-01, 3.41020733e-01, 1.05423173e-02, -6.75539821e-02, -5.05231798e-01, 1.95493981e-01, -3.91332120e-01, 4.88728613e-01, 2.03944132e-01, 4.21409816e-01, -3.52494448e-01, -1.23629794e-02, -7.36190379e-01, 1.44726455e-01, 5.02763927e-01, -9.17422120e-03, -1.90208256e-01, -6.94231153e-01, 1.79565459e-01, -5.44371307e-01, -3.39159757e-01, -9.69230756e-02, -7.34013379e-01, 3.57983142e-01, 2.98592541e-02, -2.76991367e-01, 3.22349370e-01, 2.38273293e-01, -3.85621935e-01, -1.94796294e-01, 1.58925857e-02, -2.22999975e-01, 1.27562350e-02, -7.30409741e-01, 5.42377949e-01, 7.75462449e-01, 3.11984658e-01, -5.94272137e-01, -3.89411390e-01, 1.38909906e-01, 5.38288914e-02, -5.86420894e-01, -4.59898055e-01, -3.51294905e-01, -2.52639234e-01, 1.29526585e-01, 3.99453491e-01, 2.58776903e-01, 3.78979892e-01, 6.12262607e-01, -7.90281519e-02, 4.35486697e-02, 1.21814705e-01, -5.93060851e-01, -7.41831660e-01, -6.45337030e-02, 5.67307353e-01, -4.14864808e-01, 1.40781909e-01, -2.57910490e-01, 1.89982623e-01, 2.95685351e-01, -1.78639546e-01, 3.70555967e-01, -4.44893450e-01, 2.67164707e-02, -3.89021710e-02, 2.65334040e-01, 8.63097981e-02, -4.62214381e-01, 3.63194972e-01, 3.84918898e-02, 7.76365325e-02, 2.10744530e-01, -3.56715173e-01, 1.48300081e-01, 1.40594691e-01, -2.77864009e-01, -6.63379729e-01, 1.26957715e-01, -3.18865716e-01, -2.73206711e-01, -2.60140210e-01, -5.20998001e-01, 4.53805625e-01, 2.08097503e-01, -3.63694400e-01, 1.70241684e-01, -5.37926853e-02, -5.26781738e-01, 2.18923092e-02, -1.52257919e-01, 8.82064030e-02, 7.43691530e-03, -5.51458150e-02, -7.28050232e-01, 2.06179187e-01, -2.48517796e-01, 1.51464373e-01, -6.21425450e-01, -2.79860973e-01, 8.12949777e-01, -1.09000459e-01, -3.33450288e-01, -8.25750351e-01, -3.35024595e-02, -3.75130437e-02, -1.14543743e-01, 4.48227674e-01, -1.47293851e-01, -9.86459404e-02, 2.73444474e-01, -9.43316996e-01, 7.02648342e-01, -1.67992368e-01, 1.68559060e-01, 1.28673196e-01, -6.19408727e-01, 3.94033939e-01, -2.62163162e-01, -3.09747308e-01, 4.15634245e-01, 1.26757383e-01, 8.73117819e-02, 3.82494479e-01, -9.45309281e-01, 3.05107027e-01, 7.09154487e-01, -9.11318585e-02, 4.16147500e-01, 2.27030009e-01, 4.02536601e-01, -1.11048624e-01, -3.55361104e-01, -2.01794848e-01, 6.78207994e-01, 3.93777005e-02, 5.43783069e-01, -1.07726261e-01, -2.64577866e-01, -3.15008730e-01, 4.13194895e-02, -4.69589233e-02, 7.28278533e-02, 1.15777445e+00, -1.91089004e-01, -2.61729993e-02, 1.33131459e-01, -8.68672356e-02, 5.32459021e-01, -5.59820943e-02, 7.47064352e-01, -1.07573256e-01, -2.86723137e-01, 2.86289006e-01, -2.12187365e-01, -5.78209001e-04, 1.68665841e-01, 1.28051564e-01, -8.86839405e-02, 1.27122542e-02, 5.84771216e-01, 2.32165232e-01, -2.18837187e-01], dtype=float32)
使用BERT中文预训练模型对句子编码
import torch import torch.nn as nn # 通过torch.hub(pytorch中专注于迁移学的工具)获得已经训练好的bert-base-chinese模型 model = torch.hub.load('huggingface/pytorch-transformers', 'model', 'bert-base-chinese') # 获得对应的字符映射器, 它将把中文的每个字映射成一个数字 tokenizer = torch.hub.load('huggingface/pytorch-transformers', 'tokenizer', 'bert-base-chinese') def get_bert_encode_for_single(text): """ description: 使用bert-chinese编码中文文本 :param text: 要进行编码的文本 :return: 使用bert编码后的文本张量表示 """ # 首先使用字符映射器对每个汉字进行映射 # 这里需要注意, bert的tokenizer映射后会为结果前后添加开始和结束标记即101和102 # 这对于多段文本的编码是有意义的, 但在我们这里没有意义, 因此使用[1:-1]对头和尾进行切片 indexed_tokens = tokenizer.encode(text)[1:-1] # 之后将列表结构转化为tensor tokens_tensor = torch.tensor([indexed_tokens]) print(tokens_tensor) # 使模型不自动计算梯度 with torch.no_grad(): # 调用模型获得隐层输出 encoded_layers, _ = model(tokens_tensor) # 输出的隐层是一个三维张量, 最外层一维是1, 我们使用[0]降去它. print(encoded_layers.shape) encoded_layers = encoded_layers[0] return encoded_layers if __name__=='__main__': text='鸟语花香,好美' encoded_layers=get_bert_encode_for_single(text) print(encoded_layers) 输出结果 tensor([[7881, 6427, 5709, 7676, 8024, 1962, 5401]]) torch.Size([1, 7, 768]) tensor([[-0.0471, -0.1492, -0.8097, ..., -0.3070, 0.2260, 0.4534], [ 0.8415, -0.2064, 0.1300, ..., 0.9771, 0.8106, 0.4841], [ 0.9327, -0.2953, 0.0101, ..., 0.4636, 0.5196, 0.0178], ..., [ 0.3189, 0.2888, -0.2720, ..., 1.2119, 0.2128, 0.3653], [ 0.5846, 0.2318, 0.4427, ..., 0.6561, 0.5574, 0.3100], [ 0.7143, -0.3749, -0.3590, ..., 0.6304, 0.0795, 0.3684]])
词向量好坏评价
语义相关性任务
这个任务用来评价词向量模型在两个词之间的语义相关性,如:学生与作业,中国与北京等。
具体方法由监督模式实现,首先需要一份如下的标记文件,一般可以由人工标注:
学生 上课 0.78
教师 备课 0.8
...
上述文件代表了词语之间的语义相关性,我们利用标注文件与训练出来的词向量相似度进行比较,如:词向量之间的cos距离等,确定损失函数,便可以得到一个评价指标。
但这种方法首先需要人力标注,且标注的准确性对评价指标影响非常大
语义类比任务
这个任务词向量来考察不同单词间的语义关系能力,一般给定三个词,如a、b、c,要求寻找a+b = c + ?任务中最相似的词
文本分类任务
这个任务利用词向量构成文本向量,一般采用求和平均的方式,之后利用构成的文本向量进行文本分类,根据分类的准备率等指标衡量词向量的质量
Copyright © 2003-2013 www.wpsshop.cn 版权所有,并保留所有权利。