赞
踩
本文以文本分类为例叙述
_, pooled = self.bert(context, token_type_ids=types,
attention_mask=mask,
output_all_encoded_layers=False)
报错:
result = self.forward(*input, **kwargs)
TypeError: forward() got an unexpected keyword argument 'output_all_encoded_layers'
如下:
_, pooled = self.bert(context, token_type_ids=types,
attention_mask=mask)
继续报错:
if input.dim() == 2 and bias is not None:
AttributeError: 'str' object has no attribute 'dim'
如下:
_, pooled = self.bert(context, token_type_ids=types,
attention_mask=mask,return_dict=False)
问题解决
注:若得不到解决
尝试如下三种方式
_, cls_hs = self.bert(sent_id, attention_mask=mask)
_, cls_hs = self.bert(sent_id, attention_mask=mask)[:2]
_, cls_hs = self.bert(sent_id, attention_mask=mask, return_dict=False)
Copyright © 2003-2013 www.wpsshop.cn 版权所有,并保留所有权利。