赞
踩
# !pip install transformers
from transformers import AutoModel,AutoTokenizer
tokenizer=AutoTokenizer.from_pretrained("./distilbert-base-uncased-finetuned-sst-2-english")
model=AutoModel.from_pretrained("./distilbert-base-uncased-finetuned-sst-2-english")
Some weights of the model checkpoint at ./distilbert-base-uncased-finetuned-sst-2-english were not used when initializing DistilBertModel: ['classifier.weight', 'pre_classifier.weight', 'pre_classifier.bias', 'classifier.bias']
- This IS expected if you are initializing DistilBertModel from the checkpoint of a model trained on another task or with another architecture (e.g. initializing a BertForSequenceClassification model from a BertForPreTraining model).
- This IS NOT expected if you are initializing DistilBertModel from the checkpoint of a model that you expect to be exactly identical (initializing a BertForSequenceClassification model from a BertForSequenceClassification model).
en=tokenizer.encode("how are you and you")
en,type(en)
([101, 2129, 2024, 2017, 1998, 2017, 102], list)
import torch
torch.tensor(en)
tensor([ 101, 2129, 2024, 2017, 1998, 2017, 102])
# 方法一
# out=model(torch.tensor(en).unsqueeze(0))
# 方法二
out=model(torch.tensor([en]))
print(out)
BaseModelOutput(last_hidden_state=tensor([[[ 0.4692, 0.5402, 0.2137, ..., -0.1891, 1.0371, -0.8645],
[ 0.9280, 0.8054, -0.0353, ..., -0.0706, 1.0147, -0.9412],
[ 1.1769, 0.4334, -0.4291, ..., -0.3780, 0.6734, -0.5759],
...,
[ 1.0213, 0.6273, 0.5482, ..., -0.2374, 1.0714, -0.5215],
[ 0.4576, 0.2577, 0.3044, ..., -0.1127, 1.1128, -0.9350],
[ 1.2613, 0.2868, 0.2176, ..., 0.7057, 0.1919, -0.7504]]],
grad_fn=<NativeLayerNormBackward>), hidden_states=None, attentions=None)
type(out)
transformers.modeling_outputs.BaseModelOutput
out[0]
tensor([[[ 0.4692, 0.5402, 0.2137, ..., -0.1891, 1.0371, -0.8645],
[ 0.9280, 0.8054, -0.0353, ..., -0.0706, 1.0147, -0.9412],
[ 1.1769, 0.4334, -0.4291, ..., -0.3780, 0.6734, -0.5759],
...,
[ 1.0213, 0.6273, 0.5482, ..., -0.2374, 1.0714, -0.5215],
[ 0.4576, 0.2577, 0.3044, ..., -0.1127, 1.1128, -0.9350],
[ 1.2613, 0.2868, 0.2176, ..., 0.7057, 0.1919, -0.7504]]],
grad_fn=<NativeLayerNormBackward>)
Copyright © 2003-2013 www.wpsshop.cn 版权所有,并保留所有权利。