赞
踩
- from modelscope import AutoTokenizer, AutoModel, snapshot_download
- model_dir = snapshot_download("ZhipuAI/chatglm3-6b", revision = "v1.0.0")
- tokenizer = AutoTokenizer.from_pretrained(model_dir, trust_remote_code=True)
- model = AutoModel.from_pretrained(model_dir, trust_remote_code=True).half().cuda()
- model = model.eval()
- response, history = model.chat(tokenizer, "你好", history=[])
- print(response)
- response, history = model.chat(tokenizer, "chatglm3模型的架构是使用了Transformer的哪些部分")
- print(response)
Copyright © 2003-2013 www.wpsshop.cn 版权所有,并保留所有权利。