当前位置:   article > 正文

modelscope里运行chatglm3-6b_拿modelscope用了个chatglm3-6b模型

拿modelscope用了个chatglm3-6b模型
  1. from modelscope import AutoTokenizer, AutoModel, snapshot_download
  2. model_dir = snapshot_download("ZhipuAI/chatglm3-6b", revision = "v1.0.0")
  3. tokenizer = AutoTokenizer.from_pretrained(model_dir, trust_remote_code=True)
  4. model = AutoModel.from_pretrained(model_dir, trust_remote_code=True).half().cuda()
  5. model = model.eval()
  6. response, history = model.chat(tokenizer, "你好", history=[])
  7. print(response)
  8. response, history = model.chat(tokenizer, "chatglm3模型的架构是使用了Transformer的哪些部分")
  9. print(response)

声明:本文内容由网友自发贡献,不代表【wpsshop博客】立场,版权归原作者所有,本站不承担相应法律责任。如您发现有侵权的内容,请联系我们。转载请注明出处:https://www.wpsshop.cn/w/凡人多烦事01/article/detail/439209
推荐阅读
  

闽ICP备14008679号