当前位置:   article > 正文

基于modelscope平台搭建百川13b2的对话模型demo_from modelscope import snapshot_download

from modelscope import snapshot_download

1、简单粗暴直接上代码

        注释:作者基于下图的环境搭建没什么问题 ,还需要modelscope>=1.9.1   ,gradio直接安装最新的

         

  1. import torch
  2. import gradio as gr
  3. import torch.nn as nn
  4. from modelscope import snapshot_download, Model
  5. model_dir = snapshot_download(
  6. "baichuan-inc/Baichuan2-13B-Chat", revision='v1.0.1')
  7. model = Model.from_pretrained(
  8. model_dir, device_map="auto", torch_dtype=torch.float16, trust_remote_code=True)
  9. def clear_session():
  10. return []
  11. def predict(input, history):
  12. if history is None:
  13. history = []
  14. model_input = []
  15. for chat in history:
  16. model_input.append({"role": "user", "content": chat[0]})
  17. model_input.append({"role": "assistant", "content": chat[1]})
  18. model_input.append({"role": "user", "content": input})
  19. print(model_input)
  20. response = model(model_input)["response"]
  21. history.append((input, response))
  22. history = history[-20:]
  23. return '', history
  24. block = gr.Blocks()
  25. with block as demo:
  26. gr.Markdown("""<h1><center>Baichuan2-13B-Chat</center></h1>
  27. <center>Baichuan2-13B-Chat为Baichuan2-13B系列模型中对齐后的版本,预训练模型可见Baichuan2-13B-Base</center>
  28. """)
  29. chatbot = gr.Chatbot(label='Baichuan2-13B-Chat')
  30. message = gr.Textbox()
  31. message.submit(predict,
  32. inputs=[message, chatbot],
  33. outputs=[message, chatbot])
  34. with gr.Row():
  35. clear_history = gr.Button("
    声明:本文内容由网友自发贡献,不代表【wpsshop博客】立场,版权归原作者所有,本站不承担相应法律责任。如您发现有侵权的内容,请联系我们。转载请注明出处:https://www.wpsshop.cn/w/Monodyee/article/detail/189634
    推荐阅读
    相关标签