当前位置:   article > 正文

在魔塔社区搭建通义千问-7B(Qwen-7B)流程

魔塔社区

 

 复制以下语句

  1. python3 -m venv myvenv
  2. source myvenv/bin/activate
  3. pip install modelscope
  4. pip install transformers_stream_generator
  5. pip install transformers
  6. pip install tiktoken
  7. pip install accelerate
  8. pip install bitsandbytes
  9. touch run.py
  10. vi run.py

复制下面代码粘贴 到 run.py

  1. import os
  2. import platform
  3. from modelscope import AutoModelForCausalLM, AutoTokenizer, GenerationConfig
  4. model_id = 'qwen/Qwen-7B-Chat'
  5. revision = 'v1.0.1'
  6. tokenizer = AutoTokenizer.from_pretrained(model_id, revision=revision, trust_remote_code=True)
  7. # use fp16
  8. model = AutoModelForCausalLM.from_pretrained(model_id, device_map="auto", revision=revision,
  9. trust_remote_code=True, fp16=True).eval()
  10. model.generation_config = GenerationConfig.from_pretrained(model_id,
  11. trust_remote_code=True) # 可指定不同的生成长度、top_p等相关超参
  12. stop_stream = False
  13. def clear_screen():
  14. if platform.system() == "Windows":
  15. os.system("cls")
  16. else:
  17. os.system("clear")
  18. def print_history(history):
  19. for pair in history:
  20. print(f"\nUser:{pair[0]}\nQwen-7B:{pair[1]}")
  21. def main():
  22. history, response = [], ''
  23. global stop_stream
  24. clear_screen()
  25. print("欢迎使用 Qwen-7B 模型,输入内容即可进行对话,clear 清空对话历史,stop 终止程序")
  26. while True:
  27. query = input("\nUser:")
  28. if query.strip() == "stop":
  29. break
  30. if query.strip() == "clear":
  31. history = []
  32. clear_screen()
  33. print("欢迎使用 Qwen-7B 模型,输入内容即可进行对话,clear 清空对话历史,stop 终止程序")
  34. continue
  35. for response in model.chat(tokenizer, query, history=history, stream=True):
  36. if stop_stream:
  37. stop_stream = False
  38. break
  39. else:
  40. clear_screen()
  41. print_history(history)
  42. print(f"\nUser: {query}")
  43. print("\nQwen-7B:", end="")
  44. print(response)
  45. history.append((query, response))
  46. if __name__ == "__main__":
  47. main()

按下以下按键: ESC键 :wq 回车
就保存好了

然后就运行

python run.py

 

声明:本文内容由网友自发贡献,不代表【wpsshop博客】立场,版权归原作者所有,本站不承担相应法律责任。如您发现有侵权的内容,请联系我们。转载请注明出处:https://www.wpsshop.cn/w/weixin_40725706/article/detail/103294
推荐阅读
相关标签
  

闽ICP备14008679号