赞
踩
chatglm2-6m模型
git 地址 https://github.com/THUDM/ChatGLM2-6B
模型百度网盘地址:链接:https://pan.baidu.com/s/1-LFcPB0H23RSpTKOECsjxw?pwd=5e3d
提取码:5e3d
ChatGLM-6B 是一个开源的、支持中英双语的对话语言模型,基于 General Language Model (GLM) 架构,具有 62 亿参数。
chatglm对显存的要求最低为6GB
model_path=r"E:\download\chatglm2-6b"
model = AutoModel.from_pretrained(model_path, trust_remote_code=True).quantize(8).cuda()
from transformers import AutoTokenizer, AutoModel # model_path=r"E:\download\chatglm2-6b-int4" model_path=r"E:\download\chatglm2-6b" tokenizer = AutoTokenizer.from_pretrained(model_path, trust_remote_code=True) # model = AutoModel.from_pretrained(model_path, trust_remote_code=True, device='cuda') # 按需修改,目前只支持 4/8 bit 量化 model = AutoModel.from_pretrained(model_path, trust_remote_code=True).quantize(8).cuda() # model = AutoModel.from_pretrained(model_path,trust_remote_code=True).cuda() model = model.eval() response, history = model.chat(tokenizer, "你好", history=[]) print(response) response, history = model.chat(tokenizer, "晚上睡不着应该怎么办", history=history) print(response)
方式二 :
使用web_demo.py
直接运行即可
方式三:使用streamlit
需要先找到streamlit所在文件,再执行以下命令
D:\pythonapp\anacondas\envs\chatglm\Scripts\streamlit run web_demo2.py
鸡兔同笼问题
Copyright © 2003-2013 www.wpsshop.cn 版权所有,并保留所有权利。