赞
踩
本文主要介绍使用virtualenv库生成venv,进而部署/运行ChatGLM3-6B开源双语对话语言模型的方法。
本文代码已部署到百度飞桨AI Studio平台,以供大家在线体验。
项目链接:ChatGLM3 在线体验
virtualenv环境部署代码如下:
- git clone https://github.com/THUDM/ChatGLM3
-
- cd ChatGLM3
-
- pip install -U virtualenv
-
- python -m virtualenv venv
-
- source venv/bin/activate
-
- pip install --upgrade pip
-
- pip install protobuf transformers cpm_kernels torch gradio mdtex2html sentencepiece accelerate peft
实测仅安装 protobuf transformers cpm_kernels torch gradio mdtex2html sentencepiece accelerate peft 这些库就可以运行Gradio交互式界面进行对话。
如果有其他需求,可以根据 requirements.txt 文件安装其他依赖库。
requirements.txt 文件内容:
- # basic requirements
-
- transformers==4.40.0
- cpm_kernels>=1.0.11
- torch>=2.3.0
- vllm>=0.4.2
- gradio>=4.26.0
- sentencepiece>=0.2.0
- sentence_transformers>=2.7.0
- accelerate>=0.29.2
- streamlit>=1.33.0
- fastapi>=0.110.0
- loguru~=0.7.2
- mdtex2html>=1.3.0
- latex2mathml>=3.77.0
- jupyter_client>=8.6.1
-
- # for openai demo
- openai>=1.30.1
- pydantic>=2.7.1
- sse-starlette>=2.1.0
- uvicorn>=0.29.0
- timm>=0.9.16
- tiktoken>=0.6.0
-
- # for langchain demo
-
- langchain>=0.2.1
- langchain_community>=0.2.0
- langchainhub>=0.1.15
- arxiv>=2.1.0
原始链接:https://huggingface.co/THUDM/chatglm3-6b
镜像链接:https://hf-mirror.com/THUDM/chatglm3-6b
由于官方后续提供了.safetensors格式的模型,目前模型下载页内容很多,这里采用wget下载.safetensors版模型对应的文件(共18个文件,约12GB)。
项目demo中默认模型路径为 ChatGLM3/THUDM/chatglm3-6b,可以通过定义MODEL_PATH环境变量从其他路径加载模型。
下载代码:
- cd ~/ChatGLM3
-
- mkdir THUDM THUDM/chatglm3-6b
-
- cd THUDM/chatglm3-6b
-
- wget https://hf-mirror.com/THUDM/chatglm3-6b/resolve/main/MODEL_LICENSE
- wget https://hf-mirror.com/THUDM/chatglm3-6b/resolve/main/README.md
- wget https://hf-mirror.com/THUDM/chatglm3-6b/resolve/main/config.json
- wget https://hf-mirror.com/THUDM/chatglm3-6b/resolve/main/configuration_chatglm.py
- wget https://hf-mirror.com/THUDM/chatglm3-6b/resolve/main/model-00001-of-00007.safetensors
- wget https://hf-mirror.com/THUDM/chatglm3-6b/resolve/main/model-00002-of-00007.safetensors
- wget https://hf-mirror.com/THUDM/chatglm3-6b/resolve/main/model-00003-of-00007.safetensors
- wget https://hf-mirror.com/THUDM/chatglm3-6b/resolve/main/model-00004-of-00007.safetensors
- wget https://hf-mirror.com/THUDM/chatglm3-6b/resolve/main/model-00005-of-00007.safetensors
- wget https://hf-mirror.com/THUDM/chatglm3-6b/resolve/main/model-00006-of-00007.safetensors
- wget https://hf-mirror.com/THUDM/chatglm3-6b/resolve/main/model-00007-of-00007.safetensors
- wget https://hf-mirror.com/THUDM/chatglm3-6b/resolve/main/model.safetensors.index.json
- wget https://hf-mirror.com/THUDM/chatglm3-6b/resolve/main/modeling_chatglm.py
- wget https://hf-mirror.com/THUDM/chatglm3-6b/resolve/main/quantization.py
- wget https://hf-mirror.com/THUDM/chatglm3-6b/resolve/main/special_tokens_map.json
- wget https://hf-mirror.com/THUDM/chatglm3-6b/resolve/main/tokenization_chatglm.py
- wget https://hf-mirror.com/THUDM/chatglm3-6b/resolve/main/tokenizer.model
- wget https://hf-mirror.com/THUDM/chatglm3-6b/resolve/main/tokenizer_config.json
ChatGLM3的运行很简单,激活虚拟环境后运行 basic_demo/web_demo_gradio.py 文件即可。
- cd ChatGLM3
-
- source venv/bin/activate
-
- python basic_demo/web_demo_gradio.py
Copyright © 2003-2013 www.wpsshop.cn 版权所有,并保留所有权利。