赞
踩
$ conda create --prefix /data/envs/chatchat python=3.10
$ cd /data/envs/chatchat
$ conda activate /data/envs/chatchat
$ git clone https://github.com/chatchat-space/Langchain-Chatchat.git
$ cd Langchain-Chatchat
$ pip install -r requirements.txt
$ pip install -r requirements_api.txt
$ pip install -r requirements_webui.txt
$ pip install openai==0.28
默认依赖包括基本运行环境(FAISS向量库)。如果要使用 milvus/pg_vector 等向量库,请将 requirements.txt 中相应依赖取消注释再安装。
$ yum install git-lfs
$ git lfs install
Git LFS initialized.
git clone https://huggingface.co/THUDM/chatglm3-6b
git clone https://huggingface.co/BAAI/bge-large-zh
#以上两个模型下载很慢,从网上找到模型库,clone下来即可
$ git clone https://www.modelscope.cn/ZhipuAI/chatglm3-6b.git
$ git clone https://www.modelscope.cn/AI-ModelScope/bge-large-zh.git
$ python copy_config_example.py
$ python init_database.py --recreate-vs
MODEL_ROOT_PATH = “/data/envs/chatchat/Langchain-Chatchat/aimodel”
$ pip uninstall torch
$ pip uninstall torchvision
$ pip install torch1.11.0+cu113 torchvision0.12.0+cu113 torchaudio==0.11.0 --extra-index-url https://download.pytorch.org/whl/cu113
官网:https://pytorch.org/get-started/previous-versions/
python
import torch
print(torch.version)
1.11.0+cu113
print(torch.cuda.is_available())
True
$ python startup.py -a
##报错:RuntimeError: Failed to import transformers.models.llama.modeling_llama because of the following error (look up to see its traceback):
module ‘torch’ has no attribute ‘fx’
##发现是transformers版本问题,transformers 4.35.2 降低4.34.0
$ pip install transformers==4.34.0
$ python startup.py -a
Chatchat WEBUI Server: http://0.0.0.0:8501
$ /sbin/iptables -I INPUT -p tcp --dport 8501 -j ACCEPT
$ /sbin/iptables -I INPUT -p tcp --dport 7861 -j ACCEPT
文章参考:本地搭建chatgpt知识库 - 掘金 (juejin.cn)
模型:(https://github.com/chatchat-space/Langchain-Chatchat?tab=readme-ov-file)
Copyright © 2003-2013 www.wpsshop.cn 版权所有,并保留所有权利。