赞
踩
不限微信,欢迎+“ai_qingke113”交流
文章首发于:http://wangguo.site/Blog/2024/Q1/fastgpt-chatglm2-wechat/
本地部署教程之前写过,按照这个来即可:https://wangguo.site/posts/9d8c1768.html
这个是运行web_demo.py的
这次运行openai_api.py,启动接口服务
python openai_api.py
参考:https://doc.fastgpt.in/docs/development/one-api/
docker run --name one-api -d --restart always -p 13000:3000 -e TZ=Asia/Shanghai -v /home/ubuntu/data/one-api:/data justsong/one-api
http://localhost:13000
参考:https://doc.fastgpt.in/docs/development/docker/
mkdir fastgpt
cd fastgpt
curl -O https://raw.githubusercontent.com/labring/FastGPT/main/files/deploy/fastgpt/docker-compose.yml
curl -O https://raw.githubusercontent.com/labring/FastGPT/main/projects/app/data/config.json
- OPENAI_BASE_URL=http://192.168.1.14:8000/v1 //这里的IP地址用本地IPV4的地址,可用ifconfig查看
- CHAT_API_KEY=sk- //这里填入One-API里面的令牌
"chatModels": [
{
"model": "chatglm2",
"name": "chatglm2",
"maxContext": 8000,
"maxResponse": 4000,
"quoteMaxToken": 2000,
"maxTemperature": 1,
"vision": false,
"defaultSystemChatPrompt": ""
},
docker-compose pull
docker-compose up -d
应用 -> 新建
AI模型选择ChatGLM2
参考:https://doc.fastgpt.in/docs/use-cases/wechat/
docker pull aibotk/wechat-assistant
docker run -d -e AIBOTK_KEY="微秘书apikey" -e AIBOTK_SECRET="微秘书apiSecret" --name=wechatbot aibotk/wechat-assistant
ERROR: Exception in ASGI application Traceback (most recent call last): File "/home/guo/Desktop/ChatGLM2-6B-main/venv/lib/python3.11/site-packages/sse_starlette/sse.py", line 269, in __call__ await wrap(partial(self.listen_for_disconnect, receive)) File "/home/guo/Desktop/ChatGLM2-6B-main/venv/lib/python3.11/site-packages/sse_starlette/sse.py", line 258, in wrap await func() File "/home/guo/Desktop/ChatGLM2-6B-main/venv/lib/python3.11/site-packages/sse_starlette/sse.py", line 215, in listen_for_disconnect message = await receive() ^^^^^^^^^^^^^^^ File "/home/guo/Desktop/ChatGLM2-6B-main/venv/lib/python3.11/site-packages/uvicorn/protocols/http/h11_impl.py", line 534, in receive await self.message_event.wait() File "/usr/lib/python3.11/asyncio/locks.py", line 213, in wait await fut asyncio.exceptions.CancelledError: Cancelled by cancel scope 7fbe7b1b8f50 During handling of the above exception, another exception occurred: + Exception Group Traceback (most recent call last): | File "/home/guo/Desktop/ChatGLM2-6B-main/venv/lib/python3.11/site-packages/uvicorn/protocols/http/h11_impl.py", line 404, in run_asgi | result = await app( # type: ignore[func-returns-value] | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/home/guo/Desktop/ChatGLM2-6B-main/venv/lib/python3.11/site-packages/uvicorn/middleware/proxy_headers.py", line 84, in __call__ | return await self.app(scope, receive, send) | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/home/guo/Desktop/ChatGLM2-6B-main/venv/lib/python3.11/site-packages/fastapi/applications.py", line 1054, in __call__ | await super().__call__(scope, receive, send) | File "/home/guo/Desktop/ChatGLM2-6B-main/venv/lib/python3.11/site-packages/starlette/applications.py", line 123, in __call__ | await self.middleware_stack(scope, receive, send) | File "/home/guo/Desktop/ChatGLM2-6B-main/venv/lib/python3.11/site-packages/starlette/middleware/errors.py", line 186, in __call__ | raise exc | File "/home/guo/Desktop/ChatGLM2-6B-main/venv/lib/python3.11/site-packages/starlette/middleware/errors.py", line 164, in __call__ | await self.app(scope, receive, _send) | File "/home/guo/Desktop/ChatGLM2-6B-main/venv/lib/python3.11/site-packages/starlette/middleware/cors.py", line 83, in __call__ | await self.app(scope, receive, send) | File "/home/guo/Desktop/ChatGLM2-6B-main/venv/lib/python3.11/site-packages/starlette/middleware/exceptions.py", line 62, in __call__ | await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send) | File "/home/guo/Desktop/ChatGLM2-6B-main/venv/lib/python3.11/site-packages/starlette/_exception_handler.py", line 64, in wrapped_app | raise exc | File "/home/guo/Desktop/ChatGLM2-6B-main/venv/lib/python3.11/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app | await app(scope, receive, sender) | File "/home/guo/Desktop/ChatGLM2-6B-main/venv/lib/python3.11/site-packages/starlette/routing.py", line 762, in __call__ | await self.middleware_stack(scope, receive, send) | File "/home/guo/Desktop/ChatGLM2-6B-main/venv/lib/python3.11/site-packages/starlette/routing.py", line 782, in app | await route.handle(scope, receive, send) | File "/home/guo/Desktop/ChatGLM2-6B-main/venv/lib/python3.11/site-packages/starlette/routing.py", line 297, in handle | await self.app(scope, receive, send) | File "/home/guo/Desktop/ChatGLM2-6B-main/venv/lib/python3.11/site-packages/starlette/routing.py", line 77, in app | await wrap_app_handling_exceptions(app, request)(scope, receive, send) | File "/home/guo/Desktop/ChatGLM2-6B-main/venv/lib/python3.11/site-packages/starlette/_exception_handler.py", line 64, in wrapped_app | raise exc | File "/home/guo/Desktop/ChatGLM2-6B-main/venv/lib/python3.11/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app | await app(scope, receive, sender) | File "/home/guo/Desktop/ChatGLM2-6B-main/venv/lib/python3.11/site-packages/starlette/routing.py", line 75, in app | await response(scope, receive, send) | File "/home/guo/Desktop/ChatGLM2-6B-main/venv/lib/python3.11/site-packages/sse_starlette/sse.py", line 255, in __call__ | async with anyio.create_task_group() as task_group: | File "/home/guo/Desktop/ChatGLM2-6B-main/venv/lib/python3.11/site-packages/anyio/_backends/_asyncio.py", line 678, in __aexit__ | raise BaseExceptionGroup( | ExceptionGroup: unhandled errors in a TaskGroup (1 sub-exception) +-+---------------- 1 ---------------- | Traceback (most recent call last): | File "/home/guo/Desktop/ChatGLM2-6B-main/venv/lib/python3.11/site-packages/sse_starlette/sse.py", line 258, in wrap | await func() | File "/home/guo/Desktop/ChatGLM2-6B-main/venv/lib/python3.11/site-packages/sse_starlette/sse.py", line 245, in stream_response | async for data in self.body_iterator: | File "/home/guo/Desktop/ChatGLM2-6B-main/openai_api.py", line 138, in predict | yield "{}".format(chunk.model_dump_json(exclude_unset=True,exclude_none=True)) | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | File "/home/guo/Desktop/ChatGLM2-6B-main/venv/lib/python3.11/site-packages/typing_extensions.py", line 2499, in wrapper | return arg(*args, **kwargs) | ^^^^^^^^^^^^^^^^^^^^ | File "/home/guo/Desktop/ChatGLM2-6B-main/venv/lib/python3.11/site-packages/pydantic/main.py", line 1011, in json | raise TypeError('`dumps_kwargs` keyword arguments are no longer supported.') | TypeError: `dumps_kwargs` keyword arguments are no longer supported. +------------------------------------ ^CINFO: Shutting down INFO: Waiting for application shutdown. INFO: Application shutdown complete. INFO: Finished server process [109773]
解决:参考https://github.com/THUDM/ChatGLM2-6B/issues/483
将openai_api.py文件中3处
chunk.json(exclude_unset=True, ensure_ascii=False)
替换为
chunk.model_dump_json(exclude_unset=True,exclude_none=True)
Copyright © 2003-2013 www.wpsshop.cn 版权所有,并保留所有权利。