赞
踩
整个活,同时分享技术~至于是啥活,懂得都懂,男孩子自强自尊自爱!!!
先看看实现效果吧:
那么这里的话,我们使用到的是国内的LLM,来自moonshot的大语言模型。那么废话不多少,快速开始吧。
现在我们来获取暗月之面的API,这里我们需要进入到开发平台:https://platform.moonshot.cn/console/info 这里你可能比较好奇,为什么使用这个LLM,实际上,是因为综合体验下来,它的中文效果较好,可以完成较为复杂的操作,相对于3.5或者其他模型来说。同时价格在能够接受的合理范围,当然,在我们接下来使用的中转站当中也可以直接使用GPT4.0但是使用成本将大大提升!
进入平台之后,按照平台提示即可完成创建,当然这里注意,免费用户有15元钱的token,但是存在并发限制,因此建议适开通付费提高并发量。
那么首先的话,我们来开始编写到提示词,这个非常简单:
# initalize the config of chatbot
api_key = "sk-FGivAMvTnxPSWlp7HrGfDD"
openai_api_base = "https://api.moonshot.cn/v1"
system_prompt = "你是跨国婚姻法律小助手,小汐,负责回答用户关于跨国婚姻的问题。你的回答要清晰明了,有逻辑性和条理性。请使用中文回答。"
default_model = "moonshot-v1-8k"
temperature = 0.5
编写完毕提示词之后,这还远远不够,我们需要对接模型,这里的话因为接口是按照openai的范式来的,所以的话我们直接用OpenAI这个库就好了。
然后看到下面的代码:
client = OpenAI(api_key=api_key,base_url=openai_api_base) class ChatBotHandler(object): def __init__(self, bot_name="chat"): self.bot_name = bot_name self.current_message = None def user_stream(self,user_message, history): self.current_message = user_message return "", history + [[user_message, None]] def bot_stream(self,history): if(len(history)==0): history.append([self.current_message,None]) bot_message = self.getResponse(history[-1][0],history) history[-1][1] = "" for character in bot_message: history[-1][1] += character time.sleep(0.02) yield history def signChat(self,history): history_openai_format = [] # 先加入系统信息 history_openai_format.append( {"role": "system", "content": system_prompt }, ) # 再加入解析信息 history_openai_format.extend(history) # print(history_openai_format) completion = client.chat.completions.create( model=default_model, messages=history_openai_format, temperature=temperature, ) result = completion.choices[0].message.content return result def getResponse(self,message,history): history_openai_format = [] for human, assistant in history: # 基础对话的系统设置 history_openai_format.append( {"role": "system", "content":system_prompt }, ) if(human!=None): history_openai_format.append({"role": "user", "content": human}) if(assistant!=None): history_openai_format.append({"role": "assistant", "content": assistant}) completion = client.chat.completions.create( model=default_model, messages=history_openai_format, temperature=temperature, ) result = completion.choices[0].message.content return result def chat(self,message, history): history_openai_format = [] for human, assistant in history: history_openai_format.append({"role": "user", "content": human}) history_openai_format.append({"role": "system", "content": assistant}) history_openai_format.append({"role": "user", "content": message}) response = client.chat.completions.create(model=default_model, messages=history_openai_format, temperature=1.0, stream=True) partial_message = "" for chunk in response: if chunk.choices[0].delta.content is not None: partial_message = partial_message + chunk.choices[0].delta.content yield partial_message
之后的话,就是提供webUI,这里的话还是直接使用到了streamlit
class AssistantNovel(object):
def __init__(self):
self.chat = ChatBotHandler()
def get_response(self,prompt, history):
return self.chat.signChat(history)
def clear_chat_history(self):
st.session_state.messages = [{"role": "assistant", "content": "声明:本文内容由网友自发贡献,不代表【wpsshop博客】立场,版权归原作者所有,本站不承担相应法律责任。如您发现有侵权的内容,请联系我们。转载请注明出处:https://www.wpsshop.cn/w/weixin_40725706/article/detail/896044
Copyright © 2003-2013 www.wpsshop.cn 版权所有,并保留所有权利。