赞
踩
LangChain系列文章
在LLM应用程序中可能会出现许多故障点,无论是LLM API的问题,模型输出不佳,还是其他集成的问题等等。备用方案可以帮助您优雅地处理并隔离这些问题。
至关重要的是,备用方案不仅可以应用在LLM级别上,还可以应用在整个可运行级别上。
重要提示:默认情况下,许多LLM包装器会捕获错误并重试。在使用回退时,您很可能希望关闭这些功能。否则,第一个包装器将继续重试而不会失败。
ChatOpenAI
,然后是普通的OpenAI
(不使用聊天模型)。因为OpenAI
不是一个聊天模型,你可能想要一个不同的提示。from langchain.prompts import PromptTemplate from langchain_community.chat_models import ChatOpenAI from langchain_core.runnables import ConfigurableField # We add in a string output parser here so the outputs between the two are the same type from langchain_core.output_parsers import StrOutputParser from langchain.prompts import ChatPromptTemplate # Now lets create a chain with the normal OpenAI model from langchain_community.llms import OpenAI from dotenv import load_dotenv # 导入从 .env 文件加载环境变量的函数 load_dotenv() # 调用函数实际加载环境变量 from langchain.globals import set_debug # 导入在 langchain 中设置调试模式的函数 set_debug(True) # 启用 langchain 的调试模式 # First let's create a chain with a ChatModel chat_prompt = ChatPromptTemplate.from_messages( [ ( "system", "You're a nice assistant who always includes a compliment in your response", ), ("human", "Why did the {animal} cross the road"), ] ) # Here we're going to use a bad model name to easily create a chain that will error chat_model = ChatOpenAI(model_name="gpt-fake") bad_chain = chat_prompt | chat_model | StrOutputParser() prompt_template = """Instructions: You should always include a compliment in your response. Question: Why did the {animal} cross the road?""" prompt = PromptTemplate.from_template(prompt_template) llm = OpenAI() good_chain = prompt | llm # We can now create a final chain which combines the two chain = bad_chain.with_fallbacks([good_chain]) response = chain.invoke({"animal": "turtle"}) print('response >> ', response)
运行结果
zgpeace@zgpeaces-MacBook-Pro ~/W/L/langchain-llm-app> python LCEL/fallbacks.py [18:59:12] develop? [chain/start] [1:chain:RunnableWithFallbacks] Entering Chain run with input: { "animal": "turtle" } [chain/start] [1:chain:RunnableWithFallbacks > 2:chain:RunnableSequence] Entering Chain run with input: { "animal": "turtle" } [chain/start] [1:chain:RunnableWithFallbacks > 2:chain:RunnableSequence > 3:prompt:ChatPromptTemplate] Entering Prompt run with input: { "animal": "turtle" } [chain/end] [1:chain:RunnableWithFallbacks > 2:chain:RunnableSequence > 3:prompt:ChatPromptTemplate] [1ms] Exiting Prompt run with output: { "lc": 1, "type": "constructor", "id": [ "langchain", "prompts", "chat", "ChatPromptValue" ], "kwargs": { "messages": [ { "lc": 1, "type": "constructor", "id": [ "langchain", "schema", "messages", "SystemMessage" ], "kwargs": { "content": "You're a nice assistant who always includes a compliment in your response", "additional_kwargs": {} } }, { "lc": 1, "type": "constructor", "id": [ "langchain", "schema", "messages", "HumanMessage" ], "kwargs": { "content": "Why did the turtle cross the road", "additional_kwargs": {} } } ] } } [llm/start] [1:chain:RunnableWithFallbacks > 2:chain:RunnableSequence > 4:llm:ChatOpenAI] Entering LLM run with input: { "prompts": [ "System: You're a nice assistant who always includes a compliment in your response\nHuman: Why did the turtle cross the road" ] } [llm/error] [1:chain:RunnableWithFallbacks > 2:chain:RunnableSequence > 4:llm:ChatOpenAI] [5.02s] LLM run errored with error: "APIConnectionError('Connection error.')Traceback (most recent call last):\n\n\n File \"/usr/local/lib/python3.11/site-packages/openai/_base_client.py\", line 912, in _request\n response.raise_for_status()\n\n\n File \"/usr/local/lib/python3.11/site-packages/httpx/_models.py\", line 759, in raise_for_status\n raise HTTPStatusError(message, request=request, response=self)\n\n\nhttpx.HTTPStatusError: Server error '503 Service Unavailable' for url 'https://apejhvxcd.cloud.sealos.io/v1/chat/completions'\nFor more information check: https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/503\n\n\n\nDuring handling of the above exception, another exception occurred:\n\n\n\nTraceback (most recent call last):\n\n\n File \"/usr/local/lib/python3.11/site-packages/openai/_base_client.py\", line 912, in _request\n response.raise_for_status()\n\n\n File \"/usr/local/lib/python3.11/site-packages/httpx/_models.py\", line 759, in raise_for_status\n raise HTTPStatusError(message, request=request, response=self)\n\n\nhttpx.HTTPStatusError: Server error '503 Service Unavailable' for url 'https://apejhvxcd.cloud.sealos.io/v1/chat/completions'\nFor more information check: https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/503\n\n\n\nDuring handling of the above exception, another exception occurred:\n\n\n\nTraceback (most recent call last):\n\n\n File \"/usr/local/lib/python3.11/site-packages/httpcore/_exceptions.py\", line 10, in map_exceptions\n yield\n\n\n File \"/usr/local/lib/python3.11/site-packages/httpcore/_backends/sync.py\", line 168, in start_tls\n raise exc\n\n\n File \"/usr/local/lib/python3.11/site-packages/httpcore/_backends/sync.py\", line 163, in start_tls\n sock = ssl_context.wrap_socket(\n ^^^^^^^^^^^^^^^^^^^^^^^^\n\n\n File \"/usr/local/Cellar/python@3.11/3.11.6_1/Frameworks/Python.framework/Versions/3.11/lib/python3.11/ssl.py\", line 517, in wrap_socket\n return self.sslsocket_class._create(\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n\n\n File \"/usr/local/Cellar/python@3.11/3.11.6_1/Frameworks/Python.framework/Versions/3.11/lib/python3.11/ssl.py\", line 1108, in _create\n self.do_handshake()\n\n\n File \"/usr/local/Cellar/python@3.11/3.11.6_1/Frameworks/Python.framework/Versions/3.11/lib/python3.11/ssl.py\", line 1379, in do_handshake\n self._sslobj.do_handshake()\n\n\nssl.SSLEOFError: [SSL: UNEXPECTED_EOF_WHILE_READING] EOF occurred in violation of protocol (_ssl.c:1006)\n\n\n\nThe above exception was the direct cause of the following exception:\n\n\n\nTraceback (most recent call last):\n\n\n File \"/usr/local/lib/python3.11/site-packages/httpx/_transports/default.py\", line 67, in map_httpcore_exceptions\n yield\n\n\n File \"/usr/local/lib/python3.11/site-packages/httpx/_transports/default.py\", line 231, in handle_request\n resp = self._pool.handle_request(req)\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n\n\n File \"/usr/local/lib/python3.11/site-packages/httpcore/_sync/connection_pool.py\", line 268, in handle_request\n raise exc\n\n\n File \"/usr/local/lib/python3.11/site-packages/httpcore/_sync/connection_pool.py\", line 251, in handle_request\n response = connection.handle_request(request)\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n\n\n File \"/usr/local/lib/python3.11/site-packages/httpcore/_sync/http_proxy.py\", line 317, in handle_request\n stream = stream.start_tls(**kwargs)\n ^^^^^^^^^^^^^^^^^^^^^^^^^^\n\n\n File \"/usr/local/lib/python3.11/site-packages/httpcore/_backends/sync.py\", line 152, in start_tls\n with map_exceptions(exc_map):\n\n\n File \"/usr/local/Cellar/python@3.11/3.11.6_1/Frameworks/Python.framework/Versions/3.11/lib/python3.11/contextlib.py\", line 155, in __exit__\n self.gen.throw(typ, value, traceback)\n\n\n File \"/usr/local/lib/python3.11/site-packages/httpcore/_exceptions.py\", line 14, in map_exceptions\n raise to_exc(exc) from exc\n\n\nhttpcore.ConnectError: [SSL: UNEXPECTED_EOF_WHILE_READING] EOF occurred in violation of protocol (_ssl.c:1006)\n\n\n\nThe above exception was the direct cause of the following exception:\n\n\n\nTraceback (most recent call last):\n\n\n File \"/usr/local/lib/python3.11/site-packages/openai/_base_client.py\", line 877, in _request\n response = self._client.send(\n ^^^^^^^^^^^^^^^^^^\n\n\n File \"/usr/local/lib/python3.11/site-packages/httpx/_client.py\", line 915, in send\n response = self._send_handling_auth(\n ^^^^^^^^^^^^^^^^^^^^^^^^^\n\n\n File \"/usr/local/lib/python3.11/site-packages/httpx/_client.py\", line 943, in _send_handling_auth\n response = self._send_handling_redirects(\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n\n\n File \"/usr/local/lib/python3.11/site-packages/httpx/_client.py\", line 980, in _send_handling_redirects\n response = self._send_single_request(request)\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n\n\n File \"/usr/local/lib/python3.11/site-packages/httpx/_client.py\", line 1016, in _send_single_request\n response = transport.handle_request(request)\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n\n\n File \"/usr/local/lib/python3.11/site-packages/httpx/_transports/default.py\", line 230, in handle_request\n with map_httpcore_exceptions():\n\n\n File \"/usr/local/Cellar/python@3.11/3.11.6_1/Frameworks/Python.framework/Versions/3.11/lib/python3.11/contextlib.py\", line 155, in __exit__\n self.gen.throw(typ, value, traceback)\n\n\n File \"/usr/local/lib/python3.11/site-packages/httpx/_transports/default.py\", line 84, in map_httpcore_exceptions\n raise mapped_exc(message) from exc\n\n\nhttpx.ConnectError: [SSL: UNEXPECTED_EOF_WHILE_READING] EOF occurred in violation of protocol (_ssl.c:1006)\n\n\n\nThe above exception was the direct cause of the following exception:\n\n\n\nTraceback (most recent call last):\n\n\n File \"/usr/local/lib/python3.11/site-packages/langchain_core/language_models/chat_models.py\", line 372, in generate\n self._generate_with_cache(\n\n\n File \"/usr/local/lib/python3.11/site-packages/langchain_core/language_models/chat_models.py\", line 528, in _generate_with_cache\n return self._generate(\n ^^^^^^^^^^^^^^^\n\n\n File \"/usr/local/lib/python3.11/site-packages/langchain_community/chat_models/openai.py\", line 435, in _generate\n response = self.completion_with_retry(\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^\n\n\n File \"/usr/local/lib/python3.11/site-packages/langchain_community/chat_models/openai.py\", line 352, in completion_with_retry\n return self.client.create(**kwargs)\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n\n\n File \"/usr/local/lib/python3.11/site-packages/openai/_utils/_utils.py\", line 272, in wrapper\n return func(*args, **kwargs)\n ^^^^^^^^^^^^^^^^^^^^^\n\n\n File \"/usr/local/lib/python3.11/site-packages/openai/resources/chat/completions.py\", line 645, in create\n return self._post(\n ^^^^^^^^^^^\n\n\n File \"/usr/local/lib/python3.11/site-packages/openai/_base_client.py\", line 1088, in post\n return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls))\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n\n\n File \"/usr/local/lib/python3.11/site-packages/openai/_base_client.py\", line 853, in request\n return self._request(\n ^^^^^^^^^^^^^^\n\n\n File \"/usr/local/lib/python3.11/site-packages/openai/_base_client.py\", line 916, in _request\n return self._retry_request(\n ^^^^^^^^^^^^^^^^^^^^\n\n\n File \"/usr/local/lib/python3.11/site-packages/openai/_base_client.py\", line 958, in _retry_request\n return self._request(\n ^^^^^^^^^^^^^^\n\n\n File \"/usr/local/lib/python3.11/site-packages/openai/_base_client.py\", line 916, in _request\n return self._retry_request(\n ^^^^^^^^^^^^^^^^^^^^\n\n\n File \"/usr/local/lib/python3.11/site-packages/openai/_base_client.py\", line 958, in _retry_request\n return self._request(\n ^^^^^^^^^^^^^^\n\n\n File \"/usr/local/lib/python3.11/site-packages/openai/_base_client.py\", line 905, in _request\n raise APIConnectionError(request=request) from err\n\n\nopenai.APIConnectionError: Connection error." [chain/error] [1:chain:RunnableWithFallbacks > 2:chain:RunnableSequence] [5.03s] Chain run errored with error: "APIConnectionError('Connection error.')Traceback (most recent call last):\n\n\n File \"/usr/local/lib/python3.11/site-packages/openai/_base_client.py\", line 912, in _request\n response.raise_for_status()\n\n\n File \"/usr/local/lib/python3.11/site-packages/httpx/_models.py\", line 759, in raise_for_status\n raise HTTPStatusError(message, request=request, response=self)\n\n\nhttpx.HTTPStatusError: Server error '503 Service Unavailable' for url 'https://apejhvxcd.cloud.sealos.io/v1/chat/completions'\nFor more information check: https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/503\n\n\n\nDuring handling of the above exception, another exception occurred:\n\n\n\nTraceback (most recent call last):\n\n\n File \"/usr/local/lib/python3.11/site-packages/openai/_base_client.py\", line 912, in _request\n response.raise_for_status()\n\n\n File \"/usr/local/lib/python3.11/site-packages/httpx/_models.py\", line 759, in raise_for_status\n raise HTTPStatusError(message, request=request, response=self)\n\n\nhttpx.HTTPStatusError: Server error '503 Service Unavailable' for url 'https://apejhvxcd.cloud.sealos.io/v1/chat/completions'\nFor more information check: https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/503\n\n\n\nDuring handling of the above exception, another exception occurred:\n\n\n\nTraceback (most recent call last):\n\n\n File \"/usr/local/lib/python3.11/site-packages/httpcore/_exceptions.py\", line 10, in map_exceptions\n yield\n\n\n File \"/usr/local/lib/python3.11/site-packages/httpcore/_backends/sync.py\", line 168, in start_tls\n raise exc\n\n\n File \"/usr/local/lib/python3.11/site-packages/httpcore/_backends/sync.py\", line 163, in start_tls\n sock = ssl_context.wrap_socket(\n ^^^^^^^^^^^^^^^^^^^^^^^^\n\n\n File \"/usr/local/Cellar/python@3.11/3.11.6_1/Frameworks/Python.framework/Versions/3.11/lib/python3.11/ssl.py\", line 517, in wrap_socket\n return self.sslsocket_class._create(\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n\n\n File \"/usr/local/Cellar/python@3.11/3.11.6_1/Frameworks/Python.framework/Versions/3.11/lib/python3.11/ssl.py\", line 1108, in _create\n self.do_handshake()\n\n\n File \"/usr/local/Cellar/python@3.11/3.11.6_1/Frameworks/Python.framework/Versions/3.11/lib/python3.11/ssl.py\", line 1379, in do_handshake\n self._sslobj.do_handshake()\n\n\nssl.SSLEOFError: [SSL: UNEXPECTED_EOF_WHILE_READING] EOF occurred in violation of protocol (_ssl.c:1006)\n\n\n\nThe above exception was the direct cause of the following exception:\n\n\n\nTraceback (most recent call last):\n\n\n File \"/usr/local/lib/python3.11/site-packages/httpx/_transports/default.py\", line 67, in map_httpcore_exceptions\n yield\n\n\n File \"/usr/local/lib/python3.11/site-packages/httpx/_transports/default.py\", line 231, in handle_request\n resp = self._pool.handle_request(req)\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n\n\n File \"/usr/local/lib/python3.11/site-packages/httpcore/_sync/connection_pool.py\", line 268, in handle_request\n raise exc\n\n\n File \"/usr/local/lib/python3.11/site-packages/httpcore/_sync/connection_pool.py\", line 251, in handle_request\n response = connection.handle_request(request)\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n\n\n File \"/usr/local/lib/python3.11/site-packages/httpcore/_sync/http_proxy.py\", line 317, in handle_request\n stream = stream.start_tls(**kwargs)\n ^^^^^^^^^^^^^^^^^^^^^^^^^^\n\n\n File \"/usr/local/lib/python3.11/site-packages/httpcore/_backends/sync.py\", line 152, in start_tls\n with map_exceptions(exc_map):\n\n\n File \"/usr/local/Cellar/python@3.11/3.11.6_1/Frameworks/Python.framework/Versions/3.11/lib/python3.11/contextlib.py\", line 155, in __exit__\n self.gen.throw(typ, value, traceback)\n\n\n File \"/usr/local/lib/python3.11/site-packages/httpcore/_exceptions.py\", line 14, in map_exceptions\n raise to_exc(exc) from exc\n\n\nhttpcore.ConnectError: [SSL: UNEXPECTED_EOF_WHILE_READING] EOF occurred in violation of protocol (_ssl.c:1006)\n\n\n\nThe above exception was the direct cause of the following exception:\n\n\n\nTraceback (most recent call last):\n\n\n File \"/usr/local/lib/python3.11/site-packages/openai/_base_client.py\", line 877, in _request\n response = self._client.send(\n ^^^^^^^^^^^^^^^^^^\n\n\n File \"/usr/local/lib/python3.11/site-packages/httpx/_client.py\", line 915, in send\n response = self._send_handling_auth(\n ^^^^^^^^^^^^^^^^^^^^^^^^^\n\n\n File \"/usr/local/lib/python3.11/site-packages/httpx/_client.py\", line 943, in _send_handling_auth\n response = self._send_handling_redirects(\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n\n\n File \"/usr/local/lib/python3.11/site-packages/httpx/_client.py\", line 980, in _send_handling_redirects\n response = self._send_single_request(request)\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n\n\n File \"/usr/local/lib/python3.11/site-packages/httpx/_client.py\", line 1016, in _send_single_request\n response = transport.handle_request(request)\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n\n\n File \"/usr/local/lib/python3.11/site-packages/httpx/_transports/default.py\", line 230, in handle_request\n with map_httpcore_exceptions():\n\n\n File \"/usr/local/Cellar/python@3.11/3.11.6_1/Frameworks/Python.framework/Versions/3.11/lib/python3.11/contextlib.py\", line 155, in __exit__\n self.gen.throw(typ, value, traceback)\n\n\n File \"/usr/local/lib/python3.11/site-packages/httpx/_transports/default.py\", line 84, in map_httpcore_exceptions\n raise mapped_exc(message) from exc\n\n\nhttpx.ConnectError: [SSL: UNEXPECTED_EOF_WHILE_READING] EOF occurred in violation of protocol (_ssl.c:1006)\n\n\n\nThe above exception was the direct cause of the following exception:\n\n\n\nTraceback (most recent call last):\n\n\n File \"/usr/local/lib/python3.11/site-packages/langchain_core/runnables/base.py\", line 1514, in invoke\n input = step.invoke(\n ^^^^^^^^^^^^\n\n\n File \"/usr/local/lib/python3.11/site-packages/langchain_core/language_models/chat_models.py\", line 164, in invoke\n self.generate_prompt(\n\n\n File \"/usr/local/lib/python3.11/site-packages/langchain_core/language_models/chat_models.py\", line 495, in generate_prompt\n return self.generate(prompt_messages, stop=stop, callbacks=callbacks, **kwargs)\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n\n\n File \"/usr/local/lib/python3.11/site-packages/langchain_core/language_models/chat_models.py\", line 382, in generate\n raise e\n\n\n File \"/usr/local/lib/python3.11/site-packages/langchain_core/language_models/chat_models.py\", line 372, in generate\n self._generate_with_cache(\n\n\n File \"/usr/local/lib/python3.11/site-packages/langchain_core/language_models/chat_models.py\", line 528, in _generate_with_cache\n return self._generate(\n ^^^^^^^^^^^^^^^\n\n\n File \"/usr/local/lib/python3.11/site-packages/langchain_community/chat_models/openai.py\", line 435, in _generate\n response = self.completion_with_retry(\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^\n\n\n File \"/usr/local/lib/python3.11/site-packages/langchain_community/chat_models/openai.py\", line 352, in completion_with_retry\n return self.client.create(**kwargs)\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n\n\n File \"/usr/local/lib/python3.11/site-packages/openai/_utils/_utils.py\", line 272, in wrapper\n return func(*args, **kwargs)\n ^^^^^^^^^^^^^^^^^^^^^\n\n\n File \"/usr/local/lib/python3.11/site-packages/openai/resources/chat/completions.py\", line 645, in create\n return self._post(\n ^^^^^^^^^^^\n\n\n File \"/usr/local/lib/python3.11/site-packages/openai/_base_client.py\", line 1088, in post\n return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls))\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n\n\n File \"/usr/local/lib/python3.11/site-packages/openai/_base_client.py\", line 853, in request\n return self._request(\n ^^^^^^^^^^^^^^\n\n\n File \"/usr/local/lib/python3.11/site-packages/openai/_base_client.py\", line 916, in _request\n return self._retry_request(\n ^^^^^^^^^^^^^^^^^^^^\n\n\n File \"/usr/local/lib/python3.11/site-packages/openai/_base_client.py\", line 958, in _retry_request\n return self._request(\n ^^^^^^^^^^^^^^\n\n\n File \"/usr/local/lib/python3.11/site-packages/openai/_base_client.py\", line 916, in _request\n return self._retry_request(\n ^^^^^^^^^^^^^^^^^^^^\n\n\n File \"/usr/local/lib/python3.11/site-packages/openai/_base_client.py\", line 958, in _retry_request\n return self._request(\n ^^^^^^^^^^^^^^\n\n\n File \"/usr/local/lib/python3.11/site-packages/openai/_base_client.py\", line 905, in _request\n raise APIConnectionError(request=request) from err\n\n\nopenai.APIConnectionError: Connection error." [chain/start] [1:chain:RunnableWithFallbacks > 5:chain:RunnableSequence] Entering Chain run with input: { "animal": "turtle" } [chain/start] [1:chain:RunnableWithFallbacks > 5:chain:RunnableSequence > 6:prompt:PromptTemplate] Entering Prompt run with input: { "animal": "turtle" } [chain/end] [1:chain:RunnableWithFallbacks > 5:chain:RunnableSequence > 6:prompt:PromptTemplate] [3ms] Exiting Prompt run with output: { "lc": 1, "type": "constructor", "id": [ "langchain", "prompts", "base", "StringPromptValue" ], "kwargs": { "text": "Instructions: You should always include a compliment in your response.\nQuestion: Why did the turtle cross the road?" } } [llm/start] [1:chain:RunnableWithFallbacks > 5:chain:RunnableSequence > 7:llm:OpenAI] Entering LLM run with input: { "prompts": [ "Instructions: You should always include a compliment in your response.\nQuestion: Why did the turtle cross the road?" ] } [llm/end] [1:chain:RunnableWithFallbacks > 5:chain:RunnableSequence > 7:llm:OpenAI] [3.95s] Exiting LLM run with output: { "generations": [ [ { "text": "\n\nResponse: That's an interesting question! I'm not sure, but I do know that turtles are known for their determination and perseverance. It's incredible how they can navigate their way through different terrains. That turtle must have been on a special mission!", "generation_info": { "finish_reason": "stop", "logprobs": null }, "type": "Generation" } ] ], "llm_output": { "token_usage": { "completion_tokens": 52, "total_tokens": 74, "prompt_tokens": 22 }, "model_name": "gpt-3.5-turbo-instruct" }, "run": null } [chain/end] [1:chain:RunnableWithFallbacks > 5:chain:RunnableSequence] [3.96s] Exiting Chain run with output: { "output": "\n\nResponse: That's an interesting question! I'm not sure, but I do know that turtles are known for their determination and perseverance. It's incredible how they can navigate their way through different terrains. That turtle must have been on a special mission!" } [chain/end] [1:chain:RunnableWithFallbacks] [9.01s] Exiting Chain run with output: { "output": "\n\nResponse: That's an interesting question! I'm not sure, but I do know that turtles are known for their determination and perseverance. It's incredible how they can navigate their way through different terrains. That turtle must have been on a special mission!" } response >> Response: That's an interesting question! I'm not sure, but I do know that turtles are known for their determination and perseverance. It's incredible how they can navigate their way through different terrains. That turtle must have been on a special mission!
https://github.com/zgpeace/pets-name-langchain/tree/develop
https://python.langchain.com/docs/expression_language/how_to/fallbacks
Copyright © 2003-2013 www.wpsshop.cn 版权所有,并保留所有权利。