当前位置:   article > 正文

LangChain 57 深入理解LangChain 表达式语言二十 LLM Fallbacks速率限制备份大模型 LangChain Expression Language (LCEL)_langchaindeprecationwarning: importing llms from l

langchaindeprecationwarning: importing llms from langchain is deprecated. im

LangChain系列文章

  1. LangChain 36 深入理解LangChain 表达式语言优势一 LangChain Expression Language (LCEL)
  2. LangChain 37 深入理解LangChain 表达式语言二 实现prompt+model+output parser LangChain Expression Language (LCEL)
  3. LangChain 38 深入理解LangChain 表达式语言三 实现RAG检索增强生成 LangChain Expression Language (LCEL)
  4. LangChain 39 深入理解LangChain 表达式语言四 为什么要用LCEL LangChain Expression Language (LCEL)
  5. LangChain 40 实战Langchain访问OpenAI ChatGPT API Account deactivated的另类方法,访问跳板机API
  6. LangChain 41 深入理解LangChain 表达式语言五 为什么要用LCEL调用大模型LLM LangChain Expression Language (LCEL)
  7. LangChain 42 深入理解LangChain 表达式语言六 Runtime调用不同大模型LLM LangChain Expression Language (LCEL)
  8. LangChain 43 深入理解LangChain 表达式语言七 日志和Fallbacks异常备选方案 LangChain Expression Language (LCEL)
  9. LangChain 44 深入理解LangChain 表达式语言八 Runnable接口输入输出模式 LangChain Expression Language (LCEL)
  10. LangChain 45 深入理解LangChain 表达式语言九 Runnable 调用、流输出、批量调用、异步处理 LangChain Expression Language (LCEL)
  11. LangChain 46 深入理解LangChain 表达式语言十 Runnable 调用中间状态调试日志 LangChain Expression Language (LCEL)
  12. LangChain 47 深入理解LangChain 表达式语言十一 Runnable 并行处理 LangChain Expression Language (LCEL)
  13. LangChain 48 终极解决 实战Langchain访问OpenAI ChatGPT API Account deactivated的另类方法,访问跳板机API
  14. LangChain 49 深入理解LangChain 表达式语言十二 Runnable 透传数据保持输入不变 LangChain Expression Language (LCEL)
  15. LangChain 50 深入理解LangChain 表达式语言十三 自定义pipeline函数 LangChain Expression Language (LCEL)
  16. LangChain 51 深入理解LangChain 表达式语言十四 自动修复配置RunnableConfig LangChain Expression Language (LCEL)
  17. LangChain 52 深入理解LangChain 表达式语言十五 Bind runtime args绑定运行时参数 LangChain Expression Language (LCEL)
  18. LangChain 53 深入理解LangChain 表达式语言十六 Dynamically route动态路由 LangChain Expression Language (LCEL)
  19. LangChain 54 深入理解LangChain 表达式语言十七 Chains Route动态路由 LangChain Expression Language (LCEL)
  20. LangChain 55 深入理解LangChain 表达式语言十八 function Route自定义动态路由 LangChain Expression Language (LCEL)
  21. LangChain 56 深入理解LangChain 表达式语言十九 config运行时选择大模型LLM LangChain Expression Language (LCEL)

在这里插入图片描述

1. 添加备用方案Fallbacks

在LLM应用程序中可能会出现许多故障点,无论是LLM API的问题,模型输出不佳,还是其他集成的问题等等。备用方案可以帮助您优雅地处理并隔离这些问题。

至关重要的是,备用方案不仅可以应用在LLM级别上,还可以应用在整个可运行级别上。

  1. 处理LLM API错误
    这可能是回退的最常见用例。对LLM API的请求可能因各种原因失败 - API可能宕机,您可能已达到速率限制,任何数量的原因。因此,使用回退可以帮助防止这些类型的问题。

重要提示:默认情况下,许多LLM包装器会捕获错误并重试。在使用回退时,您很可能希望关闭这些功能。否则,第一个包装器将继续重试而不会失败。

  1. 序列的备用方案Fallbacks for Sequences
    我们还可以为序列创建备用方案,这些备用方案本身就是序列。在这里,我们使用两种不同的模型来做到这一点:ChatOpenAI,然后是普通的OpenAI(不使用聊天模型)。因为OpenAI不是一个聊天模型,你可能想要一个不同的提示。
from langchain.prompts import PromptTemplate
from langchain_community.chat_models import ChatOpenAI
from langchain_core.runnables import ConfigurableField
# We add in a string output parser here so the outputs between the two are the same type
from langchain_core.output_parsers import StrOutputParser
from langchain.prompts import ChatPromptTemplate
# Now lets create a chain with the normal OpenAI model
from langchain_community.llms import OpenAI

from dotenv import load_dotenv  # 导入从 .env 文件加载环境变量的函数
load_dotenv()  # 调用函数实际加载环境变量

from langchain.globals import set_debug  # 导入在 langchain 中设置调试模式的函数
set_debug(True)  # 启用 langchain 的调试模式

# First let's create a chain with a ChatModel
chat_prompt = ChatPromptTemplate.from_messages(
    [
        (
            "system",
            "You're a nice assistant who always includes a compliment in your response",
        ),
        ("human", "Why did the {animal} cross the road"),
    ]
)
# Here we're going to use a bad model name to easily create a chain that will error
chat_model = ChatOpenAI(model_name="gpt-fake")
bad_chain = chat_prompt | chat_model | StrOutputParser()

prompt_template = """Instructions: You should always include a compliment in your response.
Question: Why did the {animal} cross the road?"""
prompt = PromptTemplate.from_template(prompt_template)
llm = OpenAI()
good_chain = prompt | llm

# We can now create a final chain which combines the two
chain = bad_chain.with_fallbacks([good_chain])
response = chain.invoke({"animal": "turtle"})
print('response >> ', response)
  • 1
  • 2
  • 3
  • 4
  • 5
  • 6
  • 7
  • 8
  • 9
  • 10
  • 11
  • 12
  • 13
  • 14
  • 15
  • 16
  • 17
  • 18
  • 19
  • 20
  • 21
  • 22
  • 23
  • 24
  • 25
  • 26
  • 27
  • 28
  • 29
  • 30
  • 31
  • 32
  • 33
  • 34
  • 35
  • 36
  • 37
  • 38
  • 39

运行结果

zgpeace@zgpeaces-MacBook-Pro ~/W/L/langchain-llm-app> python LCEL/fallbacks.py                                          [18:59:12] develop?
[chain/start] [1:chain:RunnableWithFallbacks] Entering Chain run with input:
{
  "animal": "turtle"
}
[chain/start] [1:chain:RunnableWithFallbacks > 2:chain:RunnableSequence] Entering Chain run with input:
{
  "animal": "turtle"
}
[chain/start] [1:chain:RunnableWithFallbacks > 2:chain:RunnableSequence > 3:prompt:ChatPromptTemplate] Entering Prompt run with input:
{
  "animal": "turtle"
}
[chain/end] [1:chain:RunnableWithFallbacks > 2:chain:RunnableSequence > 3:prompt:ChatPromptTemplate] [1ms] Exiting Prompt run with output:
{
  "lc": 1,
  "type": "constructor",
  "id": [
    "langchain",
    "prompts",
    "chat",
    "ChatPromptValue"
  ],
  "kwargs": {
    "messages": [
      {
        "lc": 1,
        "type": "constructor",
        "id": [
          "langchain",
          "schema",
          "messages",
          "SystemMessage"
        ],
        "kwargs": {
          "content": "You're a nice assistant who always includes a compliment in your response",
          "additional_kwargs": {}
        }
      },
      {
        "lc": 1,
        "type": "constructor",
        "id": [
          "langchain",
          "schema",
          "messages",
          "HumanMessage"
        ],
        "kwargs": {
          "content": "Why did the turtle cross the road",
          "additional_kwargs": {}
        }
      }
    ]
  }
}
[llm/start] [1:chain:RunnableWithFallbacks > 2:chain:RunnableSequence > 4:llm:ChatOpenAI] Entering LLM run with input:
{
  "prompts": [
    "System: You're a nice assistant who always includes a compliment in your response\nHuman: Why did the turtle cross the road"
  ]
}
[llm/error] [1:chain:RunnableWithFallbacks > 2:chain:RunnableSequence > 4:llm:ChatOpenAI] [5.02s] LLM run errored with error:
"APIConnectionError('Connection error.')Traceback (most recent call last):\n\n\n  File \"/usr/local/lib/python3.11/site-packages/openai/_base_client.py\", line 912, in _request\n    response.raise_for_status()\n\n\n  File \"/usr/local/lib/python3.11/site-packages/httpx/_models.py\", line 759, in raise_for_status\n    raise HTTPStatusError(message, request=request, response=self)\n\n\nhttpx.HTTPStatusError: Server error '503 Service Unavailable' for url 'https://apejhvxcd.cloud.sealos.io/v1/chat/completions'\nFor more information check: https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/503\n\n\n\nDuring handling of the above exception, another exception occurred:\n\n\n\nTraceback (most recent call last):\n\n\n  File \"/usr/local/lib/python3.11/site-packages/openai/_base_client.py\", line 912, in _request\n    response.raise_for_status()\n\n\n  File \"/usr/local/lib/python3.11/site-packages/httpx/_models.py\", line 759, in raise_for_status\n    raise HTTPStatusError(message, request=request, response=self)\n\n\nhttpx.HTTPStatusError: Server error '503 Service Unavailable' for url 'https://apejhvxcd.cloud.sealos.io/v1/chat/completions'\nFor more information check: https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/503\n\n\n\nDuring handling of the above exception, another exception occurred:\n\n\n\nTraceback (most recent call last):\n\n\n  File \"/usr/local/lib/python3.11/site-packages/httpcore/_exceptions.py\", line 10, in map_exceptions\n    yield\n\n\n  File \"/usr/local/lib/python3.11/site-packages/httpcore/_backends/sync.py\", line 168, in start_tls\n    raise exc\n\n\n  File \"/usr/local/lib/python3.11/site-packages/httpcore/_backends/sync.py\", line 163, in start_tls\n    sock = ssl_context.wrap_socket(\n           ^^^^^^^^^^^^^^^^^^^^^^^^\n\n\n  File \"/usr/local/Cellar/python@3.11/3.11.6_1/Frameworks/Python.framework/Versions/3.11/lib/python3.11/ssl.py\", line 517, in wrap_socket\n    return self.sslsocket_class._create(\n           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n\n\n  File \"/usr/local/Cellar/python@3.11/3.11.6_1/Frameworks/Python.framework/Versions/3.11/lib/python3.11/ssl.py\", line 1108, in _create\n    self.do_handshake()\n\n\n  File \"/usr/local/Cellar/python@3.11/3.11.6_1/Frameworks/Python.framework/Versions/3.11/lib/python3.11/ssl.py\", line 1379, in do_handshake\n    self._sslobj.do_handshake()\n\n\nssl.SSLEOFError: [SSL: UNEXPECTED_EOF_WHILE_READING] EOF occurred in violation of protocol (_ssl.c:1006)\n\n\n\nThe above exception was the direct cause of the following exception:\n\n\n\nTraceback (most recent call last):\n\n\n  File \"/usr/local/lib/python3.11/site-packages/httpx/_transports/default.py\", line 67, in map_httpcore_exceptions\n    yield\n\n\n  File \"/usr/local/lib/python3.11/site-packages/httpx/_transports/default.py\", line 231, in handle_request\n    resp = self._pool.handle_request(req)\n           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n\n\n  File \"/usr/local/lib/python3.11/site-packages/httpcore/_sync/connection_pool.py\", line 268, in handle_request\n    raise exc\n\n\n  File \"/usr/local/lib/python3.11/site-packages/httpcore/_sync/connection_pool.py\", line 251, in handle_request\n    response = connection.handle_request(request)\n               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n\n\n  File \"/usr/local/lib/python3.11/site-packages/httpcore/_sync/http_proxy.py\", line 317, in handle_request\n    stream = stream.start_tls(**kwargs)\n             ^^^^^^^^^^^^^^^^^^^^^^^^^^\n\n\n  File \"/usr/local/lib/python3.11/site-packages/httpcore/_backends/sync.py\", line 152, in start_tls\n    with map_exceptions(exc_map):\n\n\n  File \"/usr/local/Cellar/python@3.11/3.11.6_1/Frameworks/Python.framework/Versions/3.11/lib/python3.11/contextlib.py\", line 155, in __exit__\n    self.gen.throw(typ, value, traceback)\n\n\n  File \"/usr/local/lib/python3.11/site-packages/httpcore/_exceptions.py\", line 14, in map_exceptions\n    raise to_exc(exc) from exc\n\n\nhttpcore.ConnectError: [SSL: UNEXPECTED_EOF_WHILE_READING] EOF occurred in violation of protocol (_ssl.c:1006)\n\n\n\nThe above exception was the direct cause of the following exception:\n\n\n\nTraceback (most recent call last):\n\n\n  File \"/usr/local/lib/python3.11/site-packages/openai/_base_client.py\", line 877, in _request\n    response = self._client.send(\n               ^^^^^^^^^^^^^^^^^^\n\n\n  File \"/usr/local/lib/python3.11/site-packages/httpx/_client.py\", line 915, in send\n    response = self._send_handling_auth(\n               ^^^^^^^^^^^^^^^^^^^^^^^^^\n\n\n  File \"/usr/local/lib/python3.11/site-packages/httpx/_client.py\", line 943, in _send_handling_auth\n    response = self._send_handling_redirects(\n               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n\n\n  File \"/usr/local/lib/python3.11/site-packages/httpx/_client.py\", line 980, in _send_handling_redirects\n    response = self._send_single_request(request)\n               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n\n\n  File \"/usr/local/lib/python3.11/site-packages/httpx/_client.py\", line 1016, in _send_single_request\n    response = transport.handle_request(request)\n               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n\n\n  File \"/usr/local/lib/python3.11/site-packages/httpx/_transports/default.py\", line 230, in handle_request\n    with map_httpcore_exceptions():\n\n\n  File \"/usr/local/Cellar/python@3.11/3.11.6_1/Frameworks/Python.framework/Versions/3.11/lib/python3.11/contextlib.py\", line 155, in __exit__\n    self.gen.throw(typ, value, traceback)\n\n\n  File \"/usr/local/lib/python3.11/site-packages/httpx/_transports/default.py\", line 84, in map_httpcore_exceptions\n    raise mapped_exc(message) from exc\n\n\nhttpx.ConnectError: [SSL: UNEXPECTED_EOF_WHILE_READING] EOF occurred in violation of protocol (_ssl.c:1006)\n\n\n\nThe above exception was the direct cause of the following exception:\n\n\n\nTraceback (most recent call last):\n\n\n  File \"/usr/local/lib/python3.11/site-packages/langchain_core/language_models/chat_models.py\", line 372, in generate\n    self._generate_with_cache(\n\n\n  File \"/usr/local/lib/python3.11/site-packages/langchain_core/language_models/chat_models.py\", line 528, in _generate_with_cache\n    return self._generate(\n           ^^^^^^^^^^^^^^^\n\n\n  File \"/usr/local/lib/python3.11/site-packages/langchain_community/chat_models/openai.py\", line 435, in _generate\n    response = self.completion_with_retry(\n               ^^^^^^^^^^^^^^^^^^^^^^^^^^^\n\n\n  File \"/usr/local/lib/python3.11/site-packages/langchain_community/chat_models/openai.py\", line 352, in completion_with_retry\n    return self.client.create(**kwargs)\n           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n\n\n  File \"/usr/local/lib/python3.11/site-packages/openai/_utils/_utils.py\", line 272, in wrapper\n    return func(*args, **kwargs)\n           ^^^^^^^^^^^^^^^^^^^^^\n\n\n  File \"/usr/local/lib/python3.11/site-packages/openai/resources/chat/completions.py\", line 645, in create\n    return self._post(\n           ^^^^^^^^^^^\n\n\n  File \"/usr/local/lib/python3.11/site-packages/openai/_base_client.py\", line 1088, in post\n    return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls))\n                           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n\n\n  File \"/usr/local/lib/python3.11/site-packages/openai/_base_client.py\", line 853, in request\n    return self._request(\n           ^^^^^^^^^^^^^^\n\n\n  File \"/usr/local/lib/python3.11/site-packages/openai/_base_client.py\", line 916, in _request\n    return self._retry_request(\n           ^^^^^^^^^^^^^^^^^^^^\n\n\n  File \"/usr/local/lib/python3.11/site-packages/openai/_base_client.py\", line 958, in _retry_request\n    return self._request(\n           ^^^^^^^^^^^^^^\n\n\n  File \"/usr/local/lib/python3.11/site-packages/openai/_base_client.py\", line 916, in _request\n    return self._retry_request(\n           ^^^^^^^^^^^^^^^^^^^^\n\n\n  File \"/usr/local/lib/python3.11/site-packages/openai/_base_client.py\", line 958, in _retry_request\n    return self._request(\n           ^^^^^^^^^^^^^^\n\n\n  File \"/usr/local/lib/python3.11/site-packages/openai/_base_client.py\", line 905, in _request\n    raise APIConnectionError(request=request) from err\n\n\nopenai.APIConnectionError: Connection error."
[chain/error] [1:chain:RunnableWithFallbacks > 2:chain:RunnableSequence] [5.03s] Chain run errored with error:
"APIConnectionError('Connection error.')Traceback (most recent call last):\n\n\n  File \"/usr/local/lib/python3.11/site-packages/openai/_base_client.py\", line 912, in _request\n    response.raise_for_status()\n\n\n  File \"/usr/local/lib/python3.11/site-packages/httpx/_models.py\", line 759, in raise_for_status\n    raise HTTPStatusError(message, request=request, response=self)\n\n\nhttpx.HTTPStatusError: Server error '503 Service Unavailable' for url 'https://apejhvxcd.cloud.sealos.io/v1/chat/completions'\nFor more information check: https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/503\n\n\n\nDuring handling of the above exception, another exception occurred:\n\n\n\nTraceback (most recent call last):\n\n\n  File \"/usr/local/lib/python3.11/site-packages/openai/_base_client.py\", line 912, in _request\n    response.raise_for_status()\n\n\n  File \"/usr/local/lib/python3.11/site-packages/httpx/_models.py\", line 759, in raise_for_status\n    raise HTTPStatusError(message, request=request, response=self)\n\n\nhttpx.HTTPStatusError: Server error '503 Service Unavailable' for url 'https://apejhvxcd.cloud.sealos.io/v1/chat/completions'\nFor more information check: https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/503\n\n\n\nDuring handling of the above exception, another exception occurred:\n\n\n\nTraceback (most recent call last):\n\n\n  File \"/usr/local/lib/python3.11/site-packages/httpcore/_exceptions.py\", line 10, in map_exceptions\n    yield\n\n\n  File \"/usr/local/lib/python3.11/site-packages/httpcore/_backends/sync.py\", line 168, in start_tls\n    raise exc\n\n\n  File \"/usr/local/lib/python3.11/site-packages/httpcore/_backends/sync.py\", line 163, in start_tls\n    sock = ssl_context.wrap_socket(\n           ^^^^^^^^^^^^^^^^^^^^^^^^\n\n\n  File \"/usr/local/Cellar/python@3.11/3.11.6_1/Frameworks/Python.framework/Versions/3.11/lib/python3.11/ssl.py\", line 517, in wrap_socket\n    return self.sslsocket_class._create(\n           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n\n\n  File \"/usr/local/Cellar/python@3.11/3.11.6_1/Frameworks/Python.framework/Versions/3.11/lib/python3.11/ssl.py\", line 1108, in _create\n    self.do_handshake()\n\n\n  File \"/usr/local/Cellar/python@3.11/3.11.6_1/Frameworks/Python.framework/Versions/3.11/lib/python3.11/ssl.py\", line 1379, in do_handshake\n    self._sslobj.do_handshake()\n\n\nssl.SSLEOFError: [SSL: UNEXPECTED_EOF_WHILE_READING] EOF occurred in violation of protocol (_ssl.c:1006)\n\n\n\nThe above exception was the direct cause of the following exception:\n\n\n\nTraceback (most recent call last):\n\n\n  File \"/usr/local/lib/python3.11/site-packages/httpx/_transports/default.py\", line 67, in map_httpcore_exceptions\n    yield\n\n\n  File \"/usr/local/lib/python3.11/site-packages/httpx/_transports/default.py\", line 231, in handle_request\n    resp = self._pool.handle_request(req)\n           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n\n\n  File \"/usr/local/lib/python3.11/site-packages/httpcore/_sync/connection_pool.py\", line 268, in handle_request\n    raise exc\n\n\n  File \"/usr/local/lib/python3.11/site-packages/httpcore/_sync/connection_pool.py\", line 251, in handle_request\n    response = connection.handle_request(request)\n               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n\n\n  File \"/usr/local/lib/python3.11/site-packages/httpcore/_sync/http_proxy.py\", line 317, in handle_request\n    stream = stream.start_tls(**kwargs)\n             ^^^^^^^^^^^^^^^^^^^^^^^^^^\n\n\n  File \"/usr/local/lib/python3.11/site-packages/httpcore/_backends/sync.py\", line 152, in start_tls\n    with map_exceptions(exc_map):\n\n\n  File \"/usr/local/Cellar/python@3.11/3.11.6_1/Frameworks/Python.framework/Versions/3.11/lib/python3.11/contextlib.py\", line 155, in __exit__\n    self.gen.throw(typ, value, traceback)\n\n\n  File \"/usr/local/lib/python3.11/site-packages/httpcore/_exceptions.py\", line 14, in map_exceptions\n    raise to_exc(exc) from exc\n\n\nhttpcore.ConnectError: [SSL: UNEXPECTED_EOF_WHILE_READING] EOF occurred in violation of protocol (_ssl.c:1006)\n\n\n\nThe above exception was the direct cause of the following exception:\n\n\n\nTraceback (most recent call last):\n\n\n  File \"/usr/local/lib/python3.11/site-packages/openai/_base_client.py\", line 877, in _request\n    response = self._client.send(\n               ^^^^^^^^^^^^^^^^^^\n\n\n  File \"/usr/local/lib/python3.11/site-packages/httpx/_client.py\", line 915, in send\n    response = self._send_handling_auth(\n               ^^^^^^^^^^^^^^^^^^^^^^^^^\n\n\n  File \"/usr/local/lib/python3.11/site-packages/httpx/_client.py\", line 943, in _send_handling_auth\n    response = self._send_handling_redirects(\n               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n\n\n  File \"/usr/local/lib/python3.11/site-packages/httpx/_client.py\", line 980, in _send_handling_redirects\n    response = self._send_single_request(request)\n               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n\n\n  File \"/usr/local/lib/python3.11/site-packages/httpx/_client.py\", line 1016, in _send_single_request\n    response = transport.handle_request(request)\n               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n\n\n  File \"/usr/local/lib/python3.11/site-packages/httpx/_transports/default.py\", line 230, in handle_request\n    with map_httpcore_exceptions():\n\n\n  File \"/usr/local/Cellar/python@3.11/3.11.6_1/Frameworks/Python.framework/Versions/3.11/lib/python3.11/contextlib.py\", line 155, in __exit__\n    self.gen.throw(typ, value, traceback)\n\n\n  File \"/usr/local/lib/python3.11/site-packages/httpx/_transports/default.py\", line 84, in map_httpcore_exceptions\n    raise mapped_exc(message) from exc\n\n\nhttpx.ConnectError: [SSL: UNEXPECTED_EOF_WHILE_READING] EOF occurred in violation of protocol (_ssl.c:1006)\n\n\n\nThe above exception was the direct cause of the following exception:\n\n\n\nTraceback (most recent call last):\n\n\n  File \"/usr/local/lib/python3.11/site-packages/langchain_core/runnables/base.py\", line 1514, in invoke\n    input = step.invoke(\n            ^^^^^^^^^^^^\n\n\n  File \"/usr/local/lib/python3.11/site-packages/langchain_core/language_models/chat_models.py\", line 164, in invoke\n    self.generate_prompt(\n\n\n  File \"/usr/local/lib/python3.11/site-packages/langchain_core/language_models/chat_models.py\", line 495, in generate_prompt\n    return self.generate(prompt_messages, stop=stop, callbacks=callbacks, **kwargs)\n           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n\n\n  File \"/usr/local/lib/python3.11/site-packages/langchain_core/language_models/chat_models.py\", line 382, in generate\n    raise e\n\n\n  File \"/usr/local/lib/python3.11/site-packages/langchain_core/language_models/chat_models.py\", line 372, in generate\n    self._generate_with_cache(\n\n\n  File \"/usr/local/lib/python3.11/site-packages/langchain_core/language_models/chat_models.py\", line 528, in _generate_with_cache\n    return self._generate(\n           ^^^^^^^^^^^^^^^\n\n\n  File \"/usr/local/lib/python3.11/site-packages/langchain_community/chat_models/openai.py\", line 435, in _generate\n    response = self.completion_with_retry(\n               ^^^^^^^^^^^^^^^^^^^^^^^^^^^\n\n\n  File \"/usr/local/lib/python3.11/site-packages/langchain_community/chat_models/openai.py\", line 352, in completion_with_retry\n    return self.client.create(**kwargs)\n           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n\n\n  File \"/usr/local/lib/python3.11/site-packages/openai/_utils/_utils.py\", line 272, in wrapper\n    return func(*args, **kwargs)\n           ^^^^^^^^^^^^^^^^^^^^^\n\n\n  File \"/usr/local/lib/python3.11/site-packages/openai/resources/chat/completions.py\", line 645, in create\n    return self._post(\n           ^^^^^^^^^^^\n\n\n  File \"/usr/local/lib/python3.11/site-packages/openai/_base_client.py\", line 1088, in post\n    return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls))\n                           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n\n\n  File \"/usr/local/lib/python3.11/site-packages/openai/_base_client.py\", line 853, in request\n    return self._request(\n           ^^^^^^^^^^^^^^\n\n\n  File \"/usr/local/lib/python3.11/site-packages/openai/_base_client.py\", line 916, in _request\n    return self._retry_request(\n           ^^^^^^^^^^^^^^^^^^^^\n\n\n  File \"/usr/local/lib/python3.11/site-packages/openai/_base_client.py\", line 958, in _retry_request\n    return self._request(\n           ^^^^^^^^^^^^^^\n\n\n  File \"/usr/local/lib/python3.11/site-packages/openai/_base_client.py\", line 916, in _request\n    return self._retry_request(\n           ^^^^^^^^^^^^^^^^^^^^\n\n\n  File \"/usr/local/lib/python3.11/site-packages/openai/_base_client.py\", line 958, in _retry_request\n    return self._request(\n           ^^^^^^^^^^^^^^\n\n\n  File \"/usr/local/lib/python3.11/site-packages/openai/_base_client.py\", line 905, in _request\n    raise APIConnectionError(request=request) from err\n\n\nopenai.APIConnectionError: Connection error."
[chain/start] [1:chain:RunnableWithFallbacks > 5:chain:RunnableSequence] Entering Chain run with input:
{
  "animal": "turtle"
}
[chain/start] [1:chain:RunnableWithFallbacks > 5:chain:RunnableSequence > 6:prompt:PromptTemplate] Entering Prompt run with input:
{
  "animal": "turtle"
}
[chain/end] [1:chain:RunnableWithFallbacks > 5:chain:RunnableSequence > 6:prompt:PromptTemplate] [3ms] Exiting Prompt run with output:
{
  "lc": 1,
  "type": "constructor",
  "id": [
    "langchain",
    "prompts",
    "base",
    "StringPromptValue"
  ],
  "kwargs": {
    "text": "Instructions: You should always include a compliment in your response.\nQuestion: Why did the turtle cross the road?"
  }
}
[llm/start] [1:chain:RunnableWithFallbacks > 5:chain:RunnableSequence > 7:llm:OpenAI] Entering LLM run with input:
{
  "prompts": [
    "Instructions: You should always include a compliment in your response.\nQuestion: Why did the turtle cross the road?"
  ]
}
[llm/end] [1:chain:RunnableWithFallbacks > 5:chain:RunnableSequence > 7:llm:OpenAI] [3.95s] Exiting LLM run with output:
{
  "generations": [
    [
      {
        "text": "\n\nResponse: That's an interesting question! I'm not sure, but I do know that turtles are known for their determination and perseverance. It's incredible how they can navigate their way through different terrains. That turtle must have been on a special mission!",
        "generation_info": {
          "finish_reason": "stop",
          "logprobs": null
        },
        "type": "Generation"
      }
    ]
  ],
  "llm_output": {
    "token_usage": {
      "completion_tokens": 52,
      "total_tokens": 74,
      "prompt_tokens": 22
    },
    "model_name": "gpt-3.5-turbo-instruct"
  },
  "run": null
}
[chain/end] [1:chain:RunnableWithFallbacks > 5:chain:RunnableSequence] [3.96s] Exiting Chain run with output:
{
  "output": "\n\nResponse: That's an interesting question! I'm not sure, but I do know that turtles are known for their determination and perseverance. It's incredible how they can navigate their way through different terrains. That turtle must have been on a special mission!"
}
[chain/end] [1:chain:RunnableWithFallbacks] [9.01s] Exiting Chain run with output:
{
  "output": "\n\nResponse: That's an interesting question! I'm not sure, but I do know that turtles are known for their determination and perseverance. It's incredible how they can navigate their way through different terrains. That turtle must have been on a special mission!"
}
response >>  

Response: That's an interesting question! I'm not sure, but I do know that turtles are known for their determination and perseverance. It's incredible how they can navigate their way through different terrains. That turtle must have been on a special mission!
  • 1
  • 2
  • 3
  • 4
  • 5
  • 6
  • 7
  • 8
  • 9
  • 10
  • 11
  • 12
  • 13
  • 14
  • 15
  • 16
  • 17
  • 18
  • 19
  • 20
  • 21
  • 22
  • 23
  • 24
  • 25
  • 26
  • 27
  • 28
  • 29
  • 30
  • 31
  • 32
  • 33
  • 34
  • 35
  • 36
  • 37
  • 38
  • 39
  • 40
  • 41
  • 42
  • 43
  • 44
  • 45
  • 46
  • 47
  • 48
  • 49
  • 50
  • 51
  • 52
  • 53
  • 54
  • 55
  • 56
  • 57
  • 58
  • 59
  • 60
  • 61
  • 62
  • 63
  • 64
  • 65
  • 66
  • 67
  • 68
  • 69
  • 70
  • 71
  • 72
  • 73
  • 74
  • 75
  • 76
  • 77
  • 78
  • 79
  • 80
  • 81
  • 82
  • 83
  • 84
  • 85
  • 86
  • 87
  • 88
  • 89
  • 90
  • 91
  • 92
  • 93
  • 94
  • 95
  • 96
  • 97
  • 98
  • 99
  • 100
  • 101
  • 102
  • 103
  • 104
  • 105
  • 106
  • 107
  • 108
  • 109
  • 110
  • 111
  • 112
  • 113
  • 114
  • 115
  • 116
  • 117
  • 118
  • 119
  • 120
  • 121
  • 122
  • 123
  • 124
  • 125
  • 126
  • 127
  • 128
  • 129

代码

https://github.com/zgpeace/pets-name-langchain/tree/develop

参考

https://python.langchain.com/docs/expression_language/how_to/fallbacks

声明:本文内容由网友自发贡献,不代表【wpsshop博客】立场,版权归原作者所有,本站不承担相应法律责任。如您发现有侵权的内容,请联系我们。转载请注明出处:https://www.wpsshop.cn/w/小惠珠哦/article/detail/1009430
推荐阅读
相关标签
  

闽ICP备14008679号