赞
踩
OpenAI ChatGPT API Account deactivated的另类方法,访问跳板机API 系列
方法有:
笔者用的是macOS, 先添加配置到 .zshrc
文件
export WANDOU_OPENAI_API_KEY='sk-xxx'
export WANDOU_BASE_URL="https://apejhvxcd.cloud.sealos.io/v1”
重新加载配置
source .zshrc
call 一个API示例
import os import requests import time import json import time from langchain.llms import OpenAI API_SECRET_KEY = "你在跳板机的key"; BASE_URL = "https://apejhvxcd.cloud.sealos.io/v1"; os.environ["OPENAI_API_KEY"] = API_SECRET_KEY os.environ["OPENAI_API_BASE"] = BASE_URL def text(): llm = OpenAI(temperature=0.9) text = "What would be a good company name for a company that makes colorful socks?" print(llm(text)) if __name__ == '__main__': text();
踩坑执行代码
print(os.environ["OPENAI_API_KEY"])
发现是OpenAI的token,不是跳板机的。
.env
配置
OPENAI_API_KEY = os.getenv("WANDOU_OPENAI_API_KEY")
OPENAI_API_BASE = os.getenv("WANDOU_BASE_URL")
再次输出就是跳板机的token了,笔者用的跳板机链接,有送美金哦
import time def measure_execution_time(func, *args, **kwargs): """测量并返回给定函数的执行时间。 Args: func: 要测量的函数对象。 *args: 传递给函数的位置参数。 **kwargs: 传递给函数的关键字参数。 Returns: tuple: (执行结果, 耗时秒数) """ start_time = time.perf_counter() result = func(*args, **kwargs) end_time = time.perf_counter() elapsed_time = end_time - start_time return result, elapsed_time # 示例:假设 chain1 已经定义并且具有 invoke 方法 # chain1 = ... # 使用 measure_execution_time 来测量 chain1.invoke 的执行时间 # result, exec_time = measure_execution_time(chain1.invoke, {"topic": "bears"}) # print(f"执行结果: {result}") # print(f"函数执行耗时: {exec_time} 秒")
from langchain_core.runnables import RunnablePassthrough from langchain.prompts import ChatPromptTemplate from langchain.chat_models import ChatOpenAI from langchain_core.output_parsers import StrOutputParser import time from calc_time import measure_execution_time from dotenv import load_dotenv import os load_dotenv() from langchain_core.runnables import RunnableParallel # print(os.environ["OPENAI_API_KEY"]) # model = ChatOpenAI() model = ChatOpenAI(model="gpt-3.5-turbo") chain1 = ChatPromptTemplate.from_template("tell me a joke about {topic}") | model chain2 = ( ChatPromptTemplate.from_template("write a short (2 line) poem about {topic}") | model ) combined = RunnableParallel(joke=chain1, poem=chain2) # result, exec_time = measure_execution_time(chain1.invoke, {"topic": "bears"}) # print(f"执行结果: {result}") # print(f"函数执行耗时: {exec_time} 秒") # result, exec_time = measure_execution_time(chain2.invoke, {"topic": "bears"}) # print(f"执行结果: {result}") # print(f"函数执行耗时: {exec_time} 秒") # result, exec_time = measure_execution_time(combined.invoke, {"topic": "bears"}) # print(f"执行结果: {result}") # print(f"函数执行耗时: {exec_time} 秒") # result, exec_time = measure_execution_time(chain1.batch, [{"topic": "bears"}, {"topic": "cats"}]) # print(f"执行结果: {result}") # print(f"函数执行耗时: {exec_time} 秒") # result, exec_time = measure_execution_time(chain2.batch, [{"topic": "bears"}, {"topic": "cats"}]) # print(f"执行结果: {result}") # print(f"函数执行耗时: {exec_time} 秒") result, exec_time = measure_execution_time(combined.batch, [{"topic": "bears"}, {"topic": "cats"}, {"topic": "dogs"}]) print(f"执行结果: {result}") print(f"函数执行耗时: {exec_time} 秒")
执行结果
zgpeace at zgpeaces-MacBook-Pro in ~/Workspace/LLM/langchain-llm-app on develop!
(.venv) ± python LCEL/parallelism.py
执行结果: [{'joke': AIMessage(content="Why don't bears wear shoes?\n\nBecause they have bear feet!"), 'poem': AIMessage(content="Majestic bears roam,\nNature's guardians, fierce yet kind.")}, {'joke': AIMessage(content="Sure, here's a cat joke for you:\n\nWhy was the cat sitting on the computer?\n\nBecause it wanted to keep an eye on the mouse!"), 'poem': AIMessage(content="Whiskers glide, tails entwine,\nGraceful feline, nature's design.")}, {'joke': AIMessage(content="Sure, here's a dog-related joke for you:\n\nWhy don't dogs make good dancers?\n\nBecause they have two left feet!"), 'poem': AIMessage(content='Loyal companions, hearts pure and true,\nDogs bring joy, love, and laughter to you.')}]
函数执行耗时: 3.3068008899572305 秒
https://docs.qq.com/doc/DRXJxV1Nad3hsb3ZL
https://github.com/zgpeace/pets-name-langchain/tree/develop
Copyright © 2003-2013 www.wpsshop.cn 版权所有,并保留所有权利。