当前位置:   article > 正文

ollama本地模型使用openai的接口_ollama openai接口

ollama openai接口

用litellm通了,有空再写教程

  1. pip install 'litellm[proxy]'
  2. litellm --model ollama/qwen:0.5b
  3. http://127.0.0.1:4000/
  1. OpenAI Python library
  2. #ollama本身也可以的
  3. from openai import OpenAI
  4. client = OpenAI(
  5. base_url = 'http://localhost:11434/v1',
  6. api_key='ollama', # required, but unused
  7. )
  8. response = client.chat.completions.create(
  9. model="llama2",
  10. messages=[
  11. {"role": "system", "content": "You are a helpful assistant."},
  12. {"role": "user", "content": "Who won the world series in 2020?"},
  13. {"role": "assistant", "content": "The LA Dodgers won in 2020."},
  14. {"role": "user", "content": "Where was it played?"}
  15. ]
  16. )
  17. print(response.choices[0].message.content)
  1. from pandasai import SmartDataframe
  2. from pandasai.llm.local_llm import LocalLLM
  3. ollama_llm = LocalLLM(api_base="http://localhost:11434/v1", model="codellama")
  4. df = SmartDataframe("data.csv", config={"llm": ollama_llm})
  5. ################################
  6. from ollama import OpenAI
  7. client = OpenAI(
  8. base_url='http://localhost:11434/v1/',
  9. api_key='ollama', # 此处的api_key为必填项,但在ollama中会被忽略
  10. )
  11. chat_completion = client.chat.completions.create(
  12. messages=[
  13. {
  14. 'role': 'user',
  15. 'content': 'Say this is a test',
  16. }
  17. ],
  18. model='llama2',
  19. )
  20. ##########################
  21. import requests
  22. # API 的 URL
  23. url = 'http://localhost:11434/api/chat'
  24. # 要发送的数据
  25. data = {
  26. "model": "llama3:latest",
  27. "messages": [
  28. {
  29. "role": "user",
  30. "content": "Hello, how are you?"
  31. }
  32. ],
  33. "stream": False
  34. }
  35. # 发送 POST 请求
  36. response = requests.post(url, json=data)
  37. # 打印响应内容
  38. print(response.text)

声明:本文内容由网友自发贡献,不代表【wpsshop博客】立场,版权归原作者所有,本站不承担相应法律责任。如您发现有侵权的内容,请联系我们。转载请注明出处:https://www.wpsshop.cn/w/Li_阴宅/article/detail/910366
推荐阅读
相关标签
  

闽ICP备14008679号