当前位置:   article > 正文

Ollama:本地大模型运行指南

ollama 使用

本文作者为 360 奇舞团前端开发工程师

Ollama 简介

Ollama 是一个基于 Go 语言开发的可以本地运行大模型的开源框架。

官网:https://ollama.com/

GitHub 地址:https://github.com/ollama/ollama

Ollama 安装

下载安装 Ollama

在 Ollama 官网根据操作系统类型选择对应的安装包,这里选择 macOS 下载安装。93e77f30bb16103fe8a96486e0e7387e.png安装完在终端输入 ollama,可以看到 ollama 支持的命令。

  1. Usage:
  2.   ollama [flags]
  3.   ollama [command]
  4. Available Commands:
  5.   serve       Start ollama
  6.   create      Create a model from a Modelfile
  7.   show        Show information for a model
  8.   run         Run a model
  9.   pull        Pull a model from a registry
  10.   push        Push a model to a registry
  11.   list        List models
  12.   cp          Copy a model
  13.   rm          Remove a model
  14.   help        Help about any command
  15. Flags:
  16.   -h, --help      help for ollama
  17.   -v, --version   Show version information
  18. Use "ollama [command] --help" for more information about a command.

查看 ollama 版本

  1. ollama -v
  2. ollama version is 0.1.31

查看已下载模型

  1. ollama list
  2. NAME     ID           SIZE   MODIFIED    
  3. gemma:2b b50d6c999e59 1.7 GB 3 hours ago

我本地已经有一个大模型,接下来我们看一下怎么下载大模型。

下载大模型

365d0364c9d136c6e07f3a46dbd6ef8e.png
下载模型

安装完后默认提示安装 llama2 大模型,下面是 Ollama 支持的部分模型

ModelParametersSizeDownload
Llama 38B4.7GBollama run llama3
Llama 370B40GBollama run llama3:70b
Mistral7B4.1GBollama run mistral
Dolphin Phi2.7B1.6GBollama run dolphin-phi
Phi-22.7B1.7GBollama run phi
Neural Chat7B4.1GBollama run neural-chat
Starling7B4.1GBollama run starling-lm
Code Llama7B3.8GBollama run codellama
Llama 2 Uncensored7B3.8GBollama run llama2-uncensored
Llama 2 13B13B7.3GBollama run llama2:13b
Llama 2 70B70B39GBollama run llama2:70b
Orca Mini3B1.9GBollama run orca-mini
LLaVA7B4.5GBollama run llava
Gemma2B1.4GBollama run gemma:2b
Gemma7B4.8GBollama run gemma:7b
Solar10.7B6.1GBollama run solar

Llama 3 是 Meta 2024年4月19日 开源的大语言模型,共80亿和700亿参数两个版本,Ollama均已支持。

这里选择安装 gemma 2b,打开终端,执行下面命令:

ollama run gemma:2b
  1. pulling manifest 
  2. pulling c1864a5eb193... 100% ▕██████████████████████████████████████████████████████████▏ 1.7 GB                         
  3. pulling 097a36493f71... 100% ▕██████████████████████████████████████████████████████████▏ 8.4 KB                         
  4. pulling 109037bec39c... 100% ▕██████████████████████████████████████████████████████████▏  136 B                         
  5. pulling 22a838ceb7fb... 100% ▕██████████████████████████████████████████████████████████▏   84 B                         
  6. pulling 887433b89a90... 100% ▕██████████████████████████████████████████████████████████▏  483 B                         
  7. verifying sha256 digest 
  8. writing manifest 
  9. removing any unused layers 
  10. success

经过一段时间等待,显示模型下载完成。

上表仅是 Ollama 支持的部分模型,更多模型可以在 https://ollama.com/library 查看,中文模型比如阿里的通义千问。

终端对话

下载完成后,可以直接在终端进行对话,比如提问“介绍一下React”

>>> 介绍一下React

输出内容如下:

fe50f80bdab3fab76b912a07d1e0d5b2.png

显示帮助命令-/?

  1. >>> /?
  2. Available Commands:
  3.   /set            Set session variables
  4.   /show           Show model information
  5.   /load <model>   Load a session or model
  6.   /save <model>   Save your current session
  7.   /bye            Exit
  8.   /?, /help       Help for a command
  9.   /? shortcuts    Help for keyboard shortcuts
  10. Use """ to begin a multi-line message.

显示模型信息命令-/show

  1. >>> /show
  2. Available Commands:
  3.   /show info         Show details for this model
  4.   /show license      Show model license
  5.   /show modelfile    Show Modelfile for this model
  6.   /show parameters   Show parameters for this model
  7.   /show system       Show system message
  8.   /show template     Show prompt template

显示模型详情命令-/show info

  1. >>> /show info
  2. Model details:
  3. Family              gemma
  4. Parameter Size      3B
  5. Quantization Level  Q4_0

API 调用

除了在终端直接对话外,ollama 还可以以 API 的方式调用,比如执行 ollama show --help 可以看到本地访问地址为:http://localhost:11434

  1. ollama show --help
  2. Show information for a model
  3. Usage:
  4.   ollama show MODEL [flags]
  5. Flags:
  6.   -h, --help         help for show
  7.       --license      Show license of a model
  8.       --modelfile    Show Modelfile of a model
  9.       --parameters   Show parameters of a model
  10.       --system       Show system message of a model
  11.       --template     Show template of a model
  12. Environment Variables:
  13.       OLLAMA_HOST        The host:port or base URL of the Ollama server (e.g. http://localhost:11434)

下面介绍主要介绍两个 api :generate 和 chat。

generate

  • 流式返回

  1. curl http://localhost:11434/api/generate -d '{
  2.   "model""gemma:2b",
  3.   "prompt":"介绍一下React,20字以内"
  4. }'
  1. {"model":"gemma:2b","created_at":"2024-04-19T10:12:32.337192Z","response":"React","done":false}
  2. {"model":"gemma:2b","created_at":"2024-04-19T10:12:32.421481Z","response":" 是","done":false}
  3. {"model":"gemma:2b","created_at":"2024-04-19T10:12:32.503852Z","response":"一个","done":false}
  4. {"model":"gemma:2b","created_at":"2024-04-19T10:12:32.584813Z","response":"用于","done":false}
  5. {"model":"gemma:2b","created_at":"2024-04-19T10:12:32.672575Z","response":"构建","done":false}
  6. {"model":"gemma:2b","created_at":"2024-04-19T10:12:32.754663Z","response":"用户","done":false}
  7. {"model":"gemma:2b","created_at":"2024-04-19T10:12:32.837639Z","response":"界面","done":false}
  8. {"model":"gemma:2b","created_at":"2024-04-19T10:12:32.918767Z","response":"(","done":false}
  9. {"model":"gemma:2b","created_at":"2024-04-19T10:12:32.998863Z","response":"UI","done":false}
  10. {"model":"gemma:2b","created_at":"2024-04-19T10:12:33.080361Z","response":")","done":false}
  11. {"model":"gemma:2b","created_at":"2024-04-19T10:12:33.160418Z","response":"的","done":false}
  12. {"model":"gemma:2b","created_at":"2024-04-19T10:12:33.239247Z","response":" JavaScript","done":false}
  13. {"model":"gemma:2b","created_at":"2024-04-19T10:12:33.318396Z","response":" 库","done":false}
  14. {"model":"gemma:2b","created_at":"2024-04-19T10:12:33.484203Z","response":"。","done":false}
  15. {"model":"gemma:2b","created_at":"2024-04-19T10:12:33.671075Z","response":"它","done":false}
  16. {"model":"gemma:2b","created_at":"2024-04-19T10:12:33.751622Z","response":"允许","done":false}
  17. {"model":"gemma:2b","created_at":"2024-04-19T10:12:33.833298Z","response":"开发者","done":false}
  18. {"model":"gemma:2b","created_at":"2024-04-19T10:12:33.919385Z","response":"轻松","done":false}
  19. {"model":"gemma:2b","created_at":"2024-04-19T10:12:34.007706Z","response":"构建","done":false}
  20. {"model":"gemma:2b","created_at":"2024-04-19T10:12:34.09201Z","response":"可","done":false}
  21. {"model":"gemma:2b","created_at":"2024-04-19T10:12:34.174897Z","response":"重","done":false}
  22. {"model":"gemma:2b","created_at":"2024-04-19T10:12:34.414743Z","response":"用的","done":false}
  23. {"model":"gemma:2b","created_at":"2024-04-19T10:12:34.497013Z","response":" UI","done":false}
  24. {"model":"gemma:2b","created_at":"2024-04-19T10:12:34.584026Z","response":",","done":false}
  25. {"model":"gemma:2b","created_at":"2024-04-19T10:12:34.669825Z","response":"并","done":false}
  26. {"model":"gemma:2b","created_at":"2024-04-19T10:12:34.749524Z","response":"与","done":false}
  27. {"model":"gemma:2b","created_at":"2024-04-19T10:12:34.837544Z","response":"各种","done":false}
  28. {"model":"gemma:2b","created_at":"2024-04-19T10:12:34.927049Z","response":" JavaScript","done":false}
  29. {"model":"gemma:2b","created_at":"2024-04-19T10:12:35.008527Z","response":" ","done":false}
  30. {"model":"gemma:2b","created_at":"2024-04-19T10:12:35.088936Z","response":"框架","done":false}
  31. {"model":"gemma:2b","created_at":"2024-04-19T10:12:35.176094Z","response":"一起","done":false}
  32. {"model":"gemma:2b","created_at":"2024-04-19T10:12:35.255251Z","response":"使用","done":false}
  33. {"model":"gemma:2b","created_at":"2024-04-19T10:12:35.34085Z","response":"。","done":false}
  34. {"model":"gemma:2b","created_at":"2024-04-19T10:12:35.428575Z","response":"","done":true,"context":[106,1645,108,25661,18071,22469,235365,235284,235276,235960,179621,107,108,106,2516,108,22469,23437,5121,40163,81964,16464,57881,235538,5639,235536,235370,22978,185852,235362,236380,64032,227725,64727,81964,235553,235846,37694,13566,235365,236203,235971,34384,22978,235248,90141,19600,7060,235362,107,108],"total_duration":3172809302,"load_duration":983863,"prompt_eval_duration":80181000,"eval_count":34,"eval_duration":3090973000}
  • 非流式返回

通过设置 "stream": false 参数可以设置一次性返回。

``bash curl http://localhost:11434/api/generate -d '{ "model": "gemma:2b", "prompt":"介绍一下React,20字以内", "stream": false }'

  1. ```json
  2. {
  3.   "model": "gemma:2b",
  4.   "created_at": "2024-04-19T08:53:14.534085Z",
  5.   "response": "React 是一个用于构建用户界面的大型 JavaScript 库,允许您轻松创建动态的网站和应用程序。",
  6.   "done": true,
  7.   "context": [106, 1645, 108, 25661, 18071, 22469, 235365, 235284, 235276, 235960, 179621, 107, 108, 106, 2516, 108, 22469, 23437, 5121, 40163, 81964, 16464, 236074, 26546, 66240, 22978, 185852, 235365, 64032, 236552, 64727, 22957, 80376, 235370, 37188, 235581, 79826, 235362, 107, 108],
  8.   "total_duration": 1864443127,
  9.   "load_duration": 2426249,
  10.   "prompt_eval_duration": 101635000,
  11.   "eval_count": 23,
  12.   "eval_duration": 1757523000
  13. }

chat

  • 流式返回

  1. curl http://localhost:11434/api/chat -d '{
  2.   "model""gemma:2b",
  3.   "messages": [
  4.     { "role""user""content""介绍一下React,20字以内" }
  5.   ]
  6. }'

可以看到终端输出结果:

  1. {"model":"gemma:2b","created_at":"2024-04-19T08:45:54.86791Z","message":{"role":"assistant","content":"React"},"done":false}
  2. {"model":"gemma:2b","created_at":"2024-04-19T08:45:54.949168Z","message":{"role":"assistant","content":"是"},"done":false}
  3. {"model":"gemma:2b","created_at":"2024-04-19T08:45:55.034272Z","message":{"role":"assistant","content":"用于"},"done":false}
  4. {"model":"gemma:2b","created_at":"2024-04-19T08:45:55.119119Z","message":{"role":"assistant","content":"构建"},"done":false}
  5. {"model":"gemma:2b","created_at":"2024-04-19T08:45:55.201837Z","message":{"role":"assistant","content":"用户"},"done":false}
  6. {"model":"gemma:2b","created_at":"2024-04-19T08:45:55.286611Z","message":{"role":"assistant","content":"界面"},"done":false}
  7. {"model":"gemma:2b","created_at":"2024-04-19T08:45:55.37054Z","message":{"role":"assistant","content":" React"},"done":false}
  8. {"model":"gemma:2b","created_at":"2024-04-19T08:45:55.45099Z","message":{"role":"assistant","content":"."},"done":false}
  9. {"model":"gemma:2b","created_at":"2024-04-19T08:45:55.534105Z","message":{"role":"assistant","content":"js"},"done":false}
  10. {"model":"gemma:2b","created_at":"2024-04-19T08:45:55.612744Z","message":{"role":"assistant","content":"框架"},"done":false}
  11. {"model":"gemma:2b","created_at":"2024-04-19T08:45:55.695129Z","message":{"role":"assistant","content":","},"done":false}
  12. {"model":"gemma:2b","created_at":"2024-04-19T08:45:55.775357Z","message":{"role":"assistant","content":"允许"},"done":false}
  13. {"model":"gemma:2b","created_at":"2024-04-19T08:45:55.855803Z","message":{"role":"assistant","content":"开发者"},"done":false}
  14. {"model":"gemma:2b","created_at":"2024-04-19T08:45:55.936518Z","message":{"role":"assistant","content":"轻松"},"done":false}
  15. {"model":"gemma:2b","created_at":"2024-04-19T08:45:56.012203Z","message":{"role":"assistant","content":"地"},"done":false}
  16. {"model":"gemma:2b","created_at":"2024-04-19T08:45:56.098045Z","message":{"role":"assistant","content":"创建"},"done":false}
  17. {"model":"gemma:2b","created_at":"2024-04-19T08:45:56.178332Z","message":{"role":"assistant","content":"动态"},"done":false}
  18. {"model":"gemma:2b","created_at":"2024-04-19T08:45:56.255488Z","message":{"role":"assistant","content":"网页"},"done":false}
  19. {"model":"gemma:2b","created_at":"2024-04-19T08:45:56.336361Z","message":{"role":"assistant","content":"。"},"done":false}
  20. {"model":"gemma:2b","created_at":"2024-04-19T08:45:56.415904Z","message":{"role":"assistant","content":""},"done":true,"total_duration":2057551864,"load_duration":568391,"prompt_eval_count":11,"prompt_eval_duration":506238000,"eval_count":20,"eval_duration":1547724000}

默认流式返回,同样可以通过 "stream": false 参数一次性返回。

generate 和 chat 的区别在于,generate 是一次性生成的数据。chat 可以附加历史记录,多轮对话。

Web UI

除了上面终端和 API 调用的方式,目前还有许多开源的 Web UI,可以本地搭建一个可视化的页面来实现对话,比如:

  • open-webui

https://github.com/open-webui/open-webui

  • lollms-webui

https://github.com/ParisNeo/lollms-webui

通过 Ollama 本地运行大模型的学习成本已经非常低,大家有兴趣尝试本地部署一个大模型吧

声明:本文内容由网友自发贡献,不代表【wpsshop博客】立场,版权归原作者所有,本站不承担相应法律责任。如您发现有侵权的内容,请联系我们。转载请注明出处:https://www.wpsshop.cn/w/IT小白/article/detail/567343
推荐阅读
相关标签