当前位置:   article > 正文

AI智能体研发之路-工程篇(二):Dify智能体开发平台一键部署_dify-sandbox

dify-sandbox

博客导读:

《AI—工程篇》

AI智能体研发之路-工程篇(一):Docker助力AI智能体开发提效​​​​​​​

AI智能体研发之路-工程篇(二):Dify智能体开发平台一键部署

AI智能体研发之路-工程篇(三):大模型推理服务框架Ollama一键部署

AI智能体研发之路-工程篇(四):大模型推理服务框架Xinference一键部署

AI智能体研发之路-工程篇(五):大模型推理服务框架LocalAI一键部署

《AI—模型篇》

AI智能体研发之路-模型篇(一):大模型训练框架LLaMA-Factory在国内网络环境下的安装、部署及使用

AI智能体研发之路-模型篇(二):DeepSeek-V2-Chat 训练与推理实战

目录

1.引言

2.docker compose一行命令部署Dify

3.Dify主要特色 

3.1多种大语言模型集成

3.2丰富工具内置+自定义工具支持​

3.3工作流​

3.4Agent 编排​

 4.总结


1.引言

刚刚开始写CSDN,下班了研究研究CSDN的玩法,看到活动区发起了“Agent AI智能体的未来”话题,正好最近工作中比较多的经历集中在这里,所以借着这个话题活动写一些自己的看法。

目前的工作总结来看就是进行AI智能体的开发,组里基于langchain做了一些储备,搭建了内部的一个AI智能体开发平台,支持了知识库、工具调用、多种大模型的配置,但当后来字节的coze、百度的千帆发布,发现AI智能体开发平台真是发展的太快了,相比于coze,丰富的插件工具,稳定的模型,成熟的工作流,分析自己组内的人力情况,根本追不上:组内同学既要全力完成okr,又要每个人开发自己的AI智能体,剩下的精力才能继续完善AI智能体平台。

五一之前,领导在群里扔了一个创业公司基于Dify开发AI应用的介绍。我抱着学习的态度了解了一下Dify:可以称之为开源版的coze!支持多种大模型、插件工具的一键适配,支持知识库自动切割创建索引,支持工具流开发!赞啊,这些功能不正是组内AI智能体开发平台需要添加的功能么。于是马上开始了部署与调试。

先分享dify项目的github:https://github.com/langgenius/dify

“项目由前腾讯云DevOps团队成员创建。我们发现基于OpenAI的API开发GPT应用程序有些麻烦。凭借我们多年在开发者效率工具方面的研发经验,我们希望能够让更多的人使用自然语言开发有趣的应用程序。”

dify由10+全职团队和100+社区贡献者共同维护,迭代非常快,我下载的时候还是0.6.3,现在已经升级为0.6.6,基本上每周都会升级一个版本。这里唠叨两句,对于迭代非常频繁的开源项目,在公司内部使用的时候,咱就先别二次开发了,否则你这边刚开发完一个点,另一边社区升级了,逻辑冲突合不到一块去,尴尬不尴尬。之前听说有个团队基于tensorflow1.x“自研”了深度学习训练框架,多年过去,社区都2.x了,那个团队还在tensorflow1.x上疲于奔命。哈哈哈,扯远了。

说回到本文,标题叫《Agent AI智能体的未来—Dify项目介绍》,其实是有些夸张的,但我认为,了解Dify以及其中依赖的上下游技术、架构,利用Dify快速建立AI智能体demo原型,其实对推进AI智能体开发是有意义的。今天我先简要介绍一下Dify的部署过程,后面的篇幅会具体分享基于Dify的AI智能体开发经验,以及AI智能体开发所依赖的底层技术。

2.docker compose一行命令部署Dify

首先将dify项目下载至服务器上,

git clone https://github.com/langgenius/dify.git

项目主要分为api(后端)和web(前端)两大部分,具体代码后面再分析,进入docker目录

cd docker

目录中包含几个文件,docker-compose.yaml可以通过docker compose直接启动所有服务和依赖,docker-compose.middleware.yaml可以先启动依赖的关系数据库、向量数据库等组件,再单独启动api和web端,隔离做得可以说太棒了。

先看一下docker-compose.yaml的源码

  1. version: '3'
  2. services:
  3. # API service
  4. api:
  5. image: langgenius/dify-api:0.6.6
  6. restart: always
  7. environment:
  8. # Startup mode, 'api' starts the API server.
  9. MODE: api
  10. # The log level for the application. Supported values are `DEBUG`, `INFO`, `WARNING`, `ERROR`, `CRITICAL`
  11. LOG_LEVEL: INFO
  12. # A secret key that is used for securely signing the session cookie and encrypting sensitive information on the database. You can generate a strong key using `openssl rand -base64 42`.
  13. SECRET_KEY: sk-9f73s3ljTXVcMT3Blb3ljTqtsKiGHXVcMT3BlbkFJLK7U
  14. # The base URL of console application web frontend, refers to the Console base URL of WEB service if console domain is
  15. # different from api or web app domain.
  16. # example: http://cloud.dify.ai
  17. CONSOLE_WEB_URL: ''
  18. # Password for admin user initialization.
  19. # If left unset, admin user will not be prompted for a password when creating the initial admin account.
  20. INIT_PASSWORD: ''
  21. # The base URL of console application api server, refers to the Console base URL of WEB service if console domain is
  22. # different from api or web app domain.
  23. # example: http://cloud.dify.ai
  24. CONSOLE_API_URL: ''
  25. # The URL prefix for Service API endpoints, refers to the base URL of the current API service if api domain is
  26. # different from console domain.
  27. # example: http://api.dify.ai
  28. SERVICE_API_URL: ''
  29. # The URL prefix for Web APP frontend, refers to the Web App base URL of WEB service if web app domain is different from
  30. # console or api domain.
  31. # example: http://udify.app
  32. APP_WEB_URL: ''
  33. # File preview or download Url prefix.
  34. # used to display File preview or download Url to the front-end or as Multi-model inputs;
  35. # Url is signed and has expiration time.
  36. FILES_URL: ''
  37. # When enabled, migrations will be executed prior to application startup and the application will start after the migrations have completed.
  38. MIGRATION_ENABLED: 'true'
  39. # The configurations of postgres database connection.
  40. # It is consistent with the configuration in the 'db' service below.
  41. DB_USERNAME: postgres
  42. DB_PASSWORD: difyai123456
  43. DB_HOST: db
  44. DB_PORT: 5432
  45. DB_DATABASE: dify
  46. # The configurations of redis connection.
  47. # It is consistent with the configuration in the 'redis' service below.
  48. REDIS_HOST: redis
  49. REDIS_PORT: 6379
  50. REDIS_USERNAME: ''
  51. REDIS_PASSWORD: difyai123456
  52. REDIS_USE_SSL: 'false'
  53. # use redis db 0 for redis cache
  54. REDIS_DB: 0
  55. # The configurations of celery broker.
  56. # Use redis as the broker, and redis db 1 for celery broker.
  57. CELERY_BROKER_URL: redis://:difyai123456@redis:6379/1
  58. # Specifies the allowed origins for cross-origin requests to the Web API, e.g. https://dify.app or * for all origins.
  59. WEB_API_CORS_ALLOW_ORIGINS: '*'
  60. # Specifies the allowed origins for cross-origin requests to the console API, e.g. https://cloud.dify.ai or * for all origins.
  61. CONSOLE_CORS_ALLOW_ORIGINS: '*'
  62. # CSRF Cookie settings
  63. # Controls whether a cookie is sent with cross-site requests,
  64. # providing some protection against cross-site request forgery attacks
  65. #
  66. # Default: `SameSite=Lax, Secure=false, HttpOnly=true`
  67. # This default configuration supports same-origin requests using either HTTP or HTTPS,
  68. # but does not support cross-origin requests. It is suitable for local debugging purposes.
  69. #
  70. # If you want to enable cross-origin support,
  71. # you must use the HTTPS protocol and set the configuration to `SameSite=None, Secure=true, HttpOnly=true`.
  72. #
  73. # The type of storage to use for storing user files. Supported values are `local` and `s3` and `azure-blob` and `google-storage`, Default: `local`
  74. STORAGE_TYPE: local
  75. # The path to the local storage directory, the directory relative the root path of API service codes or absolute path. Default: `storage` or `/home/john/storage`.
  76. # only available when STORAGE_TYPE is `local`.
  77. STORAGE_LOCAL_PATH: storage
  78. # The S3 storage configurations, only available when STORAGE_TYPE is `s3`.
  79. S3_ENDPOINT: 'https://xxx.r2.cloudflarestorage.com'
  80. S3_BUCKET_NAME: 'difyai'
  81. S3_ACCESS_KEY: 'ak-difyai'
  82. S3_SECRET_KEY: 'sk-difyai'
  83. S3_REGION: 'us-east-1'
  84. # The Azure Blob storage configurations, only available when STORAGE_TYPE is `azure-blob`.
  85. AZURE_BLOB_ACCOUNT_NAME: 'difyai'
  86. AZURE_BLOB_ACCOUNT_KEY: 'difyai'
  87. AZURE_BLOB_CONTAINER_NAME: 'difyai-container'
  88. AZURE_BLOB_ACCOUNT_URL: 'https://<your_account_name>.blob.core.windows.net'
  89. # The Google storage configurations, only available when STORAGE_TYPE is `google-storage`.
  90. GOOGLE_STORAGE_BUCKET_NAME: 'yout-bucket-name'
  91. GOOGLE_STORAGE_SERVICE_ACCOUNT_JSON_BASE64: 'your-google-service-account-json-base64-string'
  92. # The type of vector store to use. Supported values are `weaviate`, `qdrant`, `milvus`, `relyt`.
  93. VECTOR_STORE: weaviate
  94. # The Weaviate endpoint URL. Only available when VECTOR_STORE is `weaviate`.
  95. WEAVIATE_ENDPOINT: http://weaviate:8080
  96. # The Weaviate API key.
  97. WEAVIATE_API_KEY: WVF5YThaHlkYwhGUSmCRgsX3tD5ngdN8pkih
  98. # The Qdrant endpoint URL. Only available when VECTOR_STORE is `qdrant`.
  99. QDRANT_URL: http://qdrant:6333
  100. # The Qdrant API key.
  101. QDRANT_API_KEY: difyai123456
  102. # The Qdrant client timeout setting.
  103. QDRANT_CLIENT_TIMEOUT: 20
  104. # The Qdrant client enable gRPC mode.
  105. QDRANT_GRPC_ENABLED: 'false'
  106. # The Qdrant server gRPC mode PORT.
  107. QDRANT_GRPC_PORT: 6334
  108. # Milvus configuration Only available when VECTOR_STORE is `milvus`.
  109. # The milvus host.
  110. MILVUS_HOST: 127.0.0.1
  111. # The milvus host.
  112. MILVUS_PORT: 19530
  113. # The milvus username.
  114. MILVUS_USER: root
  115. # The milvus password.
  116. MILVUS_PASSWORD: Milvus
  117. # The milvus tls switch.
  118. MILVUS_SECURE: 'false'
  119. # relyt configurations
  120. RELYT_HOST: db
  121. RELYT_PORT: 5432
  122. RELYT_USER: postgres
  123. RELYT_PASSWORD: difyai123456
  124. RELYT_DATABASE: postgres
  125. # Mail configuration, support: resend, smtp
  126. MAIL_TYPE: ''
  127. # default send from email address, if not specified
  128. MAIL_DEFAULT_SEND_FROM: 'YOUR EMAIL FROM (eg: no-reply <no-reply@dify.ai>)'
  129. SMTP_SERVER: ''
  130. SMTP_PORT: 587
  131. SMTP_USERNAME: ''
  132. SMTP_PASSWORD: ''
  133. SMTP_USE_TLS: 'true'
  134. # the api-key for resend (https://resend.com)
  135. RESEND_API_KEY: ''
  136. RESEND_API_URL: https://api.resend.com
  137. # The DSN for Sentry error reporting. If not set, Sentry error reporting will be disabled.
  138. SENTRY_DSN: ''
  139. # The sample rate for Sentry events. Default: `1.0`
  140. SENTRY_TRACES_SAMPLE_RATE: 1.0
  141. # The sample rate for Sentry profiles. Default: `1.0`
  142. SENTRY_PROFILES_SAMPLE_RATE: 1.0
  143. # Notion import configuration, support public and internal
  144. NOTION_INTEGRATION_TYPE: public
  145. NOTION_CLIENT_SECRET: you-client-secret
  146. NOTION_CLIENT_ID: you-client-id
  147. NOTION_INTERNAL_SECRET: you-internal-secret
  148. # The sandbox service endpoint.
  149. CODE_EXECUTION_ENDPOINT: "http://sandbox:8194"
  150. CODE_EXECUTION_API_KEY: dify-sandbox
  151. CODE_MAX_NUMBER: 9223372036854775807
  152. CODE_MIN_NUMBER: -9223372036854775808
  153. CODE_MAX_STRING_LENGTH: 80000
  154. TEMPLATE_TRANSFORM_MAX_LENGTH: 80000
  155. CODE_MAX_STRING_ARRAY_LENGTH: 30
  156. CODE_MAX_OBJECT_ARRAY_LENGTH: 30
  157. CODE_MAX_NUMBER_ARRAY_LENGTH: 1000
  158. depends_on:
  159. - db
  160. - redis
  161. volumes:
  162. # Mount the storage directory to the container, for storing user files.
  163. - ./volumes/app/storage:/app/api/storage
  164. # uncomment to expose dify-api port to host
  165. # ports:
  166. # - "5001:5001"
  167. # worker service
  168. # The Celery worker for processing the queue.
  169. worker:
  170. image: langgenius/dify-api:0.6.6
  171. restart: always
  172. environment:
  173. # Startup mode, 'worker' starts the Celery worker for processing the queue.
  174. MODE: worker
  175. # --- All the configurations below are the same as those in the 'api' service. ---
  176. # The log level for the application. Supported values are `DEBUG`, `INFO`, `WARNING`, `ERROR`, `CRITICAL`
  177. LOG_LEVEL: INFO
  178. # A secret key that is used for securely signing the session cookie and encrypting sensitive information on the database. You can generate a strong key using `openssl rand -base64 42`.
  179. # same as the API service
  180. SECRET_KEY: sk-9f73s3ljTXVcMT3Blb3ljTqtsKiGHXVcMT3BlbkFJLK7U
  181. # The configurations of postgres database connection.
  182. # It is consistent with the configuration in the 'db' service below.
  183. DB_USERNAME: postgres
  184. DB_PASSWORD: difyai123456
  185. DB_HOST: db
  186. DB_PORT: 5432
  187. DB_DATABASE: dify
  188. # The configurations of redis cache connection.
  189. REDIS_HOST: redis
  190. REDIS_PORT: 6379
  191. REDIS_USERNAME: ''
  192. REDIS_PASSWORD: difyai123456
  193. REDIS_DB: 0
  194. REDIS_USE_SSL: 'false'
  195. # The configurations of celery broker.
  196. CELERY_BROKER_URL: redis://:difyai123456@redis:6379/1
  197. # The type of storage to use for storing user files. Supported values are `local` and `s3` and `azure-blob`, Default: `local`
  198. STORAGE_TYPE: local
  199. STORAGE_LOCAL_PATH: storage
  200. # The S3 storage configurations, only available when STORAGE_TYPE is `s3`.
  201. S3_ENDPOINT: 'https://xxx.r2.cloudflarestorage.com'
  202. S3_BUCKET_NAME: 'difyai'
  203. S3_ACCESS_KEY: 'ak-difyai'
  204. S3_SECRET_KEY: 'sk-difyai'
  205. S3_REGION: 'us-east-1'
  206. # The Azure Blob storage configurations, only available when STORAGE_TYPE is `azure-blob`.
  207. AZURE_BLOB_ACCOUNT_NAME: 'difyai'
  208. AZURE_BLOB_ACCOUNT_KEY: 'difyai'
  209. AZURE_BLOB_CONTAINER_NAME: 'difyai-container'
  210. AZURE_BLOB_ACCOUNT_URL: 'https://<your_account_name>.blob.core.windows.net'
  211. # The type of vector store to use. Supported values are `weaviate`, `qdrant`, `milvus`, `relyt`.
  212. VECTOR_STORE: weaviate
  213. # The Weaviate endpoint URL. Only available when VECTOR_STORE is `weaviate`.
  214. WEAVIATE_ENDPOINT: http://weaviate:8080
  215. # The Weaviate API key.
  216. WEAVIATE_API_KEY: WVF5YThaHlkYwhGUSmCRgsX3tD5ngdN8pkih
  217. # The Qdrant endpoint URL. Only available when VECTOR_STORE is `qdrant`.
  218. QDRANT_URL: http://qdrant:6333
  219. # The Qdrant API key.
  220. QDRANT_API_KEY: difyai123456
  221. # The Qdrant clinet timeout setting.
  222. QDRANT_CLIENT_TIMEOUT: 20
  223. # The Qdrant client enable gRPC mode.
  224. QDRANT_GRPC_ENABLED: 'false'
  225. # The Qdrant server gRPC mode PORT.
  226. QDRANT_GRPC_PORT: 6334
  227. # Milvus configuration Only available when VECTOR_STORE is `milvus`.
  228. # The milvus host.
  229. MILVUS_HOST: 127.0.0.1
  230. # The milvus host.
  231. MILVUS_PORT: 19530
  232. # The milvus username.
  233. MILVUS_USER: root
  234. # The milvus password.
  235. MILVUS_PASSWORD: Milvus
  236. # The milvus tls switch.
  237. MILVUS_SECURE: 'false'
  238. # Mail configuration, support: resend
  239. MAIL_TYPE: ''
  240. # default send from email address, if not specified
  241. MAIL_DEFAULT_SEND_FROM: 'YOUR EMAIL FROM (eg: no-reply <no-reply@dify.ai>)'
  242. # the api-key for resend (https://resend.com)
  243. RESEND_API_KEY: ''
  244. RESEND_API_URL: https://api.resend.com
  245. # relyt configurations
  246. RELYT_HOST: db
  247. RELYT_PORT: 5432
  248. RELYT_USER: postgres
  249. RELYT_PASSWORD: difyai123456
  250. RELYT_DATABASE: postgres
  251. # Notion import configuration, support public and internal
  252. NOTION_INTEGRATION_TYPE: public
  253. NOTION_CLIENT_SECRET: you-client-secret
  254. NOTION_CLIENT_ID: you-client-id
  255. NOTION_INTERNAL_SECRET: you-internal-secret
  256. depends_on:
  257. - db
  258. - redis
  259. volumes:
  260. # Mount the storage directory to the container, for storing user files.
  261. - ./volumes/app/storage:/app/api/storage
  262. # Frontend web application.
  263. web:
  264. image: langgenius/dify-web:0.6.6
  265. restart: always
  266. environment:
  267. # The base URL of console application api server, refers to the Console base URL of WEB service if console domain is
  268. # different from api or web app domain.
  269. # example: http://cloud.dify.ai
  270. CONSOLE_API_URL: ''
  271. # The URL for Web APP api server, refers to the Web App base URL of WEB service if web app domain is different from
  272. # console or api domain.
  273. # example: http://udify.app
  274. APP_API_URL: ''
  275. # The DSN for Sentry error reporting. If not set, Sentry error reporting will be disabled.
  276. SENTRY_DSN: ''
  277. # uncomment to expose dify-web port to host
  278. # ports:
  279. # - "3000:3000"
  280. # The postgres database.
  281. db:
  282. image: postgres:15-alpine
  283. restart: always
  284. environment:
  285. PGUSER: postgres
  286. # The password for the default postgres user.
  287. POSTGRES_PASSWORD: difyai123456
  288. # The name of the default postgres database.
  289. POSTGRES_DB: dify
  290. # postgres data directory
  291. PGDATA: /var/lib/postgresql/data/pgdata
  292. volumes:
  293. - ./volumes/db/data:/var/lib/postgresql/data
  294. # uncomment to expose db(postgresql) port to host
  295. # ports:
  296. # - "5432:5432"
  297. healthcheck:
  298. test: [ "CMD", "pg_isready" ]
  299. interval: 1s
  300. timeout: 3s
  301. retries: 30
  302. # The redis cache.
  303. redis:
  304. image: redis:6-alpine
  305. restart: always
  306. volumes:
  307. # Mount the redis data directory to the container.
  308. - ./volumes/redis/data:/data
  309. # Set the redis password when startup redis server.
  310. command: redis-server --requirepass difyai123456
  311. healthcheck:
  312. test: [ "CMD", "redis-cli", "ping" ]
  313. # uncomment to expose redis port to host
  314. # ports:
  315. # - "6379:6379"
  316. # The Weaviate vector store.
  317. weaviate:
  318. image: semitechnologies/weaviate:1.19.0
  319. restart: always
  320. volumes:
  321. # Mount the Weaviate data directory to the container.
  322. - ./volumes/weaviate:/var/lib/weaviate
  323. environment:
  324. # The Weaviate configurations
  325. # You can refer to the [Weaviate](https://weaviate.io/developers/weaviate/config-refs/env-vars) documentation for more information.
  326. QUERY_DEFAULTS_LIMIT: 25
  327. AUTHENTICATION_ANONYMOUS_ACCESS_ENABLED: 'false'
  328. PERSISTENCE_DATA_PATH: '/var/lib/weaviate'
  329. DEFAULT_VECTORIZER_MODULE: 'none'
  330. CLUSTER_HOSTNAME: 'node1'
  331. AUTHENTICATION_APIKEY_ENABLED: 'true'
  332. AUTHENTICATION_APIKEY_ALLOWED_KEYS: 'WVF5YThaHlkYwhGUSmCRgsX3tD5ngdN8pkih'
  333. AUTHENTICATION_APIKEY_USERS: 'hello@dify.ai'
  334. AUTHORIZATION_ADMINLIST_ENABLED: 'true'
  335. AUTHORIZATION_ADMINLIST_USERS: 'hello@dify.ai'
  336. # uncomment to expose weaviate port to host
  337. # ports:
  338. # - "8080:8080"
  339. # The DifySandbox
  340. sandbox:
  341. image: langgenius/dify-sandbox:0.1.0
  342. restart: always
  343. cap_add:
  344. # Why is sys_admin permission needed?
  345. # https://docs.dify.ai/getting-started/install-self-hosted/install-faq#id-16.-why-is-sys_admin-permission-needed
  346. - SYS_ADMIN
  347. environment:
  348. # The DifySandbox configurations
  349. API_KEY: dify-sandbox
  350. GIN_MODE: release
  351. WORKER_TIMEOUT: 15
  352. # Qdrant vector store.
  353. # uncomment to use qdrant as vector store.
  354. # (if uncommented, you need to comment out the weaviate service above,
  355. # and set VECTOR_STORE to qdrant in the api & worker service.)
  356. # qdrant:
  357. # image: langgenius/qdrant:v1.7.3
  358. # restart: always
  359. # volumes:
  360. # - ./volumes/qdrant:/qdrant/storage
  361. # environment:
  362. # QDRANT_API_KEY: 'difyai123456'
  363. # # uncomment to expose qdrant port to host
  364. # # ports:
  365. # # - "6333:6333"
  366. # # - "6334:6334"
  367. # The nginx reverse proxy.
  368. # used for reverse proxying the API service and Web service.
  369. nginx:
  370. image: nginx:latest
  371. restart: always
  372. volumes:
  373. - ./nginx/nginx.conf:/etc/nginx/nginx.conf
  374. - ./nginx/proxy.conf:/etc/nginx/proxy.conf
  375. - ./nginx/conf.d:/etc/nginx/conf.d
  376. #- ./nginx/ssl:/etc/ssl
  377. depends_on:
  378. - api
  379. - web
  380. ports:
  381. - "80:80"
  382. #- "443:443"

主要包含如下几个模块及docker镜像

  • api:langgenius/dify-api:0.6.6
  • worker:langgenius/dify-api:0.6.6
  • web:langgenius/dify-web:0.6.6
  • db:postgres:15-alpine
  • redis:redis:6-alpine
  • weaviate:semitechnologies/weaviate:1.19.0
  • sandbox:langgenius/dify-sandbox:0.1.0
  • nginx:nginx:latest

docker compose一键部署安装

docker compose up -d

从dockerhub逐个pulling依赖镜像,感觉还挺爽的

等待镜像下载部署完成后,打开webui:123.123.123.123:80,默认占用80端口,可以在docker compose配置文件中更改nignx端口。

经过账号初始化等工作后,欢迎来到Dify工作界面,就是这么丝滑

3.Dify主要特色 

3.1多种大语言模型集成

相较于fastGPT+oneApi的方案,集成度更高:

  • 大模型厂商模型:仅需到大模型厂商注册账号,申请鉴权apikey,即可快速体验比较每个大模型厂商的优劣,项目还贴心的附带了每个厂商的注册跳转链接,每个厂商基本都会给几百万的token用于测试。
  • 本地部署模型:支持Xinference,Ollama,OpenLLM,LocalAI等推理框架部署的模型一键接入
  • ​HuggingFace开源模型:只需配置APIKEY和模型名字即可接入,不过要求服务器能翻墙噢
  • OpenAI-API-compatible:接入兼容OpenAI规范的API,目前Xinference,OpenLLM等很多推理框架,都是直接支持OpenAI API接口规范的,但对于每家大模型厂商,一般都要设计自己的规范,可能是想构建生态,或者是与众不同?这里多说一下:对于国内互联网中小厂,如果不自建大模型,可能要试用或买入多加公司的大模型,多加大模型给多个业务部门使用,就要有一个代理平台专门计算成本,这个平台最好对业务暴露的是兼容OpenAI的API接口,如果不是的话,当接入Dify平台时,就需要包一层与OpenAI兼容的API接口。

3.2丰富工具内置+自定义工具支持

 Dify内置了包含搜索引擎、天气预报、维基百科、SD等工具,同时自定义工具的配置化接入,团队成员一人接入,全组复用,高效!

3.3工作流

 只需连接各个节点,既能在几分钟内快速完成AI智能体创作,且逻辑非常清晰。

3.4Agent 编排

编写提示词,导入知识库,添加工具,选择模型,运行测试,发布为API,一条龙创作! 

还有很多特色,在此不再赘述了,附上一张官方的表格吧

 4.总结

临下班了,本来只是想参加个话题活动,洋洋洒洒几千字。可能加入了太多感慨吧。希望感兴趣的朋友可以关注我、点赞、收藏和评论,您的鼓励是我持续码字的动力。

本文首先结合自己的工作写了一些对Agent AI智能体的见解,接着介绍了Dify框架快捷部署的过程,最后阐述了Dify框架的特点。个人认为Dify的发展会让Agent AI智能体开发提效,涌现更多有趣有价值的AI应用。

最后,写一下我对未来AI智能体发展的看法吧,从流量与用户来看,2000年-2004年,以新浪、搜狐、网易为代表的门户网站是流量入口,2004年-2014年,以百度为代表的搜索引擎是流量入口,2014-2024,以抖音、快手、微博、小红书为代表的移动互联网推荐系统是流量入口,抓住了流量入口就抓住了用户,抓住了用户就抓住了商业变现。2024-未来,极大的可能出现一家基于AI的平台型企业,通过AI智能体抓住流量,比如你要去哪玩,AI智能体在为你做出规划的过程中,夹杂酒店、航班的广告私货,你想吃什么,AI智能体夹杂着饭店的广告私货。

AI领域,乾坤未定,你我皆是黑马。

如果对AI感兴趣,可以接着看看我的其他文章:

《AI—工程篇》

AI智能体研发之路-工程篇(一):Docker助力AI智能体开发提效​​​​​​​

AI智能体研发之路-工程篇(二):Dify智能体开发平台一键部署

AI智能体研发之路-工程篇(三):大模型推理服务框架Ollama一键部署

AI智能体研发之路-工程篇(四):大模型推理服务框架Xinference一键部署

AI智能体研发之路-工程篇(五):大模型推理服务框架LocalAI一键部署

《AI—模型篇》

AI智能体研发之路-模型篇(一):大模型训练框架LLaMA-Factory在国内网络环境下的安装、部署及使用

AI智能体研发之路-模型篇(二):DeepSeek-V2-Chat 训练与推理实战

声明:本文内容由网友自发贡献,不代表【wpsshop博客】立场,版权归原作者所有,本站不承担相应法律责任。如您发现有侵权的内容,请联系我们。转载请注明出处:https://www.wpsshop.cn/w/菜鸟追梦旅行/article/detail/704884
推荐阅读
相关标签
  

闽ICP备14008679号