当前位置:   article > 正文

dify Start with Local Source Code

dify Start with Local Source Code

Server Deployment

  • API Interface Service

  • Worker Asynchronous Queue Consumption Service

Installation of the basic environment:

Server startup requires Python 3.10.x. It is recommended to use pyenv for quick installation of the Python environment.

To install additional Python versions, use pyenv install.

pyenv install 3.10

To switch to the "3.10" Python environment, use the following command:

pyenv global 3.10

Follow these steps :

  1. Navigate to the "api" directory:

  • cd api
  • Copy the environment variable configuration file:

  • cp .env.example .env
  • Generate a random secret key and replace the value of SECRET_KEY in the .env file:

    1. SECRET_KEY=$(openssl rand -base64 42)
    2. ESCAPED_SECRET_KEY=$(printf '%s\n' "$SECRET_KEY" | sed 's/[\/&]/\\&/g')
    3. sed -i "s/SECRET_KEY=.*/SECRET_KEY=${ESCAPED_SECRET_KEY}/" .env
  • Install the required dependencies:

    Dify API service uses Poetry to manage dependencies. You can execute poetry shell to activate the environment.

    1. poetry env use 3.10
    2. poetry install
  • Perform the database migration

    Perform database migration to the latest version:

    1. poetry shell
    2. flask db upgrade
  • Start the API server:

flask run --host 0.0.0.0 --port=5001 --debug

output:

    1. * Debug mode: on
    2. INFO:werkzeug:WARNING: This is a development server. Do not use it in a production deployment. Use a production WSGI server instead.
    3. * Running on all addresses (0.0.0.0)
    4. * Running on http://127.0.0.1:5001
    5. INFO:werkzeug:Press CTRL+C to quit
    6. INFO:werkzeug: * Restarting with stat
    7. WARNING:werkzeug: * Debugger is active!
    8. INFO:werkzeug: * Debugger PIN: 695-801-919
  • start the Worker service

    To consume asynchronous tasks from the queue, such as dataset file import and dataset document updates, follow these steps to start the Worker service on Linux or macOS:

celery -A app.celery worker -P gevent -c 1 --loglevel INFO -Q dataset,generation,mail,ops_trace

If you are using a Windows system to start the Worker service, please use the following command instead:

celery -A app.celery worker -P solo --without-gossip --without-mingle -Q dataset,generation,mail,ops_trace --loglevel INFO

output:

    1. -------------- celery@TAKATOST.lan v5.2.7 (dawn-chorus)
    2. --- ***** -----
    3. -- ******* ---- macOS-10.16-x86_64-i386-64bit 2023-07-31 12:58:08
    4. - *** --- * ---
    5. - ** ---------- [config]
    6. - ** ---------- .> app: app:0x7fb568572a10
    7. - ** ---------- .> transport: redis://:**@localhost:6379/1
    8. - ** ---------- .> results: postgresql://postgres:**@localhost:5432/dify
    9. - *** --- * --- .> concurrency: 1 (gevent)
    10. -- ******* ---- .> task events: OFF (enable -E to monitor tasks in this worker)
    11. --- ***** -----
    12. -------------- [queues]
    13. .> dataset exchange=dataset(direct) key=dataset
    14. .> generation exchange=generation(direct) key=generation
    15. .> mail exchange=mail(direct) key=mail
    16. [tasks]
    17. . tasks.add_document_to_index_task.add_document_to_index_task
    18. . tasks.clean_dataset_task.clean_dataset_task
    19. . tasks.clean_document_task.clean_document_task
    20. . tasks.clean_notion_document_task.clean_notion_document_task
    21. . tasks.create_segment_to_index_task.create_segment_to_index_task
    22. . tasks.deal_dataset_vector_index_task.deal_dataset_vector_index_task
    23. . tasks.document_indexing_sync_task.document_indexing_sync_task
    24. . tasks.document_indexing_task.document_indexing_task
    25. . tasks.document_indexing_update_task.document_indexing_update_task
    26. . tasks.enable_segment_to_index_task.enable_segment_to_index_task
    27. . tasks.generate_conversation_summary_task.generate_conversation_summary_task
    28. . tasks.mail_invite_member_task.send_invite_member_mail_task
    29. . tasks.remove_document_from_index_task.remove_document_from_index_task
    30. . tasks.remove_segment_from_index_task.remove_segment_from_index_task
    31. . tasks.update_segment_index_task.update_segment_index_task
    32. . tasks.update_segment_keyword_index_task.update_segment_keyword_index_task
    33. [2023-07-31 12:58:08,831: INFO/MainProcess] Connected to redis://:**@localhost:6379/1
    34. [2023-07-31 12:58:08,840: INFO/MainProcess] mingle: searching for neighbors
    35. [2023-07-31 12:58:09,873: INFO/MainProcess] mingle: all alone
    36. [2023-07-31 12:58:09,886: INFO/MainProcess] pidbox: Connected to redis://:**@localhost:6379/1.
    37. [2023-07-31 12:58:09,890: INFO/MainProcess] celery@TAKATOST.lan ready.

Deploy the frontend page

Start the web frontend client page service

Installation of the basic environment:

To start the web frontend service, you will need Node.js v18.x (LTS) and NPM version 8.x.x or Yarn.

  • Install NodeJS + NPM

Please visit https://nodejs.org/en/download and choose the installation package for your respective operating system that is v18.x or higher. It is recommended to download the stable version, which includes NPM by default.

Follow these steps :

  1. Enter the web directory

  • cd web
  • Install the dependencies.

  • npm install
  • Configure the environment variables. Create a file named .env.local in the current directory and copy the contents from .env.example. Modify the values of these environment variables according to your requirements:

    1. # For production release, change this to PRODUCTION
    2. NEXT_PUBLIC_DEPLOY_ENV=DEVELOPMENT
    3. # The deployment edition, SELF_HOSTED or CLOUD
    4. NEXT_PUBLIC_EDITION=SELF_HOSTED
    5. # The base URL of console application, refers to the Console base URL of WEB service if console domain is
    6. # different from api or web app domain.
    7. # example: http://cloud.dify.ai/console/api
    8. NEXT_PUBLIC_API_PREFIX=http://localhost:5001/console/api
    9. # The URL for Web APP, refers to the Web App base URL of WEB service if web app domain is different from
    10. # console or api domain.
    11. # example: http://udify.app/api
    12. NEXT_PUBLIC_PUBLIC_API_PREFIX=http://localhost:5001/api
    13. # SENTRY
    14. NEXT_PUBLIC_SENTRY_DSN=
    15. NEXT_PUBLIC_SENTRY_ORG=
    16. NEXT_PUBLIC_SENTRY_PROJECT=
  • Build the code

  • npm run build
  • Start the web service:

    1. npm run start
    2. # or
    3. yarn start
    4. # or
    5. pnpm start

After successful startup, the terminal will output the following information:

  1. ready - started server on 0.0.0.0:3000, url: http://localhost:3000
  2. warn - You have enabled experimental feature (appDir) in next.config.js.
  3. warn - Experimental features are not covered by semver, and may cause unexpected or broken application behavior. Use at your own risk.
  4. info - Thank you for testing `appDir` please leave your feedback at https://nextjs.link/app-feedback

Access Dify

Finally, access http://127.0.0.1:3000 to use the locally deployed Dify.

SECRET_KEY=$(openssl rand -base64 42)
sed -i "s/SECRET_KEY=.*/SECRET_KEY=${SECRET_KEY}/" .env

要安装最新版本的 Poetry,你可以按照以下步骤进行。Poetry 提供了一种简单的方法来安装和管理其版本。
1. 安装或更新 Poetry
你可以使用以下命令来安装或更新 Poetry:
curl -sSL https://install.python-poetry.org | python3 -


          
2. 配置环境变量
安装完成后,你需要将 Poetry 的路径添加到你的环境变量中。通常情况下,安装脚本会自动完成这一步。如果没有,你可以手动添加。
编辑你的 ~/.bashrc(或 ~/.zshrc,如果你使用的是 Zsh)文件:
nano ~/.bashrc
          
添加以下行:
export PATH="$HOME/.local/bin:$PATH"
          
保存并退出编辑器,然后重新加载 shell 配置:
source ~/.bashrc
          
3. 验证安装
你可以通过以下命令验证 Poetry 是否已正确安装:
poetry --version
          
这应该会输出 Poetry 的版本号,例如:
Poetry version 1.3.2
      
4. 使用 Poetry 管理环境
现在你可以使用 Poetry 来管理你的项目环境。例如,使用以下命令来创建和管理 Python 3.10 环境:
poetry env use 3.10
poetry install
          
总结
通过上述步骤,你应该能够成功安装和配置最新版本的 Poetry。如果你在任何步骤中遇到问题或需要进一步的帮助,请告诉我!

curl -sSL https://install.python-poetry.org | python3 -

本文内容由网友自发贡献,转载请注明出处:https://www.wpsshop.cn/w/神奇cpp/article/detail/943965
推荐阅读
相关标签
  

闽ICP备14008679号