当前位置:   article > 正文

nodejs 实现加载 huggingface local embedding model 方法_@xenova/transformers

@xenova/transformers

耗尽两天出坑,整理过程如下,希望对遇到问题的人得到帮助!!!

首先nodejs在大模型生态上,坑还是超级多,尤其是对我不熟悉nodejs。

我没有从零创建项目,比如用npm init 方法,而是使用的一个开源项目:

git clone https://github.com/langchain-ai/langchain-nextjs-template.git

基于这个项目本身pnpm dev 页面显示正常,然后创建lib目录,进行ts代码开发。

前提知识:

1. 需要了解transformer js 项目,这是transformer 的js版本,nodejs server端用ts代码实现,无缝集成langchain的话,需要用这个版本。(另外看到有node集成python的,最好选择这个方案,会跟LLM融合很轻松,我以后有时间再学习)

2.需要了解langchainjs 项目,在GitHub上有项目地址,这里面集成了第一项的hf_transformer 。主要还是第一项必须得调试过。

需要提前 pnpm i @xenova/transformers

3.transformer 项目要加载onnx模型,不能是pytoch模型,所以,翻山越岭,还要解决这一关。总之,要求的技术栈真多。

问题一: 项目在commonjs 和 ES 兼容问题

 @xenova/transformers 项目的type是module, 如果主项目是commonjs,会出现不兼容问题。

问题二: 我调试ts代码的时候,使用的ts-node,遇到了两类典型问题:

① import {} from modulefile 不支持

  1. D:\code\langchain\test>ts-node llama_cpp_basic.ts
  2. C:\Users\baron\AppData\Roaming\npm\node_modules\ts-node\dist\index.js:851
  3. return old(m, filename);
  4. ^
  5. Error [ERR_REQUIRE_ESM]: require() of ES Module D:\code\langchain\test\node_modules\.pnpm\node-llama-cpp@2.8.0\node_modules\node-llama-cpp\dist\index.js from D:\code\langchain\test\node_modules\.pnpm\langchain@0.0.200_node-llama-cpp@2.8.0\node_modules\langchain\dist\util\llama_cpp.cjs not supported.
  6. Instead change the require of index.js in D:\code\langchain\test\node_modules\.pnpm\langchain@0.0.200_node-llama-cpp@2.8.0\node_modules\langchain\dist\util\llama_cpp.cjs to a dynamic import() which is available in all CommonJS modules.
  7. at require.extensions.<computed> [as .js] (C:\Users\baron\AppData\Roaming\npm\node_modules\ts-node\dist\index.js:851:20)
  8. at Object.<anonymous> (D:\code\langchain\test\node_modules\.pnpm\langchain@0.0.200_node-llama-cpp@2.8.0\node_modules\langchain\dist\util\llama_cpp.cjs:4:26)
  9. at require.extensions.<computed> [as .js] (C:\Users\baron\AppData\Roaming\npm\node_modules\ts-node\dist\index.js:851:20)
  10. at Object.<anonymous> (D:\code\langchain\test\node_modules\.pnpm\langchain@0.0.200_node-llama-cpp@2.8.0\node_modules\langchain\dist\embeddings\llama_cpp.cjs:4:24)
  11. at require.extensions.<computed> [as .js] (C:\Users\baron\AppData\Roaming\npm\node_modules\ts-node\dist\index.js:851:20)
  12. at Object.<anonymous> (D:\code\langchain\test\node_modules\.pnpm\langchain@0.0.200_node-llama-cpp@2.8.0\node_modules\langchain\embeddings\llama_cpp.cjs:1:18)
  13. at require.extensions.<computed> [as .js] (C:\Users\baron\AppData\Roaming\npm\node_modules\ts-node\dist\index.js:851:20)
  14. at Object.<anonymous> (D:\code\langchain\test\llama_cpp_basic.ts:3:21)
  15. at m._compile (C:\Users\baron\AppData\Roaming\npm\node_modules\ts-node\dist\index.js:857:29)
  16. at require.extensions.<computed> [as .ts] (C:\Users\baron\AppData\Roaming\npm\node_modules\ts-node\dist\index.js:859:16)
  17. at phase4 (C:\Users\baron\AppData\Roaming\npm\node_modules\ts-node\dist\bin.js:466:20)
  18. at bootstrap (C:\Users\baron\AppData\Roaming\npm\node_modules\ts-node\dist\bin.js:54:12)
  19. at main (C:\Users\baron\AppData\Roaming\npm\node_modules\ts-node\dist\bin.js:33:12)
  20. at Object.<anonymous> (C:\Users\baron\AppData\Roaming\npm\node_modules\ts-node\dist\bin.js:579:5) {
  21. code: 'ERR_REQUIRE_ESM'
  22. }

② MODULE_NOT_FUND,在测试langchainjs example时报错

  1. D:\code\langchain\test>ts-node llama_cpp_basic.ts
  2. Error: Cannot find module 'node-llama-cpp'
  3. Require stack:
  4. - D:\code\langchain\test\node_modules\.pnpm\langchain@0.0.200\node_modules\langchain\dist\util\llama_cpp.cjs
  5. - D:\code\langchain\test\node_modules\.pnpm\langchain@0.0.200\node_modules\langchain\dist\embeddings\llama_cpp.cjs
  6. - D:\code\langchain\test\node_modules\.pnpm\langchain@0.0.200\node_modules\langchain\embeddings\llama_cpp.cjs
  7. - D:\code\langchain\test\llama_cpp_basic.ts
  8. at Function.Module._resolveFilename (node:internal/modules/cjs/loader:1048:15)
  9. at Function.Module._resolveFilename.sharedData.moduleResolveFilenameHook.installedValue [as _resolveFilename] (C:\Users\baron\AppData\Roaming\npm\node_modules\ts-node\node_modules\@cspotcode\source-map-support\source-map-support.js:811:30)
  10. at Function.Module._load (node:internal/modules/cjs/loader:901:27)
  11. at Module.require (node:internal/modules/cjs/loader:1115:19)
  12. at require (node:internal/modules/helpers:130:18)
  13. at Object.<anonymous> (D:\code\langchain\test\node_modules\.pnpm\langchain@0.0.200\node_modules\langchain\dist\util\llama_cpp.cjs:4:26)
  14. at Module._compile (node:internal/modules/cjs/loader:1241:14)
  15. at Module._extensions..js (node:internal/modules/cjs/loader:1295:10)
  16. at Object.require.extensions.<computed> [as .js] (C:\Users\baron\AppData\Roaming\npm\node_modules\ts-node\src\index.ts:1608:43)
  17. at Module.load (node:internal/modules/cjs/loader:1091:32) {
  18. code: 'MODULE_NOT_FOUND',
  19. requireStack: [
  20. 'D:\\code\\langchain\\test\\node_modules\\.pnpm\\langchain@0.0.200\\node_modules\\langchain\\dist\\util\\llama_cpp.cjs',
  21. 'D:\\code\\langchain\\test\\node_modules\\.pnpm\\langchain@0.0.200\\node_modules\\langchain\\dist\\embeddings\\llama_cpp.cjs',
  22. 'D:\\code\\langchain\\test\\node_modules\\.pnpm\\langchain@0.0.200\\node_modules\\langchain\\embeddings\\llama_cpp.cjs',
  23. 'D:\\code\\langchain\\test\\llama_cpp_basic.ts'
  24. ]
  25. }

这个问题遇到多次,经常忽视第一行提示,瞎找答案

  1. D:\code\langchain\test>pnpm i node-llama-cpp
  2. Packages: +151 -1
  3. +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++-
  4. Progress: resolved 219, reused 162, downloaded 57, added 151, done
  5. Downloading registry.npmmirror.com/node-llama-cpp/2.8.0: 8.13 MB/8.13 MB, done
  6. node_modules/.pnpm/node-llama-cpp@2.8.0/node_modules/node-llama-cpp: Running postinstall script, done in 345ms
  7. dependencies:
  8. + node-llama-cpp 2.8.0
  9. Done in 5.7s

Error : modulefile 不存在,其实那个ts文件就在那

③ ts文件扩展名不识别

解决办法:

ts-node 在node v20 and above,不支持(从18v 升到21v,才发现,又降到18v)

ts-node  --esm 无效

node --load/ts-node-esm 方式无效

最后:发现更好用的工具tsx

安装:npm install tsx -g 。建议进行全局安装

在package.json 中scripts 部分,增加:

 "tsx": "tsx .\\lib\\embedding.ts",

调试ts文件得以解决

问题三   报connect outtime 问题,这个我熟,hf模型需要连接网络进行下载。hf国内根本访问不了,翻墙也不行。需要用本地模型。

  1. import { HuggingFaceTransformersEmbeddings } from "langchain/embeddings/hf_transformers";
  2. const embs = new HuggingFaceTransformersEmbeddings({modelName:"bge-large-en"})
  3. const docs = "doc"
  4. console.log(embs.embedQuery(docs))
  5. console.log("tttt")

上面代码怎么配置本地模型和禁止联网呢?参考下面代码,在上面代码头上增加。

  1. import { env } from '@xenova/transformers';
  2. // Specify a custom location for models (defaults to '/models/').
  3. env.localModelPath = 'D:/code/embedding/';
  4. // Disable the loading of remote models from the Hugging Face Hub:
  5. env.allowRemoteModels = false;
  6. // Set location of .wasm files. Defaults to use a CDN.
  7. // env.backends.onnx.wasm.wasmPaths = '/path/to/files/';

ok,加载本地模型的方式解决了。需要注意,transformer支持的模型是有清单的,参考项目下的scripts/support_models.py文件,很多中文模型都不支持,如bge-large-zh,是否配置后就可以,需要确认。

问题三、Error: `local_files_only=true` or `env.allowRemoteModels=false` and file was not found locally at "D:/code/embedding/bge-large-en/onnx/model_quantized.onnx".

加载模型,从云端下载的pt版本的是不行的,需要onnx,统一标准模型(跨平台)

坑1:使用transformer项目中scripts目录下的convert.py

python -m convert --quantize --model_id bge-large-en

上面是transfomer项目readme的方法,行不通!!一开始我以为python 的版本是3.11可能太高,重新创建了3.9的虚拟环境,错误依旧。

报如下错误:

  1. (embedding_3.9) D:\code\embedding>python -m convert --quantize --model_id bge-large-en
  2. Framework not specified. Using pt to export to ONNX.
  3. Traceback (most recent call last):
  4. File "C:\Users\baron\anaconda3\envs\embedding_3.9\lib\runpy.py", line 197, in _run_module_as_main
  5. return _run_code(code, main_globals, None,
  6. File "C:\Users\baron\anaconda3\envs\embedding_3.9\lib\runpy.py", line 87, in _run_code
  7. exec(code, run_globals)
  8. File "D:\code\embedding\convert.py", line 396, in <module>
  9. main()
  10. File "D:\code\embedding\convert.py", line 357, in main
  11. main_export(**export_kwargs)
  12. File "C:\Users\baron\anaconda3\envs\embedding_3.9\lib\site-packages\optimum\exporters\onnx\__main__.py", line 313, in main_export
  13. task = TasksManager.infer_task_from_model(model_name_or_path)
  14. File "C:\Users\baron\anaconda3\envs\embedding_3.9\lib\site-packages\optimum\exporters\tasks.py", line 1453, in infer_task_from_model
  15. task = cls._infer_task_from_model_name_or_path(model, subfolder=subfolder, revision=revision)
  16. File "C:\Users\baron\anaconda3\envs\embedding_3.9\lib\site-packages\optimum\exporters\tasks.py", line 1377, in _infer_task_from_model_name_or_path
  17. raise RuntimeError("Cannot infer the task from a local directory yet, please specify the task manually.")
  18. RuntimeError: Cannot infer the task from a local directory yet, please specify the task manually.

分析了一通源代码,没有搞定。直接另辟道路,寻找通用的pt模型转onnx,解决了!!

参考:Huggingface:导出transformers模型到onnx-腾讯云开发者社区-腾讯云 (tencent.com)

因为用python进行模型转换,需要python环境,安装transformer.onnx 

note:cmd窗口需要管理员权限运行,否则出错。

pip install transformers.onnx 

  1. (embedding_3.9) D:\code\embedding>python -m transformers.onnx --model=bge-large-en onnx/
  2. Framework not specified. Using pt to export to ONNX.
  3. Using the export variant default. Available variants are:
  4. - default: The default ONNX variant.
  5. Using framework PyTorch: 2.1.1+cpu
  6. Overriding 1 configuration item(s)
  7. - use_cache -> False
  8. Post-processing the exported models...
  9. Deduplicating shared (tied) weights...
  10. Validating ONNX model onnx/model.onnx...
  11. -[✓] ONNX model output names match reference model (last_hidden_state)
  12. - Validating ONNX Model output "last_hidden_state":
  13. -[✓] (2, 16, 1024) matches (2, 16, 1024)
  14. -[✓] all values close (atol: 0.0001)
  15. The ONNX export succeeded and the exported model was saved at: onnx
  16. The export was done by optimum.exporters.onnx. We recommend using to use this package directly in future, as transformers.onnx is deprecated, and will be removed in v5. You can find more information here: https://huggingface.co/docs/optimum/exporters/onnx/usage_guides/export_a_model.

onnx/ 是模型导出的目录,注意:将这个目录要copy到原模型目录下,因为之前提示错误都是找模型的子目录的onnx路径。同时要注意模型文件model.onnx-》model_quantized.onnx需要修改,   否则报下面错误:

  1. Error: `local_files_only=true` or `env.allowRemoteModels=false` and file was not found locally at "D:/code/embedding/bge-large-en/onnx/model_quantized.onnx".
  2. at getModelFile (d:\code\langchain\langchain-nextjs-template\node_modules\.pnpm\@xenova+transformers@2.9.0\node_modules\@xenova\transformers\src\utils\hub.js:461:27)
  3. at constructSession (d:\code\langchain\langchain-nextjs-template\node_modules\.pnpm\@xenova+transformers@2.9.0\node_modules\@xenova\transformers\src\models.js:126:18)
  4. at async Promise.all (index 1)
  5. at Function.from_pretrained (d:\code\langchain\langchain-nextjs-template\node_modules\.pnpm\@xenova+transformers@2.9.0\node_modules\@xenova\transformers\src\models.js:782:20)
  6. at Function.from_pretrained (d:\code\langchain\langchain-nextjs-template\node_modules\.pnpm\@xenova+transformers@2.9.0\node_modules\@xenova\transformers\src\models.js:4082:20)
  7. at async Promise.all (index 1)
  8. at loadItems (d:\code\langchain\langchain-nextjs-template\node_modules\.pnpm\@xenova+transformers@2.9.0\node_modules\@xenova\transformers\src\pipelines.js:2568:5)
  9. at pipeline (d:\code\langchain\langchain-nextjs-template\node_modules\.pnpm\@xenova+transformers@2.9.0\node_modules\@xenova\transformers\src\pipelines.js:2514:19)
  10. at async HuggingFaceTransformersEmbeddings.runEmbedding (D:\code\langchain\langchain-nextjs-template\node_modules\.pnpm\langchain@0.0.187_@supabase+supabase-js@2.39.0_@xenova+transformers@2.9.0\node_modules\langchain\dist\embeddings\hf_transformers.cjs:64:22)
  11. at async HuggingFaceTransformersEmbeddings.embedQuery (D:\code\langchain\langchain-nextjs-template\node_modules\.pnpm\langchain@0.0.187_@supabase+supabase-js@2.39.0_@xenova+transformers@2.9.0\node_modules\langchain\dist\embeddings\hf_transformers.cjs:58:22)
  12. Node.js v20.9.0
  13.  ELIFECYCLE  Command failed with exit code 1.

处理完后,再执行pnpm tsx ,成功!!!!!!

  1. (base) PS D:\code\langchain\langchain-nextjs-template> pnpm tsx
  2. > langchain-nextjs-template@0.0.0 tsx D:\code\langchain\langchain-nextjs-template
  3. > tsx .\lib\embedding.ts
  4. Promise { <pending> }
  5. tttt

问题四: 真正编写代码遇到await错误

修改代码增加await,报如下错误 :

  1. [esbuild Error]: Top-level await is currently not supported with the "cjs" output format
  2. at D:\code\langchain\langchain-nextjs-template\lib\embedding.ts:19:12
  3.  ELIFECYCLE  Command failed with exit code 1.

解决方法:

1.修改tsconfig.json,target 和 module

  1. {
  2. "compilerOptions": {
  3. "target": "esnext", // ①从es5 改成esnext
  4. "lib": ["dom", "dom.iterable", "esnext"],
  5. "allowJs": true,
  6. "skipLibCheck": true,
  7. "strict": true,
  8. "forceConsistentCasingInFileNames": true,
  9. "noEmit": true,
  10. "esModuleInterop": true,
  11. "module": "esnext", //②修改这个
  12. "moduleResolution": "bundler",
  13. "resolveJsonModule": true,
  14. "isolatedModules": true,
  15. "jsx": "preserve",
  16. "incremental": true,
  17. "plugins": [
  18. {
  19. "name": "next"
  20. }
  21. ],
  22. "paths": {
  23. "@/*": ["./*"]
  24. }
  25. },
  26. "include": ["next-env.d.ts", "**/*.ts", "**/*.tsx", ".next/types/**/*.ts"],
  27. "exclude": ["node_modules"]
  28. }

2.修改package.json,修改type

  1. {
  2. "name": "langchain-nextjs-template",
  3. "version": "0.0.0",
  4. "private": true,
  5. "type": "module",
  6. "scripts": {
  7. "dev": "next dev",
  8. "tsx": "tsx .\\lib\\embedding.ts",
  9. "build": "next build",
  10. "start": "next start",
  11. "lint": "next lint",
  12. "format": "prettier --write \"app\""
  13. },
  14. "engines": {
  15. "node": ">=18"
  16. },
  17. "dependencies": {
  18. "@next/bundle-analyzer": "^13.4.19",
  19. "@supabase/supabase-js": "^2.32.0",
  20. "@types/node": "20.4.5",
  21. "@types/react": "18.2.17",
  22. "@types/react-dom": "18.2.7",
  23. "@xenova/transformers": "^2.9.0",
  24. "ai": "^2.1.28",
  25. "autoprefixer": "10.4.14",
  26. "eslint": "8.46.0",
  27. "eslint-config-next": "13.4.12",
  28. "langchain": "^0.0.187",
  29. "next": "14.0.1",
  30. "postcss": "8.4.27",
  31. "react": "18.2.0",
  32. "react-dom": "18.2.0",
  33. "react-toastify": "^9.1.3",
  34. "tailwindcss": "3.3.3",
  35. "typescript": "5.1.6",
  36. "zod": "^3.22.3",
  37. "zod-to-json-schema": "^3.21.4"
  38. },
  39. "devDependencies": {
  40. "prettier": "3.0.0",
  41. "tsx": "^4.6.2"
  42. }
  43. }

正式代码如下:embedding.ts

  1. import { env } from '@xenova/transformers';
  2. // Specify a custom location for models (defaults to '/models/').
  3. env.localModelPath = 'D:/code/embedding/';
  4. // Disable the loading of remote models from the Hugging Face Hub:
  5. env.allowRemoteModels = false;
  6. // Set location of .wasm files. Defaults to use a CDN.
  7. // env.backends.onnx.wasm.wasmPaths = '/path/to/files/';
  8. import { HuggingFaceTransformersEmbeddings } from "langchain/embeddings/hf_transformers";
  9. import { LlamaCppEmbeddings } from "langchain/embeddings/llama_cpp";
  10. const embs = new HuggingFaceTransformersEmbeddings({modelName:"bge-large-en"})
  11. const docs = "doc"
  12. console.log( await embs.embedQuery(docs))

输出内容:

  1. (base) PS D:\code\langchain\langchain-nextjs-template> pnpm tsx
  2. > langchain-nextjs-template@0.0.0 tsx D:\code\langchain\langchain-nextjs-template
  3. > tsx .\lib\embedding.ts
  4. [
  5. 0.003330473555251956, -0.02205600030720234, 0.0157240591943264,
  6. 0.019194256514310837, -0.04094541072845459, -0.04336366057395935,
  7. 0.007914731279015541, 0.01676217094063759, -0.00026966544101014733,
  8. 0.04427381977438927, 0.036663103848695755, 0.0008914328063838184,
  9. 0.03049340285360813, -0.014682380482554436, -0.01731256954371929,
  10. 0.021954838186502457, -0.0004227317695040256, -0.027069522067904472,
  11. -0.047488000243902206, 0.00778471864759922, -0.008551156148314476,
  12. 0.002418451476842165, -0.054124195128679276, 0.002399906050413847,
  13. -0.01912163756787777, 0.029669228941202164, -0.013196144253015518,
  14. 0.02986353449523449, 0.06598247587680817, 0.05015171319246292,
  15. -0.032113343477249146, -0.023680096492171288, 0.012189813889563084,
  16. -0.024729931727051735, -0.012590361759066582, -0.028836917132139206,
  17. 0.03547510504722595, -0.03484269976615906, 0.01825197972357273,
  18. -0.037200022488832474, -0.00193393521476537, -0.02055542729794979,
  19. 0.0485348254442215, -0.042570747435092926, -0.07042783498764038,
  20. -0.0019728969782590866, -0.029526256024837494, -0.04461062327027321,
  21. 0.013710870407521725, -0.03797616809606552, 0.025837218388915062,
  22. -0.013484465889632702, 0.01923159696161747, -0.01916288211941719,
  23. 0.023215124383568764, -0.009301901794970036, -0.006729026325047016,
  24. 0.004620965104550123, -0.025768844410777092, 0.009178749285638332,
  25. 0.004959911108016968, -0.01205375324934721, 0.012440511025488377,
  26. -0.03975430876016617, 0.012500863522291183, 0.011405334807932377,
  27. -0.00874609500169754, -0.015204871073365211, 0.02256081812083721,
  28. -0.007945860736072063, 0.009428582154214382, -0.003165576374158263,
  29. 0.0007117743953131139, -0.008373155258595943, 0.006578522268682718,
  30. 0.01326432079076767, 0.02995370887219906, -0.003609190694987774,
  31. -0.03194940462708473, 0.03550690785050392, 0.017655085772275925,
  32. 0.03506942093372345, 0.024607902392745018, 0.046828530728816986,
  33. -0.03198009356856346, -0.03032330609858036, 0.018970536068081856,
  34. 0.00927543081343174, 0.00751956831663847, 0.004081718157976866,
  35. 0.02597544528543949, 0.04809199646115303, -0.0011864954140037298,
  36. -0.007367478217929602, 0.0500517338514328, 0.04424132779240608,
  37. -0.007328629028052092, 0.014350713230669498, 0.004544962663203478,
  38. -0.004281576257199049,
  39. ... 924 more items
  40. ]

声明:本文内容由网友自发贡献,不代表【wpsshop博客】立场,版权归原作者所有,本站不承担相应法律责任。如您发现有侵权的内容,请联系我们。转载请注明出处:https://www.wpsshop.cn/w/凡人多烦事01/article/detail/129742
推荐阅读
相关标签
  

闽ICP备14008679号