当前位置:   article > 正文

LangChain+llama.cpp+llama-cpp-python工程学习笔记(一)_error: could not find a version that satisfies the

error: could not find a version that satisfies the requirement langchainhub

LangChain 是一个开发由语言模型驱动的应用程序的框架,有以下特性。

  • 数据感知 : 将语言模型连接到其他数据源
  • 具有代理性质 : 允许语言模型与其环境交互

pip install langchain

报错ERROR: Could not find a version that satisfies the requirement langchain (from versions: none)ERROR: No matching distribution found for langchain

解决:pip install langchain -i http://pypi.douban.com/simple/ --trusted-host pypi.douban.com

langchain集成llama:

pip install llama-cpp-python -i http://pypi.douban.com/simple/ --trusted-host pypi.douban.com

报错:ERROR: Cannot unpack file C:\Users\96584\AppData\Local\Temp\pip-unpack-izfgtfwa\simple (downloaded from C:\Users\96584\AppData\Local\Temp\pip-req-build-1raavtqr, content-type: text/html; charset=utf-8); cannot detect archive format
ERROR: Cannot determine archive format of C:\Users\96584\AppData\Local\Temp\pip-req-build-1raavtqr

解决:换源

pip install -i https://pypi.tuna.tsinghua.edu.cn/simple --trusted-host pypi.tuna.tsinghua.edu.cn  llama-cpp-python

报错:ERROR: Could not build wheels for llama-cpp-python, which is required to install pyproject.toml-based projects   不识别nmake

解决:在powershell中更改编译器 在cmd中输入powershell

$env:CMAKE_GENERATOR = "MinGW Makefiles"
$env:CMAKE_ARGS = "-DLLAMA_OPENBLAS=on

-DCMAKE_C_COMPILER=C:/w64devkit/bin/gcc.exe

-DCMAKE_CXX_COMPILER=C:/w64devkit/bin/g++.exe"

pip install -i https://pypi.tuna.tsinghua.edu.cn/simple --trusted-host pypi.tuna.tsinghua.edu.cn  llama-cpp-python

报错:

:thread.c:(.text+0x103f): multiple definition of `pthread_self' ../../libllama.dll.a(d000850.o):(.text+0x0): first defined here collect2.exe: error: ld returned 1 exit status

解决:

在github上issue里有:添加:-DLLAVA_BUILD=OFF  取消LLAVA的支持

这是llama-cpp-python的bug

简单运行:

import llama_cpp

model = llama_cpp.Llama(
    model_path="D:/researchPJ/llamacpp/llama.cpp/models/llama-2-7b/ggml-model-q4_0.gguf",
)
print(model("The quick brown fox jumps ", stop=["."])["choices"][0]["text"])

报错:

RuntimeError: Failed to load shared library 'D:\code\langchain-llama\.venv\lib\site-packages\llama_cpp\libllama.dll': [WinError 193] %1 不是有效的 Win32 应用程序。

解决:

libllama.dll是32位的,需要重新编译为64位以适配windows

在cmake环境处添加-DCMAKE_GENERATOR_PLATFORM=x64以生成64位dll

报错:

MinGW Makefiles does not support platform specification, but platform x64 was specified.

解决:

创建文件如"mingw64.cmake"

其中内容:

set(CMAKE_GENERATOR_PLATFORM x64)

# Specify compilers

set(CMAKE_C_COMPILER C:/w64devkit/bin/gcc.exe)

set(CMAKE_CXX_COMPILER C:/w64devkit/bin/g++.exe)

更改编译环境:

 添加-DCMAKE_TOOLCHAIN_FILE=path/to/mingw64.cmake

重新编译:

pip install -i https://pypi.tuna.tsinghua.edu.cn/simple --trusted-host pypi.tuna.tsinghua.edu.cn llama-cpp-python --upgrade  --force --reinstall  --no-cache-dir
之后运行成功

create_chat_completion()   return response['choices'][0]['message']['content']

langchain学习

  1. from langchain_community.llms import LlamaCpp
  2. llm = LlamaCpp(
  3. model_path="D:/researchPJ/llamacpp/llama.cpp/models/llama-2-7b/ggml-model-q4_0.gguf",
  4. temperature=0.75,
  5. n_gpu_layers=20, # gpu accelerate
  6. n_threads=6,
  7. )
  8. prompt = """
  9. what is langsmith?
  10. """
  11. print(llm.invoke(prompt))

用模板来输入:

  1. template = """Question: {question}
  2. Answer: Let's work this out in a step by step way to be sure we have the right answer."""
  3. prompt = PromptTemplate.from_template(template)
  4. llm_chain = LLMChain(prompt=prompt, llm=llm)
  5. question = "What NFL team won the Super Bowl in the year Justin Bieber was born?"
  6. print(llm_chain.run(question))

提示+模型+输出解析:

  1. prompt = ChatPromptTemplate.from_messages([
  2. ("system", "You are world class technical documentation writer."),
  3. ("user", "{input}")
  4. ])
  5. output_parser = StrOutputParser()
  6. llm_chain = LLMChain(prompt=prompt, llm=llm, output_parser=output_parser)
  7. print(llm_chain.invoke({"input": "how can langsmith help with testing?"}))

嵌入模型+向量存储+提示+模型+输出解析:

pip install beautifulsoup4

pip install faiss-cpu

  1. from langchain.callbacks.manager import CallbackManager
  2. from langchain.callbacks.streaming_stdout import StreamingStdOutCallbackHandler
  3. from langchain.chains import LLMChain
  4. from langchain.prompts import PromptTemplate
  5. from langchain_community.llms import LlamaCpp
  6. from langchain_core.prompts import ChatPromptTemplate
  7. from langchain_core.output_parsers import StrOutputParser
  8. from langchain_community.document_loaders import WebBaseLoader
  9. from langchain.embeddings import LlamaCppEmbeddings
  10. from langchain_community.vectorstores import FAISS
  11. from langchain.text_splitter import RecursiveCharacterTextSplitter
  12. from langchain.chains.combine_documents import create_stuff_documents_chain
  13. from langchain.chains import create_retrieval_chain
  14. model_path = "D:/researchPJ/llamacpp/llama.cpp/models/llama-2-7b/ggml-model-q4_0.gguf"
  15. embeddings = LlamaCppEmbeddings(model_path=model_path)
  16. llm = LlamaCpp(
  17. model_path=model_path,
  18. temperature=0.75,
  19. n_gpu_layers=20, # gpu accelerate
  20. n_threads=6,
  21. n_ctx=2048,
  22. )
  23. loader = WebBaseLoader("https://docs.smith.langchain.com")
  24. docs = loader.load()
  25. text_splitter = RecursiveCharacterTextSplitter()
  26. documents = text_splitter.split_documents(docs)
  27. vector = FAISS.from_documents(documents, embeddings)
  28. prompt = ChatPromptTemplate.from_template("""Answer the following question based only on the provided context:
  29. <context>
  30. {context}
  31. </context>
  32. Question: {input}""")
  33. document_chain = create_stuff_documents_chain(llm, prompt)
  34. retriever = vector.as_retriever()
  35. retrieval_chain = create_retrieval_chain(retriever, document_chain)
  36. response = retrieval_chain.invoke({"input": "how can langsmith help with testing?"})
  37. print(response["answer"])
  38. # LangSmith offers several features that can help with testing:...

加载要索引的数据。使用 WebBaseLoader

使用嵌入模型将文档摄取到向量存储

将创建一个检索链。 该链将接受一个传入的问题,查找相关文档,然后将这些文档与原始问题一起传递到 LLM 中,并要求它回答原始问题。

声明:本文内容由网友自发贡献,不代表【wpsshop博客】立场,版权归原作者所有,本站不承担相应法律责任。如您发现有侵权的内容,请联系我们。转载请注明出处:https://www.wpsshop.cn/w/Gausst松鼠会/article/detail/410759
推荐阅读
相关标签
  

闽ICP备14008679号