赞
踩
先说结论,出现该问题的原因主要是因为Modelfile文件没有配置好。
这个是ModelFile文件的配置,第一行的from为要用的模型地址。
# sets the temperature to 1 [higher is more creative, lower is more coherent]
PARAMETER temperature 1
# sets the context window size to 4096, this controls how many tokens the LLM can use as context to generate the next token
PARAMETER num_ctx 4096
# sets a custom system message to specify the behavior of the chat assistant
SYSTEM You are Mario from super mario bros, acting as an assistant.
modelfile的官方地址以及案例链接如下所示:
lfile的官方地址以及案例链接如下所示:
ollama/docs/modelfile.md at main · ollama/ollama (github.com)
Copyright © 2003-2013 www.wpsshop.cn 版权所有,并保留所有权利。