赞
踩
本地运行大模型的工具,可以运行Llama 3、Phi 3、Mistral、Gemma和其他型号。定制并创建您自己的型号。
推荐操作系统最少16G+512G
spring:
application:
name: springOllama
ai:
ollama:
base-url: http://localhost:11434
chat:
options:
model: qwen2:0.5b
<?xml version="1.0" encoding="UTF-8"?> <project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-4.0.0.xsd"> <modelVersion>4.0.0</modelVersion> <parent> <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-starter-parent</artifactId> <version>3.3.0</version> <relativePath/> <!-- lookup parent from repository --> </parent> <groupId>com.kdx</groupId> <artifactId>springOllama</artifactId> <version>0.0.1-SNAPSHOT</version> <name>springAI</name> <description>Demo project for Spring Boot</description> <properties> <java.version>17</java.version> <spring-ai.version>1.0.0-M1</spring-ai.version> </properties> <dependencies> <dependency> <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-starter-web</artifactId> </dependency> <dependency> <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-starter-test</artifactId> <scope>test</scope> </dependency> <dependency> <groupId>org.springframework.ai</groupId> <artifactId>spring-ai-ollama-spring-boot-starter</artifactId> </dependency> </dependencies> <dependencyManagement> <dependencies> <dependency> <groupId>org.springframework.ai</groupId> <artifactId>spring-ai-bom</artifactId> <version>${spring-ai.version}</version> <type>pom</type> <scope>import</scope> </dependency> </dependencies> </dependencyManagement> <build> <plugins> <plugin> <groupId>org.apache.maven.plugins</groupId> <artifactId>maven-compiler-plugin</artifactId> <version>3.8.1</version> <configuration> <compilerVersion>17</compilerVersion> <source>16</source> <target>16</target> <encoding>UTF-8</encoding> <!-- maven 3.6.2及之后加上编译参数,可以让我们在运行期获取方法参数名称。 --> <parameters>true</parameters> <skip>true</skip> </configuration> </plugin> </plugins> </build> <repositories> <repository> <id>spring-milestones</id> <name>Spring Milestones</name> <url>https://repo.spring.io/milestone</url> <snapshots> <enabled>false</enabled> </snapshots> </repository> </repositories> </project>
新建Controller类
package com.kdx.springai.controller; import org.springframework.ai.chat.model.ChatResponse; import org.springframework.ai.chat.prompt.Prompt; import org.springframework.ai.ollama.OllamaChatModel; import org.springframework.ai.ollama.api.OllamaOptions; import org.springframework.beans.factory.annotation.Autowired; import org.springframework.web.bind.annotation.RequestMapping; import org.springframework.web.bind.annotation.RequestParam; import org.springframework.web.bind.annotation.RestController; @RestController @RequestMapping("/ollama") public class HelloController { @Autowired(required = false) private OllamaChatModel ollamaChatModel; //交流 @RequestMapping("/chat") public String chat(@RequestParam(value = "message", defaultValue = "讲个笑话") String message) { ChatResponse response = ollamaChatModel.call( new Prompt( message, OllamaOptions.create() //模型名称 .withModel("qwen2:0.5b") .withTemperature(0.4f) )); return response.getResult().getOutput().getContent(); } }
启动项目访问接口测试
参照安装帖子
docker run -d -p 3000:8080 --add-host=host.docker.internal:host-gateway -v D:\docker\open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main
选择ollama下载的大模型进行对话
Copyright © 2003-2013 www.wpsshop.cn 版权所有,并保留所有权利。