当前位置:   article > 正文

JAVA实现人工智能,采用框架SpringAI_java spring ai

java spring ai

Spring AI介绍

Spring
AI是AI工程师的一个应用框架,它提供了一个友好的API和开发AI应用的抽象,旨在简化AI应用的开发工序,例如开发一款基于ChatGPT的对话应用程序。

  • 项目地址:https://github.com/spring-projects-experimental/spring-ai
  • 文档地址:https://docs.spring.io/spring-ai/reference/

目前该项目已经集成了OpenAI、Azure OpenAI、Hugging
Face、Ollama等API。不过,对于集成了OpenAI接口的项目,只要再搭配One-API项目,就可以调用目前主流的大语言模型了。

使用介绍

在介绍如何使用Spring AI开发一个对话接口之前,我先介绍下ChatGPT应用的开发原理。

首先,ChatGPT是OpenAI推出的一款生成式人工智能大语言模型,OpenAI为了ChatGPT能够得到广泛应用,向开发者提供了ChatGPT的使用接口,开发者只需使用OpenAI为开发者提供的Key,向OpenAI提供的接口地址发起各种形式的请求就可以使用ChatGPT。因此,开发一款ChatGPT应用并不是让你使用人工智能那套技术进行训练和开发,而是作为搬运工,通过向OpenAI提供的ChatGPT接口发起请求来获取ChatGPT响应,基于这一流程来开发的。

第一种方式采用openai

1.聊天问答

<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
         xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-4.0.0.xsd">
    <modelVersion>4.0.0</modelVersion>
    <parent>
        <groupId>org.springframework.boot</groupId>
        <artifactId>spring-boot-starter-parent</artifactId>
        <version>3.2.4</version>
        <relativePath/> <!-- lookup parent from repository -->
    </parent>
    <groupId>com.example</groupId>
    <artifactId>ai</artifactId>
    <version>0.0.1-SNAPSHOT</version>
    <name>ai</name>
    <description>ai</description>
    <properties>
        <java.version>17</java.version>
    </properties>
    <dependencyManagement>
        <dependencies>
            <dependency>
                <groupId>org.springframework.ai</groupId>
                <artifactId>spring-ai-bom</artifactId>
                <version>0.8.1-SNAPSHOT</version>
                <type>pom</type>
                <scope>import</scope>
            </dependency>
        </dependencies>
    </dependencyManagement>
    <dependencies>
        <dependency>
            <groupId>org.springframework.boot</groupId>
            <artifactId>spring-boot-starter</artifactId>
        </dependency>
        <dependency>
            <groupId>org.springframework.boot</groupId>
            <artifactId>spring-boot-starter-web</artifactId>
        </dependency>
        <dependency>
            <groupId>org.springframework.boot</groupId>
            <artifactId>spring-boot-starter-test</artifactId>
            <scope>test</scope>
        </dependency>
        <dependency>
            <groupId>org.springframework.ai</groupId>
            <artifactId>spring-ai-openai</artifactId>
        </dependency>

        <dependency>
            <groupId>org.springframework.ai</groupId>
            <artifactId>spring-ai-openai-spring-boot-starter</artifactId>
        </dependency>
        <dependency>
            <groupId>org.springframework.ai</groupId>
            <artifactId>spring-ai-ollama-spring-boot-starter</artifactId>
        </dependency>
    </dependencies>
    <repositories>
        <repository>
            <id>spring-milestones</id>
            <name>Spring Milestones</name>
            <url>https://repo.spring.io/milestone</url>
            <snapshots>
                <enabled>false</enabled>
            </snapshots>
        </repository>
        <repository>
            <id>spring-snapshots</id>
            <name>Spring Snapshots</name>
            <url>https://repo.spring.io/snapshot</url>
            <releases>
                <enabled>false</enabled>
            </releases>
        </repository>
    </repositories>
    <build>
        <plugins>
            <plugin>
                <groupId>org.springframework.boot</groupId>
                <artifactId>spring-boot-maven-plugin</artifactId>
            </plugin>
        </plugins>
    </build>

</project>
  • 1
  • 2
  • 3
  • 4
  • 5
  • 6
  • 7
  • 8
  • 9
  • 10
  • 11
  • 12
  • 13
  • 14
  • 15
  • 16
  • 17
  • 18
  • 19
  • 20
  • 21
  • 22
  • 23
  • 24
  • 25
  • 26
  • 27
  • 28
  • 29
  • 30
  • 31
  • 32
  • 33
  • 34
  • 35
  • 36
  • 37
  • 38
  • 39
  • 40
  • 41
  • 42
  • 43
  • 44
  • 45
  • 46
  • 47
  • 48
  • 49
  • 50
  • 51
  • 52
  • 53
  • 54
  • 55
  • 56
  • 57
  • 58
  • 59
  • 60
  • 61
  • 62
  • 63
  • 64
  • 65
  • 66
  • 67
  • 68
  • 69
  • 70
  • 71
  • 72
  • 73
  • 74
  • 75
  • 76
  • 77
  • 78
  • 79
  • 80
  • 81
  • 82
  • 83
  • 84
  • 85

application.yml

spring:
  ai:
    openai:
      api-key: sk-xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
      base-url: xxxxxxxxxxxxxxxxxxx
      chat:
        options:
          #model: gpt-3.5-turbo
          temperature: 0.3F
  • 1
  • 2
  • 3
  • 4
  • 5
  • 6
  • 7
  • 8
  • 9

controller

/**
     * spring-ai 自动装配的,可以直接注入使用
     */
    @Resource
    private OpenAiChatClient openAiChatClient;

    /**
     * 调用OpenAI的接口
     *
     * @param msg 我们提的问题
     * @return
     */
    @RequestMapping(value = "/ai/chat")
    public String chat(@RequestParam(value = "msg") String msg) {
        String called = openAiChatClient.call(msg);

        return called;
    }

    /**
     * 调用OpenAI的接口
     *
     * @param msg 我们提的问题
     * @return
     */
    @RequestMapping(value = "/ai/chat2")
    public Object chat2(@RequestParam(value = "msg") String msg) {
        ChatResponse chatResponse = openAiChatClient.call(new Prompt(msg));
        return chatResponse.getResult().getOutput().getContent();
    }

    /**
     * 调用OpenAI的接口
     *
     * @param msg 我们提的问题
     * @return
     */
    @RequestMapping(value = "/ai/chat3")
    public Object chat3(@RequestParam(value = "msg") String msg) {
        //可选参数在配置文件中配置了,在代码中也配置了,那么以代码的配置为准,也就是代码的配置会覆盖掉配置文件中的配置
        ChatResponse chatResponse = openAiChatClient.call(new Prompt(msg, OpenAiChatOptions.builder()
                //.withModel("gpt-4-32k") //gpt的版本,32k是参数量
                .withTemperature(0.4F) //温度越高,回答得比较有创新性,但是准确率会下降,温度越低,回答的准确率会更好
                .build()));
        return chatResponse.getResult().getOutput().getContent();
    }

    /**
     * 调用OpenAI的接口
     *
     * @param msg 我们提的问题
     * @return
     */
    @RequestMapping(value = "/ai/chat4")
    public Object chat4(@RequestParam(value = "msg") String msg) {
        //可选参数在配置文件中配置了,在代码中也配置了,那么以代码的配置为准,也就是代码的配置会覆盖掉配置文件中的配置
        Flux<ChatResponse> flux = openAiChatClient.stream(new Prompt(msg, OpenAiChatOptions.builder()
                //.withModel("gpt-4-32k") //gpt的版本,32k是参数量
                .withTemperature(0.4F) //温度越高,回答得比较有创新性,但是准确率会下降,温度越低,回答的准确率会更好
                .build()));

        flux.toStream().forEach(chatResponse -> {
            System.out.println(chatResponse.getResult().getOutput().getContent());
        });
        return flux.collectList(); //数据的序列,一序列的数据,一个一个的数据返回
    }
  • 1
  • 2
  • 3
  • 4
  • 5
  • 6
  • 7
  • 8
  • 9
  • 10
  • 11
  • 12
  • 13
  • 14
  • 15
  • 16
  • 17
  • 18
  • 19
  • 20
  • 21
  • 22
  • 23
  • 24
  • 25
  • 26
  • 27
  • 28
  • 29
  • 30
  • 31
  • 32
  • 33
  • 34
  • 35
  • 36
  • 37
  • 38
  • 39
  • 40
  • 41
  • 42
  • 43
  • 44
  • 45
  • 46
  • 47
  • 48
  • 49
  • 50
  • 51
  • 52
  • 53
  • 54
  • 55
  • 56
  • 57
  • 58
  • 59
  • 60
  • 61
  • 62
  • 63
  • 64
  • 65
  • 66

2.图片问答

spring:
  ai:
    openai:
      api-key: ${api-key}
      base-url: ${base-url}
      image:
        options:
          model: gpt-4-dalle
          quality: hd
          n: 1
          height: 1024
          width: 1024
  • 1
  • 2
  • 3
  • 4
  • 5
  • 6
  • 7
  • 8
  • 9
  • 10
  • 11
  • 12

controller

package com.bjpowernode.controller;

import jakarta.annotation.Resource;
import org.springframework.ai.image.ImagePrompt;
import org.springframework.ai.image.ImageResponse;
import org.springframework.ai.openai.OpenAiImageClient;
import org.springframework.ai.openai.OpenAiImageOptions;
import org.springframework.web.bind.annotation.RequestMapping;
import org.springframework.web.bind.annotation.RequestParam;
import org.springframework.web.bind.annotation.RestController;

@RestController
public class ImageController {

    @Resource
    private OpenAiImageClient openAiImageClient;


    @RequestMapping("/ai/image")
    private Object image(@RequestParam(value = "msg") String msg) {
        ImageResponse imageResponse = openAiImageClient.call(new ImagePrompt(msg));
        System.out.println(imageResponse);

        String imageUrl = imageResponse.getResult().getOutput().getUrl();
        //把图片进行业务处理

        return imageResponse.getResult().getOutput();
    }

    @RequestMapping("/ai/image2")
    private Object image2(@RequestParam(value = "msg") String msg) {
        ImageResponse imageResponse = openAiImageClient.call(new ImagePrompt(msg, OpenAiImageOptions.builder()
                .withQuality("hd") //高清图像
                .withN(1)  //生成1张图片
                .withHeight(1024) //生成的图片高度
                .withWidth(1024) //生成的图片宽度
                .build()));
        System.out.println(imageResponse);

        String imageUrl = imageResponse.getResult().getOutput().getUrl();
        //把图片进行业务处理
        return imageResponse.getResult().getOutput();
    }
}
  • 1
  • 2
  • 3
  • 4
  • 5
  • 6
  • 7
  • 8
  • 9
  • 10
  • 11
  • 12
  • 13
  • 14
  • 15
  • 16
  • 17
  • 18
  • 19
  • 20
  • 21
  • 22
  • 23
  • 24
  • 25
  • 26
  • 27
  • 28
  • 29
  • 30
  • 31
  • 32
  • 33
  • 34
  • 35
  • 36
  • 37
  • 38
  • 39
  • 40
  • 41
  • 42
  • 43
  • 44

3.录音转文字

spring:
  ai:
    openai:
      api-key: ${api-key}
      base-url: ${base-url}
  • 1
  • 2
  • 3
  • 4
  • 5

controller

package com.bjpowernode.controller;

import jakarta.annotation.Resource;
import org.springframework.ai.openai.OpenAiAudioTranscriptionClient;
import org.springframework.core.io.ClassPathResource;
import org.springframework.web.bind.annotation.RequestMapping;
import org.springframework.web.bind.annotation.RestController;

@RestController
public class TranscriptionController {

    @Resource
    private OpenAiAudioTranscriptionClient openAiAudioTranscriptionClient;

    @RequestMapping(value = "/ai/transcription")
    public Object transcription() {
        //org.springframework.core.io.Resource audioFile = new ClassPathResource("jfk.flac");
        org.springframework.core.io.Resource audioFile = new ClassPathResource("cat.mp3");
        String called = openAiAudioTranscriptionClient.call(audioFile);
        System.out.println(called);

        return called;
    }
}

  • 1
  • 2
  • 3
  • 4
  • 5
  • 6
  • 7
  • 8
  • 9
  • 10
  • 11
  • 12
  • 13
  • 14
  • 15
  • 16
  • 17
  • 18
  • 19
  • 20
  • 21
  • 22
  • 23
  • 24
  • 25

4.文字转录音

package com.bjpowernode.controller;

import com.bjpowernode.util.FileUtils;
import jakarta.annotation.Resource;
import org.springframework.ai.openai.OpenAiAudioSpeechClient;
import org.springframework.web.bind.annotation.RequestMapping;
import org.springframework.web.bind.annotation.RestController;

@RestController
public class TTSController {

    @Resource
    private OpenAiAudioSpeechClient openAiAudioSpeechClient;

    @RequestMapping(value = "/ai/tts")
    public Object tts() {
        String text = "2023年全球汽车销量重回9000万辆大关,同比2022年增长11%。分区域看,西欧(14%)、中国(12%)两大市场均实现两位数增长。面对这样亮眼的数据,全球汽车行业却都对2024年的市场前景表示悲观,宏观数据和企业体感之前的差异并非中国独有,在汽车市场中,这是共性问题。";
        byte[] bytes = openAiAudioSpeechClient.call(text);
        FileUtils.save2File("D:\\SpringAI\\test.mp3", bytes);
        return "OK";
    }

    @RequestMapping(value = "/ai/tts2")
    public Object tts2() {
        String text = "Spring AI is an application framework for AI engineering. Its goal is to apply to the AI domain Spring ecosystem design principles such as portability and modular design and promote using POJOs as the building blocks of an application to the AI domain.";
        byte[] bytes = openAiAudioSpeechClient.call(text);
        FileUtils.save2File("D:\\SpringAI\\test2.mp3", bytes);
        return "OK";
    }
}

  • 1
  • 2
  • 3
  • 4
  • 5
  • 6
  • 7
  • 8
  • 9
  • 10
  • 11
  • 12
  • 13
  • 14
  • 15
  • 16
  • 17
  • 18
  • 19
  • 20
  • 21
  • 22
  • 23
  • 24
  • 25
  • 26
  • 27
  • 28
  • 29
  • 30
  • 31
package com.bjpowernode.util;

import java.io.*;

public class FileUtils {

    public static boolean save2File(String fname, byte[] msg) {
        OutputStream fos = null;
        try{
            File file = new File(fname);
            File parent = file.getParentFile();
            boolean bool;
            if ((!parent.exists()) &&
                    (!parent.mkdirs())) {
                return false;
            }
            fos = new FileOutputStream(file);
            fos.write(msg);
            fos.flush();
            return true;
        }catch (FileNotFoundException e){
            return false;
        }catch (IOException e){
            e.printStackTrace();
            return false;
        }
        finally{
            if (fos != null) {
                try{
                    fos.close();
                }catch (IOException e) {}
            }
        }
    }
}

  • 1
  • 2
  • 3
  • 4
  • 5
  • 6
  • 7
  • 8
  • 9
  • 10
  • 11
  • 12
  • 13
  • 14
  • 15
  • 16
  • 17
  • 18
  • 19
  • 20
  • 21
  • 22
  • 23
  • 24
  • 25
  • 26
  • 27
  • 28
  • 29
  • 30
  • 31
  • 32
  • 33
  • 34
  • 35
  • 36

5.多模态AI问答

package com.bjpowernode.controller;

import jakarta.annotation.Resource;
import org.springframework.ai.chat.ChatClient;
import org.springframework.ai.chat.ChatResponse;
import org.springframework.ai.chat.messages.Media;
import org.springframework.ai.chat.messages.UserMessage;
import org.springframework.ai.chat.prompt.Prompt;
import org.springframework.ai.openai.OpenAiChatOptions;
import org.springframework.ai.openai.api.OpenAiApi;
import org.springframework.util.MimeTypeUtils;
import org.springframework.web.bind.annotation.RequestMapping;
import org.springframework.web.bind.annotation.RestController;

import java.util.List;

@RestController
public class MultiModelController {

    @Resource
    private ChatClient chatClient;

    @RequestMapping(value = "/ai/multi")
    public Object multi(String msg, String imageUrl) {

        UserMessage userMessage = new UserMessage(msg, List.of(new Media(MimeTypeUtils.IMAGE_PNG, imageUrl)));

        ChatResponse response = chatClient.call(new Prompt(userMessage, OpenAiChatOptions.builder()
                .withModel(OpenAiApi.ChatModel.GPT_4_VISION_PREVIEW.getValue())
                .build()));

        System.out.println(response.getResult().getOutput());
        return response.getResult().getOutput().getContent();
    }
}

  • 1
  • 2
  • 3
  • 4
  • 5
  • 6
  • 7
  • 8
  • 9
  • 10
  • 11
  • 12
  • 13
  • 14
  • 15
  • 16
  • 17
  • 18
  • 19
  • 20
  • 21
  • 22
  • 23
  • 24
  • 25
  • 26
  • 27
  • 28
  • 29
  • 30
  • 31
  • 32
  • 33
  • 34
  • 35
  • 36

第二种方式采用Ollama

Spring AI 不仅提供了与 OpenAI 进行API交互,同样支持与 Ollama 进行API交互。Ollama
是一个发布在GitHub上的项目,专为运行、创建和分享大型语言模型而设计,可以轻松地在本地启动和运行大型语言模型。

windows可以下载ollama,你可以用linux都行,我这里图方便

https://ollama.com/download

下载好了打开命令台 你可以自己选model
https://ollama.com/library

ollama run llama2-chinese
  • 1

然后yml配置

spring:
  ai:
    ollama:
      base-url: http://127.0.0.1:11434
      chat:
        # 要跟你刚刚ollama run llama2-chinese 后面这个模块一模一样才行
        model: llama2-chinese
  • 1
  • 2
  • 3
  • 4
  • 5
  • 6
  • 7

IndexServiceImpl

package com.example.service.impl;

import com.example.service.IndexService;
import org.springframework.ai.ollama.OllamaChatClient;
import org.springframework.ai.openai.OpenAiChatClient;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.stereotype.Service;

@Service
public class IndexServiceImpl implements IndexService {
    @Autowired
    private OllamaChatClient ollamaChatClient;

    @Override
    public String send(String msg) {

     /*ChatResponse chatResponse = ollamaChatClient.call(new Prompt(msg, OllamaOptions.create()
                .withModel("qwen:0.5b-chat") //使用哪个大模型
                .withTemperature(0.4F))); //温度,温度值越高,准确率下降,温度值越低,准确率会提高

        System.out.println(chatResponse.getResult().getOutput().getContent());
        return chatResponse.getResult().getOutput().getContent();*/



//        Prompt prompt = new Prompt(new UserMessage(msg));
//        return ollamaChatClient.stream(prompt);

        return ollamaChatClient.call(msg);

    }
}

  • 1
  • 2
  • 3
  • 4
  • 5
  • 6
  • 7
  • 8
  • 9
  • 10
  • 11
  • 12
  • 13
  • 14
  • 15
  • 16
  • 17
  • 18
  • 19
  • 20
  • 21
  • 22
  • 23
  • 24
  • 25
  • 26
  • 27
  • 28
  • 29
  • 30
  • 31
  • 32
  • 33

在这里插入图片描述

linux部署ollama

docker run -d -v /root/ollama:/root/.ollama -e OLLAMA_ORIGINS='*' -p 11434:11434 --name ollama ollama/ollama
  • 1

Open WebUI

https://github.com/open-webui/open-webui

lobe-chat

https://github.com/lobehub/lobe-chat

docker run -d -p 31526:3210   -e OPENAI_API_KEY=sk-xxxx   -e ACCESS_CODE=lobe66   --name lobe-chat  lobehub/lobe-chat
  • 1
声明:本文内容由网友自发贡献,不代表【wpsshop博客】立场,版权归原作者所有,本站不承担相应法律责任。如您发现有侵权的内容,请联系我们。转载请注明出处:https://www.wpsshop.cn/w/繁依Fanyi0/article/detail/653980
推荐阅读
相关标签
  

闽ICP备14008679号