赞
踩
★★★ 本文源自AlStudio社区精品项目,【点击此处】查看更多精品内容 >>>
AI Studio平台最新上线了Gradio在线部署环境。相比原有的Streamlit在线部署,学习成本更低,上手速度更快。个人认为,从使用情况看,部署环境也更加稳定。
下面,本文就以“熟肉”也能继续炒:一键完成视频字幕文件抽取、问答定位这个项目为例,介绍如何搭建Gradio版本的在线部署应用。
首先,我们先准备好算法调试的环境。确保字幕提取模型能够正常使用。
!pip install -U scikit-image
# pip 安装PaddleSpeech
!pip install paddleocr --user
!pip install paddlenlp --user
!pip install paddlespeech --user
!pip install moviepy --user
# 下载nltk数据包,放入到nltk可以检索的路径下
# %cd /home/aistudio
# !wget -P data https://paddlespeech.bj.bcebos.com/Parakeet/tools/nltk_data.tar.gz
# !tar zxvf data/nltk_data.tar.gz
import os
import cv2
from PIL import Image
import numpy as np
from tqdm import tqdm
from paddleocr import PaddleOCR, draw_ocr
# 调试环境的gradio没有直接联网,用户需要在notebook中下好预训练模型
ocr = PaddleOCR(use_angle_cls=False, lang="ch")
[2023/03/23 12:58:28] ppocr DEBUG: Namespace(alpha=1.0, benchmark=False, beta=1.0, cls_batch_num=6, cls_image_shape='3, 48, 192', cls_model_dir='/home/aistudio/.paddleocr/whl/cls/ch_ppocr_mobile_v2.0_cls_infer', cls_thresh=0.9, cpu_threads=10, crop_res_save_dir='./output', det=True, det_algorithm='DB', det_box_type='quad', det_db_box_thresh=0.6, det_db_score_mode='fast', det_db_thresh=0.3, det_db_unclip_ratio=1.5, det_east_cover_thresh=0.1, det_east_nms_thresh=0.2, det_east_score_thresh=0.8, det_limit_side_len=960, det_limit_type='max', det_model_dir='/home/aistudio/.paddleocr/whl/det/ch/ch_PP-OCRv3_det_infer', det_pse_box_thresh=0.85, det_pse_min_area=16, det_pse_scale=1, det_pse_thresh=0, det_sast_nms_thresh=0.2, det_sast_score_thresh=0.5, draw_img_save_dir='./inference_results', drop_score=0.5, e2e_algorithm='PGNet', e2e_char_dict_path='./ppocr/utils/ic15_dict.txt', e2e_limit_side_len=768, e2e_limit_type='max', e2e_model_dir=None, e2e_pgnet_mode='fast', e2e_pgnet_score_thresh=0.5, e
点击上面图片中的【应用gradio】按钮后,项目目录下会自动新增一个名为untitled.gradio.py
的文件。
在AI Studio上,gradio应用统一需要以{filename}.gradio.py
命名,平台会自动识别。
虽然Gradio的官方文档介绍相当简洁明了,但是网络环境不那么稳定,很多在线app应用我们体验不太好,不过我们可以把代码搬运到AI Studio平台,新建gradio应用自行调试。
Gradio的核心是它的gr.Interface
函数,用来构建可视化界面。
最后我们用demo.lauch()
把页面一发布,一个本地静态交互页面就完成了。
import gradio as gr
import cv2
def to_black(image):
output = cv2.cvtColor(image, cv2.COLOR_BGR2GRAY)
return output
demo = gr.Interface(fn=to_black, inputs="image", outputs="image")
demo.launch()
代码的运行效果可运行to_black.gradio.py
文件查看。
我们在搭建在线算法模型部署应用时,要处理的输入输出无非是那么几种:文本、图片、视频、音频、文件,而对于这些输入输出,在gradio中,都可以用一行代码简单表示。
# 在notebook下载预训练模型
from paddlenlp import Taskflow
schema = ['时间', '选手', '赛事名称']
ie = Taskflow('information_extraction', schema=schema)
[2023-03-23 12:58:48,069] [ INFO] - We are using <class 'paddlenlp.transformers.ernie.tokenizer.ErnieTokenizer'> to load '/home/aistudio/.paddlenlp/taskflow/information_extraction/uie-base'.
应用效果读者可到information_extraction.gradio.py
文件中体验。
import os
import cv2
from paddleocr import PPStructure,save_structure_res
from paddleocr.ppstructure.recovery.recovery_to_doc import sorted_layout_boxes, convert_info_docx
table_engine = PPStructure(recovery=True)
img = cv2.imread('1.png')
result = table_engine(img)
output = save_structure_res(result, './','output')
for line in result:
line.pop('img')
print(line)
h, w, _ = img.shape
res = sorted_layout_boxes(result, w)
convert_info_docx(img, res, './','output')
[2023/03/23 12:58:50] ppocr DEBUG: Namespace(alpha=1.0, benchmark=False, beta=1.0, cls_batch_num=6, cls_image_shape='3, 48, 192', cls_model_dir=None, cls_thresh=0.9, cpu_threads=10, crop_res_save_dir='./output', det=True, det_algorithm='DB', det_box_type='quad', det_db_box_thresh=0.6, det_db_score_mode='fast', det_db_thresh=0.3, det_db_unclip_ratio=1.5, det_east_cover_thresh=0.1, det_east_nms_thresh=0.2, det_east_score_thresh=0.8, det_limit_side_len=960, det_limit_type='max', det_model_dir='/home/aistudio/.paddleocr/whl/det/ch/ch_PP-OCRv3_det_infer', det_pse_box_thresh=0.85, det_pse_min_area=16, det_pse_scale=1, det_pse_thresh=0, det_sast_nms_thresh=0.2, det_sast_score_thresh=0.5, draw_img_save_dir='./inference_results', drop_score=0.5, e2e_algorithm='PGNet', e2e_char_dict_path='./ppocr/utils/ic15_dict.txt', e2e_limit_side_len=768, e2e_limit_type='max', e2e_model_dir=None, e2e_pgnet_mode='fast', e2e_pgnet_score_thresh=0.5, e2e_pgnet_valid_set='totaltext', enable_mkldnn=False, fourier_d [2023/03/23 12:58:58] ppocr DEBUG: dt_boxes num : 6, elapse : 0.05759406089782715 [2023/03/23 12:58:59] ppocr DEBUG: rec_res num : 6, elapse : 0.49809908866882324 [2023/03/23 12:58:59] ppocr DEBUG: dt_boxes num : 5, elapse : 0.04185223579406738 [2023/03/23 12:58:59] ppocr DEBUG: rec_res num : 5, elapse : 0.030829668045043945 [2023/03/23 12:58:59] ppocr DEBUG: dt_boxes num : 1, elapse : 0.03926730155944824 [2023/03/23 12:58:59] ppocr DEBUG: rec_res num : 1, elapse : 0.010443687438964844 [2023/03/23 12:58:59] ppocr DEBUG: dt_boxes num : 12, elapse : 0.04475545883178711 [2023/03/23 12:58:59] ppocr DEBUG: rec_res num : 12, elapse : 0.0351407527923584 [2023/03/23 12:58:59] ppocr DEBUG: dt_boxes num : 3, elapse : 0.04018092155456543 [2023/03/23 12:58:59] ppocr DEBUG: rec_res num : 3, elapse : 0.02977585792541504 [2023/03/23 12:58:59] ppocr DEBUG: dt_boxes num : 1, elapse : 0.03946208953857422 [2023/03/23 12:58:59] ppocr DEBUG: rec_res num : 1, elapse : 0.016471385955810547 [2023/03/23 12:59:00] ppocr DEBUG: dt_boxes num : 80, elapse : 0.05151057243347168 [2023/03/23 12:59:00] ppocr DEBUG: rec_res num : 80, elapse : 0.20645451545715332 [2023/03/23 12:59:01] ppocr DEBUG: dt_boxes num : 109, elapse : 0.06318998336791992 [2023/03/23 12:59:01] ppocr DEBUG: rec_res num : 109, elapse : 0.28577733039855957 [2023/03/23 12:59:02] ppocr DEBUG: dt_boxes num : 1, elapse : 0.0403900146484375 [2023/03/23 12:59:02] ppocr DEBUG: rec_res num : 1, elapse : 0.012970924377441406 [2023/03/23 12:59:02] ppocr DEBUG: dt_boxes num : 3, elapse : 0.04197072982788086 [2023/03/23 12:59:02] ppocr DEBUG: rec_res num : 3, elapse : 0.029776573181152344 {'type': 'text', 'bbox': [11, 729, 407, 847], 'res': [{'text': 'CTWi500.Intesting,thethresholdthsissetto0.8.', 'confidence': 0.9144828915596008, 'text_region': [[13.0, 730.0], [406.0, 730.0], [406.0, 744.0], [13.0, 744.0]]}, {'text': 'Representative visible resultsare shown inFig.8 (c)and', 'confidence': 0.9103177189826965, 'text_region': [[14.0, 750.0], [406.0, 750.0], [406.0, 764.0], [14.0, 764.0]]}, {'text': '(d),whichindicateourmethodpreciselydetectsbound-', 'confidence': 0.9388320446014404, 'text_region': [[15.0, 770.0], [404.0, 770.0], [404.0, 784.0], [15.0, 784.0]]}, {'text': 'aries of long curved text with line-level.The quantitative', 'confidence': 0.9249826073646545, 'text_region': [[14.0, 789.0], [407.0, 789.0], [407.0, 803.0], [14.0, 803.0]]}, {'text': 'results are listed inTab.6.Compared with the previous', 'confidence': 0.9132938981056213, 'text_region': [[13.0, 809.0], [407.0, 809.0], [407.0, 824.0], [13.0, 824.0]]}, {'text': 'sate-of-the-artmethods[12,34,36],our approac {'type': 'text', 'bbox': [442, 754, 837, 847], 'res': [{'text': 'Inthispaper,weproposeanoveladaptivebound-', 'confidence': 0.9279642105102539, 'text_region': [[464.0, 754.0], [836.0, 754.0], [836.0, 768.0], [464.0, 768.0]]}, {'text': 'aryproposal networkfor arbitrary shapetext detection,', 'confidence': 0.9277142882347107, 'text_region': [[444.0, 773.0], [837.0, 772.0], [837.0, 789.0], [444.0, 790.0]]}, {'text': 'whichadopt anboundaryproposal model togenerate coarse', 'confidence': 0.9346425533294678, 'text_region': [[444.0, 791.0], [838.0, 793.0], [838.0, 810.0], [444.0, 808.0]]}, {'text': 'boundaryproposals,andthen adoptan adaptiveboundary', 'confidence': 0.9254657030105591, 'text_region': [[445.0, 813.0], [834.0, 813.0], [834.0, 828.0], [445.0, 828.0]]}, {'text': 'deformationmodelcombinedwithGCNandRNNtoper', 'confidence': 0.9705811738967896, 'text_region': [[443.0, 831.0], [834.0, 832.0], [834.0, 846.0], [443.0, 845.0]]}], 'img_idx': 0} {'type': 'title', 'bbox': [443, 705, 559, 719], 'res': [{'text': '5.Conclusion', 'confidence': 0.927318811416626, 'text_region': [[442.0, 705.0], [559.0, 705.0], [559.0, 723.0], [442.0, 723.0]]}], 'img_idx': 0} {'type': 'figure', 'bbox': [10, 1, 841, 294], 'res': [{'text': 'GARBER', 'confidence': 0.861666202545166, 'text_region': [[667.0, 31.0], [802.0, 28.0], [803.0, 68.0], [668.0, 71.0]]}, {'text': 'STTEOD', 'confidence': 0.7545282244682312, 'text_region': [[496.0, 51.0], [564.0, 51.0], [564.0, 62.0], [496.0, 62.0]]}, {'text': 'ANCIRNT', 'confidence': 0.7368512153625488, 'text_region': [[487.0, 65.0], [572.0, 64.0], [572.0, 75.0], [487.0, 76.0]]}, {'text': 'KARKET SOUARE', 'confidence': 0.825034499168396, 'text_region': [[455.0, 78.0], [604.0, 78.0], [604.0, 92.0], [455.0, 92.0]]}, {'text': '3th C)', 'confidence': 0.6893618702888489, 'text_region': [[499.0, 98.0], [557.0, 98.0], [557.0, 109.0], [499.0, 109.0]]}, {'text': 'GHOD', 'confidence': 0.8563829064369202, 'text_region': [[682.0, 99.0], [794.0, 97.0], [795.0, 130.0], [683.0, 133.0]]}, {'text': 'No.', 'confidence': 0.6608548164367676, 'text_region': [[512.0, 122.0], [546.0, 122.0], [546.0, 132.0], [512.0, 132.0]]}, {'text': 'OSTWORIT', {'type': 'figure_caption', 'bbox': [70, 317, 707, 357], 'res': [{'text': 'Visual experimental results. Theblue contours are boundary proposals, and the green contours are final dd', 'confidence': 0.9035724401473999, 'text_region': [[75.0, 318.0], [709.0, 318.0], [709.0, 332.0], [75.0, 332.0]]}, {'text': 'Table 6. Experimental results on CTW-1500', 'confidence': 0.9105359315872192, 'text_region': [[75.0, 343.0], [341.0, 343.0], [341.0, 358.0], [75.0, 358.0]]}, {'text': 'Table 7.Experimental results on M', 'confidence': 0.8745455741882324, 'text_region': [[495.0, 343.0], [707.0, 343.0], [707.0, 358.0], [495.0, 358.0]]}], 'img_idx': 0} {'type': 'figure_caption', 'bbox': [160, 317, 797, 335], 'res': [{'text': 'nental results.The blue contours are boundary proposals,and the green contours arefinal detection bound', 'confidence': 0.9008669853210449, 'text_region': [[162.0, 317.0], [796.0, 317.0], [796.0, 331.0], [162.0, 331.0]]}], 'img_idx': 0} {'type': 'table', 'bbox': [453, 359, 822, 664], 'res': {'cell_bbox': [[37.02046203613281, 7.918796539306641, 102.86544799804688, 7.905942440032959, 103.93917083740234, 19.562490463256836, 37.05264663696289, 19.635887145996094], [146.731201171875, 6.306790828704834, 178.56271362304688, 6.161284923553467, 181.18629455566406, 20.368318557739258, 149.02525329589844, 20.890018463134766], [209.37831115722656, 4.934323787689209, 233.69007873535156, 4.8544416427612305, 236.02627563476562, 19.189176559448242, 211.77857971191406, 19.629215240478516], [266.735595703125, 5.390480041503906, 291.53753662109375, 5.297186374664307, 293.080078125, 19.297195434570312, 268.8021240234375, 19.685422897338867], [320.02606201171875, 5.8392486572265625, 356.4461669921875, 5.785706520080566, 356.5975036621094, 20.283218383789062, 320.76409912109375, 20.501319885253906], [20.764667510986328, 25.679840087890625, 112.11119079589844, 25.433696746826172, 111.81179809570312, 39.405025482177734, 20.26268196105957, 39 {'type': 'table', 'bbox': [12, 360, 410, 716], 'res': {'cell_bbox': [[36.285438537597656, 7.422496318817139, 120.45587921142578, 7.5343918800354, 120.50914764404297, 21.121793746948242, 35.78128433227539, 20.998836517333984], [146.39810180664062, 6.236685276031494, 181.86717224121094, 6.220641613006592, 183.5493621826172, 21.815570831298828, 147.76646423339844, 22.085832595825195], [204.2172088623047, 5.472501754760742, 232.8429412841797, 5.454161643981934, 234.3572540283203, 21.741382598876953, 205.5713653564453, 22.0509033203125], [257.9702453613281, 5.737213134765625, 281.7971496582031, 5.705610752105713, 283.713623046875, 20.13962745666504, 259.9882507324219, 20.404027938842773], [309.5661315917969, 5.743965148925781, 334.0520935058594, 5.7649736404418945, 334.9576721191406, 20.066280364990234, 310.7884826660156, 20.136138916015625], [360.2051696777344, 6.001227378845215, 385.8355712890625, 6.019595146179199, 385.9518737792969, 20.49489402770996, 360.5616149902344, 20.5755996704101 {'type': 'table_caption', 'bbox': [494, 343, 785, 356], 'res': [{'text': 'Table 7. Experimental results on MSRA-TD500', 'confidence': 0.919044554233551, 'text_region': [[494.0, 343.0], [785.0, 343.0], [785.0, 357.0], [494.0, 357.0]]}], 'img_idx': 0} {'type': 'table_caption', 'bbox': [69, 318, 706, 357], 'res': [{'text': 'Visual experimental results.The blue contours are boundary proposals, and the green contours are final d', 'confidence': 0.9185776114463806, 'text_region': [[76.0, 318.0], [707.0, 318.0], [707.0, 332.0], [76.0, 332.0]]}, {'text': 'Table 6. Experimental results on CTW-1500', 'confidence': 0.9322524666786194, 'text_region': [[73.0, 343.0], [342.0, 343.0], [342.0, 358.0], [73.0, 358.0]]}, {'text': 'Table 7.Experimental results on M', 'confidence': 0.8937045335769653, 'text_region': [[495.0, 342.0], [706.0, 342.0], [706.0, 357.0], [495.0, 357.0]]}], 'img_idx': 0} [2023/03/23 12:59:04] ppocr INFO: docx save to ./output_ocr.docx
:59:04] ppocr INFO: docx save to ./output_ocr.docx
应用效果读者可到ocr.gradio.py
文件中体验。
作为一个要对外提供的部署应用,没标题没操作说明让用户摸不着头脑可谓大忌。虽然直接用gr.Interface()
函数也能支持,像这样:
demo = gr.Interface(fn=quickstart,inputs=[gr.Markdown("# 这是一个版面恢复的应用"), gr.Image()], outputs=gr.File())
但是文字一多,麻烦可就大了,排版也就难看了。
这一点上,和Streamlit类似,Gradio提供的解决方案也是用上with
,就像下面这样。
def welcome(name):
return f"Welcome to Gradio, {name}!"
with gr.Blocks() as demo:
gr.Markdown(
"""
# Hello World!
Start typing below to see the output.
""")
inp = gr.Textbox(placeholder="What is your name?")
out = gr.Textbox()
inp.change(welcome, inp, out)
作为一个刚上线的解决方案,肯定还有一些地方需要持续不断迭代的,Gradio方案也是如此。
目前发现的,有几个细节读者需要特别注意:
.gradio.py
文件上面两个问题都会造成应用加载页面显示异常,大概率是缓存刷新方面还有待优化。
另外,gradio应用在debug的时候会存在比较大的困难。原因在于,调用函数内部报错的时候,程序不会直接崩溃,而是不断地增加加载时间,导致我们无法及时发现问题。
所以,当读者自己使用时,如果发现类似问题,就要仔细检查下自己的程序,是不是那里错误了。
这是一个对上面介绍各项功能的综合应用项目,读者可到srt.gradio.py
文件中运行体验,把它部署后,就是项目首页展示的效果。
在本项目中,我们介绍了应用中心Gradio部署的基本方法,并通过迁移“熟肉”也能继续炒:一键完成视频字幕文件抽取、问答定位这个项目,成功实现了SRT格式字幕文件的一键生成。
希望对各位读者更快上手Gradio应用有所帮助。
此文章为转载
原为链接
Copyright © 2003-2013 www.wpsshop.cn 版权所有,并保留所有权利。