赞
踩
好久没写博客了,最近的事情的比较多。公司正在向产品这块转型,要做音视频的编辑开发,之前的接触这块的东西并不多,所以开发起来有很多的困难,从踩自定义相机的坑开始,视频的录制,编辑(主要包括合成和裁剪);音频的录制,裁剪;图片的一些基本处理,包括裁剪,旋转,添加文字,水印等等。哇,真的很麻烦!更令人闹心的是,之前和我合作的,主要开发视频这块的功能的同事,顶不住压力,拉稀了,不干了!那。。。视频这块的开发只能又落到我的头上了!
废话就扯到这里,进入正题。
视频的合成裁剪,一般都用的FFmpeg,这块的c代码我没研究过,但是呢,为了开发进度能够顺利的进行,我是不可能自己进行编译一遍FFmpeg的,于是就从万能的gitHub上找了一份编译好的来用(其实我心里的是鄙视我自己的!)。下面开始撸代码!
1.视频的采集功能!
怎么说呢,视频的采集看似简单,其实比较麻烦!从自定义相机开始出发,当然,我没自己从头开始写,用的是之前那个孩子的代码!,改了几天,发现各种各样的问题,采集视频的播放方向,不同相机的不同摄像头的分辨率设置,以及采集完成后,编辑后的播放方向问题等等。加上定时采集,回删操作,拍摄过程中的标记添加等等!总之呢,大坑到没有,小坑不断!之后的合成更是慢的一批,产品经理用了一下,否了!那。。很尴尬!视频的功能做不出来,那就没用!但是国内功能稍微好点的视频编辑都收费,领导在github上看到某云的sdk编辑是免费的,但是他们的云是收费的,感觉能搞,于是让我先。。。。。加上!(欲哭无泪)先加上。。。。结果周五过来一谈,嗯,必须用他们的云,其他的免费!我忍着不笑!可怜我刚刚加上采集。采集确实好用,并且不收费!
下面代码:
布局文件代码如下:
<?xml version="1.0" encoding="utf-8"?>
<RelativeLayout xmlns:android="http://schemas.android.com/apk/res/android"
android:layout_width="match_parent"
android:layout_height="match_parent"
android:orientation="vertical"
android:id="@+id/rl_root"
>
<android.opengl.GLSurfaceView
android:id="@+id/camera_preview"
android:layout_width="match_parent"
android:layout_height="match_parent"
android:layout_alignParentBottom="true"
android:layout_alignParentTop="true"/>
<com.tian.videomergedemo.view.CameraHintView
android:id="@+id/camera_hint"
android:layout_width="match_parent"
android:layout_height="match_parent"
android:layout_alignParentBottom="true"
android:layout_alignParentTop="true" />
<RelativeLayout
android:id="@+id/bottom_mask"
android:layout_width="fill_parent"
android:layout_height="120.0dip"
android:layout_alignParentBottom="true"
android:layout_gravity="bottom" >
<ImageView
android:id="@+id/iv_record"
android:layout_width="75dp"
android:layout_height="75dp"
android:layout_centerHorizontal="true"
android:layout_marginTop="30dp"
android:src="@drawable/record_state" />
<ImageView
android:id="@+id/iv_point_maker1"
android:layout_width="40dp"
android:layout_height="40dp"
android:layout_centerVertical="true"
android:layout_marginRight="30dp"
android:layout_toLeftOf="@id/iv_record"
android:src="@drawable/record_maker_d"
/>
<ImageView
android:id="@+id/iv_stop"
android:layout_width="40dp"
android:layout_height="40dp"
android:layout_centerVertical="true"
android:layout_marginLeft="30dp"
android:layout_toRightOf="@id/iv_record"
android:src="@drawable/record_stop_ok" />
</RelativeLayout>
<RelativeLayout
android:id="@+id/rl_progress_bar"
android:layout_width="match_parent"
android:layout_height="wrap_content"
android:layout_alignParentTop="true"
>
<com.tian.videomergedemo.view.RecordProgressView
android:id="@+id/record_progress"
android:layout_width="match_parent"
android:layout_height="13dp"
/>
</RelativeLayout>
<RelativeLayout
android:id="@+id/rl"
android:layout_width="match_parent"
android:layout_height="50dp"
android:layout_below="@id/rl_progress_bar" >
<Chronometer
android:id="@+id/tv_record_time"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_centerInParent="true"
android:drawableLeft="@drawable/red_dots"
android:drawablePadding="5dp"
android:format="%s"
android:gravity="center"
android:textColor="@color/red"
android:textSize="19sp"/>
<TextView
android:id="@+id/tv_record_time1"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_centerInParent="true"
android:drawableLeft="@drawable/red_dots"
android:drawablePadding="5dp"
android:text="00:00"
android:visibility="gone"
android:textColor="@android:color/white"
android:textSize="19sp" />
</RelativeLayout>
<LinearLayout
android:id="@+id/ll_top"
android:layout_width="match_parent"
android:layout_height="50dp"
android:background="#607394"
android:gravity="center_vertical"
android:orientation="horizontal" >
<ImageView
android:id="@+id/iv_back"
android:layout_width="0dp"
android:layout_height="match_parent"
android:layout_weight="1"
android:padding="15dp"
android:src="@drawable/record_cha" />
<ImageView
android:id="@+id/iv_flash"
android:layout_width="0dp"
android:layout_height="match_parent"
android:layout_weight="1"
android:padding="15dp"
android:src="@drawable/record_falsh_state" />
<ImageView
android:id="@+id/iv_camera_switch"
android:layout_width="0dp"
android:layout_height="match_parent"
android:layout_weight="1"
android:padding="15dp"
android:src="@drawable/icn_change_view" />
<ImageView
android:id="@+id/iv_clock"
android:layout_width="0dp"
android:layout_height="match_parent"
android:layout_weight="1"
android:padding="15dp"
android:src="@drawable/clock" />
<TextView
android:id="@+id/tv_resolution"
android:layout_width="0dp"
android:layout_height="match_parent"
android:layout_weight="1"
android:gravity="center"
android:text="480P"
android:textColor="@android:color/white"
android:textSize="16sp" />
</LinearLayout>
<RelativeLayout
android:id="@+id/rl_progress"
android:layout_width="match_parent"
android:layout_height="match_parent"
android:visibility="gone" >
<ProgressBar
android:id="@+id/progressBar1"
style="?android:attr/progressBarStyleLarge"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_centerInParent="true" />
</RelativeLayout>
</RelativeLayout>
效果图:
开始录制参数设置(也可以设置其他的参数,其他的需求请自行查阅API):
/**
* 初始化默认录制参数
*/
private void initData() {
mShortVideoConfig = new ShortVideoConfig();
//帧率
mShortVideoConfig.fps = 20;
//视频的码率
mShortVideoConfig.videoBitrate = 1000;
//音频的码率
mShortVideoConfig.audioBitrate = 64;
//默认的视频录制的分辨率(480)
mShortVideoConfig.resolution = StreamerConstants.VIDEO_RESOLUTION_480P;
//H264编码
mShortVideoConfig.encodeType = AVConst.CODEC_ID_AVC;
//功率(默认平衡模式)
mShortVideoConfig.encodeProfile = VideoEncodeFormat.ENCODE_PROFILE_BALANCE;
//默认软编模式
mShortVideoConfig.encodeMethod = StreamerConstants.ENCODE_METHOD_SOFTWARE;
}
初始化相机操作:
/**
* 初始化相机,开始拍摄工作
*/
private void initCameraData() {
mKSYRecordKit.setPreviewFps(mShortVideoConfig.fps);
mKSYRecordKit.setTargetFps(mShortVideoConfig.fps);
mKSYRecordKit.setVideoKBitrate(mShortVideoConfig.videoBitrate);
mKSYRecordKit.setAudioKBitrate(mShortVideoConfig.audioBitrate);
mKSYRecordKit.setPreviewResolution(mShortVideoConfig.resolution);
mKSYRecordKit.setTargetResolution(mShortVideoConfig.resolution);
mKSYRecordKit.setVideoCodecId(mShortVideoConfig.encodeType);
mKSYRecordKit.setEncodeMethod(mShortVideoConfig.encodeMethod);
mKSYRecordKit.setVideoEncodeProfile(mShortVideoConfig.encodeProfile);
mKSYRecordKit.setRotateDegrees(0);
mKSYRecordKit.setDisplayPreview(mCameraPreviewView);
mKSYRecordKit.setEnableRepeatLastFrame(false);
mKSYRecordKit.setCameraFacing(CameraCapture.FACING_FRONT);
mKSYRecordKit.setFrontCameraMirror(false);
mKSYRecordKit.setOnInfoListener(mOnInfoListener);
mKSYRecordKit.setOnErrorListener(mOnErrorListener);
mKSYRecordKit.setOnLogEventListener(mOnLogEventListener);
CameraTouchHelper cameraTouchHelper = new CameraTouchHelper();
cameraTouchHelper.setCameraCapture(mKSYRecordKit.getCameraCapture());
mCameraPreviewView.setOnTouchListener(cameraTouchHelper);
cameraTouchHelper.setCameraHintView(mCameraHintView);
mKSYRecordKit.startCameraPreview();
}
开始拍摄:
private void startRecord() {
mRecordUrl = getRecordFileFolder() + "/" + System.currentTimeMillis() + ".mp4";
videosToMerge.add(mRecordUrl);//每次开始录制时记录
mKSYRecordKit.setVoiceVolume(50);
mKSYRecordKit.startRecord(mRecordUrl);
mIsFileRecording = true;
mRecordControler.getDrawable().setLevel(2);
}
暂停拍摄(因为是断点拍摄,finished参数判断是不是最后的录制完成标记):
/**
*
* @param finished
*/
private void stopRecord(boolean finished) {
//录制完成进入编辑
//若录制文件大于1则需要触发文件合成
if (finished) {
if (mKSYRecordKit.getRecordedFilesCount() > 1) {
String fileFolder = getRecordFileFolder();
//合成文件路径
final String outFile = fileFolder + "/" + "merger_" + System.currentTimeMillis() + ".mp4";
mKSYRecordKit.stopRecord(outFile, new KSYRecordKit.MegerFilesFinishedListener() {
@Override
public void onFinished() {
runOnUiThread(new Runnable() {
@Override
public void run() {
//TODO
Toast.makeText(RecordActivity.this, "短视频录制结束!", Toast.LENGTH_SHORT).show();
}
});
}
});
} else {
mKSYRecordKit.stopRecord();
}
} else {
//普通录制停止
mKSYRecordKit.stopRecord();
}
//更新进度显示
mRecordProgressCtl.stopRecording();
mRecordControler.getDrawable().setLevel(1);
updateDeleteView();
mIsFileRecording = false;
stopChronometer();
}
合成的异步任务栈
private class MergeVideos extends AsyncTask<String, Integer, String> {
//The working path where the video files are located
private String workingPath;
//The file names to merge
private ArrayList<String> videosToMerge;
//Dialog to show to the user
private ProgressDialog progressDialog;
private MergeVideos(String workingPath, ArrayList<String> videosToMerge) {
this.workingPath = workingPath;
this.videosToMerge = videosToMerge;
}
@Override
protected void onPreExecute() {
if(progressDialog==null){
progressDialog = ProgressDialog.show(RecordActivity.this,
"合并中...", "请稍等...", true);
}else{
progressDialog.show();
}
};
@Override
protected String doInBackground(String... params) {
File storagePath = new File(workingPath);
storagePath.mkdirs();
File myMovie = new File(storagePath, String.format("output-%s.mp4", newName));
finalPath=myMovie.getAbsolutePath();
VideoStitchingRequest videoStitchingRequest = new VideoStitchingRequest.Builder()
.inputVideoFilePath(videosToMerge)
.outputPath(finalPath).build();
FfmpegManager manager = FfmpegManager.getInstance();
manager.stitchVideos(RecordActivity.this, videoStitchingRequest,
new CompletionListener() {
@Override
public void onProcessCompleted(String message,List<String> merger) {
mMessage=message;
}
});
return mMessage;
}
@Override
protected void onPostExecute(String value) {
super.onPostExecute(value);
progressDialog.dismiss();
progressDialog.cancel();
progressDialog=null;
if(value!=null){
Toast.makeText(RecordActivity.this, "啊哦,录制失败了!请重新尝试...", Toast.LENGTH_SHORT).show();
}else{
saveFlagPointer(mRecordProgressCtl.getFlagPointers());
Intent intent = new Intent(RecordActivity.this,EditVedioActivity.class);
intent.putExtra("vedio_path",finalPath);//把最终的路径传过去
startActivity(intent);
finish();
}
}
}
录制完成后会跳转到编辑界面,合成的操作本来就在子线程中,这里其实是多余启动的一个子线程合成,代码后期还会不断的优化的,勿喷!
package com.tian.videomergedemo.manager;
import java.io.File;
import java.util.List;
import java.util.concurrent.BlockingQueue;
import java.util.concurrent.LinkedBlockingQueue;
import java.util.concurrent.ThreadPoolExecutor;
import java.util.concurrent.TimeUnit;
import android.annotation.SuppressLint;
import android.content.Context;
import com.tian.videomergedemo.R;
import com.tian.videomergedemo.inter.CompletionListener;
import com.tian.videomergedemo.task.StitchingTask;
import com.tian.videomergedemo.task.TrimTask;
import com.tian.videomergedemo.utils.Utils;
/**
* Created by TCX on 21/01/17.
*
* 合并和切割的操做(如果编码的话会很耗时,所以用线程池进行管理控制)
*
*
*/
public class FfmpegManager {
private static FfmpegManager manager;
private String mFfmpegInstallPath;
private static int NUMBER_OF_CORES =
Runtime.getRuntime().availableProcessors();
// 线程队列
private final BlockingQueue<Runnable> mDecodeWorkQueue = new LinkedBlockingQueue<Runnable>();
private static final int KEEP_ALIVE_TIME = 1;
// 线程池设置
private static final TimeUnit KEEP_ALIVE_TIME_UNIT = TimeUnit.SECONDS;
// 线程池管理
ThreadPoolExecutor mDecodeThreadPool = new ThreadPoolExecutor(
NUMBER_OF_CORES, // 初始化线程
NUMBER_OF_CORES, // 最大线程数
KEEP_ALIVE_TIME,
KEEP_ALIVE_TIME_UNIT,
mDecodeWorkQueue);
private FfmpegManager() {
}
public synchronized static FfmpegManager getInstance() {
if (manager == null) {
manager = new FfmpegManager();
}
return manager;
}
/**
* 合并操作
* @param context
* @param videoStitchingRequest
* @param completionListener
*/
public void stitchVideos(Context context, VideoStitchingRequest videoStitchingRequest, CompletionListener completionListener) {
installFfmpeg(context);
StitchingTask stitchingTask = new StitchingTask(context, mFfmpegInstallPath, videoStitchingRequest, completionListener);
mDecodeThreadPool.execute(stitchingTask);
}
//切割操作
public void trimVideo(Context context,File srcFile,File destFile,List<long[]> mNewSeeks,CompletionListener completionListener){
installFfmpeg(context);
TrimTask trimTask=new TrimTask(context, mFfmpegInstallPath, srcFile, destFile,mNewSeeks , completionListener);
mDecodeThreadPool.execute(trimTask);
}
/*
* 插入FFmpeg的路径(这里我保存在资源文件下的raw文件夹下)
*/
@SuppressLint("NewApi") private void installFfmpeg(Context context) {
String arch = System.getProperty("os.arch");//获取CPU的架构类型
String arc = arch.substring(0, 3).toUpperCase();
String rarc = "";
int rawFileId;
if (arc.equals("ARM")) {//arm架构
rawFileId = R.raw.ffmpeg;
} else if (arc.equals("MIP")) {
rawFileId = R.raw.ffmpeg;
} else if (arc.equals("X86")) {//x86架构
rawFileId = R.raw.ffmpeg_x86;
} else {
rawFileId = R.raw.ffmpeg;
}
File ffmpegFile = new File(context.getCacheDir(), "ffmpeg");
mFfmpegInstallPath = ffmpegFile.toString();
Utils.installBinaryFromRaw(context, rawFileId, ffmpegFile);
ffmpegFile.setExecutable(true);//对操作者的执行权限
}
}
合成的线程
package com.tian.videomergedemo.task;
import java.io.BufferedReader;
import java.io.File;
import java.io.FileNotFoundException;
import java.io.FileOutputStream;
import java.io.IOException;
import java.io.InputStreamReader;
import android.content.Context;
import android.os.Environment;
import android.util.Log;
import com.tian.videomergedemo.inter.CompletionListener;
import com.tian.videomergedemo.manager.VideoStitchingRequest;
/**
* Created by TCX on 22/01/16.
*/
public class StitchingTask implements Runnable {
private Context context;
private VideoStitchingRequest videoStitchingRequest;
private CompletionListener completionListener;
private String mFfmpegInstallPath;
public StitchingTask(Context context, String mFfmpegInstallPath, VideoStitchingRequest stitchingRequest, CompletionListener completionListener) {
this.context = context;
this.mFfmpegInstallPath = mFfmpegInstallPath;
this.videoStitchingRequest = stitchingRequest;
this.completionListener = completionListener;
}
@Override
public void run() {
stitchVideo(context, mFfmpegInstallPath, videoStitchingRequest, completionListener);
}
private void stitchVideo(Context context, String mFfmpegInstallPath, VideoStitchingRequest videoStitchingRequest, final CompletionListener completionListener) {
//合成的路径
String path = Environment.getExternalStorageDirectory().getAbsolutePath() + File.separator + "ffmpeg_videos";
File dir = new File(path);
if (!dir.exists()) {
dir.mkdirs();
}
File inputfile = new File(path, "input.txt");
try {
inputfile.createNewFile();
FileOutputStream out = new FileOutputStream(inputfile);
for (String string : videoStitchingRequest.getInputVideoFilePaths()) {
out.write(("file " + "'" + string + "'").getBytes());
out.write("\n".getBytes());
}
} catch (FileNotFoundException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
// execFFmpegBinary("-i " + src.getAbsolutePath() + " -ss "+ startMs/1000 + " -to " + endMs/1000 + " -strict -2 -async 1 "+ dest.getAbsolutePath());
//合成的FFmpeg命令行
String[] sampleFFmpegcommand = {mFfmpegInstallPath, "-f", "concat", "-i", inputfile.getAbsolutePath(), "-c", "copy", videoStitchingRequest.getOutputPath()};
try {
Process ffmpegProcess = new ProcessBuilder(sampleFFmpegcommand)
.redirectErrorStream(true).start();
String line;
BufferedReader reader = new BufferedReader(
new InputStreamReader(ffmpegProcess.getInputStream()));
Log.d("***", "*******Starting FFMPEG");
while ((line = reader.readLine()) != null) {
Log.d("***", "***" + line + "***");
}
Log.d(null, "****ending FFMPEG****");
ffmpegProcess.waitFor();
} catch (Exception e) {
e.printStackTrace();
}
inputfile.delete();
//合成成功的接口回调
completionListener.onProcessCompleted("Video Stitiching Comleted",null);
}
}
都有注释,不用我多扯皮了吧!
OK,至此,合成和裁剪的java层已出,篇幅太长,下一篇,接着来(并附传送门)!
加油!
Github地址(大家下载的时候顺便给个star也是对作者劳动成果的肯定,谢谢):
https://github.com/T-chuangxin/VideoMergeDemo
Copyright © 2003-2013 www.wpsshop.cn 版权所有,并保留所有权利。