当前位置:   article > 正文

Android录制视频,三种Camera的使用与预览及其简单封装

android camera

三种Camera的使用与封装

前言

省流:先别慌别跑,本文只是涉及到 Android API 的使用以及封装,还没有涉及到专业音视频领域(PS:说的好像我会似的),大家如果想看哪一种 Camera API 直接点击分类查看即可。代码比较多全文阅读的话大概20分钟。

文末附源码。

作者:newki
链接:https://juejin.cn/post/7252597901762625596

正文部分

使用到相机大家都不陌生,很多应用都会使用到 Camera 硬件,一般是用于拍照或者录制视频,这都是很常见的功能。

一般来大家都是怎么使用的呢?说如果没有 UI 的限制或逻辑的限制,只是单纯的拍照或录制视频,我们其实可以通过 intent 的方式直接启动系统的拍照或录制 Activity 就能完成,这是最方便的方式。也就不需要此文了。

但是总有那么一些应用,那么一些需求,要么是需要自定义的 相机UI页面,要么是业务逻辑需要 ,一般我们就需要自己实现 Camera 的逻辑,比如显示预览页面,对回调的数据帧进行人脸判断? 进行图片识别? 或者需要自定义拍照与录制逻辑单击拍照长按录制? 或者是对特效/滤镜视频的录制?

既然离不开 Camera API 的使用,那么本文就从应用的角度出发如何使用 Camera API 实现想要的效果。

那么需要具体到使用哪一种 Camera 呢?这个大家也能听说过,我们目前有三种 API 可以使用,分别是 Camera , Camera2 , CameraX ??? 谷歌是不是傻,为什么要增加我们开发者的工作量,搞这么复杂我怎么用嘛!

一、Camera的前世今生

其实主要是为了兼容性与安全性考虑,这里简单的总结一下:

Camera API 是允许应用程序直接与相机硬件交互的旧版 android.hardware.Camera 类在底层通过驱动程序直接访问相机硬件。

它提供了基本的控制与图片捕捉能力,但是这种方式具有一定的限制和设备兼容性问题。所以 Android 团队决定在 Android 5.0 引入全新的相机 API ,即Camera2 API。

Camera2 API引入了一种新的架构,应用程序通过 CameraManager 与系统 相机服务 进行通信,并使用CameraDevice、CameraCaptureSession等对象来控制相机功能。这种方式提供了更精细的控制和更高的性能,并解决了旧版API存在的一些限制。

其实这样很好,直接操作变为通过服务通信,并且它提供更精细的控制和更高的性能。到此的话应该没什么问题,为什么还要推出 CameraX 呢?

尽管 Camera2 API 提供了很多优势,但使用它仍然需要编写大量的代码来处理各种情况和设备兼容性。为了简化相机开发流程并提高跨设备兼容性,Google 推出了 CameraX 库

而 CameraX 库则是在 Camera2 API 的基础上进行封装和简化,以提供更一致和易用的相机接口。它抽象了底层的相机相关逻辑和设备差异,使开发者能够以统一的方式编写相机代码,而无需关心特定设备兼容性的细节。它提供了更高级别的API,使开发者能够更轻松地实现相机功能,同时保持跨设备兼容性。

所以其实我们用 Camera2 能实现的效果都能通过 CameraX 更简单的与方便的实现,所以个人也是比较推荐使用 CameraX 。

下面就把三种 Camera API 的如何使用与如何封装都写一遍。

二、Camera1 的使用与封装

一般来说我们使用 Camera1 的时候,我们都是使用 SurfaceView 或 TextureView 来承载预览画面。他们在此场景下的区别就是 TextureView 可以用一些动画实现自适应,裁剪布局等效果。

这里我们简单的 SurfaceView 来演示如何使用:

先创建 SurfaceView 并设置监听,然后添加到我们指定的容器中:

  1. public View initCamera(Context context) {
  2. mSurfaceView = new SurfaceView(context);
  3. mContext = context;
  4. mSurfaceView.setLayoutParams(new ViewGroup.LayoutParams(ViewGroup.LayoutParams.MATCH_PARENT, ViewGroup.LayoutParams.MATCH_PARENT));
  5. mSurfaceHolder = mSurfaceView.getHolder();
  6. mSurfaceHolder.addCallback(new CustomCallBack());
  7. mSurfaceHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
  8. return mSurfaceView;
  9. }

在 SurfaceView 的回调中,我们初始化 Camera

  1. private class CustomCallBack implements SurfaceHolder.Callback {
  2. @Override
  3. public void surfaceCreated(SurfaceHolder holder) {
  4. initCamera();
  5. }
  6. @Override
  7. public void surfaceChanged(SurfaceHolder holder, int format, int width, int height) {
  8. }
  9. @Override
  10. public void surfaceDestroyed(SurfaceHolder holder) {
  11. releaseAllCamera();
  12. }
  13. }
  14. private void initCamera() {
  15. if (mCamera != null) {
  16. releaseAllCamera();
  17. }
  18. //打开摄像头
  19. try {
  20. mCamera = Camera.open();
  21. } catch (Exception e) {
  22. e.printStackTrace();
  23. releaseAllCamera();
  24. }
  25. if (mCamera == null)
  26. return;
  27. //设置摄像头参数
  28. setCameraParams();
  29. try {
  30. mCamera.setDisplayOrientation(90); //设置拍摄方向为90度(竖屏)
  31. mCamera.setPreviewDisplay(mSurfaceHolder);
  32. mCamera.startPreview();
  33. mCamera.unlock();
  34. } catch (IllegalStateException e) {
  35. e.printStackTrace();
  36. } catch (RuntimeException e) {
  37. e.printStackTrace();
  38. } catch (Exception e) {
  39. e.printStackTrace();
  40. }
  41. }

这里需要额外处理预览的方向与预览尺寸,代码太多了,在下面的工具类中放出。

上面的代码只是最简单的使用,关于切换前后镜头,镜像展示,预览方向,旋转角度,回调处理等等一系列的操作使用起来很是麻烦,所以这里贴一个自用的工具类统一管理他们。

先定义一个接口回调:

  1. public interface CameraListener {
  2. /**
  3. * 当打开时执行
  4. *
  5. * @param camera 相机实例
  6. * @param cameraId 相机ID
  7. * @param displayOrientation 相机预览旋转角度
  8. * @param isMirror 是否镜像显示
  9. */
  10. void onCameraOpened(Camera camera, int cameraId, int displayOrientation, boolean isMirror);
  11. /**
  12. * 预览数据回调
  13. *
  14. * @param data 预览数据
  15. * @param camera 相机实例
  16. */
  17. void onPreview(byte[] data, Camera camera);
  18. /**
  19. * 当相机关闭时执行
  20. */
  21. void onCameraClosed();
  22. /**
  23. * 当出现异常时执行
  24. *
  25. * @param e 相机相关异常
  26. */
  27. void onCameraError(Exception e);
  28. /**
  29. * 属性变化时调用
  30. *
  31. * @param cameraID 相机ID
  32. * @param displayOrientation 相机旋转方向
  33. */
  34. void onCameraConfigurationChanged(int cameraID, int displayOrientation);
  35. }

然后就是我们的工具类,这里直接贴出:

  1. public class CameraHelper implements Camera.PreviewCallback {
  2. private Camera mCamera;
  3. private int mCameraId;
  4. private Point previewViewSize;
  5. private View previewDisplayView;
  6. private Camera.Size previewSize;
  7. private Point specificPreviewSize;
  8. private int displayOrientation = 0;
  9. private int rotation;
  10. private int additionalRotation;
  11. private boolean isMirror = false;
  12. private Integer specificCameraId = null;
  13. private CameraListener cameraListener; //自定义监听回调
  14. private CameraHelper(Builder builder) {
  15. previewDisplayView = builder.previewDisplayView;
  16. specificCameraId = builder.specificCameraId;
  17. cameraListener = builder.cameraListener;
  18. rotation = builder.rotation;
  19. additionalRotation = builder.additionalRotation;
  20. previewViewSize = builder.previewViewSize;
  21. specificPreviewSize = builder.previewSize;
  22. if (builder.previewDisplayView instanceof TextureView) {
  23. isMirror = builder.isMirror;
  24. } else if (isMirror) {
  25. throw new RuntimeException("mirror is effective only when the preview is on a textureView");
  26. }
  27. }
  28. public void init() {
  29. if (previewDisplayView instanceof TextureView) {
  30. ((TextureView) this.previewDisplayView).setSurfaceTextureListener(textureListener);
  31. } else if (previewDisplayView instanceof SurfaceView) {
  32. ((SurfaceView) previewDisplayView).getHolder().addCallback(surfaceCallback);
  33. }
  34. if (isMirror) {
  35. previewDisplayView.setScaleX(-1);
  36. }
  37. }
  38. public String start() {
  39. String firstSize = null;
  40. String finalSize = null;
  41. synchronized (this) {
  42. if (mCamera != null) {
  43. return null;
  44. }
  45. //相机数量为2则打开1,1则打开0,相机ID 1为前置,0为后置
  46. mCameraId = Camera.getNumberOfCameras() - 1;
  47. //若指定了相机ID且该相机存在,则打开指定的相机
  48. if (specificCameraId != null && specificCameraId <= mCameraId) {
  49. mCameraId = specificCameraId;
  50. }
  51. //没有相机
  52. if (mCameraId == -1) {
  53. if (cameraListener != null) {
  54. cameraListener.onCameraError(new Exception("camera not found"));
  55. }
  56. return null;
  57. }
  58. //开启相机
  59. if (mCamera == null) {
  60. mCamera = Camera.open(mCameraId);
  61. }
  62. //获取预览方向
  63. displayOrientation = getCameraOri(rotation);
  64. mCamera.setDisplayOrientation(displayOrientation);
  65. try {
  66. Camera.Parameters parameters = mCamera.getParameters();
  67. parameters.setPreviewFormat(ImageFormat.NV21);
  68. //预览大小设置
  69. previewSize = parameters.getPreviewSize();
  70. firstSize = previewSize.width + " - " + previewSize.height;
  71. List<Camera.Size> supportedPreviewSizes = parameters.getSupportedPreviewSizes();
  72. if (supportedPreviewSizes != null && supportedPreviewSizes.size() > 0) {
  73. previewSize = getBestSupportedSize(supportedPreviewSizes, previewViewSize);
  74. finalSize = previewSize.width + " - " + previewSize.height;
  75. }
  76. YYLogUtils.w("Base Preview Size ,Width:" + previewSize.width + " height:" + previewSize.height);
  77. parameters.setPreviewSize(previewSize.width, previewSize.height);
  78. //对焦模式设置
  79. List<String> supportedFocusModes = parameters.getSupportedFocusModes();
  80. if (supportedFocusModes != null && supportedFocusModes.size() > 0) {
  81. if (supportedFocusModes.contains(Camera.Parameters.FOCUS_MODE_CONTINUOUS_PICTURE)) {
  82. parameters.setFocusMode(Camera.Parameters.FOCUS_MODE_CONTINUOUS_PICTURE);
  83. } else if (supportedFocusModes.contains(Camera.Parameters.FOCUS_MODE_CONTINUOUS_VIDEO)) {
  84. parameters.setFocusMode(Camera.Parameters.FOCUS_MODE_CONTINUOUS_VIDEO);
  85. } else if (supportedFocusModes.contains(Camera.Parameters.FOCUS_MODE_AUTO)) {
  86. parameters.setFocusMode(Camera.Parameters.FOCUS_MODE_AUTO);
  87. }
  88. }
  89. //Camera 配置完成,设置回去
  90. mCamera.setParameters(parameters);
  91. //绑定到 TextureView 或 SurfaceView
  92. if (previewDisplayView instanceof TextureView) {
  93. mCamera.setPreviewTexture(((TextureView) previewDisplayView).getSurfaceTexture());
  94. } else {
  95. mCamera.setPreviewDisplay(((SurfaceView) previewDisplayView).getHolder());
  96. }
  97. //启动预览并设置预览回调
  98. mCamera.setPreviewCallback(this);
  99. mCamera.startPreview();
  100. if (cameraListener != null) {
  101. cameraListener.onCameraOpened(mCamera, mCameraId, displayOrientation, isMirror);
  102. }
  103. } catch (Exception e) {
  104. if (cameraListener != null) {
  105. cameraListener.onCameraError(e);
  106. }
  107. }
  108. }
  109. return "firstSize :" + firstSize + " finalSize:" + finalSize;
  110. }
  111. private int getCameraOri(int rotation) {
  112. int degrees = rotation * 90;
  113. switch (rotation) {
  114. case Surface.ROTATION_0:
  115. degrees = 0;
  116. break;
  117. case Surface.ROTATION_90:
  118. degrees = 90;
  119. break;
  120. case Surface.ROTATION_180:
  121. degrees = 180;
  122. break;
  123. case Surface.ROTATION_270:
  124. degrees = 270;
  125. break;
  126. default:
  127. break;
  128. }
  129. additionalRotation /= 90;
  130. additionalRotation *= 90;
  131. degrees += additionalRotation;
  132. int result;
  133. Camera.CameraInfo info = new Camera.CameraInfo();
  134. Camera.getCameraInfo(mCameraId, info);
  135. if (info.facing == Camera.CameraInfo.CAMERA_FACING_FRONT) {
  136. result = (info.orientation + degrees) % 360;
  137. result = (360 - result) % 360;
  138. } else {
  139. result = (info.orientation - degrees + 360) % 360;
  140. }
  141. return result;
  142. }
  143. public void stop() {
  144. synchronized (this) {
  145. if (mCamera == null) {
  146. return;
  147. }
  148. mCamera.setPreviewCallback(null);
  149. mCamera.stopPreview();
  150. mCamera.release();
  151. mCamera = null;
  152. if (cameraListener != null) {
  153. cameraListener.onCameraClosed();
  154. }
  155. }
  156. }
  157. public boolean isStopped() {
  158. synchronized (this) {
  159. return mCamera == null;
  160. }
  161. }
  162. public void release() {
  163. synchronized (this) {
  164. stop();
  165. previewDisplayView = null;
  166. specificCameraId = null;
  167. cameraListener = null;
  168. previewViewSize = null;
  169. specificPreviewSize = null;
  170. previewSize = null;
  171. }
  172. }
  173. /**
  174. * 根据 Camera 获取支持的宽高,获取到最适合的预览宽高
  175. */
  176. private Camera.Size getBestSupportedSize(List<Camera.Size> sizes, Point previewViewSize) {
  177. if (sizes == null || sizes.size() == 0) {
  178. return mCamera.getParameters().getPreviewSize();
  179. }
  180. Camera.Size[] tempSizes = sizes.toArray(new Camera.Size[0]);
  181. Arrays.sort(tempSizes, new Comparator<Camera.Size>() {
  182. @Override
  183. public int compare(Camera.Size o1, Camera.Size o2) {
  184. if (o1.width > o2.width) {
  185. return -1;
  186. } else if (o1.width == o2.width) {
  187. return o1.height > o2.height ? -1 : 1;
  188. } else {
  189. return 1;
  190. }
  191. }
  192. });
  193. sizes = Arrays.asList(tempSizes);
  194. Camera.Size bestSize = sizes.get(0);
  195. float previewViewRatio;
  196. if (previewViewSize != null) {
  197. previewViewRatio = (float) previewViewSize.x / (float) previewViewSize.y;
  198. } else {
  199. previewViewRatio = (float) bestSize.width / (float) bestSize.height;
  200. }
  201. if (previewViewRatio > 1) {
  202. previewViewRatio = 1 / previewViewRatio;
  203. }
  204. boolean isNormalRotate = (additionalRotation % 180 == 0);
  205. for (Camera.Size s : sizes) {
  206. if (specificPreviewSize != null && specificPreviewSize.x == s.width && specificPreviewSize.y == s.height) {
  207. return s;
  208. }
  209. if (isNormalRotate) {
  210. if (Math.abs((s.height / (float) s.width) - previewViewRatio) < Math.abs(bestSize.height / (float) bestSize.width - previewViewRatio)) {
  211. bestSize = s;
  212. }
  213. } else {
  214. if (Math.abs((s.width / (float) s.height) - previewViewRatio) < Math.abs(bestSize.width / (float) bestSize.height - previewViewRatio)) {
  215. bestSize = s;
  216. }
  217. }
  218. }
  219. return bestSize;
  220. }
  221. public List<Camera.Size> getSupportedPreviewSizes() {
  222. if (mCamera == null) {
  223. return null;
  224. }
  225. return mCamera.getParameters().getSupportedPreviewSizes();
  226. }
  227. public List<Camera.Size> getSupportedPictureSizes() {
  228. if (mCamera == null) {
  229. return null;
  230. }
  231. return mCamera.getParameters().getSupportedPictureSizes();
  232. }
  233. @Override
  234. public void onPreviewFrame(byte[] nv21, Camera camera) {
  235. if (cameraListener != null && nv21 != null) {
  236. cameraListener.onPreview(nv21, camera);
  237. }
  238. }
  239. /**
  240. * TextureView 的监听回调
  241. */
  242. private TextureView.SurfaceTextureListener textureListener = new TextureView.SurfaceTextureListener() {
  243. @Override
  244. public void onSurfaceTextureAvailable(SurfaceTexture surfaceTexture, int width, int height) {
  245. // start();
  246. if (mCamera != null) {
  247. try {
  248. mCamera.setPreviewTexture(surfaceTexture);
  249. } catch (IOException e) {
  250. e.printStackTrace();
  251. }
  252. }
  253. }
  254. @Override
  255. public void onSurfaceTextureSizeChanged(SurfaceTexture surfaceTexture, int width, int height) {
  256. }
  257. @Override
  258. public boolean onSurfaceTextureDestroyed(SurfaceTexture surfaceTexture) {
  259. stop();
  260. return false;
  261. }
  262. @Override
  263. public void onSurfaceTextureUpdated(SurfaceTexture surfaceTexture) {
  264. }
  265. };
  266. /**
  267. * SurfaceView 的监听回调
  268. */
  269. private SurfaceHolder.Callback surfaceCallback = new SurfaceHolder.Callback() {
  270. @Override
  271. public void surfaceCreated(SurfaceHolder holder) {
  272. if (mCamera != null) {
  273. try {
  274. mCamera.setPreviewDisplay(holder);
  275. } catch (IOException e) {
  276. e.printStackTrace();
  277. }
  278. }
  279. }
  280. @Override
  281. public void surfaceChanged(SurfaceHolder holder, int format, int width, int height) {
  282. }
  283. @Override
  284. public void surfaceDestroyed(SurfaceHolder holder) {
  285. stop();
  286. }
  287. };
  288. /**
  289. * 切换摄像头方向
  290. */
  291. public void changeDisplayOrientation(int rotation) {
  292. if (mCamera != null) {
  293. this.rotation = rotation;
  294. displayOrientation = getCameraOri(rotation);
  295. mCamera.setDisplayOrientation(displayOrientation);
  296. if (cameraListener != null) {
  297. cameraListener.onCameraConfigurationChanged(mCameraId, displayOrientation);
  298. }
  299. }
  300. }
  301. /**
  302. * 翻转前后摄像镜头
  303. */
  304. public boolean switchCamera() {
  305. if (Camera.getNumberOfCameras() < 2) {
  306. return false;
  307. }
  308. // cameraId ,0为后置,1为前置
  309. specificCameraId = 1 - mCameraId;
  310. stop();
  311. start();
  312. return true;
  313. }
  314. /**
  315. * 使用构建者模式创建,配置Camera
  316. */
  317. public static final class Builder {
  318. //预览显示的view,目前仅支持surfaceView和textureView
  319. private View previewDisplayView;
  320. //是否镜像显示,只支持textureView
  321. private boolean isMirror;
  322. //指定的相机ID
  323. private Integer specificCameraId;
  324. //事件回调
  325. private CameraListener cameraListener;
  326. //屏幕的长宽,在选择最佳相机比例时用到
  327. private Point previewViewSize;
  328. //屏幕的方向,一般传入getWindowManager().getDefaultDisplay().getRotation()的值即可
  329. private int rotation;
  330. //指定的预览宽高,若系统支持则会以这个预览宽高进行预览
  331. private Point previewSize;
  332. //额外的旋转角度(用于适配一些定制设备)
  333. private int additionalRotation;
  334. public Builder() {
  335. }
  336. //必须要绑定到SurfaceView或者TextureView上
  337. public Builder previewOn(View view) {
  338. if (view instanceof SurfaceView || view instanceof TextureView) {
  339. previewDisplayView = view;
  340. return this;
  341. } else {
  342. throw new RuntimeException("you must preview on a textureView or a surfaceView");
  343. }
  344. }
  345. public Builder isMirror(boolean mirror) {
  346. isMirror = mirror;
  347. return this;
  348. }
  349. public Builder previewSize(Point point) {
  350. previewSize = point;
  351. return this;
  352. }
  353. public Builder previewViewSize(Point point) {
  354. previewViewSize = point;
  355. return this;
  356. }
  357. public Builder rotation(int rotation) {
  358. rotation = rotation;
  359. return this;
  360. }
  361. public Builder additionalRotation(int rotation) {
  362. additionalRotation = rotation;
  363. return this;
  364. }
  365. public Builder specificCameraId(Integer id) {
  366. specificCameraId = id;
  367. return this;
  368. }
  369. public Builder cameraListener(CameraListener listener) {
  370. cameraListener = listener;
  371. return this;
  372. }
  373. public CameraHelper build() {
  374. if (previewViewSize == null) {
  375. throw new RuntimeException("previewViewSize is null, now use default previewSize");
  376. }
  377. if (cameraListener == null) {
  378. throw new RuntimeException("cameraListener is null, callback will not be called");
  379. }
  380. if (previewDisplayView == null) {
  381. throw new RuntimeException("you must preview on a textureView or a surfaceView");
  382. }
  383. //build的时候顺便执行初始化
  384. CameraHelper cameraHelper = new CameraHelper(this);
  385. cameraHelper.init();
  386. return cameraHelper;
  387. }
  388. }
  389. }

我们使用起来就很简单了:

还是初始化 SurfaceView 并且添加到指定的布局容器中:

  1. public View initCamera(Context context) {
  2. mSurfaceView = new SurfaceView(context);
  3. mContext = context;
  4. mSurfaceView.setLayoutParams(new ViewGroup.LayoutParams(ViewGroup.LayoutParams.MATCH_PARENT, ViewGroup.LayoutParams.MATCH_PARENT));
  5. mSurfaceView.getViewTreeObserver().addOnGlobalLayoutListener(new ViewTreeObserver.OnGlobalLayoutListener() {
  6. @Override
  7. public void onGlobalLayout() {
  8. mSurfaceView.post(() -> {
  9. setupCameraHelper();
  10. });
  11. mSurfaceView.getViewTreeObserver().removeOnGlobalLayoutListener(this);
  12. }
  13. });
  14. return mSurfaceView;
  15. }
  16. private void setupCameraHelper() {
  17. cameraHelper = new CameraHelper.Builder()
  18. .previewViewSize(new Point(mSurfaceView.getMeasuredWidth(), mSurfaceView.getMeasuredHeight()))
  19. .rotation(((Activity) mContext).getWindowManager().getDefaultDisplay().getRotation())
  20. .specificCameraId(Camera.CameraInfo.CAMERA_FACING_BACK)
  21. .isMirror(false)
  22. .previewOn(mSurfaceView) //预览容器 推荐TextureView
  23. .cameraListener(mCameraListener) //设置自定义的监听器
  24. .build();
  25. cameraHelper.start();
  26. }

所有的逻辑就在回调中处理:

  1. //自定义监听
  2. private CameraListener mCameraListener = new CameraListener() {
  3. @Override
  4. public void onCameraOpened(Camera camera, int cameraId, int displayOrientation, boolean isMirror) {
  5. YYLogUtils.w("CameraListener - onCameraOpened");
  6. //你可以使用 MediaRecorder 去录制视频
  7. mMediaRecorder = new MediaRecorder();
  8. ...
  9. }
  10. @Override
  11. public void onPreview(byte[] data, Camera camera) {
  12. // nv21 数据
  13. // 你也可以用 MediaCodec 自己编码去录制视频
  14. }
  15. @Override
  16. public void onCameraClosed() {
  17. YYLogUtils.w("CameraListener - onCameraClosed");
  18. }
  19. @Override
  20. public void onCameraError(Exception e) {
  21. YYLogUtils.w("CameraListener - onCameraError");
  22. }
  23. @Override
  24. public void onCameraConfigurationChanged(int cameraID, int displayOrientation) {
  25. YYLogUtils.w("CameraListener - onCameraConfigurationChanged");
  26. }
  27. };

停止与释放的逻辑

  1. @Override
  2. public void releaseAllCamera() {
  3. cameraHelper.stop();
  4. cameraHelper.release();
  5. }

效果:

02c8dec4b8b2aed70025463dd905f232.jpeg

细啊,真的是太细了,代码太详细了。

三、Camera2 的使用与封装

Camera2 的使用是通过服务获取的,使用起来相对步骤多一些。

这里我们使用 TextureView 来承载画面:

  1. public View initCamera(Context context) {
  2. mTextureView = new TextureView(context);
  3. mContext = context;
  4. mTextureView.setLayoutParams(new ViewGroup.LayoutParams(720, 1280));
  5. mTextureView.setSurfaceTextureListener(mSurfaceTextureListener);
  6. return mTextureView;
  7. }

同样的逻辑,我们在 TextureView 的回调中启动 Camera2 :

  1. private TextureView.SurfaceTextureListener mSurfaceTextureListener = new TextureView.SurfaceTextureListener() {
  2. @Override
  3. public void onSurfaceTextureAvailable(SurfaceTexture texture, int width, int height) {
  4. // 当TextureView可用时,打开摄像头
  5. YYLogUtils.w("当TextureView可用时,width:" + width + " height:" + height);
  6. openCamera(width, height);
  7. }
  8. @Override
  9. public void onSurfaceTextureSizeChanged(SurfaceTexture texture, int width, int height) {
  10. // 当TextureView尺寸改变时,更新预览尺寸
  11. configureTransform(width, height);
  12. }
  13. @Override
  14. public boolean onSurfaceTextureDestroyed(SurfaceTexture texture) {
  15. // 当TextureView销毁时,释放资源
  16. return true;
  17. }
  18. @Override
  19. public void onSurfaceTextureUpdated(SurfaceTexture texture) {
  20. // 监听纹理更新事件
  21. }
  22. };
  23. private void openCamera(int width, int height) {
  24. // 获取相机管理器
  25. mCameraManager = (CameraManager) mContext.getSystemService(Context.CAMERA_SERVICE);
  26. // 设置自定义的线程处理
  27. HandlerThread handlerThread = new HandlerThread("Camera2Manager");
  28. handlerThread.start();
  29. mBgHandler = new Handler(handlerThread.getLooper());
  30. try {
  31. //获取到相机信息并赋值
  32. getCameraListCameraCharacteristics(width,height);
  33. YYLogUtils.w("打开的摄像头id:" + mCurrentCameraId);
  34. // 打开摄像头
  35. mCameraManager.openCamera(mCurrentCameraId, new CameraDevice.StateCallback() {
  36. @Override
  37. public void onOpened(CameraDevice cameraDevice) {
  38. // 当摄像头打开时,创建预览会话
  39. mCameraDevice = cameraDevice;
  40. createCameraPreviewSession(mBgHandler, mTextureView.getWidth(), mTextureView.getHeight());
  41. }
  42. @Override
  43. public void onDisconnected(CameraDevice cameraDevice) {
  44. if (mCameraDevice != null) {
  45. mCameraDevice.close();
  46. mCameraDevice = null;
  47. }
  48. }
  49. @Override
  50. public void onError(CameraDevice cameraDevice, int error) {
  51. if (mCameraDevice != null) {
  52. mCameraDevice.close();
  53. mCameraDevice = null;
  54. }
  55. }
  56. }, mBgHandler);
  57. } catch (CameraAccessException e) {
  58. e.printStackTrace();
  59. }
  60. }

这里我们使用的 mBgHandler 我们使用的主线程,其实在子线程中更好,这里只用作展示而已(具体的后面工具类会给出)。

我们继续往下走,打开相机之后需要创建预览的会话,创建预览Surface,并且在预览会话回调中发起预览请求:

  1. /**
  2. * 创建相机预览会话
  3. */
  4. private void createCameraPreviewSession(Handler handler, int width, int height) {
  5. try {
  6. // 获取SurfaceTexture并设置默认缓冲区大小
  7. SurfaceTexture texture = mTextureView.getSurfaceTexture();
  8. texture.setDefaultBufferSize(mTextureView.getWidth(), mTextureView.getHeight());
  9. // 创建预览Surface
  10. Surface surface = new Surface(texture);
  11. // 创建CaptureRequest.Builder并设置预览Surface为目标
  12. mPreviewRequestBuilder = mCameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW);
  13. mPreviewRequestBuilder.addTarget(surface);
  14. // 创建ImageReader并设置回调
  15. YYLogUtils.w("创建ImageReader并设置回调,width:" + width + " height:" + height);
  16. // mImageReader = ImageReader.newInstance(mTextureView.getWidth(), mTextureView.getHeight(), ImageFormat.JPEG, 1);
  17. mImageReader = ImageReader.newInstance(width, height, ImageFormat.YUV_420_888, 2);
  18. mImageReader.setOnImageAvailableListener(mOnImageAvailableListener, handler);
  19. // 将ImageReader的Surface添加到CaptureRequest.Builder中
  20. Surface readerSurface = mImageReader.getSurface();
  21. mPreviewRequestBuilder.addTarget(readerSurface);
  22. // 创建预览会话
  23. mCameraDevice.createCaptureSession(Arrays.asList(surface, readerSurface), mSessionCallback, handler);
  24. } catch (CameraAccessException e) {
  25. e.printStackTrace();
  26. }
  27. }
  28. private CameraCaptureSession.StateCallback mSessionCallback = new CameraCaptureSession.StateCallback() {
  29. @Override
  30. public void onConfigured(@NonNull CameraCaptureSession cameraCaptureSession) {
  31. // 预览会话已创建成功,开始预览
  32. mPreviewSession = cameraCaptureSession;
  33. updatePreview();
  34. }
  35. @Override
  36. public void onConfigureFailed(@NonNull CameraCaptureSession cameraCaptureSession) {
  37. ToastUtils.INSTANCE.makeText(mContext, "Failed to create camera preview session");
  38. }
  39. };
  40. private void updatePreview() {
  41. try {
  42. // 设置自动对焦模式
  43. mPreviewRequestBuilder.set(CaptureRequest.CONTROL_AF_MODE, CaptureRequest.CONTROL_AF_MODE_CONTINUOUS_PICTURE);
  44. // 构建预览请求
  45. mPreviewRequest = mPreviewRequestBuilder.build();
  46. // 发送预览请求
  47. mPreviewSession.setRepeatingRequest(mPreviewRequest, null, null);
  48. } catch (CameraAccessException e) {
  49. e.printStackTrace();
  50. }
  51. }
  52. private ImageReader.OnImageAvailableListener mOnImageAvailableListener = new ImageReader.OnImageAvailableListener() {
  53. @Override
  54. public void onImageAvailable(ImageReader reader) {
  55. // YUV_420 格式
  56. Image image = reader.acquireLatestImage();
  57. saveImage(bytes);
  58. image.close();
  59. }
  60. };

可以看到,Camera2 的基本使用比起 Camera1 来说复杂了不少,这里还是省略了一些旋转角度,预览方向,最佳尺寸等逻辑的代码(后面工具类会给出)。所以难怪谷歌要出 CameraX,自己都看不下去了呗。

当然,我们封装一下之后使用起来也是很简单的,看我的工具类。

这里分别不同的实例提供,一种是基本的预览,一种是AllSize的裁剪模式,一种是提供了 ImageReader 帧回调的终极模式。

这里真的不能全部贴出来,不然代码太多,大家可以看文章底部的源码查看。所以这里贴出的是关键代码:

由于 Camera2 的很多操作都推荐在子线程处理,这里先定义子线程的 Loop 与 Handler :

  1. public abstract class BaseMessageLoop {
  2. private volatile MsgHandlerThread mHandlerThread;
  3. private volatile Handler mHandler;
  4. private String mName;
  5. public BaseMessageLoop(Context context, String name) {
  6. mName = name;
  7. }
  8. public MsgHandlerThread getHandlerThread() {
  9. return mHandlerThread;
  10. }
  11. public Handler getHandler() {
  12. return mHandler;
  13. }
  14. public void Run() {
  15. Quit();
  16. //LogUtil.v(TAG, mName + " HandlerThread Run");
  17. synchronized (this) {
  18. mHandlerThread = new MsgHandlerThread(mName);
  19. mHandlerThread.start();
  20. mHandler = new Handler(mHandlerThread.getLooper(), mHandlerThread);
  21. }
  22. }
  23. public void Quit() {
  24. //LogUtil.v(TAG, mName + " HandlerThread Quit");
  25. synchronized (this) {
  26. if (mHandlerThread != null) {
  27. mHandlerThread.quit();
  28. }
  29. if (mHandler != null) {
  30. mHandler.removeCallbacks(mHandlerThread);
  31. }
  32. mHandlerThread = null;
  33. mHandler = null;
  34. }
  35. }
  36. ...
  37. }

基本的预览提供:

  1. public class BaseCommonCameraProvider extends BaseCameraProvider {
  2. protected Activity mContext;
  3. protected String mCameraId;
  4. protected Handler mCameraHandler;
  5. private final BaseMessageLoop mThread;
  6. protected CameraDevice mCameraDevice;
  7. protected CameraCaptureSession session;
  8. protected AspectTextureView[] mTextureViews;
  9. protected CameraManager cameraManager;
  10. protected OnCameraInfoListener mCameraInfoListener;
  11. public interface OnCameraInfoListener {
  12. void getBestSize(Size outputSizes);
  13. void onFrameCannback(Image image);
  14. void initEncode();
  15. void onSurfaceTextureAvailable(SurfaceTexture surfaceTexture, int width, int height);
  16. }
  17. public void setCameraInfoListener(OnCameraInfoListener cameraInfoListener) {
  18. this.mCameraInfoListener = cameraInfoListener;
  19. }
  20. protected BaseCommonCameraProvider(Activity mContext) {
  21. this.mContext = mContext;
  22. mThread = new BaseMessageLoop(mContext, "camera") {
  23. @Override
  24. protected boolean recvHandleMessage(Message msg) {
  25. return false;
  26. }
  27. };
  28. mThread.Run();
  29. mCameraHandler = mThread.getHandler();
  30. cameraManager = (CameraManager) mContext.getSystemService(Context.CAMERA_SERVICE);
  31. }
  32. protected String getCameraId(boolean useFront) {
  33. try {
  34. for (String cameraId : cameraManager.getCameraIdList()) {
  35. CameraCharacteristics characteristics = cameraManager.getCameraCharacteristics(cameraId);
  36. int cameraFacing = characteristics.get(CameraCharacteristics.LENS_FACING);
  37. if (useFront) {
  38. if (cameraFacing == CameraCharacteristics.LENS_FACING_FRONT) {
  39. return cameraId;
  40. }
  41. } else {
  42. if (cameraFacing == CameraCharacteristics.LENS_FACING_BACK) {
  43. return cameraId;
  44. }
  45. }
  46. }
  47. } catch (CameraAccessException e) {
  48. e.printStackTrace();
  49. }
  50. return null;
  51. }
  52. protected Size getCameraBestOutputSizes(String cameraId, Class clz) {
  53. try {
  54. //拿到支持的全部Size,并从大到小排序
  55. CameraCharacteristics characteristics = cameraManager.getCameraCharacteristics(cameraId);
  56. StreamConfigurationMap configs = characteristics.get(CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP);
  57. List<Size> sizes = Arrays.asList(configs.getOutputSizes(clz));
  58. Collections.sort(sizes, new Comparator<Size>() {
  59. @Override
  60. public int compare(Size o1, Size o2) {
  61. return o1.getWidth() * o1.getHeight() - o2.getWidth() * o2.getHeight();
  62. }
  63. });
  64. Collections.reverse(sizes);
  65. YYLogUtils.w("all_sizes:" + sizes);
  66. //去除一些不合适的预览尺寸
  67. List<Size> suitableSizes = new ArrayList();
  68. for (int i = 0; i < sizes.size(); i++) {
  69. Size option = sizes.get(i);
  70. if (textureViewSize.getWidth() > textureViewSize.getHeight()) {
  71. if (option.getWidth() >= textureViewSize.getWidth() && option.getHeight() >= textureViewSize.getHeight()) {
  72. suitableSizes.add(option);
  73. }
  74. } else {
  75. if (option.getWidth() >= textureViewSize.getHeight() && option.getHeight() >= textureViewSize.getWidth()) {
  76. suitableSizes.add(option);
  77. }
  78. }
  79. }
  80. YYLogUtils.w("suitableSizes:" + suitableSizes);
  81. //获取最小占用的Size
  82. if (!suitableSizes.isEmpty()) {
  83. return suitableSizes.get(suitableSizes.size() - 1);
  84. } else {
  85. //异常情况下只能找默认的了
  86. return sizes.get(0);
  87. }
  88. } catch (CameraAccessException e) {
  89. e.printStackTrace();
  90. }
  91. return null;
  92. }
  93. protected List<Size> getCameraAllSizes(String cameraId, int format) {
  94. try {
  95. CameraCharacteristics characteristics = cameraManager.getCameraCharacteristics(cameraId);
  96. StreamConfigurationMap configs = characteristics.get(CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP);
  97. return Arrays.asList(configs.getOutputSizes(format));
  98. } catch (CameraAccessException e) {
  99. e.printStackTrace();
  100. }
  101. return null;
  102. }
  103. protected void releaseCameraDevice(CameraDevice cameraDevice) {
  104. if (cameraDevice != null) {
  105. cameraDevice.close();
  106. cameraDevice = null;
  107. }
  108. }
  109. protected void releaseCameraSession(CameraCaptureSession session) {
  110. if (session != null) {
  111. session.close();
  112. session = null;
  113. }
  114. }
  115. protected void releaseCamera() {
  116. releaseCameraDevice(mCameraDevice);
  117. releaseCameraSession(session);
  118. }
  119. }

对内部的一些监听与回调做了抽取,内部对切换镜头,支持的Size选择,等做了一些实现,这样简单的就能实现预览的操作。

我们常用的应该是可以裁切拉伸的 TextureView , 这里单独对它做一个预览的处理:

  1. /**
  2. * 只用于预览,没有帧回调
  3. */
  4. public class Camera2AllSizeProvider extends BaseCommonCameraProvider {
  5. private CaptureRequest.Builder mPreviewBuilder;
  6. private Size outputSize;
  7. public Camera2AllSizeProvider(Activity mContext) {
  8. super(mContext);
  9. Point displaySize = new Point();
  10. mContext.getWindowManager().getDefaultDisplay().getSize(displaySize);
  11. screenSize = new Size(displaySize.x, displaySize.y);
  12. }
  13. private void initCamera() {
  14. mCameraId = getCameraId(false);//默认使用后置相机
  15. //获取指定相机的输出尺寸列表,降序排序
  16. outputSize = getCameraBestOutputSizes(mCameraId, SurfaceTexture.class);
  17. //初始化预览尺寸
  18. previewSize = outputSize;
  19. YYLogUtils.w("previewSize,width:" + previewSize.getWidth() + "height:" + previewSize.getHeight());
  20. if (mCameraInfoListener != null) {
  21. mCameraInfoListener.getBestSize(outputSize);
  22. }
  23. }
  24. int index = 0;
  25. /**
  26. * 关联并初始化TextTure
  27. */
  28. public void initTexture(AspectTextureView... textureViews) {
  29. mTextureViews = textureViews;
  30. int size = textureViews.length;
  31. for (AspectTextureView aspectTextureView : textureViews) {
  32. aspectTextureView.setSurfaceTextureListener(new Camera2SimpleInterface.SimpleSurfaceTextureListener() {
  33. @Override
  34. public void onSurfaceTextureAvailable(SurfaceTexture surface, int width, int height) {
  35. textureViewSize = new Size(width, height);
  36. YYLogUtils.w("textureViewSize,width:" + textureViewSize.getWidth() + "height:" + textureViewSize.getHeight());
  37. YYLogUtils.w("screenSize,width:" + screenSize.getWidth() + "height:" + screenSize.getHeight());
  38. if (++index == size) {
  39. initCamera();
  40. openCamera();
  41. }
  42. }
  43. });
  44. }
  45. }
  46. @SuppressLint("MissingPermission")
  47. private void openCamera() {
  48. try {
  49. cameraManager.openCamera(mCameraId, mStateCallback, mCameraHandler);
  50. } catch (CameraAccessException e) {
  51. e.printStackTrace();
  52. }
  53. }
  54. private final CameraDevice.StateCallback mStateCallback = new Camera2SimpleInterface.SimpleCameraDeviceStateCallback() {
  55. @Override
  56. public void onOpened(CameraDevice camera) {
  57. mCameraDevice = camera;
  58. MediaCodecList allMediaCodecLists = new MediaCodecList(-1);
  59. MediaCodecInfo avcCodecInfo = null;
  60. for (MediaCodecInfo mediaCodecInfo : allMediaCodecLists.getCodecInfos()) {
  61. if (mediaCodecInfo.isEncoder()) {
  62. String[] supportTypes = mediaCodecInfo.getSupportedTypes();
  63. for (String supportType : supportTypes) {
  64. if (supportType.equals(MediaFormat.MIMETYPE_VIDEO_AVC)) {
  65. avcCodecInfo = mediaCodecInfo;
  66. Log.d("TAG", "编码器名称:" + mediaCodecInfo.getName() + " " + supportType);
  67. MediaCodecInfo.CodecCapabilities codecCapabilities = avcCodecInfo.getCapabilitiesForType(MediaFormat.MIMETYPE_VIDEO_AVC);
  68. int[] colorFormats = codecCapabilities.colorFormats;
  69. for (int colorFormat : colorFormats) {
  70. switch (colorFormat) {
  71. case MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV411Planar:
  72. case MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV411PackedPlanar:
  73. case MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420Planar:
  74. case MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420PackedPlanar:
  75. case MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420SemiPlanar:
  76. case MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420PackedSemiPlanar:
  77. Log.d("TAG", "支持的格式::" + colorFormat);
  78. break;
  79. }
  80. }
  81. }
  82. }
  83. }
  84. }
  85. //根据什么Size来展示PreView
  86. startPreviewSession(previewSize);
  87. }
  88. };
  89. public void startPreviewSession(Size size) {
  90. YYLogUtils.w("startPreviewSession 真正的Size,width:" + size.getWidth() + " height:" + size.getHeight());
  91. try {
  92. releaseCameraSession(session);
  93. mPreviewBuilder = mCameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW);
  94. List<Surface> outputs = new ArrayList<>();
  95. for (AspectTextureView aspectTextureView : mTextureViews) {
  96. //设置预览大小与展示的裁剪模式
  97. aspectTextureView.setScaleType(AspectInterface.ScaleType.FIT_CENTER);
  98. aspectTextureView.setSize(size.getHeight(), size.getWidth());
  99. SurfaceTexture surfaceTexture = aspectTextureView.getSurfaceTexture();
  100. surfaceTexture.setDefaultBufferSize(size.getWidth(), size.getHeight());
  101. Surface previewSurface = new Surface(surfaceTexture);
  102. mPreviewBuilder.addTarget(previewSurface);
  103. outputs.add(previewSurface);
  104. }
  105. mCameraDevice.createCaptureSession(outputs, mStateCallBack, mCameraHandler);
  106. } catch (CameraAccessException e) {
  107. e.printStackTrace();
  108. }
  109. }
  110. private final CameraCaptureSession.StateCallback mStateCallBack = new Camera2SimpleInterface.SimpleStateCallback() {
  111. @Override
  112. public void onConfigured(CameraCaptureSession session) {
  113. try {
  114. Camera2AllSizeProvider.this.session = session;
  115. //设置拍照前持续自动对焦
  116. mPreviewBuilder.set(CaptureRequest.CONTROL_AF_MODE, CaptureRequest.CONTROL_AF_MODE_CONTINUOUS_PICTURE);
  117. CaptureRequest request = mPreviewBuilder.build();
  118. session.setRepeatingRequest(request, null, mCameraHandler);
  119. } catch (CameraAccessException e) {
  120. e.printStackTrace();
  121. }
  122. }
  123. };
  124. public void closeCamera() {
  125. releaseCamera();
  126. if (mCameraDevice != null) {
  127. mCameraDevice.close();
  128. }
  129. }
  130. }

对于预览的回调,我们可以再做一个单独的实例提供:

  1. /**
  2. * 在预览的基础上加入ImageReader的帧回调,可以用于编码H264视频流等操作
  3. */
  4. public class Camera2ImageReaderProvider extends BaseCommonCameraProvider {
  5. private CaptureRequest.Builder mPreviewBuilder;
  6. protected ImageReader mImageReader;
  7. private Size outputSize;
  8. public Camera2ImageReaderProvider(Activity mContext) {
  9. super(mContext);
  10. Point displaySize = new Point();
  11. mContext.getWindowManager().getDefaultDisplay().getSize(displaySize);
  12. screenSize = new Size(displaySize.x, displaySize.y);
  13. YYLogUtils.w("screenSize,width:" + screenSize.getWidth() + "height:" + screenSize.getHeight(), "Camera2ImageReaderProvider");
  14. }
  15. private void initCamera() {
  16. mCameraId = getCameraId(false);//默认使用后置相机
  17. //获取指定相机的输出尺寸列表,降序排序
  18. outputSize = getCameraBestOutputSizes(mCameraId, SurfaceTexture.class);
  19. //初始化预览尺寸
  20. previewSize = outputSize;
  21. YYLogUtils.w("previewSize,width:" + previewSize.getWidth() + "height:" + previewSize.getHeight(), "Camera2ImageReaderProvider");
  22. if (mCameraInfoListener != null) {
  23. mCameraInfoListener.getBestSize(outputSize);
  24. }
  25. }
  26. int index = 0;
  27. /**
  28. * 关联并初始化TextTure
  29. */
  30. public void initTexture(AspectTextureView... textureViews) {
  31. mTextureViews = textureViews;
  32. int size = textureViews.length;
  33. for (AspectTextureView aspectTextureView : textureViews) {
  34. aspectTextureView.setSurfaceTextureListener(new Camera2SimpleInterface.SimpleSurfaceTextureListener() {
  35. @Override
  36. public void onSurfaceTextureAvailable(SurfaceTexture surfaceTexture, int width, int height) {
  37. textureViewSize = new Size(width, height);
  38. YYLogUtils.w("textureViewSize,width:" + textureViewSize.getWidth() + "height:" + textureViewSize.getHeight(), "Camera2ImageReaderProvider");
  39. if (mCameraInfoListener != null) {
  40. mCameraInfoListener.onSurfaceTextureAvailable(surfaceTexture, width, height);
  41. }
  42. if (++index == size) {
  43. initCamera();
  44. openCamera();
  45. }
  46. }
  47. });
  48. }
  49. }
  50. //初始化编码格式
  51. public void initEncord() {
  52. if (mCameraInfoListener != null) {
  53. mCameraInfoListener.initEncode();
  54. }
  55. }
  56. @SuppressLint("MissingPermission")
  57. private void openCamera() {
  58. try {
  59. cameraManager.openCamera(mCameraId, mStateCallback, mCameraHandler);
  60. } catch (CameraAccessException e) {
  61. e.printStackTrace();
  62. }
  63. }
  64. private final CameraDevice.StateCallback mStateCallback = new Camera2SimpleInterface.SimpleCameraDeviceStateCallback() {
  65. @Override
  66. public void onOpened(CameraDevice camera) {
  67. mCameraDevice = camera;
  68. initEncord();
  69. MediaCodecList allMediaCodecLists = new MediaCodecList(-1);
  70. MediaCodecInfo avcCodecInfo = null;
  71. for (MediaCodecInfo mediaCodecInfo : allMediaCodecLists.getCodecInfos()) {
  72. if (mediaCodecInfo.isEncoder()) {
  73. String[] supportTypes = mediaCodecInfo.getSupportedTypes();
  74. for (String supportType : supportTypes) {
  75. if (supportType.equals(MediaFormat.MIMETYPE_VIDEO_AVC)) {
  76. avcCodecInfo = mediaCodecInfo;
  77. Log.d("TAG", "编码器名称:" + mediaCodecInfo.getName() + " " + supportType);
  78. MediaCodecInfo.CodecCapabilities codecCapabilities = avcCodecInfo.getCapabilitiesForType(MediaFormat.MIMETYPE_VIDEO_AVC);
  79. int[] colorFormats = codecCapabilities.colorFormats;
  80. for (int colorFormat : colorFormats) {
  81. switch (colorFormat) {
  82. case MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV411Planar:
  83. case MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV411PackedPlanar:
  84. case MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420Planar:
  85. case MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420PackedPlanar:
  86. case MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420SemiPlanar:
  87. case MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420PackedSemiPlanar:
  88. Log.d("TAG", "支持的格式::" + colorFormat);
  89. break;
  90. }
  91. }
  92. }
  93. }
  94. }
  95. }
  96. //根据什么Size来展示PreView
  97. startPreviewSession(previewSize);
  98. }
  99. };
  100. public void startPreviewSession(Size size) {
  101. YYLogUtils.w("真正的预览Size,width:" + size.getWidth() + " height:" + size.getHeight(), "Camera2ImageReaderProvider");
  102. try {
  103. releaseCameraSession(session);
  104. mPreviewBuilder = mCameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW);
  105. List<Surface> outputs = new ArrayList<>();
  106. for (AspectTextureView aspectTextureView : mTextureViews) {
  107. //设置预览大小与展示的裁剪模式
  108. aspectTextureView.setScaleType(AspectInterface.ScaleType.FIT_CENTER);
  109. aspectTextureView.setSize(size.getHeight(), size.getWidth());
  110. SurfaceTexture surfaceTexture = aspectTextureView.getSurfaceTexture();
  111. surfaceTexture.setDefaultBufferSize(size.getWidth(), size.getHeight());
  112. Surface previewSurface = new Surface(surfaceTexture);
  113. mPreviewBuilder.addTarget(previewSurface);
  114. outputs.add(previewSurface);
  115. }
  116. //这里的回调监听
  117. mImageReader = ImageReader.newInstance(size.getWidth(), size.getHeight(), ImageFormat.YUV_420_888, 10);
  118. mImageReader.setOnImageAvailableListener(mOnImageAvailableListener, mCameraHandler);
  119. Surface readerSurface = mImageReader.getSurface();
  120. mPreviewBuilder.addTarget(readerSurface);
  121. outputs.add(readerSurface);
  122. mCameraDevice.createCaptureSession(outputs, mStateCallBack, mCameraHandler);
  123. } catch (CameraAccessException e) {
  124. e.printStackTrace();
  125. }
  126. }
  127. private ImageReader.OnImageAvailableListener mOnImageAvailableListener = new ImageReader.OnImageAvailableListener() {
  128. @Override
  129. public void onImageAvailable(ImageReader reader) {
  130. Image image = reader.acquireLatestImage();
  131. if (image == null) {
  132. return;
  133. }
  134. if (mCameraInfoListener != null) {
  135. mCameraInfoListener.onFrameCannback(image);
  136. }
  137. image.close();
  138. }
  139. };
  140. private final CameraCaptureSession.StateCallback mStateCallBack = new Camera2SimpleInterface.SimpleStateCallback() {
  141. @Override
  142. public void onConfigured(CameraCaptureSession session) {
  143. try {
  144. Camera2ImageReaderProvider.this.session = session;
  145. //设置拍照前持续自动对焦
  146. mPreviewBuilder.set(CaptureRequest.CONTROL_AF_MODE, CaptureRequest.CONTROL_AF_MODE_CONTINUOUS_PICTURE);
  147. // mPreviewBuilder.set(CaptureRequest.JPEG_ORIENTATION, 90);
  148. CaptureRequest request = mPreviewBuilder.build();
  149. session.setRepeatingRequest(request, null, mCameraHandler);
  150. } catch (CameraAccessException e) {
  151. e.printStackTrace();
  152. }
  153. }
  154. };
  155. public void closeCamera() {
  156. releaseCamera();
  157. if (mImageReader != null) {
  158. mImageReader.close();
  159. }
  160. if (mCameraDevice != null) {
  161. mCameraDevice.close();
  162. }
  163. }
  164. }

关于  Camera2 的尺寸,其实我们要了解的是三种尺寸,当前屏幕尺寸,当前 textureView 尺寸,以及当前预览的尺寸。

比较容易混淆的就是 textureView 尺寸和预览的尺寸,一个是显示控件的尺寸,一个是 Camera 支持的预览尺寸,他们的大小可能相同,但更多的可能是不同,宽高比例也可能不同 ,所以我们才需要居中裁剪或居中展示的方式来预览画面,上文关于这三种尺寸的定义与使用都有详细的注释。

上文的代码中还省略了一些非关键的类与回调,有兴趣可以去源码中查看。(本文结尾有链接)

如何使用? 定义完成之后我们就可以来一个简单的使用示例:

这里以 H264 的编码为例:

  1. fun setupCamera(activity: Activity, container: ViewGroup) {
  2. file = File(CommUtils.getContext().externalCacheDir, "${System.currentTimeMillis()}-record.h264")
  3. if (!file.exists()) {
  4. file.createNewFile()
  5. }
  6. if (!file.isDirectory) {
  7. outputStream = FileOutputStream(file, true)
  8. }
  9. val textureView = AspectTextureView(activity)
  10. textureView.layoutParams = ViewGroup.LayoutParams(ViewGroup.LayoutParams.MATCH_PARENT, ViewGroup.LayoutParams.MATCH_PARENT)
  11. mCamera2Provider = Camera2ImageReaderProvider(activity)
  12. mCamera2Provider?.initTexture(textureView)
  13. mCamera2Provider?.setCameraInfoListener(object :
  14. BaseCommonCameraProvider.OnCameraInfoListener {
  15. override fun getBestSize(outputSizes: Size?) {
  16. mPreviewSize = outputSizes
  17. }
  18. override fun onFrameCannback(image: Image) {
  19. if (isRecording) {
  20. // 使用C库获取到I420格式,对应 COLOR_FormatYUV420Planar
  21. val yuvFrame = yuvUtils.convertToI420(image)
  22. // 与MediaFormat的编码格式宽高对应
  23. val yuvFrameRotate = yuvUtils.rotate(yuvFrame, 90)
  24. // 用于测试RGB图片的回调预览
  25. bitmap = Bitmap.createBitmap(yuvFrameRotate.width, yuvFrameRotate.height, Bitmap.Config.ARGB_8888)
  26. yuvUtils.yuv420ToArgb(yuvFrameRotate, bitmap!!)
  27. mBitmapCallback?.invoke(bitmap)
  28. // 旋转90度之后的I420格式添加到同步队列
  29. val bytesFromImageAsType = yuvFrameRotate.asArray()
  30. //使用Java工具类转换Image对象为NV21格式,对应 COLOR_FormatYUV420Flexible
  31. // val bytesFromImageAsType = getBytesFromImageAsType(image, Camera2ImageUtils.YUV420SP)
  32. originVideoDataList.offer(bytesFromImageAsType)
  33. }
  34. }
  35. override fun initEncode() {
  36. mediaCodecEncodeToH264()
  37. }
  38. override fun onSurfaceTextureAvailable(surfaceTexture: SurfaceTexture?, width: Int, height: Int) {
  39. this@VideoH264RecoderUtils.surfaceTexture = surfaceTexture
  40. }
  41. })
  42. container.addView(textureView)
  43. }

由于这里返回的是 YUV420 ,所以这里我们需要转换为 I420 或 NV21 格式。这里我分别展示了 C 库转为 I420(YUV420) , Java 库转换为 NV21(YUV420SP) 格式。

然后我们就能把 I420 与 NV21 这两种我们常见的格式编码为 H264 文件,而怎么编码反倒不是今天的主题了,只能说方式太多了,这里先略过。

只需要这一个 setupCamera 方法,就能完成绑定,我们在 Activity 中使用即可一行代码设置进去就可以了。

  1. class RecoderVideo1Activity : BaseActivity() {
  2. override fun init() {
  3. val flContainer = findViewById<FrameLayout>(R.id.fl_container)
  4. val videoRecodeUtils = VideoH264RecoderUtils()
  5. videoRecodeUtils.setupCamera(this, container)
  6. }
  7. }

效果:

d9e981e44cc0aee889ff92aa669a46b6.jpeg

四、CameraX 的使用与封装

相比前面两种 API 的使用,CameraX 的使用就简单多了,网上也很多 CameraX 的使用教程,这里我们就快速过一下。

我们初始化 PreviewView 对象,然后添加到我们指定的容器中。

  1. public View initCamera(Context context) {
  2. mPreviewView = new PreviewView(context);
  3. mContext = context;
  4. mPreviewView.setScaleType(PreviewView.ScaleType.FIT_CENTER);
  5. mPreviewView.setLayoutParams(new ViewGroup.LayoutParams(ViewGroup.LayoutParams.MATCH_PARENT, ViewGroup.LayoutParams.MATCH_PARENT));
  6. mPreviewView.getViewTreeObserver().addOnGlobalLayoutListener(new ViewTreeObserver.OnGlobalLayoutListener() {
  7. @Override
  8. public void onGlobalLayout() {
  9. if (mPreviewView.isShown()) {
  10. startCamera();
  11. }
  12. mPreviewView.getViewTreeObserver().removeOnGlobalLayoutListener(this);
  13. }
  14. });
  15. return mPreviewView;
  16. }

其次我们就可以启动相机并绑定到生命周期。

  1. private void startCamera() {
  2. //获取屏幕的分辨率
  3. DisplayMetrics displayMetrics = new DisplayMetrics();
  4. mPreviewView.getDisplay().getRealMetrics(displayMetrics);
  5. //获取宽高比
  6. int screenAspectRatio = aspectRatio(displayMetrics.widthPixels, displayMetrics.heightPixels);
  7. int rotation = mPreviewView.getDisplay().getRotation();
  8. ListenableFuture<ProcessCameraProvider> cameraProviderFuture = ProcessCameraProvider.getInstance(mContext);
  9. cameraProviderFuture.addListener(() -> {
  10. try {
  11. //获取相机信息
  12. mCameraProvider = cameraProviderFuture.get();
  13. //镜头选择
  14. mLensFacing = getLensFacing();
  15. mCameraSelector = new CameraSelector.Builder().requireLensFacing(mLensFacing).build();
  16. //预览对象
  17. Preview.Builder previewBuilder = new Preview.Builder()
  18. .setTargetAspectRatio(screenAspectRatio)
  19. .setTargetRotation(rotation);
  20. Preview preview = previewBuilder.build();
  21. preview.setSurfaceProvider(mPreviewView.getSurfaceProvider());
  22. //录制视频对象
  23. mVideoCapture = new VideoCapture.Builder()
  24. .setTargetAspectRatio(screenAspectRatio) //设置高宽比
  25. .setAudioRecordSource(MediaRecorder.AudioSource.MIC) //设置音频源麦克风
  26. .setTargetRotation(rotation)
  27. //视频帧率
  28. .setVideoFrameRate(30)
  29. //bit率
  30. .setBitRate(3 * 1024 * 1024)
  31. .build();
  32. // ImageAnalysis imageAnalysis = new ImageAnalysis.Builder()
  33. // .setTargetAspectRatio(screenAspectRatio)
  34. // .setTargetRotation(rotation)
  35. // .setBackpressureStrategy(ImageAnalysis.STRATEGY_KEEP_ONLY_LATEST)
  36. // .build();
  37. //
  38. // // 在每一帧上应用颜色矩阵
  39. // imageAnalysis.setAnalyzer(Executors.newSingleThreadExecutor(), new MyAnalyzer(mContext));
  40. //开启CameraX
  41. mCameraProvider.unbindAll();
  42. if (mContext instanceof FragmentActivity) {
  43. FragmentActivity fragmentActivity = (FragmentActivity) mContext;
  44. mCameraProvider.bindToLifecycle(fragmentActivity, mCameraSelector, preview, mVideoCapture/*,imageAnalysis*/);
  45. }
  46. } catch (ExecutionException | InterruptedException e) {
  47. e.printStackTrace();
  48. }
  49. }, ContextCompat.getMainExecutor(mContext));
  50. }

这里我们使用了四个用例,预览,拍照,录制,分析。但是我们注释了分析用例,因为录制与分析这两个用例是不能同时使用的。

录制视频的话,我们就能直接使用录制用例的方法就能录制,这是最简单的。

  1. public void startCameraRecord() {
  2. if (mVideoCapture == null) return;
  3. VideoCapture.OutputFileOptions outputFileOptions = new VideoCapture.OutputFileOptions.Builder(getOutFile()).build();
  4. mVideoCapture.startRecording(outputFileOptions, mExecutorService, new VideoCapture.OnVideoSavedCallback() {
  5. @Override
  6. public void onVideoSaved(@NonNull VideoCapture.OutputFileResults outputFileResults) {
  7. YYLogUtils.w("视频保存成功,outputFileResults:" + outputFileResults.getSavedUri());
  8. if (mCameraCallback != null) mCameraCallback.takeSuccess();
  9. }
  10. @Override
  11. public void onError(int videoCaptureError, @NonNull String message, @Nullable Throwable cause) {
  12. YYLogUtils.e(message);
  13. }
  14. });
  15. }

内部一些镜头选择,比例选择等代码省略了,在下面的工具类中会给出。

可以看到虽然 CameraX 的使用已经是够简单的了,但是由于都是一些重复的代码,我们还是可以对其做一些封装,代码如下:

  1. class CameraXController {
  2. private var mCameraProvider: ProcessCameraProvider? = null
  3. private var mLensFacing = 0
  4. private var mCameraSelector: CameraSelector? = null
  5. private var mVideoCapture: VideoCapture? = null
  6. private var mCameraCallback: ICameraCallback? = null
  7. private val mExecutorService = Executors.newSingleThreadExecutor()
  8. //初始化 CameraX 相关配置
  9. fun setUpCamera(context: Context, surfaceProvider: Preview.SurfaceProvider) {
  10. //获取屏幕的分辨率与宽高比
  11. val displayMetrics = context.resources.displayMetrics
  12. val screenAspectRatio = aspectRatio(displayMetrics.widthPixels, displayMetrics.heightPixels)
  13. val cameraProviderFuture = ProcessCameraProvider.getInstance(context)
  14. cameraProviderFuture.addListener({
  15. mCameraProvider = cameraProviderFuture.get()
  16. //镜头选择
  17. mLensFacing = lensFacing
  18. mCameraSelector = CameraSelector.Builder().requireLensFacing(mLensFacing).build()
  19. //预览对象
  20. val preview: Preview = Preview.Builder()
  21. .setTargetAspectRatio(screenAspectRatio)
  22. .build()
  23. preview.setSurfaceProvider(surfaceProvider)
  24. //录制视频对象
  25. mVideoCapture = VideoCapture.Builder()
  26. .setTargetAspectRatio(screenAspectRatio)
  27. .setAudioRecordSource(MediaRecorder.AudioSource.MIC) //设置音频源麦克风
  28. //视频帧率
  29. .setVideoFrameRate(30)
  30. //bit率
  31. .setBitRate(3 * 1024 * 1024)
  32. .build()
  33. //绑定到页面
  34. mCameraProvider?.unbindAll()
  35. val camera = mCameraProvider?.bindToLifecycle(
  36. context as LifecycleOwner,
  37. mCameraSelector!!,
  38. mVideoCapture,
  39. preview
  40. )
  41. val cameraInfo = camera?.cameraInfo
  42. val cameraControl = camera?.cameraControl
  43. }, ContextCompat.getMainExecutor(context))
  44. }
  1. //根据屏幕宽高比设置预览比例为4:3还是16:9
  2. private fun aspectRatio(widthPixels: Int, heightPixels: Int): Int {
  3. val previewRatio = Math.max(widthPixels, heightPixels).toDouble() / Math.min(widthPixels, heightPixels).toDouble()
  4. return if (Math.abs(previewRatio - 4.0 / 3.0) <= Math.abs(previewRatio - 16.0 / 9.0)) {
  5. AspectRatio.RATIO_4_3
  6. } else {
  7. AspectRatio.RATIO_16_9
  8. }
  9. }
  10. //优先选择哪一个摄像头镜头
  11. private val lensFacing: Int
  12. private get() {
  13. if (hasBackCamera()) {
  14. return CameraSelector.LENS_FACING_BACK
  15. }
  16. return if (hasFrontCamera()) {
  17. CameraSelector.LENS_FACING_FRONT
  18. } else -1
  19. }
  20. //是否有后摄像头
  21. private fun hasBackCamera(): Boolean {
  22. if (mCameraProvider == null) {
  23. return false
  24. }
  25. try {
  26. return mCameraProvider!!.hasCamera(CameraSelector.DEFAULT_BACK_CAMERA)
  27. } catch (e: CameraInfoUnavailableException) {
  28. e.printStackTrace()
  29. }
  30. return false
  31. }
  32. //是否有前摄像头
  33. private fun hasFrontCamera(): Boolean {
  34. if (mCameraProvider == null) {
  35. return false
  36. }
  37. try {
  38. return mCameraProvider!!.hasCamera(CameraSelector.DEFAULT_BACK_CAMERA)
  39. } catch (e: CameraInfoUnavailableException) {
  40. e.printStackTrace()
  41. }
  42. return false
  43. }
  44. // 开始录制
  45. fun startCameraRecord(outFile: File) {
  46. mVideoCapture ?: return
  47. val outputFileOptions: VideoCapture.OutputFileOptions = VideoCapture.OutputFileOptions.Builder(outFile).build()
  48. mVideoCapture!!.startRecording(outputFileOptions, mExecutorService, object : VideoCapture.OnVideoSavedCallback {
  49. override fun onVideoSaved(outputFileResults: VideoCapture.OutputFileResults) {
  50. YYLogUtils.w("视频保存成功,outputFileResults:" + outputFileResults.savedUri)
  51. mCameraCallback?.takeSuccess()
  52. }
  53. override fun onError(videoCaptureError: Int, message: String, cause: Throwable?) {
  54. YYLogUtils.e(message)
  55. }
  56. })
  57. }
  58. // 停止录制
  59. fun stopCameraRecord(cameraCallback: ICameraCallback?) {
  60. mCameraCallback = cameraCallback
  61. mVideoCapture?.stopRecording()
  62. }
  63. // 释放资源
  64. fun releseAll() {
  65. mVideoCapture?.stopRecording()
  66. mExecutorService.shutdown()
  67. mCameraProvider?.unbindAll()
  68. mCameraProvider?.shutdown()
  69. mCameraProvider = null
  70. }
  71. }

对与封装之后的工具类来说,使用起来就更简单了:

  1. override fun initCamera(context: Context): View {
  2. mPreviewView = PreviewView(context)
  3. mContext = context
  4. mPreviewView.layoutParams = ViewGroup.LayoutParams(ViewGroup.LayoutParams.MATCH_PARENT, ViewGroup.LayoutParams.MATCH_PARENT)
  5. mPreviewView.viewTreeObserver.addOnGlobalLayoutListener(object : OnGlobalLayoutListener {
  6. override fun onGlobalLayout() {
  7. if (mPreviewView.isShown) {
  8. startCamera()
  9. }
  10. mPreviewView.viewTreeObserver.removeOnGlobalLayoutListener(this)
  11. }
  12. })
  13. return mPreviewView
  14. }
  15. private fun startCamera() {
  16. cameraXController.setUpCamera(mContext, mPreviewView.surfaceProvider)
  17. }

如果想开启视频录制:

  1. override fun startCameraRecord() {
  2. cameraXController.startCameraRecord(outFile)
  3. }

同时 CameraX 自身就自带裁剪功能,也不需要我们自定义 TextureView 实现了。

居中显示与居中裁剪的效果如下:

2212dd87c897d8834ecdf07a17754934.jpeg

7278856531b9bb360e36aa81dab4f7d1.jpeg

太好了,使用起来真的是超级简单,兼容性也很好,爱了爱了。

总结

本文对于常见的三种 Camera API 做了示例代码及其对于分别进行封装。对于使用哪一种 Camera 实现效果,大家可以自行选择。

如果我想要回调 NV21 的数据,其实我会选择 Camera1 ,因为它本身返回就是这个格式不需要转换,如果我想要回调 I420 格式,我会选择 Camera2 或 CameraX ,反正需要转换,他们更方便,特别是配合 libyuv 库,效率会更高。

如果想实现录制视频的功能呢?如果是普通的录制我会选择 CameraX 自带的录制视频功能,更加的简单方便。如果是特效的录制,那么我会选择 Camera2 或 CameraX 配合 GLSurfaceView 实现。因为他们性能更好。

是的摊牌了,我就是喜欢 CameraX ,别的 CameraX 能实现的不能实现的,它都能实现,使用还简单,兼容性还更高,不得不爱。(一些个人的观点与喜好大家可以无视,反正三个 API 都能用,自己选择即可)

本文中如果贴出的代码有不全的,可以点击源码打开项目进行查看,传送门。同时你也可以关注我的开源项目,后续会持续更新。

关于如何录制视频的问题,有哪几种方法?,优缺点? 后期会单独再出文章做出介绍。

惯例,我如有讲解不到位或错漏的地方,希望同学们可以指出。有疑问也可以评论区交流。虽然我也不一定会 - -|

声明:本文内容由网友自发贡献,不代表【wpsshop博客】立场,版权归原作者所有,本站不承担相应法律责任。如您发现有侵权的内容,请联系我们。转载请注明出处:https://www.wpsshop.cn/w/花生_TL007/article/detail/639515
推荐阅读
相关标签