Android平台RTMP|RTSP播放器如何回调YUV或RGB数据?

技术选型

我们知道,Android平台一般RTMP|RTSP播放器通常不直接提供回调YUV或RGB数据的功能。如果播放端有视觉分析或类似的需求,需要播放端,能支持YUV或ARG的数据回调,一般来说,可参考的方法如下:

1. 使用FFmpeg和JNI

FFmpeg是一个强大的多媒体处理库,它支持解码视频并提取帧数据。你可以通过JNI在Android的Java层调用C/C++层的FFmpeg库来解码RTSP流,并获取YUV或RGB数据。

步骤

  • 将FFmpeg库集成到你的Android项目中。
  • 使用FFmpeg的API来设置RTSP流的解码器。
  • 解码视频帧,并将YUV或RGB数据从解码器传输到Java层。

2. 使用OpenGL ES

如果你使用的是OpenGL ES进行视频渲染,你可以在着色器(Shader)中处理视频帧的YUV数据,并将其转换为RGB格式(如果需要)。然而,这种方法并不会直接回调YUV或RGB数据到Java层,而是允许你在GPU级别上操作这些数据。

3. 使用MediaCodec和ImageReader

从Android 5.0(API 级别 21)开始,MediaCodec支持与ImageReader一起使用,以捕获解码后的视频帧作为Image对象。这些Image对象可以直接访问YUV或RGB数据(取决于配置)。

步骤

  • 配置MediaCodec以使用ImageReader作为输出。
  • 解码RTSP流并捕获解码后的帧。
  • ImageReaderImage对象中读取YUV或RGB数据。

4. 使用第三方RTMP|RTSP播放器直接回调数据

以大牛直播SDK的RTMP|RTSP播放模块为例,我们是可以直接设置YUV或RGB数据回调,并提供相关调用示例:

java 复制代码
btnStartStopPlayback.setOnClickListener(new Button.OnClickListener() {

	// @Override
	public void onClick(View v) {

		if (isPlaying) {
			Log.i(TAG, "Stop playback stream++");

			int iRet = libPlayer.SmartPlayerStopPlay(playerHandle);

			if (iRet != 0) {
				Log.e(TAG, "Call SmartPlayerStopPlay failed..");
				return;
			}

			btnHardwareDecoder.setEnabled(true);
			btnLowLatency.setEnabled(true);

			if (!isRecording) {
				btnPopInputUrl.setEnabled(true);
				btnPopInputKey.setEnabled(true);
				btnSetPlayBuffer.setEnabled(true);
				btnFastStartup.setEnabled(true);

				btnRecoderMgr.setEnabled(true);
				libPlayer.SmartPlayerClose(playerHandle);
				playerHandle = 0;
			}

			isPlaying = false;
			btnStartStopPlayback.setText("开始播放 ");

			if (is_enable_hardware_render_mode && sSurfaceView != null) {
				sSurfaceView.setVisibility(View.GONE);
				sSurfaceView.setVisibility(View.VISIBLE);
			}

			Log.i(TAG, "Stop playback stream--");
		} else {
			Log.i(TAG, "Start playback stream++");

			if (!isRecording) {
				InitAndSetConfig();
			}

			// 如果第二个参数设置为null,则播放纯音频
			libPlayer.SmartPlayerSetSurface(playerHandle, sSurfaceView);

			libPlayer.SmartPlayerSetRenderScaleMode(playerHandle, 1);

			//int render_format = 1;
			//libPlayer.SmartPlayerSetSurfaceRenderFormat(playerHandle, render_format);

			//int is_enable_anti_alias = 1;
			//libPlayer.SmartPlayerSetSurfaceAntiAlias(playerHandle, is_enable_anti_alias);

			if (isHardwareDecoder && is_enable_hardware_render_mode) {
				libPlayer.SmartPlayerSetHWRenderMode(playerHandle, 1);
			}

			// External Render test
			//libPlayer.SmartPlayerSetExternalRender(playerHandle, new RGBAExternalRender(imageSavePath));
			libPlayer.SmartPlayerSetExternalRender(playerHandle, new I420ExternalRender(imageSavePath));

			libPlayer.SmartPlayerSetUserDataCallback(playerHandle, new UserDataCallback());
			//libPlayer.SmartPlayerSetSEIDataCallback(playerHandle, new SEIDataCallback());

			libPlayer.SmartPlayerSetAudioOutputType(playerHandle, 1);

			if (isMute) {
				libPlayer.SmartPlayerSetMute(playerHandle, isMute ? 1
						: 0);
			}

			if (isHardwareDecoder) {
				int isSupportHevcHwDecoder = libPlayer.SetSmartPlayerVideoHevcHWDecoder(playerHandle, 1);

				int isSupportH264HwDecoder = libPlayer
						.SetSmartPlayerVideoHWDecoder(playerHandle, 1);

				Log.i(TAG, "isSupportH264HwDecoder: " + isSupportH264HwDecoder + ", isSupportHevcHwDecoder: " + isSupportHevcHwDecoder);
			}

			libPlayer.SmartPlayerSetLowLatencyMode(playerHandle, isLowLatency ? 1
					: 0);

			libPlayer.SmartPlayerSetFlipVertical(playerHandle, is_flip_vertical ? 1 : 0);

			libPlayer.SmartPlayerSetFlipHorizontal(playerHandle, is_flip_horizontal ? 1 : 0);

			libPlayer.SmartPlayerSetRotation(playerHandle, rotate_degrees);

			libPlayer.SmartPlayerSetAudioVolume(playerHandle, curAudioVolume);

			int iPlaybackRet = libPlayer
					.SmartPlayerStartPlay(playerHandle);

			if (iPlaybackRet != 0) {
				Log.e(TAG, "Call SmartPlayerStartPlay failed..");
				return;
			}

			btnStartStopPlayback.setText("停止播放 ");

			btnPopInputUrl.setEnabled(false);
			btnPopInputKey.setEnabled(false);
			btnHardwareDecoder.setEnabled(false);
			btnSetPlayBuffer.setEnabled(false);
			btnLowLatency.setEnabled(false);
			btnFastStartup.setEnabled(false);
			btnRecoderMgr.setEnabled(false);

			isPlaying = true;
			Log.i(TAG, "Start playback stream--");
		}
	}
});

对应的设置如下:

java 复制代码
// External Render test
libPlayer.SmartPlayerSetExternalRender(playerHandle, new RGBAExternalRender(imageSavePath));
libPlayer.SmartPlayerSetExternalRender(playerHandle, new I420ExternalRender(imageSavePath));

如果是RGBA数据,处理如下:

java 复制代码
/*
 * RGBA数据回调处理
 * Author: daniusdk.com
 * WeChat: xinsheng120
 */   
private static class RGBAExternalRender implements NTExternalRender {
        // public static final int NT_FRAME_FORMAT_RGBA = 1;
        // public static final int NT_FRAME_FORMAT_ABGR = 2;
        // public static final int NT_FRAME_FORMAT_I420 = 3;

        private final String image_path_;
        private long last_save_image_time_ms_;

        private int width_;
        private int height_;
        private int row_bytes_;

        private ByteBuffer rgba_buffer_;

        public RGBAExternalRender(String image_path) {
            this.image_path_ = image_path;
        }

        @Override
        public int getNTFrameFormat() {
            Log.i(TAG, "RGBAExternalRender::getNTFrameFormat return " + NT_FRAME_FORMAT_RGBA);
            return NT_FRAME_FORMAT_RGBA;
        }

        @Override
        public void onNTFrameSizeChanged(int width, int height) {
            width_ = width;
            height_ = height;

            row_bytes_ = width_ * 4;
            rgba_buffer_ = ByteBuffer.allocateDirect(row_bytes_ * height_);

            Log.i(TAG, "RGBAExternalRender::onNTFrameSizeChanged width_:" + width_ + " height_:" + height_);
        }

        @Override
        public ByteBuffer getNTPlaneByteBuffer(int index) {
            if (index == 0)
                return rgba_buffer_;

            Log.e(TAG, "RGBAExternalRender::getNTPlaneByteBuffer index error:" + index);
            return null;
        }

        @Override
        public int getNTPlanePerRowBytes(int index) {
            if (index == 0)
                return row_bytes_;

            Log.e(TAG, "RGBAExternalRender::getNTPlanePerRowBytes index error:" + index);
            return 0;
        }

        public void onNTRenderFrame(int width, int height, long timestamp) {
            if (rgba_buffer_ == null)
                return;

            rgba_buffer_.rewind();

            // copy buffer

            // test
            // byte[] test_buffer = new byte[16];
            // rgba_buffer_.get(test_buffer);

           Log.i(TAG, "RGBAExternalRender:onNTRenderFrame " + width + "*" + height + ", t:" + timestamp);

            // Log.i(TAG, "RGBAExternalRender:onNTRenderFrame rgba:" +
            // bytesToHexString(test_buffer));
        }
    }

如果是I420数据:

java 复制代码
/*
 *YUV数据回调处理
 * Author: daniusdk.com
 * WeChat: xinsheng120
 */   
private static class I420ExternalRender implements NTExternalRender {
        // public static final int NT_FRAME_FORMAT_RGBA = 1;
        // public static final int NT_FRAME_FORMAT_ABGR = 2;
        // public static final int NT_FRAME_FORMAT_I420 = 3;

        private final String image_path_;
        private long last_save_image_time_ms_;

        private int width_;
        private int height_;

        private int y_row_bytes_;
        private int u_row_bytes_;
        private int v_row_bytes_;

        private ByteBuffer y_buffer_;
        private ByteBuffer u_buffer_;
        private ByteBuffer v_buffer_;

        public I420ExternalRender(String image_path) {
            this.image_path_ = image_path;
        }

        @Override
        public int getNTFrameFormat() {
            Log.i(TAG, "I420ExternalRender::getNTFrameFormat return " + NT_FRAME_FORMAT_I420);
            return NT_FRAME_FORMAT_I420;
        }

        @Override
        public void onNTFrameSizeChanged(int width, int height) {
            width_ = width;
            height_ = height;

            y_row_bytes_ = width;
            u_row_bytes_ = (width+1)/2;
            v_row_bytes_ = (width+1)/2;

            y_buffer_ = ByteBuffer.allocateDirect(y_row_bytes_*height_);
            u_buffer_ = ByteBuffer.allocateDirect(u_row_bytes_*((height_ + 1) / 2));
            v_buffer_ = ByteBuffer.allocateDirect(v_row_bytes_*((height_ + 1) / 2));

            Log.i(TAG, "I420ExternalRender::onNTFrameSizeChanged width_="
                    + width_ + " height_=" + height_ + " y_row_bytes_="
                    + y_row_bytes_ + " u_row_bytes_=" + u_row_bytes_
                    + " v_row_bytes_=" + v_row_bytes_);
        }

        @Override
        public ByteBuffer getNTPlaneByteBuffer(int index) {
            switch (index) {
                case 0:
                    return y_buffer_;
                case 1:
                    return u_buffer_;
                case 2:
                    return v_buffer_;
                default:
                    Log.e(TAG, "I420ExternalRender::getNTPlaneByteBuffer index error:" + index);
                    return null;
            }
        }

        @Override
        public int getNTPlanePerRowBytes(int index) {
            switch (index) {
                case 0:
                    return y_row_bytes_;
                case 1:
                    return u_row_bytes_;
                case 2:
                    return v_row_bytes_;
                default:
                    Log.e(TAG, "I420ExternalRender::getNTPlanePerRowBytes index error:" + index);
                    return 0;
            }
        }

        public void onNTRenderFrame(int width, int height, long timestamp) {
            if (null == y_buffer_ || null == u_buffer_ || null == v_buffer_)
                return;

            y_buffer_.rewind();
            u_buffer_.rewind();
            v_buffer_.rewind();

            Log.i(TAG, "I420ExternalRender::onNTRenderFrame " + width + "*" + height + ", t:" + timestamp);

        }
    }

总结

无论使用哪种方法,处理视频帧数据都可能是计算密集型的,特别是在高清视频或高帧率视频的情况下。确保你的应用能够处理这些性能要求,并考虑在后台线程中执行解码和数据处理操作。确保回调数据,尽可能小的占用资源。以上抛砖引玉,感兴趣的开发者,可以单独跟我沟通讨论。

相关推荐
Kapaseker4 分钟前
实战 Compose 中的 IntrinsicSize
android·kotlin
xq95271 小时前
Andorid Google 登录接入文档
android
黄林晴2 小时前
告别 Modifier 地狱,Compose 样式系统要变天了
android·android jetpack
冬奇Lab15 小时前
Android触摸事件分发、手势识别与输入优化实战
android·源码阅读
城东米粉儿18 小时前
Android MediaPlayer 笔记
android
Jony_18 小时前
Android 启动优化方案
android
阿巴斯甜18 小时前
Android studio 报错:Cause: error=86, Bad CPU type in executable
android
张小潇18 小时前
AOSP15 Input专题InputReader源码分析
android
_小马快跑_1 天前
Kotlin | 协程调度器选择:何时用CoroutineScope配置,何时用launch指定?
android
_小马快跑_1 天前
Kotlin | 从SparseArray、ArrayMap的set操作符看类型检查的不同
android