Video Call
  • iOS : Objective-C
  • Android
  • Web
  • Flutter
  • React Native
  • Electron
  • Unity3D
  • Cocos Creator
  • Windows
  • macOS
  • Linux
  • Overview
  • Develop your app
    • Integrate the SDK
    • Implement a basic video call
    • Enhance basic feature
      • Use Tokens for authentication
      • Config your video based on scenes
      • Check the room connection status
      • Set up common video config
      • Set up common audio config
  • Best practices
    • Implement a video call for multiple users
  • Upgrade using advanced features
    • Advanced features
      • Configure the video
        • Watermark the video/Take snapshots
        • Improve your appearance in the video
        • Beautify & Change the voice
        • Configure video codec
        • Output the video in H.265
      • Improve video quality
        • Configure bandwidth management
        • Test network and devices in advance
        • Visualize the sound level
        • Monitor streaming quality
      • Message signaling
        • Convey extra information using SEI
        • Broadcast real-time messages to a room
        • Quotas and limits
      • Play media files
        • Play media files
        • Play sound effects
      • Share the screen
      • Mix the video streams
      • Publish multiple video streams
      • Encrypt the video streams
      • Record video media data
    • Distincitve features
      • Join multiple rooms
      • Customize the video and audio
      • Set the voice hearing range
      • Use the bit mask
      • Play streams via URL
      • Play a transparent gift special effect
  • Upgrade using Add-on
  • Resources & Reference
    • SDK
    • Sample codes
    • API reference
      • Client APIs
      • Server APIs
    • Debugging
      • Error codes
      • Logging/Version number
    • FAQs
    • Key concepts
  • Documentation
  • Video Call
  • Upgrade using advanced features
  • Distincitve features
  • Customize the video and audio
  • Customize how the video renders

Customize how the video renders

Last updated:2023-11-14 14:43

Introduction

When the ZEGOCLOUD SDK's built-in video rendering module cannot meet your app's requirement, the SDK allows you to customize the video rendering process. With custom video rendering enabled, the SDK will send out the video frame data captured locally or received from a remote stream so that you can use the data to perform the rendering by yourself.

Listed below are some scenarios where enabling custom video rendering is recommended:

  • Your app uses a cross-platform framework that requires a complex UI hierarchy (e.g., Qt) or game engines like Unity, Unreal Engine, and Cocos.
  • Your app requires special processing of the video frame data for rendering.

Prerequisites

Before enabling this feature, please make sure:

  • ZEGO Express SDK has been integrated into the project to implement basic real-time audio and video functions. For details, please refer to Quick start .
  • A project has been created in ZEGOCLOUD Console and applied for a valid AppID and AppSign. For details, please refer to Console - Project Information .

Implementation process

The process of custom video rendering is as follows:

  1. Create a "ZegoExpressEngine" instance.
  2. Enable the feature of custom video rendering. Set up the event handler for custom video rendering callbacks, and implement the callback handler methods according to your app's requirements.
  3. Log in to a room and start previewing the locally captured video or playing a remote stream. The SDK will then start sending out video frame data of the local or remote stream through the corresponding callback for custom video rendering.

Refer to the API call sequence diagram below to implement custom video rendering in your project:

Enable the Feature of Custom Video Rendering

First, create a ZegoCustomVideoRenderConfig object and configure its attributes according to your actual rendering requirements.

  • Set the "bufferType" (ZegoVideoBufferType) attribute to specify the video frame data type for custom rendering.

  • Set the "frameFormatSeries" (ZegoVideoFrameFormatSeries) attribute to specify the color space (RGB or YUV) of the video frame data for custom rendering.

  • Set the "enableEngineRender" attribute to determine whether the SDK's built-in renderer is also enabled. If it is set to "Yes", the SDK's built-in renderer will render the stream to the View passed to startPreview or startPlayingStream. If it is set to " NO", the SDK's built-in renderer will not render the stream.

Then, call enableCustomVideoRender to enable custom video rendering.

ZegoCustomVideoRenderConfig *renderConfig = [[ZegoCustomVideoRenderConfig alloc] init];
// Send out video frame data in CVPixelBuffer format.
renderConfig.bufferType = ZegoVideoBufferTypeCVPixelBuffer;
// Specify the color space of video frame data as RGB.
renderConfig.frameFormatSeries = ZegoVideoFrameFormatSeriesRGB;
// Enable the SDK's built-in renderer.
renderConfig.enableEngineRender = YES;

[ZegoExpressEngine sharedEngine] enableCustomVideoRender:YES config:renderConfig];

Set up the Custom Video Rendering Callback Handler

Call setCustomVideoRenderHandler to set up an event handler (conforming to the protocol ZegoCustomVideoRenderHandler) to listen for and handle the callbacks related to custom video rendering.

// Set ViewController as the custom video rendering callback handler object, which confirms to the ZegoCustomVideoRenderHandler protocol.
@interface ViewController () <ZegoEventHandler, ZegoCustomVideoRenderHandler>

......

@end
// Set the engine itself as the custom video rendering callback handler object
[[ZegoExpressEngine sharedEngine] setCustomVideoRenderHandler:self];

Implement the handler methods for the callbacks onCapturedVideoFrameRawData and onCapturedVideoFrameCVPixelBuffer. The SDK sends out the video frame data captured locally through either one of these callbacks for the custom rendering of the local preview.

/// If `ZegoCustomVideoRenderConfig.bufferType` is set to `ZegoVideoBufferTypeRawData`, the SDK will send out video frame data captured locally in raw data format through the callback onCapturedVideoFrameRawData. 
- (void)onCapturedVideoFrameRawData:(unsigned char * _Nonnull *)data dataLength:(unsigned int *)dataLength param:(ZegoVideoFrameParam *)param flipMode:(ZegoVideoFlipMode)flipMode {
    NSLog(@"raw data video frame callback. format:%d, width:%f, height:%f, isNeedFlip:%d", (int)param.format, param.size.width, param.size.height, (int)flipMode);
}

/// If `ZegoCustomVideoRenderConfig.bufferType` is set to `ZegoVideoBufferTypeCVPixelBuffer`, the SDK will send out video frame data captured locally  in CVPixelBuffer format through the callback onCapturedVideoFrameCVPixelBuffer. 
- (void)onCapturedVideoFrameCVPixelBuffer:(CVPixelBufferRef)buffer param:(ZegoVideoFrameParam *)param flipMode:(ZegoVideoFlipMode)flipMode {
    NSLog(@"pixel buffer video frame callback. format:%d, width:%f, height:%f, isNeedFlip:%d", (int)param.format, param.size.width, param.size.height, (int)flipMode);
}

Implement the handler methods for the callbacks onRemoteVideoFrameRawData and onRemoteVideoFrameCVPixelBuffer. The SDK sends out the video frame data received from a remote stream through either one of these fallbacks for the custom rendering of the remote stream playback.

/// If `ZegoCustomVideoRenderConfig.bufferType` is set to `ZegoVideoBufferTypeRawData`, the SDK will send out video frame data in raw data format for the custom rendering of remote stream playback through this callback. Use the streamID parameter to identify the stream being processed.
- (void)onRemoteVideoFrameRawData:(unsigned char * _Nonnull * _Nonnull)data dataLength:(unsigned int *)dataLength param:(ZegoVideoFrameParam *)param streamID:(NSString *)streamID {
    NSLog(@"raw data video frame callback. format:%d, width:%f, height:%f", (int)param.format, param.size.width, param.size.height);
}

/// If `ZegoCustomVideoRenderConfig.bufferType` is set to `ZegoVideoBufferTypeCVPixelBuffer`, the SDK will send out video frame data in CVPixelBuffer format for the custom rendering of remote stream playback through this callback. Use the streamID parameter to identify the stream being processed.
- (void)onRemoteVideoFrameCVPixelBuffer:(CVPixelBufferRef)buffer param:(ZegoVideoFrameParam *)param streamID:(NSString *)streamID {
    NSLog(@"pixel buffer video frame callback. format:%d, width:%f, height:%f", (int)param.format, param.size.width, param.size.height);
}

The "flipMode" (ZegoVideoFlipMode) parameter of the local preiview custom rendering callbacks indicates whether you need to flip the video so that the preview matches the video mirroring effect of the "mirrorMode" (ZegoVideoMirrorMode) setting of setVideoMirrorMode.

The "param" (ZegoVideoFrameParam) parameter of the callbacks above describes the attributes of the video frames, which are defined as below:

/// Video Frame Parameters
///
@interface ZegoVideoFrameParam : NSObject

/// The pixel format (e.g., I420, NV12, NV21, RGBA32, etc.) 
@property (nonatomic, assign) ZegoVideoFrameFormat format;

/// The stride values of each color plane (e.g., for RGBA format, only the value of strides[0] needs to be used; while for I420, the values of strides[0,1,2] need to be used).
@property (nonatomic, assign) int *strides;

/// The image size
@property (nonatomic, assign) CGSize size;

@end

The relationship between the stride and the image width is illustrated in the picture below:

img

Trigger the Custom Video Rendering Callback

After logging in to a room, you can start the preview of the local stream or the playback of a remote stream to trigger the custom video rendering callback.

  • Custom Video Rendering for Local Preview

    With custom video rendering enabled, calling startPreview will trigger the callback onCapturedVideoFrameRawData or onCapturedVideoFrameCVPixelBuffer (depending on the "bufferType" set in step 4.1) through which the SDK sends out the video frame data captured locally. You can then use the video data obtained from this callback to render the local preview by yourself.

    If the SDK's built-in renderer is enabled, you need to pass in a View as the "canvas" parameter when calling startPreview; otherwise, set the "canvas" parameter to "nil".

    After the preview is started, you can then start publishing the stream.

    // If the SDK's built-in renderer is enabled, call [startPreview] with a View passed in as the `canvas` parameter.
    ZegoCanvas *previewCanvas = [ZegoCanvas canvasWithView:self.previewView];
    [[ZegoExpressEngine sharedEngine] startPreview:previewCanvas];
    
    // If the SDK's built-in renderer is NOT enabled, call [startPreview] with the `canvas` parameter set to nil.
    [[ZegoExpressEngine sharedEngine] startPreview:nil];
    
    // After the preview is started, the SDK will start sending out video frame data for custom rendering through the callback [onCapturedVideoFrameRawData] or [onCapturedVideoFrameCVPixelBuffer].
    
    // Start publishing the stream.
    [[ZegoExpressEngine sharedEngine] startPublishing:self.streamID];
  • Custom Video Rendering for Remote Stream Playback

    With custom video rendering enabled, calling startPlayingStream will trigger the callback onRemoteVideoFrameRawData or onRemoteVideoFrameCVPixelBuffer (depending on the "bufferType" set in step 4.1) through which the SDK sends out video frame data received from a remote stream. You can then use the video data obtained from this callback to render the remote stream by yourself.

    If the SDK's built-in renderer is enabled, you need to pass in a View as the "canvas" parameter when calling startPlayingStream; otherwise, set the "canvas" parameter to "nil".

    // If the SDK's built-in renderer is enabled, call [startPlayingStream] with a View passed in as the `canvas` parameter.
    ZegoCanvas *playCanvas = [ZegoCanvas canvasWithView:self.playView];
    [[ZegoExpressEngine sharedEngine] startPlayingStream:self.streamID canvas:playCanvas];
    
    // If the SDK's built-in renderer is NOT enabled, call [startPreview] with the `canvas` parameter set to null.
    [[ZegoExpressEngine sharedEngine] startPlayingStream:self.streamID canvas:nil];
    
    // After the stream playback is started, the SDK will start sending out video frame data for custom rendering through the callback [onRemoteVideoFrameRawData] or [onRemoteVideoFrameCVPixelBuffer].

FAQ

  1. What's the difference between custom video rendering for local stream preview and custom video rendering for remote stream playback?

    Custom video rendering for local stream preview is performed on the stream publishing end to give the streamer a preview of the local stream with special rendering effects. Custom video rendering for remote stream playback is performed on the stream receiving end to provide the viewers with special rendering effects.

  2. If the "enableEngineRender" attribute of ZegoCustomVideoRenderConfig is set to "NO", what value should be passed to the "canvas" parameter when calling startPreview and startPlayingStream?

    If "enableEngineRender" is set to "NO", the engine's built-in renderer will not do any video rendering, so the "canvas" parameter can be set to "nil".

  3. When custom video rendering for local stream preview is enabled, will the processed video data for preview also get published out?

    No. The custom rendering process only affects the local preview and does not affect the video data that are streamed out.

  4. What is the width and height of the video frames for local preview custom rendering?

    The width and height of the video frames for local preview custom rendering can be obtained from the "param" (ZegoVideoFrameParam) parameter of the custom video rendering callback, which should be the same as the video capture definition specified to call setVideoConfig or the engine's default video capture definition.

  5. What is the pixel format of the video frame data for custom video rendering? Is the YUV format supported?

    Both YUV and RGB are supported. As mentioned in step 4.1 above, you need to set the "frameFormatSeries" (ZegoVideoFrameFormatSeries) attribute to specify the color space (RGB or YUV) of the video frame data when calling enableCustomVideoRender. The specific pixel format of the video frame data can be obtained from the "format" (ZegoVideoFrameFormat) attribute of the "param" (ZegoVideoFrameParam) parameter of the custom video rendering callbacks.

    The supported pixel formats defined in ZegoVideoFrameFormat are listed below for your reference.

    Pixel Format Description
    ZegoVideoFrameFormatI420 YUV420P; 12bits per pixel.
    It has 3 planes: the luma plane Y first, then the U chroma plane, and last the V chroma plane.
    For a 2×2 square of pixels, there are 4 Y samples but only 1 U sample and 1 V sample.
    ZegoVideoFrameFormatNV12 YUV420SP; 12bits per pixel.
    It has two planes: one luma plane Y and one plane with U and V values interleaved.
    For a 2×2 square of pixels, there are 4 Y samples but only 1 U sample and 1 V sample.
    ZegoVideoFrameFormatNV21 NV21 is like NV12, but with U and V order reversed: it starts with V.
    ZegoVideoFrameFormatBGRA32 BGRA is an sRGB format with 32 bits per pixel.
    Each channel (Blue, Green, Red, and Alpha) is allocated 8 bits per pixel.
    ZegoVideoFrameFormatRGBA32 Similar to GBRA; just the channels are in a different sequence (Red, Green, Blue, Alpha)
    ZegoVideoFrameFormatARGB32 Similar to GBRA; just the channels are in a different sequence (Alpha, Red, Green, Blue)
    ZegoVideoFrameFormatABGR32 Similar to GBRA; just the channels are in a different sequence (Alpha, Blue, Green, Red)
    ZegoVideoFrameFormatI422 YUV422P;16bits per pixel;
    It has three planes: one luma plane Y and 2 chroma planes U, V.
    For a 2×2 group of pixels, there are 4 Y samples and 2 U and 2 V samples each.
  6. What is the frequency of the callbacks for custom video rendering? Is there anything I should pay attention to?

    In general, the frequency of the callback for local preview custom rendering is the same as the frame rate set for stream publishing. But it also changes according to the frame rate set for traffic control (if such settings are enabled). The frequency of the callback for remote stream custom rendering changes according to the actual frame rate of the video data being received. For example, it may change due to any frame rate change on the stream publishing end for traffic control or any network stuttering on the stream receiving end.

  7. How to get the data of the first frame for custom video rendering?

    The data received from the first custom video rendering callback is the data for the first frame.

  8. Is it possible to enable custom video rendering for local stream preview only and still let the ZEGO SDK handle the rendering for remote stream playback?

    Yes. If the "enableEngineRender" attribute of ZegoCustomVideoRenderConfig is set to "YES", the SDK's built-in renderer will also render the video data to the view specified in the "canvas" parameter of startPreview and startPlayingStream while sending out the video data for custom rendering.

  9. Custom video rendering is enabled, but no video data callback is received for the local preview rendering. How to solve the problem?

    Before you start publishing the stream, you need to call startPreview trigger the custom rendering callback.

  10. For local preview custom rendering, why is horizontal mirroring not done by default for the video frame data captured from the front camera?

    For custom video rendering, you need to implement video mirroring by yourself. You can use the "flipMode" (ZegoVideoFlipMode) parameter of the custom rendering callback to determine whether mirroring is required.

Page Directory