logo
Live Streaming
On this page

Use Local Broadcast


Feature Overview

With the development of real-time interaction, video interaction is no longer limited to single camera images. Screen sharing images, multi-camera images, fun accessories, product information, etc. have become content that more and more live rooms or users need to display.

ZEGO's local broadcast plugin supports mixing images and sounds locally, merging multiple audio and video streams or page elements into a single audio and video stream for stream publishing, helping developers implement richer scenarios.

Effect Demonstration

Online Education/MeetingE-commerce Live StreamingGame Live Streaming

Feature Trial

Applicable Scenarios

  • Online education
  • Online meetings
  • Live streaming sales
  • Game live streaming
  • Show live streaming

Compatibility Description

  • Desktop: Supports Chrome 94 or above, Edge 94 or above. For the best experience, we recommend using the latest version of Chrome or Edge browser.
  • Local broadcast plugin is not recommended for use on mobile.

Prerequisites

Before implementing the local broadcast feature, please ensure:

Implementation Steps

1 Integrate Local Broadcast Plugin

Developers can integrate the local broadcast plugin through any of the following methods.

Method 1: Integrate using npm

Find the "index.js" file in your project directory and add the following code to import the local broadcast plugin package.

import { ZegoExpressEngine } from'zego-express-engine-webrtc';
import { StreamCompositor } from'zego-express-engine-webrtc/stream-compositor';

ZegoExpressEngine.use(StreamCompositor);

Method 2: Manual integration

Get the latest package on the Download page. The files in the unzipped package contain the local broadcast plugin package. Find the "index.html" file in your project directory and add the following code to manually integrate the local broadcast plugin package.

<script src="./stream-compositor-XXX.js"></script>
<script src="./ZegoExpressWebRTC-XXX.js"></script>
<script>
        ZegoExpressEngine.use(StreamCompositor);
        ...
</script>

2 Create Local Broadcast Plugin Processing Instance

Create and initialize a ZegoExpressEngine instance, then call the createStreamCompositor interface to create a local broadcast plugin processing instance.

// Initialize instance, appID and server please obtain from the console
const zg = new ZegoExpressEngine(appID, server);
const compositor = zg.createStreamCompositor();

3 Create Local Data Stream

Call the createZegoStream interface with different parameters to create local data streams for camera and screen sharing respectively.

const cameraStream = await zg.createZegoStream();
const screenStream = await zg.createZegoStream({ screen: { video: true } });

4 (Optional) Set Camera Stream Background Transparent

If you want to set the video stream background outside the portrait collected by the camera to transparent, to achieve portrait picture-in-picture, presenter mode, and immersive live streaming sales, you need to import the background processing module.

  1. Contact ZEGOCLOUD technical support to enable related permissions.
  2. Find the "index.js" file in your project directory and add the following code to import the background processing module.
import { BackgroundProcess } from'zego-express-engine-webrtc/background-process';
// Reference background processing module
ZegoExpressEngine.use(BackgroundProcess);
  1. Call the setTransparentBackgroundOptions interface to set the transparent effect of the background of the image with a portrait collected by the camera stream; then call the enableBackgroundProcess interface to enable background processing.
zg.setTransparentBackgroundOptions(cameraStream);
await zg.enableBackgroundProcess(cameraStream, true, 0);

5 Set Input Stream Layers

Call the setInputEndpoint interface to set the layout effect, rendering mode, layer, etc. of the input stream.

compositor.setInputEndpoint(screenStream, {
    layout: { x: 0, y: 0, width: 1280, height: 720, zOrder: 0 }
});

compositor.setInputEndpoint(cameraStream, {
    layout: { x: 0, y: 0, width: 320, height: 180, zOrder: 1 }
});
Note

The larger the zOrder value of the layer parameter, the higher the layer.

6 Create Image Input Layer

Call the addImage interface to set the position of the image layer.

const img = document.getElementById("backImg");

compositor.addImage(img, {
    x: 0,
    y: 540,
    width: 320,
    height: 180,
    zOrder: 1
});
Notice
  • Due to browser security policy influence, if the image resource is not under the same domain, the server needs to enable cross-origin access permissions. You need to set the crossOrigin attribute of HTMLImageElement to 'anonymous'.
  • Image loading takes time. You need to call the addImage interface after the image loading is completed (in the onload event).

7 Set Output Parameters and Layout

Call the setOutputConfig interface to set the width, height, frame rate, etc. of the output stream.

const width = 1280, height = 720;

compositor.setOutputConfig({
    width: width,
    height: height,
    framerate: 15
});

8 Start Composing Media Stream

Call the startComposingStream interface to start composing media streams.

// Output video stream mediaStream
const outputStream = await compositor.startComposingStream();

9 Stop Composing Media Stream

Call the stopComposingStream interface to stop composing media streams; then call destroyStream to destroy stream data and release resources.

await compositor.stopComposingStream();
zg.destroyStream(cameraStream);
zg.destroyStream(screenStream);

Previous

OBS Streaming with WHIP Protocol

Next

Using Token for Authentication