Screen sharing refers to sharing screen content with other viewers in the form of video during a video call or Interactive live streaming, enhancing interactive experience and improving communication efficiency.
Screen sharing is widely used in the following scenarios:
Please refer to Download example source code to obtain the source code.
Related source code please check the files under the "/ZegoExpressExample/Examples/Others/ScreenSharing" directory.
Before implementing the screen sharing function, please ensure:
The iOS platform uses Apple's Replaykit framework to achieve screen Recording, which can share the entire system's screen content. However, the current App (main App process) needs to provide an additional Extension component (Extension process) to record the screen, and then combine it with the relevant API of the ZEGO Express SDK to achieve the screen sharing function.
The main process of implementing screen sharing is as follows:
Switch the collection source to the screen sharing source
Start screen sharing
Sharing within the application
Cross-application sharing
Login to the room and push stream
Watch remote shared screen
Stop screen sharing
Setting the capture source to screen sharing source requires configuring both the video source and audio source.
The SDK's "video source" for streaming is set to camera by default. If you need to switch to screen sharing source, use setVideoSource to make the change.
[ZegoExpressEngine.shareEnigne setVideoSource:ZegoVideoSourceScreenCapture channel:ZegoPublishChannelMain];
The SDK's "audio source" for streaming defaults to microphone. If you need to switch to Screen sharing source, use setAudioSource to change it to Screen sharing.
If the main channel uses the screen sharing feature, the SDK will only start internal audio capturing and maintain background activity when the main channel's audio source is set to Microphone
. If set to other audio source types, screen sharing will stop once the app moves to the background. It's recommended that users implement their own logic for maintaining background activity.
[ZegoExpressEngine.shareEnigne setAudioSource:ZegoAudioSourceTypeScreenCapture channel:ZegoPublishChannelMain];
There are two types of screen sharing methods: "In-app Screen Sharing" and "Cross-app Screen Sharing".
If users only share their screen and sound within the app, they can call the startScreenCaptureInApp interface to start screen sharing. They can also call the broadcastFinished interface for a callback when screen sharing ends. If screen capture fails, the reason for failure can be received.
ZegoScreenCaptureConfig *config = [[ZegoScreenCaptureConfig alloc] init];
config.captureVideo = true;
config.captureAudio = true;
// Optional parameter, set the capture area for video, which must be within the original video data, in units of pixels (px)
config.cropRect = CGRectMake(x, y, width, height);
[ZegoExpressEngine.sharedEngine startScreenCaptureInApp:config];
Cross-app screen Sharing is conducted by the iOS system through an Extension that runs in a separate process for recording, so it requires creating an additional extension process and starting it. Please refer to the following implementation steps:
The memory usage limit for the Broadcast Upload Extension is 50 MB. Do not perform additional memory allocations within the screen sharing Extension.
Open the project workspace file with Xcode, then click “File > New > Target..." in the menu bar.
In the pop-up window, select "Broadcast Upload Extension" on the iOS page, then click “Next”.
In the pop-up dialog box, enter the name of the "Broadcast Upload Extension", such as “ScreenShare”, in the “Product Name” field. After selecting the “Team”, “Language” and other information, click “Finish”.
Do not check “Include UI Extension”.
After creation, you will see the folder of this Extension in your project, with a structure similar to the following. This folder is used to store the implementation code for the screen sharing feature:
Ensure that in the "Info.plist" file of the Extension, “RPBroadcastProcessMode” is set to “RPBroadcastProcessModeSampleBuffer”.
Import the ZEGOCLOUD Express SDK into the Extension; for more details, please refer to Quick Start - Integration.
If users need to share the entire system's screen and sound, they can call the startScreenCapture interface to start screen sharing.
ZegoScreenCaptureConfig *config = [[ZegoScreenCaptureConfig alloc] init];
config.captureVideo = true;
config.captureAudio = true;
// Optional parameter, set the capture area for video, which must be within the original video data, unit is pixels (px)
config.cropRect = CGRectMake(x, y, width, height);
[ZegoExpressEngine.sharedEngine startScreenCapture:config];
There are two ways to launch it; please choose to implement based on your needs.
You need to long press the screen recording button in the iOS system's Control Center, then select the corresponding Extension to start the recording.
Apple introduced RPSystemBroadcastPickerView in iOS 12.0, which can pop up a launcher from the App for users to confirm starting screen sharing.
RPSystemBroadcastPickerView *broadcastPickerView = [[RPSystemBroadcastPickerView alloc] initWithFrame:CGRectMake(0, 0, 44, 44)];
NSString *bundlePath = [[NSBundle mainBundle] pathForResource:@"ZegoExpressExample-Broadcast" ofType:@"appex" inDirectory:@"PlugIns"];
if (bundlePath) {
NSBundle *bundle = [NSBundle bundleWithPath:bundlePath];
if (bundle) {
broadcastPickerView.preferredExtension = bundle.bundleIdentifier;
for (UIView *subView in broadcastPickerView.subviews) {
if ([subView isMemberOfClass:[UIButton class]]) {
UIButton *button = (UIButton *)subView;
[button sendActionsForControlEvents:UIControlEventAllEvents];
}
}
}
}
Apple introduced RPSystemBroadcastPickerView in iOS 12.0, which can pop up a launcher from the App for users to confirm starting screen sharing. However, currently RPSystemBroadcastPickerView does not support custom interfaces and there is no official method to invoke it.
The official Apple recommendation is not in favor of this solution, and it may become invalid in future system updates. Therefore, it is just an optional solution, and you need to bear the risk yourself if you choose this option.
The implementation of the following system callbacks can be viewed in the file “/ZegoExpressExample/Examples/Others/ScreenSharing/ZegoExpressExample-Broadcast/SampleHandler.m” in Sample codes:
The system notifies the Extension that screen Recording has started via the broadcastStartedWithSetupInfo callback, and within this callback, you need to call the setupWithDelegate interface in the ZegoReplayKitExt class to create a data transmission channel:
[ZegoReplayKitExt.sharedInstance setupWithDelegate:self];
In the processSampleBuffer system callback, send it to the ZEGO Express SDK via the sendSampleBuffer interface in the ZegoReplayKitExt class.
[ZegoReplayKitExt.sharedInstance sendSampleBuffer:sampleBuffer withType:sampleBufferType];
The system notifies the Extension via the broadcastFinished callback that screen Recording has ended, and if the screen recording fails, the reason for failure can be received. Within this callback, you can call the finished interface in the ZegoReplayKitExt class to stop screen collection and disconnect the data transmission channel:
[ZegoReplayKitExt.sharedInstance finished];
By calling the setupWithDelegate method of the SDK to initialize and set up the delegate, you can add the <ZegoReplayKitExtHandler>
protocol in the current class and implement the callback to listen for the reason why screen sharing ends or fails.
- (void)broadcastFinished:(ZegoReplayKitExt *)broadcast reason:(ZegoReplayKitExtReason)reason {
switch (reason) {
case ZegoReplayKitExtReasonHostStop:
{
NSDictionary *userInfo = @{NSLocalizedDescriptionKey : @"Host app stop srceen capture"};
NSError *error = [NSError errorWithDomain:NSCocoaErrorDomain code:0 userInfo:userInfo];
[self finishBroadcastWithError:error];
}
break;
case ZegoReplayKitExtReasonConnectFail:
{
NSDictionary *userInfo = @{NSLocalizedDescriptionKey : @"Connect host app fail need startScreenCapture in host app"};
NSError *error = [NSError errorWithDomain:NSCocoaErrorDomain code:0 userInfo:userInfo];
[self finishBroadcastWithError:error];
}
break;
case ZegoReplayKitExtReasonDisconnect:
{
NSDictionary *userInfo = @{NSLocalizedDescriptionKey : @"disconnect with host app"};
NSError *error = [NSError errorWithDomain:NSCocoaErrorDomain code:0 userInfo:userInfo];
[self finishBroadcastWithError:error];
}
break;
}
}
After completing the process of capturing the Screen sharing source, push the captured data source startPublishingStream to the cloud server. (The channel used for pushing the data source must be consistent with the channel used for setting up the capture source)
[ZegoExpressEngine.sharedEngine startPublishingStream:streamID channel:ZegoPublishChannelMain];
After completing the above steps, other users can use the startPlayingStream interface to pull the Screen sharing stream.
// Pull stream playback, need to pass in the streamID used by the user who initiated the Screen sharing when pushing the stream
[[ZegoExpressEngine sharedEngine] startPlayingStream:streamID canvas:[ZegoCanvas canvasWithView:self.playView]];
Users can call the stopScreenCapture interface to stop sharing.
[ZegoExpressEngine.sharedEngine stopScreenCapture];
Does iOS support sharing a specific area?
The iOS system only supports sharing the entire screen, not a specific area.
Why does screen sharing stop when entering the background on iOS?
Microphone
. If it is set to other audio source types, Screen sharing will stop once the app goes to the background. It is recommended that users add their own background retention logic to the app.How to handle abnormal audio playback when using Screen sharing on iOS?
If you use the Screen sharing function to capture and stream audio while also using the pull stream function on the local machine, it will cause the iOS system to duplicate the collection of pull stream audio, leading to abnormal audio playback. It is recommended to use muteAllPlayStreamAudio to prevent fetching all audio streams.