Talk to us
Talk to us
menu

Blog Post

start building
Developer

How To Build Android And iOS Apps Using WebRTC With Flutter

How To Build Android And iOS Apps Using WebRTC With Flutter

WebRTC (Web Real-Time Communications) is a real-time communication technology that allows network applications or sites to establish a peer-to-peer (Peer-to-Peer) connection between browsers without intermediaries to achieve video streaming and/or the transmission of audio streams or other arbitrary data.

WebRTC is not just for building desktop apps, it can be used to create iOS and Android apps too. The main difference between WebRTC for desktop and mobile apps is that you need to use a third-party library for the browser on mobile devices in order to access the camera and microphone.

This article will explain how to use WebRTC to build Android and iOS applications with Flutter.

Introduction to Video Conference Kit

This article will use ZEGOCLOUD’s UIKits to explain how to quickly use WebRTC to build Android and iOS audio and video applications.

UIKits is a prebuilt feature-rich component, which enables you to build video communication into your web and mobile apps in minutes.

And it includes the business logic with the UI, you can customize various video communication features by modifying parameters.

Building WebRTC Apps with Flutter

1. Create a project

First, you need to create a Flutter project with the command flutter create myWebviewWebRTC.

flutter create myWebviewWebRTC

2. Add dependent library

  • Because you need to use WebView to load the WebRTC program, you need to run the command flutter pub add flutter_inappwebview. Add the WebView dependency library.
flutter pub add flutter_inappwebview
  • Because you need to use devices such as camera and microphone, you need to run the command flutter pub add permission_handler. Add permission_handler dependent library.
flutter pub add  permission_handler

3. Android application configuration

3.1 Device Permission Application

You need to add the following code in the android/app/src/main/AndroidManifest.xml file to apply for the camera, microphone, and other device permissions.

 <uses-permission android:name="android.permission.INTERNET" />
 <uses-permission android:name="android.permission.CAMERA" />
 <uses-permission android:name="android.permission.RECORD_AUDIO" />
 <uses-permission android:name="android.permission.MODIFY_AUDIO_SETTINGS" />
 <uses-permission android:name="android.permission.VIDEO_CAPTURE" />
 <uses-permission android:name="android.permission.AUDIO_CAPTURE" />

3.2 SDK version setting

Next, you need to set the compileSdkVersion field in the android/app/build.gradle file to 33.

...
android {
   compileSdkVersion 33
...
}
...

3.3 Minimum supported SDK version setting

Then you need to set the minSdkVersion field to 19 in the android/app/build.gradle file.

android {
 defaultConfig {
    minSdkVersion 19
    ...
}
 ...
}

3.4 Support whiteboard function

If you need to use the whiteboard function, you also need to add the following configuration in the android/app/src/main/AndroidManifest.xml file.

<application>
...
<provider
           android:name="androidx.core.content.FileProvider"
           android:authorities="${applicationId}.flutter_inappwebview.fileprovider"
           android:exported="false"
           android:grantUriPermissions="true"
       >
           <meta-data
               android:name="android.support.FILE_PROVIDER_PATHS"
               android:resource="@xml/provider_paths" />
</provider>
</application>

4. iOS app configuration

4.1 Device Permission Application

First of all, you need to add the following code in info.plist to add permission applications for cameras and microphones.

    <key>NSCameraUsageDescription</key>
    <string>We require camera access to connect to a video conference</string>
    <key>NSMicrophoneUsageDescription</key>
    <string>We require microphone access to connect to a video conference</string>

5. Implement audio and video functions

Finally, you only need to use WebviewScreen to load the WebRTC application in mian.dart to realize the mobile audio and video application.

   import 'dart:async';
   import 'dart:io';
   import 'package:flutter/material.dart';
   import 'package:permission_handler/permission_handler.dart';
   import 'package:url_launcher/url_launcher.dart';
   import 'package:flutter_inappwebview/flutter_inappwebview.dart';

   Future main() async {
     WidgetsFlutterBinding.ensureInitialized();

     if (Platform.isAndroid) {
       await AndroidInAppWebViewController.setWebContentsDebuggingEnabled(true);
     }

     await Permission.camera.request();
     await Permission.microphone.request();

     runApp(const MaterialApp(home: WebviewScreen()));
   }

   class WebviewScreen extends StatefulWidget {
     const WebviewScreen({Key? key}) : super(key: key);

     @override
     State<WebviewScreen> createState() => _WebviewScreenState();
   }

   class _WebviewScreenState extends State<WebviewScreen> {
     final GlobalKey webViewKey = GlobalKey();

     InAppWebViewController? webViewController;
     InAppWebViewGroupOptions options = InAppWebViewGroupOptions(
         crossPlatform: InAppWebViewOptions(
           useShouldOverrideUrlLoading: true,
           mediaPlaybackRequiresUserGesture: false,
         ),
         android: AndroidInAppWebViewOptions(
           useHybridComposition: true,
         ),
         ios: IOSInAppWebViewOptions(
           allowsInlineMediaPlayback: true,
         ));

     @override
     void initState() {
       super.initState();
     }

     @override
     void dispose() {
       super.dispose();
     }

     @override
     Widget build(BuildContext context) {
       return WillPopScope(
         onWillPop: () async {
           // detect Android back button click
           final controller = webViewController;
           if (controller != null) {
             if (await controller.canGoBack()) {
               controller.goBack();
               return false;
             }
           }
           return true;
         },
         child: Scaffold(
             appBar: AppBar(
               title: const Text("InAppWebView test"),
             ),
             body: Column(children: <Widget>[
               Expanded(
                 child: InAppWebView(
                   key: webViewKey,
                   initialUrlRequest: URLRequest(
                       url: Uri.parse(
                           "https://zegocloud.github.io/zego_uikit_prebuilt_web/video_conference/index.html?roomID=zegocloud&role=Host")),
                   initialOptions: options,
                   onWebViewCreated: (controller) {
                     webViewController = controller;
                   },
                   androidOnPermissionRequest:
                       (controller, origin, resources) async {
                     return PermissionRequestResponse(
                         resources: resources,
                         action: PermissionRequestResponseAction.GRANT);
                   },
                 ),
               ),
             ])),
       );
     }
   }

Run a demo

Next, you can run the project and experience your audio and video applications built with Flutter.

webrtc-flutter

You can also learn how to use WebRTC in Flutter by downloading the full sample code.

Conclusion

Over the past few years, WebRTC has seen widespread adoption in the tech world. Facebook, Amazon, and Google are all important technology companies implementing WebRTC to make their web applications faster, more reliable, and more secure.

The WebRTC feature also provides a ready-made solution that can be easily integrated with other software. ZEGOCLOUD is also one of the excellent saas providers. For more information, please continue to follow us

ZEGOCLOUD
ZEGOCLOUD With our fully customizable and easy-to-integrate Live SDK, you can quickly build reliable, scalable, and interactive live streaming into your mobile, web, and desktop apps.

Related posts