Adding video streaming functionality within a React Native application from scratch can be a daunting task that some might think impossible. Maintaining low-latency, load balancing, and managing user event states can be incredibly tedious. On top of that, people have to maintain cross-platform compatibility.
Well, there’s an easy way to solve all these issues. In this article, I will guide you to build a React Native video calling app by utilizing the magic of Agora’s Video SDK. We’ll go over the structure, setup, and execution of the app before diving into the logistics. You can get a cross-platform video call app going in few simple steps within a matter of minutes.
We’ll be using the Agora RTC SDK for React Native for the example below.
Creating an account with Agora
Structure of our example
index.js . ├── android ├── components │ ├── Home.js │ └── permission.js │ └── Router.js │ └── Video.js ├── ios ├── index.js .
Let’s run the app
- Make sure you’ve registered an Agora account, setup up a project and generated an App ID.
- Download and extract the zip file from master branch.
npm installor use yarn to install the app dependencies in the unzipped directory.
- Navigate to
./components/Home.jsand edit line 13 to enter your App ID that we generated as
- Open a terminal and execute
react-native link react-native-agoraand
react-native link react-native-vector-icons. This links the necessary files from your native modules.
- Connect your device and run
react-native run-iosto start the app. Give it a few minutes to do its magic.
- Once you see the home screen on your mobile (or emulator) enter a channel name and hit submit on the device.
- Use the same channel name on a different device.
- That’s it. You should have a video call going between the two devices.
Getting to how it works
Home component will be our landing screen when our app is launched.
It will have two fields to enter user App ID, Channel Name and a button to submit the data and start the call.
We write the used import statements and define the Agora object as a native module and set the defaults from it.
We define the class based video component. In the constructor, we set our state variables, peerIds (array of connected peers), and uid (local user’s unique id), appid (agora app id), channelName, vidMute (True to mute local user’s video, False otherwise), and similarly, audMute for audio.
We set up our video stream configuration in const config and initialize the RTC engine, by calling RtcEngine.init(config).
Before we bring together the components, we define functions to execute user events, i.e. when a new user joins the call, we add their uid to the array; when user leaves the call, we remove their uid from the array; if the local users successfully joins the call channel, we start the stream preview.
We define functions to toggle audio and video feeds of the local user and to end the call by leaving the channel.
Next we define the view for different possible number of users; we start with 4 external users on the channel and move down to no users using conditional operator. We call this function inside our render method. We define styles for our internal components and export our component Video to use with the router.
That’s it, that’s how the app works. You can use pretty much the same execution to add multi-user video-calling in your own React Native application using Agora’s RTC SDK.