Skip to content
Create a Live Video App with Augmented Reality Featured

Create a Live Video App with Augmented Reality

By Author: Jonathan Fotland In Developer

Over the last decade, live streaming has gradually become one of the most popular forms of entertainment. More and more people are enjoying sharing their lives with the public through live streaming.

Nowadays, there are plenty of streaming application choices such as Twitch, Facebook Live, and Youtube Live. These apps provide many fun additional features such as face filters and voice-changing options. But have you ever thought about building a streaming application on your own? Are you hesitating because some of these features seem too fancy or complicated?

Luckily, Agora Video SDK and Banuba Face AR SDK make developing a streaming application with a face filter quick and easy. Today, I’ll show you how to do this on iOS, step by step.

Prerequisites

  1. A basic understanding of Swift and the iOS SDK.
  2. An Agora developer account.
  3. Xcode and an iOS device.
  4. CocoaPods. If you don’t have CocoaPods installed already, you can find instructions here.
  5. The Banuba Face AR demo app.

Overview

This tutorial will go over the steps for building a live-streaming application on iOS using the Agora SDK and Banuba Face AR. The following is a list of the core features that will be included in our app:

  • Users (streams and audiences) can create and log in to their account. User account information will be saved in Google Firebase Realtime Database.
  • Users can set up virtual rooms to host live streams and become a streamer.
  • Users can find all live streams and join the virtual room as an audience.
  • Streamers can use the face filter feature to stream with a virtual mask or animation.
  • Audiences in a virtual room can send text messages which can be seen by everyone in that room.
  • Users can search for other users by name and send private text messages to them.

You can find my demo app as a reference for this tutorial.

Downloading Banuba FaceAR Demo

The easiest way to build our app is to build it on top of the existing Banuba FaceAR demo app. The package contains a default Banuba demo app demonstrating the SDK. Once you’ve downloaded the app and unzipped it, follow Banuba’s online instructions to get it set up. Note the following about the demo app: You will need a Banuba client token to run the project. Contact info@banuba.com for more information.

Some easy-to-miss tips with the Banuba setup:

  • In step 6, make sure BanubaEffectPlayer.framework is marked Embed and Sign.
  • In step 8, make sure you select Create folder references when adding the effects folder.

After you’re done with the setup, it’s a good idea to compile the BanubaSdk framework target (make sure you set the device to Generic iOS device) to create an up-to-date version of the Banuba SDK. You’ll then need to copy the framework that was just created to the same place you put  BanubaEffectPlayer.framework, and add it as an embedded library in the same way.

General settings after setup

After you’ve recompiled the SDK, run the BanubaSdkApp target. You should see something like this:

Details for the UI of this screen can be found here

This will be our streamer view. However, we’re going to need to do some work to get these face filters streamed out to an audience. This is where the Agora Video SDK comes in handy.

Setting Up CocoaPods

  1. In Terminal, navigate to the directory containing the
    BanubaSdkApp.xcodeproj project and run pod init to initialize CocoaPods. This should be src/BanubaSdk/BanubaSdkApp.
  2. Open the Podfile that was created and add the pods for the Agora library, as well as the Firebase libraries we’ll use for user management:
  1. Run pod install in Terminal to install the libraries.
  2. Open BanubaSdkApp.xcworkspace.
  3. Open Build Settings and search for “Framework Search Paths”. Replace the first search path: ("$(SRCROOT)/../../../../build") with
    $(inherited) to make sure Xcode can locate the CocoaPod frameworks.
Framework Search Paths

Setting Up Firebase

Go to https://console.firebase.google.com and create a new Firebase project. Follow the instructions there to set up Firebase within your existing app. We’re going to be using Firebase for authentication, analytics, and user management.

Once you’ve finished going through Firebase’s setup, you should have completed the following steps:

  1. Registered your app’s Bundle ID with Firebase. As a reminder, you can find your Bundle ID in your project settings, under General.
  2. Downloaded the GoogleService-Info.plist file and added it to your app.
  3. Imported Firebase to your AppDelegate and called FirebaseApp.configure() in didFinishLaunchingWithOptions.
  4. Ran your application to have Firebase verify communication.

You will then be presented with the Firebase dashboard. Click Develop in the left navigation, and then click Authentication. Click Set up sign-in method and enable the Email/Password and Google sign-in options. Note that you’ll need to set your public-facing app name and support email to do so.

In Xcode, you’ll also need to set up a URL scheme to handle Google sign-in. Copy the REVERSED_CLIENT_ID field from your GoogleService-Info.plist, and open up the URL Types pane in the Info section of your project settings:

URL Schemes

Add a new URL type and paste the reversed client ID into the URL Schemes field. We’ll also need to write some code so our app knows how to handle that URL. We’ll be using Firebase UI, so for us it’s as simple as just telling Firebase to handle it. Add the following to your AppDelegate.swift:

Note: Firebase provides several other sign-in options that you may want to allow, but we won’t be covering them here.

Setting Up the Views

View Layout

Open up Main.storyboard. It already contains a view for the Banuba demo, which we’re going to leave alone. We’ll be adding some additional screens around it to encompass our new functionality. We need a Tab Bar Controller with three views attached to it: one for our list of live streams, one for chat, and one for settings and other miscellany (which we’re just using as a placeholder for the purposes of this demo). Set the Tab Bar Controller as the new Initial View Controller.

For the first tab, we need a Navigation Controller, and a Table View Controller with a button overlaid to go live with your personal stream. That button should show the existing Banuba screen with a segue. Make sure to give it an identifier, we’ll be using that soon.

Make a simple TableViewCell:

Finally, make sure you give the cell prototype a reuse identifier.

For now, leave the chat view controller blank; we’ll come back to it later.

Logging in with FirebaseUI

In this guide, we’ll be using Firebase’s built-in UI to handle sign-in for us. If you want to use your own login page, or simply want to be more flexible with your UI, you can find the docs for logging in programmatically with email and Google here and here, respectively.

We’re going to be using FirebaseUI to log in to our app. We’ll have our tab bar controller — I’ve named it MainScreenViewController — handle showing the default FUIAuth View Controller. All we need to do is tell it what providers we want to allow and who to tell when the user successfully logs in:

We could call this function on startup, but it would get pretty annoying to have to log in every time we open the app. To solve this, we can use something provided to us by FirebaseAuth — an AuthStateDidChangeListener. It will tell us whenever the user’s authentication state changes and allow us to show the login page if there’s no user already logged in. Adding one is pretty simple:

We now have a functional login page that will appear if the current user is nil.

Creating a User Database

Firebase will track our users for us — you can see this for yourself on the Authentication tab of the Firebase dashboard once you sign in with a user. However, this list of users isn’t very useful to us. While we can get information from it about the currently logged-in user, it won’t allow us to get any info about other users. We’ll need our own database for that.

Go to the Database tab on the Firebase dashboard and create a new Realtime Database. Start it in test mode for now, so we can easily modify it without having to worry about security while we’re working on it. We could add data manually here, but it’ll be easier to do it automatically in code.

Adding Users on Login

Head back to our FUIAuthDelegate extension. We’re going to make use of that didSignInWith callback to add a user to our database whenever they log in:

This code gets a reference to our main database, and adds an entry in a new user node. Each child of a node needs to have a unique key, so we use the unique UID Firebase gives us, and we store the user’s email, their display name, and a lowercased version of their display name that will make it easier to search for it later.

Note: This code will overwrite our user node every time the user logs in. If you want to add additional fields to our user database, this code will need to be adjusted so it doesn’t delete things.

Going Live

In our GoLiveViewController, we first need to make sure we have a reference to our current user, so we know who’s streaming. We can use the same method we used to log in.

When we hit the Go Live button, we want to pass our username to the Banuba screen so that we can enter our own personal channel. We also need to make sure we save the fact that we’re live somewhere so other users can watch the stream. We’ll do that using another node in our Firebase database.

Note: We haven’t created a roomName variable in the Banuba View controller yet, but we’ll add that soon.

When the user hits the Go Live button, we’ll add their username to the list of users currently live. We’re then going to update our table view by reading from that list.

We use a database observer to get notified when the list of live users changes and then update our table as needed. When we set up the observer, it will immediately give us a childAdded event for every user who is already live, so no additional cases are needed.

Connecting Banuba to Agora

This is where the real meat of the app is. We need the video frames Banuba is generating to be streaming to our audience. To do that we’re going to create a custom video source that’s responsible for taking the frames Banuba is generating and passing them to the Agora engine in a way it understands. Create a new CustomVideoSource class that conforms to the AgoraVideoSourceProtocol:

Most of the functions are required stubs that don’t do much. The only real content is the sendBuffer function, which will take a CVPixelBuffer in, and pass it to Agora.

Now it’s time to start editing Banuba’s ViewController.swift file. We need to set up some variables:

And then we add a function to initialize Agora:

Make sure to add the AgoraRtcEngineDelegate protocol to the ViewController class. We don’t actually need it for this demo, but it could come in handy later if we want to expand the app’s functionality.

And then we tell Banuba where to send its frames in viewDidLoad:

Finally, make sure we clean up our stream when we leave the screen:

Joining the Audience

We’re now sending our fancy AR frames to the Agora channel, but we don’t have any way to view them. Fortunately, creating a simple audience view is trivial with AgoraUIKit. We don’t even need to layout any views.

Back in our GoLiveViewController, add a handler for when a user taps a live user:

That’s literally all there is to it. AgoraUIKit will do all the heavy lifting for us. If you try the app out now, you should be able to log in, start streaming with face filters, and see the stream on another device.

Adding Chat

We’re going to use Agora’s Real-Time Messaging (RTM) SDK to allow users to chat with each other while they’re in a video call. First, let’s set up some new views.

Chat Views

Replace the chat section of our app with three new views:

  • A Navigation Controller.
  • A view for searching for users to chat with. This needs a UISearchBar and a UITableView. Make sure you connect the search bar’s delegate and the tableview’s delegate and data source to the view controller. We also need a prototype cell with a single label in it. Make sure to give it a reuse identifier.
  • A view for chatting. This needs a UITableView and a UITextField. Remember to connect up their delegates and the table’s data source. Make another prototype cell, also with a single label.

Searching for Users

Connect up your first prototype cell to a simple UITableViewCell subclass:

We’re then going to make a new UserSearchViewController class. Make sure you set the custom class in your Main.storyboard when you do. We first do our standard setup to get a reference to our user, as well as a reference to our user database.

Then, when the user searches, we perform a database query to get a list of matching users:

Tip: If you don’t have multiple phones to test with, you can always add dummy users into your database directly in the Firebase console.

We then need to display the users we found:

If you run your app and search for another user, they will now appear in your search. However, you may also notice Firebase complaining at you in the console:

[Firebase/Database][I-RDB034028] Using an unspecified index. Your data will be downloaded and filtered on the client. Consider adding “.indexOn”: “username” at /users to your security rules for better performance.

This is Firebase telling us that it’s not indexing our users by our search fields, because we haven’t told it to. With as few users as we have now, it’s not a big deal. But if we want to release to a large user base, we should fix this.

Fortunately, adding the rule is easy. Head to the Database tab in your Firebase dashboard, and open up your Rules. Add the .indexOn field to your users database and hit Publish:

indexOn field

Finally, we need to actually show the chat view when we select a user. Create a manual segue to the chat view in Main.storyboard and give it a name. Then we can perform this segue and pass the name of the person we want to chat with to the next view.

Sending Messages

Our final screen is where our users can actually chat with each other. First, we need to initialize Agora’s RTM module and then log in:

When our user types a message, we send it using RTM and add it to our table view:

Then we implement the AgoraRtmDelegate to receive and display messages from the user we’re chatting with:

Handling the Keyboard

If you try to test the app now, you’ll notice an immediate problem: text field is at the bottom of the screen, and the keyboard covers it up when you select it. Let’s fix that.

Here, we add a reference to the NSLayoutConstraint attaching the text field to the bottom of the screen. Using Notification Center, we can then find out when the keyboard is shown or hidden, and adjust how far from the bottom of the screen our text field is automatically.

Adding Channel Chat

In addition to direct messaging, it’d be nice if our audience members could chat with other people in the room while watching a stream. We’re currently using AgoraUIKit to handle the audience, which doesn’t have RTM built in. We can fix that, though, because we can add whatever functionality we want by subclassing AgoraVideoViewController.

Make a new file. We’re going to be doing a lot of the same things we just did for one-on-one chat.

Getting our user should be familiar by now. In addition, we’re setting up Agora RTM. Here, we’re calling createChannel with the name of the streamer, which will create a channel if it doesn’t exist, or give it to us if it does. We’re then joining that channel and will be sending it messages, instead of sending them to specific users.

In our viewDidLoad, we’re adding the table view and text field we need to handle chat, and setting up some programmatic constraints so it stays in the right place. We’re also doing the same Notification Center setup we did earlier so that we can properly handle the keyboard here.

Our table view and keyboard setup doesn’t change much:

And we handle messages much the same way, though the callback for receiving channel messages is slightly different:

Finally, we need to update our code to initialize our audience view slightly in our GoLiveViewController:

Conclusion

With that, we’re done! We have a working streaming application, complete with AR face filters. If you’ve gotten this far, thanks for following along! If you’d like to see more features or have questions, feel free to leave a comment or email devrel@agora.io. Happy streaming!