Skip to content
Augmented Reality Streaming with Agora UIKit on iOS featured

Augmented Reality Streaming with Agora UIKit on iOS

By Author: Max Cobb In Developer

With the Agora UIKit package, streaming a live augmented reality (AR) session has never been easier.

In this post, you’ll see how to stream an Augmented Reality scene to Agora with just a few basic steps.

Prerequisites

Setup

Let’s start with a new, single-view iOS project. Create the project in Xcode, and then add the Agora UIKit package.

Add the package by opening selecting File > Swift Packages > Add Package Dependency, and then paste in the link into this Swift package:

https://github.com/AgoraIO-Community/iOS-UIKit.git

At the time of writing, the latest release is 4.0.0-preview.8. The 4.x preview version of the SDK is used in this post, but a working version of this example with the Agora video SDK 3.x is also included in the example repository.

We also want to add ARVideoKit, a Swift package that helps to capture audio and video from SceneKit views:

https://github.com/AFathi/ARVideoKit.git

If you want to jump ahead, you can find the full example project here:

Augmented Reality Streaming with Agora UIKit on iOS 1

Once those packages are installed, the camera and microphone usage descriptions need to be added. To see how to do that, check out Apple’s documentation here:

Augmented Reality Streaming with Agora UIKit on iOS 2

Create the UI

Only two views need to be added to our app:

  • Augmented Reality SceneKit view (ARSCNView)
  • Agora UIKit view, set to the .collection style

The SceneKit view won’t be anything special in this example. All we will do is create an ARSCNView that fills the screen and place a cube in front of the camera (at [0, 0, -3]).

SceneKit

The SceneKit view setup would look similar to this:

All the snippets are from methods in the ViewController class.

After that, we need to create the AR Recorder, from ARVideoKit. This class wraps the ARSCNView so that it can grab the appropriate camera and SceneKit frames and stitch them together:

In the above example, we set the renderAR delegate as self. self here is the ViewController instance.

We need to add the RenderARDelegate protocol to the ViewController, and add the delegate method that gives us the video frame:

extension ViewController: RenderARDelegate {
  // MARK: ARVideoKit Renderer
  open func frame(
    didRender buffer: CVPixelBuffer, with time: CMTime,
    using rawBuffer: CVPixelBuffer
  ) {
    // Create AgoraVideoFrame, and push to Agora engine.
    // This part will be filled in later.
  }
}

The final step for the ARKit session is to configure and run the AR session. I typically put this into the viewDidAppear method. We just want a basic AR session, and ARWorldTrackingConfiguration is a good basic configuration to use:

open func setARConfiguration() {
  let configuration = ARWorldTrackingConfiguration()
  // run the config to start the ARSession
  self.arvkRenderer?.prepare(configuration)
}

Normally, we would call self.sceneView.session.run(configuration) to start the session. But because we have it wrapped in a RecordAR object, we can call this method on the arvkRenderer object instead.

Agora UIKit

Now we need to join the Agora video channel using Agora UIKit.

We need to tell the engine that we will be using an external camera. Otherwise, Agora UIKit will immediately go for the default built-in cameras.

In AgoraSettings is a setting for this called externalVideoSettings. This property can tell the engine that an external video source should be used. It also tells the engine a few details about this external video source, including whether it is textured video data and whether the video source is encoded.

In our case, textured video data is used, and the source is not encoded. We don’t want to show the option to flip the camera, so the settings property gets created like this:

var agSettings = AgoraSettings()
agSettings.externalVideoSettings = AgoraSettings.ExternalVideoSettings(
  enabled: true, texture: true, encoded: false
)
agSettings.enabledButtons = [.cameraButton, .micButton]

Then we create an instance of AgoraVideoViewer, with the above settings, and the .collection style mentioned earlier.

let agoraView = AgoraVideoViewer(
  connectionData: AgoraConnectionData(
    appId: <#Agora App ID#>,
    appToken: <#Agora Token or nil#>
  ),
  style: .collection,
  agoraSettings: agSettings
)

Then fill the view with the AgoraVideoViewer, join the channel, and keep a reference to agoraView in the ViewController.

agoraView.fills(view: self.view)
agoraView.join(channel: "test", as: .broadcaster)
self.agoraView = agoraView

Push Frames to Agora

Now the AR scene is rendering correctly in the background. Anyone who joins the same channel with their camera will appear across the top of the view, but our device is not pushing anything, so our camera feed never arrives for any of the remote users.

Returning to the frame delegate method from earlier, we need to create an AgoraVideoFrame object, assign the format, the pixel buffer, and a timestamp for the video frame:

Then we grab the AgoraRtcEngineKit instance through the AgoraVideoViewer class, and push the video frame to it.

Augmented Reality Streaming with Agora UIKit on iOS 3

Audio

If you want to stream audio from SceneKit, a few more settings need to be made:

Conclusion

Your new video streaming app can stream an AR scene using SceneKit on iOS, and you can view the scene on any other device that has a compatible Agora SDK.

Testing

Try out this example using either the 3.x or 4.x SDK through Agora UIKit here:

Augmented Reality Streaming with Agora UIKit on iOS 4

There are some known issues with ARVideoKit that have to do with device orientation, so for this specific example I recommend trying it only with the device upright (portrait).

Other Resources

For more information about building applications using Agora SDKs, take a look at the Agora Video Call Quickstart Guide and Agora API Reference.

I also invite you to join the Agora Developer Slack community.