Set up Agora Engine and Magic Leap 2 to have real-time engagement experiences that allow video chat and live video streaming.
Set up your development environment by following Magic Leap’s setup guides:
- Download ML Hub and install the ML SDKs for Unity and/or native via Package Manager.
- Via Unity Hub, download and install Unity Editor v2022.2.0b7 or higher with Android Build Support and its dependencies Android SDK & NDK Tools and OpenJDK.
Set Up the Unity Project Environment
- Follow the Magic Leap’s official Get-Started guide, set up a Unity project and configure the XR programming environment.
- It is important to have the following options enabled for using the Agora SDK
i. Enable Magic Leap in XR Plug-in Management for the Android Platform.
ii. Make sure that Use ML Audio is enabled in XR Plug-in Management > Magic Leap Settings.
iii. Enable camera and audio recording permissions in MagicLeap > Manifest Settings.
Add Agora Engine
- Download the Agora Engine package for Unity from the Agora Extensions Release page.
- Double-click or drag the downloaded package into the Unity Project Asset window. Click Import. Note that the package includes supporting assets from the MagicLeap example, you may exclude them if you are testing with the existing MagicLeap Unity example project to avoid overwriting these files.
Test The Demo
- Open the AgoraMLDemo scene from Agora_MagicLeap2_Plugin > AgoraEngine > ML2Support > Demo.
- Input your Agora project’s APP_ID and CHANNEL_NAME for your test. It is recommended to test the demo with the test mode app ID first, then use the token-enabled app ID (see the README file for testing with tokens).
- Fill in the appropriate Android build info in the build settings.
- Connect the ML2 device and click Build and Run.
- Start one or more remote user’s Agora RTC client. See https://webdemo.agora.io/ for a list of test apps. Choose the Basic Video Calling demo for a quick test. Fill in the App ID, channel name, and join the channel.
- Press the Connect Camera button in the ML2 demo app.
The ML2 user on should now see the camera stream from the Web user. The Web user should also see the ML2’s camera stream. They can also start voice conversations at any time.