Skip to content
Using C++ to Implement a Video Chat Feature in Unreal Engine Featured

Using C++ to Implement a Video Chat Feature in Unreal Engine

By Author: Joel Thomas In Developer

Hello, devs! Today I’m going to walk you through the steps needed to implement the Agora Real-Time-Engagement service into Unreal Engine using C++!

For this example, I’ll use Unreal Engine 4.25 and the current Agora SDK.

Getting Started

Create a Project

Now, let’s build a project from scratch!

  1. Open the Unreal Editor and click New project.
  2. On the New Project panel, choose C++ as the project type, enter the project name, choose the project location, and click Create Project.

Uncomment the PrivateDependencyModuleNames line in the [your_project]/Source/[project_name]/[project_name].Build.cs file. Unreal Engine comments this line out by default and causes a compile error if the line is left commented out.

// Uncomment if you are using Slate UI
PrivateDependencyModuleNames.AddRange(new string[] { "UMG", "Slate", "SlateCore" })

Installation

Follow these steps to integrate the Agora plugin into your project.

  1. Copy the plugin to [your_project]/Plugins.
  2. Add the plugin dependency into the [your_project]/Source/[project_name]/[project_name].Build.cs file, Private Dependencies section:
    PrivateDependencyModuleNames.AddRange(new string[] { “AgoraPlugin” });.
  3. Restart Unreal Engine (if it is running).
  4. Go to Edit > Plugins. Find the Project > Other category and make sure that the Plugin is enabled.

Create a Level

Next, we will create a level to build our game environment. There are several different ways to create a level. Here, you use the File menu method, which lists level-selection options.

In the Unreal Editor, select File > New Level:

The New Level dialog box opens:

Click the Empty Level to select it, and save it to a folder named Levels.

Create Core Classes

Now it’s time to create your first C++ classes, which will handle communication with the Agora SDK:

  • VideoFrameObserver
  • VideoCall

Create VideoFrameObserver

VideoFrameObserver implements

agora::media::IVideoFrameObserver

The methods in the VideoFrameObserver class manage video frames callbacks, and should be registered in

agora::media::IMediaEngine

using the registerVideoFrameObserver function. 

To create your VideoFrameObserver, you:

  1. Create the VideoFrameObserver class interface.
  2. Override the onCaptureVideoFrame and onRenderVideoFrame methods.
  3. Add the setOnCaptureVideoFrameCallback and setOnRenderVideoFrameCallback methods.

In the Unreal Editor, select File > Add New C++ Class.

Select None as a parent class and click Next:

Name the class VideoFramerObserver and click Create Class.

Create the VideoFrameObserver class interface.

Open the VideoFrameObserver.h file and the interface:

//VideoFrameObserver.h
#include "CoreMinimal.h"
#include <functional>
#include "AgoraMediaEngine.h"
class AGORAVIDEOCALL_API VideoFrameObserver : public agora::media::IVideoFrameObserver
{
public:
	virtual ~VideoFrameObserver() = default;
public:
	bool onCaptureVideoFrame(VideoFrame& videoFrame) override;
	bool onRenderVideoFrame(unsigned int uid, VideoFrame& videoFrame) override;
	void setOnCaptureVideoFrameCallback(
		std::function<void(std::uint8_t*, std::uint32_t, std::uint32_t, std::uint32_t)> callback);
	void setOnRenderVideoFrameCallback(
		std::function<void(std::uint8_t*, std::uint32_t, std::uint32_t, std::uint32_t)> callback);
	virtual VIDEO_FRAME_TYPE getVideoFormatPreference() override { return FRAME_TYPE_RGBA; }
private:
	std::function<void(std::uint8_t*, std::uint32_t, std::uint32_t, std::uint32_t)> OnCaptureVideoFrame;
	std::function<void(std::uint8_t*, std::uint32_t, std::uint32_t, std::uint32_t)> OnRenderVideoFrame;
};

Note: AGORAVIDEOCALL_API is a project-dependent define. Use your own define generated by Unreal Engine instead.

Override the onCaptureVideoFrame and onRenderVideoFrame Methods

The onCaptureVideoFrame function retrieves the camera captured image, converts it to ARGB format, and triggers the OnCaptureVideoFrame callback. The function onRenderVideoFrame converts the received image of the specified user to ARGB format and triggers the OnRenderVideoFrame callback.

//VideoFrameObserver.cpp
bool VideoFrameObserver::onCaptureVideoFrame(VideoFrame& Frame)
{
   const auto BufferSize = Frame.yStride*Frame.height;
	if (OnCaptureVideoFrame)
	{
		OnCaptureVideoFrame( static_cast< uint8_t* >( Frame.yBuffer ), Frame.width, Frame.height, BufferSize );
	}
	return true;
}
bool VideoFrameObserver::onRenderVideoFrame(unsigned int uid, VideoFrame& Frame)
{
	const auto BufferSize = Frame.yStride*Frame.height;
	if (OnRenderVideoFrame)
	{
		OnRenderVideoFrame( static_cast<uint8_t*>(Frame.yBuffer), Frame.width, Frame.height, BufferSize );
	}
	return true;
}

Add the setOnCaptureVideoFrameCallback and setOnRenderVideoFrameCallback methods.

These are callbacks to retrieve the camera captured image/the received image of the remote user.

//VideoFrameObserver.cpp
void VideoFrameObserver::setOnCaptureVideoFrameCallback(
	std::function<void(std::uint8_t*, std::uint32_t, std::uint32_t, std::uint32_t)> Callback)
{
	OnCaptureVideoFrame = Callback;
}
void VideoFrameObserver::setOnRenderVideoFrameCallback(
	std::function<void(std::uint8_t*, std::uint32_t, std::uint32_t, std::uint32_t)> Callback)
{
	OnRenderVideoFrame = Callback;
}

Create the VideoCall C++ Class

The VideoCall class manages communication with the Agora SDK.

Create the Class Interface

Return to the Unreal Editor, create a C++ class as you did in the previous step, and name it VideoCall.h.

Go to the VideoCall.h file and add:

//VideoCall.h
#pragma once
#include "CoreMinimal.h"
#include <functional>
#include <vector>
#include "AgoraRtcEngine.h"
#include "AgoraMediaEngine.h"
class VideoFrameObserver;
class AGORAVIDEOCALL_API VideoCall
{
public:
	VideoCall();
	~VideoCall();
	FString GetVersion() const;
	void RegisterOnLocalFrameCallback(
		std::function<void(std::uint8_t*, std::uint32_t, std::uint32_t, std::uint32_t)> OnLocalFrameCallback);
	void RegisterOnRemoteFrameCallback(
		std::function<void(std::uint8_t*, std::uint32_t, std::uint32_t, std::uint32_t)> OnRemoteFrameCallback);
	void StartCall(
		const FString& ChannelName,
		const FString& EncryptionKey,
		const FString& EncryptionType);
	void StopCall();
	bool MuteLocalAudio(bool bMuted = true);
	bool IsLocalAudioMuted();
	bool MuteLocalVideo(bool bMuted = true);
	bool IsLocalVideoMuted();
	bool EnableVideo(bool bEnable = true);
private:
	void InitAgora();
private:
	TSharedPtr<agora::rtc::ue4::AgoraRtcEngine> RtcEnginePtr;
	TSharedPtr<agora::media::ue4::AgoraMediaEngine> MediaEnginePtr;
	TUniquePtr<VideoFrameObserver> VideoFrameObserverPtr;
	//callback
	//data, w, h, size
	std::function<void(std::uint8_t*, std::uint32_t, std::uint32_t, std::uint32_t)> OnLocalFrameCallback;
	std::function<void(std::uint8_t*, std::uint32_t, std::uint32_t, std::uint32_t)> OnRemoteFrameCallback;
	bool bLocalAudioMuted = false;
	bool bLocalVideoMuted = false;
};

Create Initializing Methods

Go to the VideoCall.cpp file and add the required includes:

//VideoCall.cpp
#include "AgoraVideoDeviceManager.h"
#include "AgoraAudioDeviceManager.h"
#include "MediaShaders.h"
#include "VideoFrameObserver.h"

Next, you add the methods to VideoCall.cpp which will create and initialize the Agora engine:

//VideoCall.cpp
VideoCall::VideoCall()
{
	InitAgora();
}
VideoCall::~VideoCall()
{
	StopCall();
}
void VideoCall::InitAgora()
{
	RtcEnginePtr = TSharedPtr<agora::rtc::ue4::AgoraRtcEngine>(agora::rtc::ue4::AgoraRtcEngine::createAgoraRtcEngine());
	static agora::rtc::RtcEngineContext ctx;
	ctx.appId = "aab8b8f5a8cd4469a63042fcfafe7063";
	ctx.eventHandler = new agora::rtc::IRtcEngineEventHandler();
	int ret = RtcEnginePtr->initialize(ctx);
	if (ret < 0)
	{
		UE_LOG(LogTemp, Warning, TEXT("RtcEngine initialize ret: %d"), ret);
	}
	MediaEnginePtr = TSharedPtr<agora::media::ue4::AgoraMediaEngine>(agora::media::ue4::AgoraMediaEngine::Create(RtcEnginePtr.Get()));
}
FString VideoCall::GetVersion() const
{
	if (!RtcEnginePtr)
	{
		return "";
	}
	int build = 0;
	const char* version = RtcEnginePtr->getVersion(&build);
	return FString(ANSI_TO_TCHAR(version));
}

Create Callbacks

Set the callback function to return local and remote frames:

//VideoCall.cpp
void VideoCall::RegisterOnLocalFrameCallback(
	std::function<void(std::uint8_t*, std::uint32_t, std::uint32_t, std::uint32_t)> OnFrameCallback)
{
	OnLocalFrameCallback = std::move(OnFrameCallback);
}
void VideoCall::RegisterOnRemoteFrameCallback(
	std::function<void(std::uint8_t*, std::uint32_t, std::uint32_t, std::uint32_t)> OnFrameCallback)
{
	OnRemoteFrameCallback = std::move(OnFrameCallback);
}

Create Call Methods

The methods in this section manage joining or leaving a channel.

Add the StartCall Function

Create the VideoFrameObserver object and register the following callbacks according to your scenarios:

-`OnLocalFrameCallback`: Occurs each time the SDK receives a video frame captured by the local camera.
-`OnRemoteFrameCallback`: Occurs each time the SDK receives a video frame sent by the remote user.

In the InitAgora function, register the VideoFrameObserver object in the MediaEngine object with the registerVideoFrameObserver method. If EncryptionType and EncryptionKey are not empty, set EncryptionMode and EncryptionSecret for RtcEngine, then set the channel profile according to your needs and call joinChannel.

//VideoCall.cpp
void VideoCall::StartCall(
	const FString& ChannelName,
	const FString& EncryptionKey,
	const FString& EncryptionType)
{
	if (!RtcEnginePtr)
	{
		return;
	}
	if (MediaEnginePtr)
	{
		if (!VideoFrameObserverPtr)
		{
			VideoFrameObserverPtr = MakeUnique<VideoFrameObserver>();
			std::function<void(std::uint8_t*, std::uint32_t, std::uint32_t, std::uint32_t)> OnCaptureVideoFrameCallback
				= [this](std::uint8_t* buffer, std::uint32_t width, std::uint32_t height, std::uint32_t size)
			{
				if (OnLocalFrameCallback)
				{
					OnLocalFrameCallback(buffer, width, height, size);
				}
				else { UE_LOG(LogTemp, Warning, TEXT("VideoCall OnLocalFrameCallback isn't set")); }
			};
			VideoFrameObserverPtr->setOnCaptureVideoFrameCallback(std::move(OnCaptureVideoFrameCallback));
			std::function<void(std::uint8_t*, std::uint32_t, std::uint32_t, std::uint32_t)> OnRenderVideoFrameCallback
				= [this](std::uint8_t* buffer, std::uint32_t width, std::uint32_t height, std::uint32_t size)
			{
				if (OnRemoteFrameCallback)
				{
					OnRemoteFrameCallback(buffer, width, height, size);
				}
				else { UE_LOG(LogTemp, Warning, TEXT("VideoCall OnRemoteFrameCallback isn't set")); }
			};
			VideoFrameObserverPtr->setOnRenderVideoFrameCallback(std::move(OnRenderVideoFrameCallback));
		}
		MediaEnginePtr->registerVideoFrameObserver(VideoFrameObserverPtr.Get());
	}
    int nRet = RtcEnginePtr->enableVideo();
    if (nRet < 0)
    {
        UE_LOG(LogTemp, Warning, TEXT("enableVideo : %d"), nRet)
    }
	if (!EncryptionType.IsEmpty() && !EncryptionKey.IsEmpty())
	{
		if (EncryptionType == "aes-256")
		{
			RtcEnginePtr->setEncryptionMode("aes-256-xts");
		}
		else
		{
			RtcEnginePtr->setEncryptionMode("aes-128-xts");
		}
		nRet = RtcEnginePtr->setEncryptionSecret(TCHAR_TO_ANSI(*EncryptionKey));
		if (nRet < 0)
		{
			UE_LOG(LogTemp, Warning, TEXT("setEncryptionSecret : %d"), nRet)
		}
	}
	nRet = RtcEnginePtr->setChannelProfile(agora::rtc::CHANNEL_PROFILE_COMMUNICATION);
	if (nRet < 0)
	{
		UE_LOG(LogTemp, Warning, TEXT("setChannelProfile : %d"), nRet)
	}
	//"demoChannel1";
	std::uint32_t nUID = 0;
	nRet = RtcEnginePtr->joinChannel(NULL, TCHAR_TO_ANSI(*ChannelName), NULL, nUID);
	if (nRet < 0)
	{
		UE_LOG(LogTemp, Warning, TEXT("joinChannel ret: %d"), nRet);
	}
}

Add the StopCall Function

Call the leaveChannel method to leave the current call according to your scenario – for example, when leaving a current call because a call ends, when you need to close the app, or when your app runs in the background. Call registerVideoFrameObserver with the nullptr argument to cancel the registration of the VideoFrameObserver.

//VideoCall.cpp
void VideoCall::StopCall()
{
	if (!RtcEnginePtr)
	{
		return;
	}
	auto ConnectionState = RtcEnginePtr->getConnectionState();
	if (agora::rtc::CONNECTION_STATE_DISCONNECTED != ConnectionState)
	{
		int nRet = RtcEnginePtr->leaveChannel();
		if (nRet < 0)
		{
			UE_LOG(LogTemp, Warning, TEXT("leaveChannel ret: %d"), nRet);
		}
		if (MediaEnginePtr)
		{
			MediaEnginePtr->registerVideoFrameObserver(nullptr);
		}
	}
}

Video Methods

Add the EnableVideo() Method

The EnableVideo() method enables the video for the sample application. Initialize nRet with a value for 0. If bEnable is true, enable the video using RtcEnginePtr->enableVideo(). Otherwise, disable the video using RtcEnginePtr->disableVideo().

//VideoCall.cpp
bool VideoCall::EnableVideo(bool bEnable)
{
	if (!RtcEnginePtr)
	{
		return false;
	}
	int nRet = 0;
	if (bEnable)
		nRet = RtcEnginePtr->enableVideo();
	else
		nRet = RtcEnginePtr->disableVideo();
	return nRet == 0 ? true : false;
}

Add the MuteLocalVideo() Method

The MuteLocalVideo() method turns local video on or off. Ensure that RtcEnginePtr is not nullptr before completing the remaining method actions. If the mute or unmute local video is successful, set bLocalVideoMuted to bMuted.

//VideoCall.cpp
bool VideoCall::MuteLocalVideo(bool bMuted)
{
	if (!RtcEnginePtr)
	{
		return false;
	}
	int ret = RtcEnginePtr->muteLocalVideoStream(bMuted);
	if (ret == 0)
		bLocalVideoMuted = bMuted;
	return ret == 0 ? true : false;
}

Add the IsLocalVideoMuted() Method

The IsLocalVideoMuted() method indicates whether the local video is on or off for the sample application, returning bLocalVideoMuted.

//VideoCall.cpp
bool VideoCall::IsLocalVideoMuted()
{
	return bLocalVideoMuted;
}

Create Audio Methods

Add the MuteLocalAudio() Method

The MuteLocalAudio() method mutes or unmutes the local audio. Ensure that RtcEnginePtr is not nullptr before completing the remaining method actions. If the mute or unmute local audio is successful, set bLocalAudioMuted to bMuted.

//VideoCall.cpp<br>
bool VideoCall::MuteLocalAudio(bool bMuted)
{
	if (!RtcEnginePtr)
	{
		return false;
	}
	int ret = RtcEnginePtr-&gt;muteLocalAudioStream(bMuted);
	if (ret == 0)
		bLocalAudioMuted = bMuted;
	return ret == 0 ? true : false;
}

Add the IsLocalAudioMuted() Method

The IsLocalAudioMuted() method indicates whether local audio is muted or unmuted for the sample application, returning bLocalAudioMuted.

//VideoCall.cpp
bool VideoCall::IsLocalAudioMuted()
{
	return bLocalAudioMuted;
}

Create the GUI

Now you create the graphical user interface (GUI) for the one-to-one call in your project with these classes:

  • VideoCallPlayerController
  • EnterChannelWidget
  • VideoViewWidget
  • VideoCallViewWidget
  • VideoCallWidget
  • BP_EnterChannelWidget blueprint asset
  • BP_VideoViewWidget asset
  • BP_VideoCallViewWidget asset
  • BP_VideoCallWidget asset
  • BP_VideoCallPlayerController blueprint asset
  • BP_AgoraVideoCallGameModeBase asset

Create the VideoCallPlayerController

To add our widget blueprints to the Viewport, you create a custom player controller class.

In the Content browser, click the Add New button and select New C++ Class. In the Add C++ Class window, click the Show All Classes button, and type “PlayerController”. Click the Next button and name the class VideoCallPlayerController. Click the Create Class button.

#include "CoreMinimal.h"
#include "GameFramework/PlayerController.h"
#include "VideoCallPlayerController.generated.h"UCLASS()
<br>class AGORAVIDEOCALL_API AVideoCallPlayerController : public APlayerController
{
	GENERATED_BODY()
public:
};

This class is a base class for the BP_VideoCallPlayerController blueprint asset, which will be created at the end.

Add Required Includes

At the top of the VideoCallPlayerController.h file, include the required header files:

//VideoCallPlayerController.h
#include "CoreMinimal.h"
#include "GameFramework/PlayerController.h"
#include "Templates/UniquePtr.h"
#include "VideoCall.h"
#include "VideoCallPlayerController.generated.h"
...
//VideoCallPlayerController.cpp
#include "Blueprint/UserWidget.h"
#include "EnterChannelWidget.h"
#include "VideoCallWidget.h"

Class Declaration

Add forward declaration of the next classes:

//VideoCallPlayerController.h
class UEnterChannelWidget;
class UVideoCallWidget;

Later you will follow up on the creation of two of them: UEnterChannelWidget and UVideoCallWidget.

Add Member Variables

Now add the member references to the UMG asset in the Unreal Editor:

//VideoCallPlayerController.h
...
UCLASS()
class AGORAVIDEOCALL_API AVideoCallPlayerController : public APlayerController
{
	GENERATED_BODY()
public:
	UPROPERTY(EditAnywhere, BlueprintReadWrite, Category = "Widgets")
		TSubclassOf&lt;class UUserWidget&gt;  wEnterChannelWidget;
	UPROPERTY(EditAnywhere, BlueprintReadWrite, Category = "Widgets")
		TSubclassOff&lt;class UUserWidget&gt;  wVideoCallWidget;
...
};

Add variables to hold the widgets after creating a pointer to VideoCall:

//VideoCallPlayerController.h
...
UCLASS()
class AGORAVIDEOCALL_API AVideoCallPlayerController : public APlayerController
{
	GENERATED_BODY()
public:
	...
	UEnterChannelWidget* EnterChannelWidget = nullptr;
	UVideoCallWidget* VideoCallWidget = nullptr;
	TUniquePtr&lt;VideoCall&gt;  VideoCallPtr;
	...
};

Override BeginPlay and EndPlay

//VideoCallPlayerController.h
...
UCLASS()
class AGORAVIDEOCALL_API AVideoCallPlayerController : public APlayerController
{
	GENERATED_BODY()
public:
	...
	void BeginPlay() override;
	void EndPlay(const EEndPlayReason::Type EndPlayReason) override;
	...
};
//VideoCallPlayerController.cpp
void AVideoCallPlayerController::BeginPlay()
{
	Super::BeginPlay();
	//initialize widgets
	if (wEnterChannelWidget) // Check if the Asset is assigned in the blueprint.
	{
		// Create the widget and store it.
		if (!EnterChannelWidget)
		{
			EnterChannelWidget = CreateWidget&lt;UEnterChannelWidget&gt;(this, wEnterChannelWidget);
			EnterChannelWidget-&gt;SetVideoCallPlayerController(this);
		}
		// now you can use the widget directly since you have a reference for it.
		// Extra check to  make sure the pointer holds the widget.
		if (EnterChannelWidget)
		{
			//let's add it to the view port
			EnterChannelWidget-&gt;AddToViewport();
		}
		//Show the Cursor.
		bShowMouseCursor = true;
	}
	if (wVideoCallWidget)
	{
		if (!VideoCallWidget)
		{
			VideoCallWidget = CreateWidget&lt;UVideoCallWidget&gt;(this, wVideoCallWidget);
			VideoCallWidget-&gt;SetVideoCallPlayerController(this);
		}
		if (VideoCallWidget)
		{
			VideoCallWidget-&gt;AddToViewport();
		}
		VideoCallWidget-&gt;SetVisibility(ESlateVisibility::Collapsed);
	}
	//create video call and switch on the EnterChannelWidget
	VideoCallPtr = MakeUnique&lt;VideoCall&gt;();
	FString Version = VideoCallPtr-&gt;GetVersion();
	Version = "Agora version: " + Version;
	EnterChannelWidget-&gt;UpdateVersionText(Version);
	SwitchOnEnterChannelWidget(std::move(VideoCallPtr));
}
void AVideoCallPlayerController::EndPlay(const EEndPlayReason::Type EndPlayReason)
{
	Super::EndPlay(EndPlayReason);
}

You may notice that the EnterChannelWidget and VideoCallWidget methods are marked as errors. This is because they are not implemented yet. You implement them in the next steps.

Add StartCall and EndCall Methods

//VideoCallPlayerController.h
...
UCLASS()
class AGORAVIDEOCALL_API AVideoCallPlayerController : public APlayerController
{
	GENERATED_BODY()
public:
	...
	void StartCall(
		TUniquePtr&lt;VideoCall&gt; PassedVideoCallPtr,
		const FString&amp; ChannelName,
		const FString&amp; EncryptionKey,
		const FString&amp; EncryptionType
		);
	void EndCall(TUniquePtr&lt;VideoCall&gt; PassedVideoCallPtr);
	...
};
//VideoCallPlayerController.cpp
void AVideoCallPlayerController::StartCall(
	TUniquePtr&lt;VideoCall&gt; PassedVideoCallPtr,
	const FString&amp; ChannelName,
	const FString&amp; EncryptionKey,
	const FString&amp; EncryptionType)
{
	SwitchOnVideoCallWidget(std::move(PassedVideoCallPtr));
	VideoCallWidget-&gt;OnStartCall(
		ChannelName,
		EncryptionKey,
		EncryptionType);
}
void AVideoCallPlayerController::EndCall(TUniquePtr&lt;VideoCall&gt; PassedVideoCallPtr)
{
	SwitchOnEnterChannelWidget(std::move(PassedVideoCallPtr));
}

Add Switch On Another Widget Methods

By managing the visibility of a widget and passing the VideoCall pointer, we define an active widget.

//VideoCallPlayerController.h
...
UCLASS()
class AGORAVIDEOCALL_API AVideoCallPlayerController : public APlayerController
{
	GENERATED_BODY()
public:
	...
	void SwitchOnEnterChannelWidget(TUniquePtr&lt;VideoCall&gt; PassedVideoCallPtr);
	void SwitchOnVideoCallWidget(TUniquePtr&lt;VideoCall&gt; PassedVideoCallPtr);
	...
};
//VideoCallPlayerController.cpp
void AVideoCallPlayerController::SwitchOnEnterChannelWidget(TUniquePtr&lt;VideoCall&gt; PassedVideoCallPtr)
{
	if (!EnterChannelWidget)
	{
		return;
	}
	EnterChannelWidget-&gt;SetVideoCall(std::move(PassedVideoCallPtr));
	EnterChannelWidget-&gt;SetVisibility(ESlateVisibility::Visible);
}
void AVideoCallPlayerController::SwitchOnVideoCallWidget(TUniquePtr&lt;VideoCall&gt; PassedVideoCallPtr)
{
	if (!VideoCallWidget)
	{
		return;
	}
	VideoCallWidget-&gt;SetVideoCall(std::move(PassedVideoCallPtr));
	VideoCallWidget-&gt;SetVisibility(ESlateVisibility::Visible);
}

Create EnterChannelWidget C++ Class

The EnterChannelWidget class manages UI element interactions (from the corresponding blueprint asset) with the application.

Create a class of UserWidget type. In the Content browser, click the Add New button and select New C++ Class. Then click the Show All Classes button, and type “UserWidget.” Click the Next button and set a name for the class: EnterChannelWidget.

When you create the channel widget, you get something like this:

//EnterChannelWidget.h
#include "CoreMinimal.h"
#include "Blueprint/UserWidget.h"
#include "EnterChannelWidget.generated.h"UCLASS()
class AGORAVIDEOCALL_API UEnterChannelWidget : public UUserWidget
{
	GENERATED_BODY()
};

Add Required Includes

At the top of the EnterChannelWidget.h file, include the required header files and forward declarations:

//EnterChannelWidget.h
#include "CoreMinimal.h"
#include "Blueprint/UserWidget.h"
#include "Components/TextBlock.h"
#include "Components/RichTextBlock.h"
#include "Components/EditableTextBox.h"
#include "Components/ComboBoxString.h"
#include "Components/Button.h"
#include "Components/Image.h"
#include "VideoCall.h"
#include "EnterChannelWidget.generated.h"
class AVideoCallPlayerController;
//EnterChannelWidget.cpp
#include "Blueprint/WidgetTree.h"
#include "VideoCallPlayerController.h"

Add Member Variables

Now add the next member variables:

//EnterChannelWidget.h...
UCLASS()
class AGORAVIDEOCALL_API UEnterChannelWidget : public UUserWidget
{
	GENERATED_BODY()public:
	UPROPERTY(VisibleAnywhere, BlueprintReadOnly, meta = (BindWidget))
		UTextBlock* HeaderTextBlock = nullptr;
	UPROPERTY(VisibleAnywhere, BlueprintReadOnly, meta = (BindWidget))
		UTextBlock* DescriptionTextBlock = nullptr;
	UPROPERTY(VisibleAnywhere, BlueprintReadOnly, meta = (BindWidget))
		UEditableTextBox* ChannelNameTextBox = nullptr;
	UPROPERTY(VisibleAnywhere, BlueprintReadOnly, meta = (BindWidget))
		UEditableTextBox* EncriptionKeyTextBox = nullptr;
	UPROPERTY(VisibleAnywhere, BlueprintReadOnly, meta = (BindWidget))
		UTextBlock* EncriptionTypeTextBlock = nullptr;
	UPROPERTY(VisibleAnywhere, BlueprintReadOnly, meta = (BindWidget))
		UComboBoxString* EncriptionTypeComboBox = nullptr;
	UPROPERTY(VisibleAnywhere, BlueprintReadOnly, meta = (BindWidget))
		UButton* JoinButton = nullptr;
	UPROPERTY(VisibleAnywhere, BlueprintReadOnly, meta = (BindWidget))
		UButton* TestButton = nullptr;
	UPROPERTY(VisibleAnywhere, BlueprintReadOnly, meta = (BindWidget))
		UButton* VideoSettingsButton = nullptr;
	UPROPERTY(VisibleAnywhere, BlueprintReadOnly, meta = (BindWidget))
		UTextBlock* ContactsTextBlock = nullptr;
	UPROPERTY(VisibleAnywhere, BlueprintReadOnly, meta = (BindWidget))
		UTextBlock* BuildInfoTextBlock = nullptr;
	...
};

These variables are needed to control the corresponding UI elements in the blueprint asset. The most important here is the BindWidget meta property. By marking a pointer to a widget as BindWidget, you can create an identically. named widget in a Blueprint subclass of your C++ class and at run time access it from the C++.

Add the Next Members

//EnterChannelWidget.h
...
UCLASS()
class AGORAVIDEOCALL_API UEnterChannelWidget : public UUserWidget
{
	GENERATED_BODY()
	...
public:
	AVideoCallPlayerController* PlayerController = nullptr;
	TUniquePtr&lt;VideoCall&gt; VideoCallPtr;
	...
};

Add Constructor and Construct/Destruct Methods

//EnterChannelWidget.h
...
UCLASS()
class AGORAVIDEOCALL_API UEnterChannelWidget : public UUserWidget
{
	GENERATED_BODY()
public:
	...
UEnterChannelWidget(const FObjectInitializer&amp; objectInitializer);
	void NativeConstruct() override;
...
};
//EnterChannelWidget.cpp
UEnterChannelWidget::UEnterChannelWidget(const FObjectInitializer&amp; objectInitializer)
	: Super(objectInitializer)
{
}
void UEnterChannelWidget::NativeConstruct()
{
	Super::NativeConstruct();
	if (HeaderTextBlock)
		HeaderTextBlock-&gt;SetText(FText::FromString("Enter a conference room name"));
	if (DescriptionTextBlock)
		DescriptionTextBlock-&gt;SetText(FText::FromString("If you are the first person to specify this name, 
the room will be created and you willnbe placed in it. 
If it has already been created you will join the conference in progress"));
	if (ChannelNameTextBox)
		ChannelNameTextBox-&gt;SetHintText(FText::FromString("Channel Name"));
	if (EncriptionKeyTextBox)
		EncriptionKeyTextBox-&gt;SetHintText(FText::FromString("Encription Key"));
	if (EncriptionTypeTextBlock)
		EncriptionTypeTextBlock-&gt;SetText(FText::FromString("Enc Type:"));
	if (EncriptionTypeComboBox)
	{
		EncriptionTypeComboBox-&gt;AddOption("aes-128");
		EncriptionTypeComboBox-&gt;AddOption("aes-256");
		EncriptionTypeComboBox-&gt;SetSelectedIndex(0);
	}
	if (JoinButton)
	{
		UTextBlock* JoinTextBlock = WidgetTree-&gt;ConstructWidget&lt;UTextBlock&gt;(UTextBlock::StaticClass());
		JoinTextBlock-&gt;SetText(FText::FromString("Join"));
		JoinButton-&gt;AddChild(JoinTextBlock);
		JoinButton-&gt;OnClicked.AddDynamic(this, &amp;UEnterChannelWidget::OnJoin);
	}
	if (ContactsTextBlock)
		ContactsTextBlock-&gt;SetText(FText::FromString("agora.io Contact support: 400 632 6626"));
	if (BuildInfoTextBlock)
		BuildInfoTextBlock-&gt;SetText(FText::FromString(" "));
}

Add Setter Methods

Initialize the PlayerController and VideoCallPtr variables:

//EnterChannelWidget.h
...
UCLASS()
class AGORAVIDEOCALL_API UEnterChannelWidget : public UUserWidget
{
	GENERATED_BODY()
public:
	...
void SetVideoCallPlayerController(AVideoCallPlayerController* VideoCallPlayerController);
	void SetVideoCall(TUniquePtr&lt;VideoCall&gt; PassedVideoCallPtr);
...
};
//EnterChannelWidget.cpp
void UEnterChannelWidget::SetVideoCallPlayerController(AVideoCallPlayerController* VideoCallPlayerController)
{
	PlayerController = VideoCallPlayerController;
}
void UEnterChannelWidget::SetVideoCall(TUniquePtr&lt;VideoCall&gt;  PassedVideoCallPtr)
{
	VideoCallPtr = std::move(PassedVideoCallPtr);
}

Add BlueprintCallable Methods

To react on the corresponding onButtonClick button event:

//EnterChannelWidget.h
...
UCLASS()
class AGORAVIDEOCALL_API UEnterChannelWidget : public UUserWidget
{
	GENERATED_BODY()
public:
	...
	UFUNCTION(BlueprintCallable)
		void OnJoin();
	...
};
//EnterChannelWidget.cpp
void UEnterChannelWidget::OnJoin()
{
	if (!PlayerController || !VideoCallPtr)
	{
		return;
	}
	FString ChannelName = ChannelNameTextBox-&gt;GetText().ToString();
	FString EncryptionKey = EncriptionKeyTextBox-&gt;GetText().ToString();
	FString EncryptionType = EncriptionTypeComboBox-&gt;GetSelectedOption();
	SetVisibility(ESlateVisibility::Collapsed);
	PlayerController-&gt;StartCall(
		std::move(VideoCallPtr),
		ChannelName,
		EncryptionKey,
		EncryptionType);
}

Add Update Methods

//EnterChannelWidget.h
...
UCLASS()
class AGORAVIDEOCALL_API UEnterChannelWidget : public UUserWidget
{
	GENERATED_BODY()
public:
	...
	void UpdateVersionText(FString newValue);
	...
};
//EnterChannelWidget.cpp
void UEnterChannelWidget::UpdateVersionText(FString newValue)
{
	if (BuildInfoTextBlock)
		BuildInfoTextBlock-&gt;SetText(FText::FromString(newValue));
}

Create the VideoViewWidget C++ Class

VideoViewWidget is a class to store the dynamic texture and update it, using the RGBA buffer, received from VideoCall OnLocalFrameCallback / OnRemoteFrameCallback functions.

Create the Class and Add Required Includes

Create the widget C++ class as you did before and add the required includes:

//VideoViewWidget.h
#include "CoreMinimal.h"
#include "Blueprint/UserWidget.h"
#include "Components/Image.h"
#include "VideoViewWidget.generated.h"
//VideoViewWidget.cpp
#include "EngineUtils.h"
#include "Engine/Texture2D.h"
#include &lt;algorithm&gt;

Add Member Variables

Buffer — Variable to store the RGBA buffer, Width, Height and BufferSize, which are params of the video frame.
RenderTargetImage — The image widget that allows you to display a Slate Brush, texture, or material in the UI.
RenderTargetTexture — The dynamic texture, which you will update using the Buffer variable.
FUpdateTextureRegion2D — Specifies an update region for a texture.
Brush — A brush that contains information about how to draw a Slate element. You will use it to draw RenderTargetTexture on RenderTargetImage.

//VideoViewWidget.h
...
UCLASS()
class AGORAVIDEOCALL_API UVideoViewWidget : public UUserWidget
{
	GENERATED_BODY()
public:
	UPROPERTY(BlueprintReadOnly, meta = (BindWidget))
		UImage* RenderTargetImage = nullptr;	UPROPERTY(EditDefaultsOnly)
		UTexture2D* RenderTargetTexture = nullptr;
	UTexture2D* CameraoffTexture = nullptr;
	uint8* Buffer = nullptr;
	uint32_t Width = 0;
	uint32_t Height = 0;
	uint32 BufferSize = 0;
	FUpdateTextureRegion2D* UpdateTextureRegion = nullptr;
	FSlateBrush Brush;
	FCriticalSection Mutex;
	...
};

Override the NativeConstruct() Method

In the NativeConstruct you initialize the image with a default color. To initialize our RenderTargetTexture you need to create the dynamic texture (Texture2D) using a CreateTransient call, then allocate Buffer with BufferSize calculated as Width * Height * 4 (to store the RGBA format, where each pixel can be represented using 4 bytes).

To update our texture you can use a call to UpdateTextureRegions. One of the input parameters to this function is the pixel data buffer. Whenever you modify the pixel data buffer you need to call this function to make the change visible in the texture.

Now initialize the Brush variable with the RenderTargetTexture, and then set this Brush in RenderTargetImage widget.

//VideoViewWidget.h
...
UCLASS()
class AGORAVIDEOCALL_API UVideoViewWidget : public UUserWidget
{
	GENERATED_BODY()
public:
...
	void NativeConstruct() override;
	...
};
//VideoViewWidget.cpp
void UVideoViewWidget::NativeConstruct()
{
	Super::NativeConstruct();
	Width = 640;
	Height = 360;
	RenderTargetTexture = UTexture2D::CreateTransient(Width, Height, PF_R8G8B8A8);
	RenderTargetTexture-&gt;UpdateResource();
	BufferSize = Width * Height * 4;
	Buffer = new uint8[BufferSize];
	for (uint32 i = 0; i &lt; Width * Height; ++i)
	{
		Buffer[i * 4 + 0] = 0x32;
		Buffer[i * 4 + 1] = 0x32;
		Buffer[i * 4 + 2] = 0x32;
		Buffer[i * 4 + 3] = 0xFF;
	}
	UpdateTextureRegion = new FUpdateTextureRegion2D(0, 0, 0, 0, Width, Height);
	RenderTargetTexture-&gt;UpdateTextureRegions(0, 1, UpdateTextureRegion, Width * 4, (uint32)4, Buffer);
	Brush.SetResourceObject(RenderTargetTexture);
	RenderTargetImage-&gt;SetBrush(Brush);
}

Override the NativeConstruct() Method

//VideoViewWidget.h
...
UCLASS()
class AGORAVIDEOCALL_API UVideoViewWidget : public UUserWidget
{
	GENERATED_BODY()
public:
	...
	void NativeDestruct() override;
	...
};
//VideoViewWidget.cpp
void UVideoViewWidget::NativeDestruct()
{
	Super::NativeDestruct();
	delete[] Buffer;
	delete UpdateTextureRegion;
}

Override the NativeTick() Method

In case UpdateTextureRegion Width or Height are not equal to the member Width Height values, you need to re-create RenderTargetTexture to support the updated values, and repeat initialization as in the Native Construct member. Otherwise, just call UpdateTextureRegions with Buffer.

//VideoViewWidget.h
...
UCLASS()
class AGORAVIDEOCALL_API UVideoViewWidget : public UUserWidget
{
	GENERATED_BODY()
public:
	...
	void NativeTick(const FGeometry&amp; MyGeometry, float DeltaTime) override;
	...
};
//VideoViewWidget.cpp
void UVideoViewWidget::NativeTick(const FGeometry&amp; MyGeometry, float DeltaTime)
{
	Super::NativeTick(MyGeometry, DeltaTime);
	FScopeLock lock(&amp;Mutex);
	if (UpdateTextureRegion-&gt;Width != Width ||
		UpdateTextureRegion-&gt;Height != Height)
	{
		auto NewUpdateTextureRegion = new FUpdateTextureRegion2D(0, 0, 0, 0, Width, Height);
		auto NewRenderTargetTexture = UTexture2D::CreateTransient(Width, Height, PF_R8G8B8A8);
		NewRenderTargetTexture-&gt;UpdateResource();
		NewRenderTargetTexture-&gt;UpdateTextureRegions(0, 1, NewUpdateTextureRegion, Width * 4, (uint32)4, Buffer);
		Brush.SetResourceObject(NewRenderTargetTexture);
		RenderTargetImage-&gt;SetBrush(Brush);
		//UClass's such as UTexture2D are automatically garbage collected when there is no hard pointer references made to that object.
		//So if you just leave it and don't reference it elsewhere then it will be destroyed automatically.
		FUpdateTextureRegion2D* TmpUpdateTextureRegion = UpdateTextureRegion;
		RenderTargetTexture = NewRenderTargetTexture;
		UpdateTextureRegion = NewUpdateTextureRegion;
		delete TmpUpdateTextureRegion;
		return;
	}	RenderTargetTexture-&gt;UpdateTextureRegions(0, 1, UpdateTextureRegion, Width * 4, (uint32)4, Buffer);
}

Add the UpdateBuffer() Method

You expect the new value to be received from the Agora SDK thread, so due to a UE4 limitation you save the value into the Buffer variable, update the texture in the NativeTick method, and don’t call UpdateTextureRegions here.

//VideoViewWidget.h
...
UCLASS()
class AGORAVIDEOCALL_API UVideoViewWidget : public UUserWidget
{
	GENERATED_BODY()
public:
	...
	void UpdateBuffer( uint8* RGBBuffer, uint32_t Width, uint32_t Height, uint32_t Size );
	void ResetBuffer();
	...
};
//VideoViewWidget.cpp 
void UVideoViewWidget::UpdateBuffer(
	uint8* RGBBuffer,
	uint32_t NewWidth,
	uint32_t NewHeight,
	uint32_t NewSize)
{
	FScopeLock lock(&amp;Mutex);
	if (!RGBBuffer)
	{
		return;
	}
	if (BufferSize == NewSize)
	{
		std::copy(RGBBuffer, RGBBuffer + NewSize, Buffer);
	}
	else
	{
		delete[] Buffer;
		BufferSize = NewSize;
		Width = NewWidth;
		Height = NewHeight;
		Buffer = new uint8[BufferSize];
		std::copy(RGBBuffer, RGBBuffer + NewSize, Buffer);
	}
}
void UVideoViewWidget::ResetBuffer()
{
	for (uint32 i = 0; i &lt; Width * Height; ++i)
	{
		Buffer[i * 4 + 0] = 0x32;
		Buffer[i * 4 + 1] = 0x32;
		Buffer[i * 4 + 2] = 0x32;
		Buffer[i * 4 + 3] = 0xFF;
	}
}

Create the VideoCallViewWidget C++ Class

The VideoCallViewWidget class serves to display the local and remote user video. You need two VideoViewWidget widgets: one to display video from the local camera and another to display video received from the remote user (assume you support only one remote user).

Create Class and Add Required Includes

//VideoCallViewWidget.h 
#include "CoreMinimal.h"
#include "Blueprint/UserWidget.h"
#include "Components/SizeBox.h"
#include "VideoViewWidget.h"
#include "VideoCallViewWidget.generated.h"
//VideoCallViewWidget.cpp
#include "Components/CanvasPanelSlot.h"

Add Member Variables

//VideoCallViewWidget.h 
...
UCLASS()
class AGORAVIDEOCALL_API UVideoCallViewWidget : public UUserWidget
{
	GENERATED_BODY()
public:
	UPROPERTY(BlueprintReadOnly, meta = (BindWidget))
		UVideoViewWidget* MainVideoViewWidget = nullptr;
	UPROPERTY(BlueprintReadOnly, meta = (BindWidget))
		USizeBox* MainVideoSizeBox = nullptr;
	UPROPERTY(BlueprintReadOnly, meta = (BindWidget))
		UVideoViewWidget* AdditionalVideoViewWidget = nullptr;
	UPROPERTY(BlueprintReadOnly, meta = (BindWidget))
		USizeBox* AdditionalVideoSizeBox = nullptr;
public:
	int32 MainVideoWidth = 0;
	int32 MainVideoHeight = 0;
	...
};

Override the NativeTick() Method

In NativeTick you update the widgets geometry:

//VideoCallViewWidget.h 
...
UCLASS()
class AGORAVIDEOCALL_API UVideoCallViewWidget : public UUserWidget
{
	GENERATED_BODY()
public:
	...
	void NativeTick(const FGeometry&amp; MyGeometry, float DeltaTime) override;
	...
};
//VideoCallViewWidget.cpp
void UVideoCallViewWidget::NativeTick(const FGeometry&amp; MyGeometry, float DeltaTime)
{
	Super::NativeTick(MyGeometry, DeltaTime);
	auto ScreenSize = MyGeometry.GetLocalSize();
	if (MainVideoHeight != 0)
	{
		float AspectRatio = 0;
		AspectRatio = MainVideoWidth / (float)MainVideoHeight;
		auto MainVideoGeometry = MainVideoViewWidget-&gt;GetCachedGeometry();
		auto MainVideoScreenSize = MainVideoGeometry.GetLocalSize();
		if (MainVideoScreenSize.X == 0)
		{
			return;
		}
		auto NewMainVideoHeight = MainVideoScreenSize.Y;
		auto NewMainVideoWidth = AspectRatio * NewMainVideoHeight;
		MainVideoSizeBox-&gt;SetMinDesiredWidth(NewMainVideoWidth);
		MainVideoSizeBox-&gt;SetMinDesiredHeight(NewMainVideoHeight);
		UCanvasPanelSlot* CanvasSlot = Cast&lt;UCanvasPanelSlot&gt;(MainVideoSizeBox-&gt;Slot);
		CanvasSlot-&gt;SetAutoSize(true);
		FVector2D NewPosition;
		NewPosition.X = -NewMainVideoWidth / 2;
		NewPosition.Y = -NewMainVideoHeight / 2;
		CanvasSlot-&gt;SetPosition(NewPosition);
	}
}

Add the Update UpdateMainVideoBuffer and UpdateAdditionalVideoBuffer Methods

//VideoCallViewWidget.h 
...
UCLASS()
class AGORAVIDEOCALL_API UVideoCallViewWidget : public UUserWidget
{
	GENERATED_BODY()
public:
	...
	void UpdateMainVideoBuffer( uint8* RGBBuffer, uint32_t Width, uint32_t Height, uint32_t Size);
	void UpdateAdditionalVideoBuffer( uint8* RGBBuffer, uint32_t Width, uint32_t Height, uint32_t Size);
	void ResetBuffers();
	...
};
//VideoCallViewWidget.cpp
void UVideoCallViewWidget::UpdateMainVideoBuffer(
	uint8* RGBBuffer,
	uint32_t Width,
	uint32_t Height,
	uint32_t Size)
{
	if (!MainVideoViewWidget)
	{
		return;
	}
	MainVideoWidth = Width;
	MainVideoHeight = Height;
	MainVideoViewWidget-&gt;UpdateBuffer(RGBBuffer, Width, Height, Size);
}
void UVideoCallViewWidget::UpdateAdditionalVideoBuffer(
	uint8* RGBBuffer,
	uint32_t Width,
	uint32_t Height,
	uint32_t Size)
{
	if (!AdditionalVideoViewWidget)
	{
		return;
	}
	AdditionalVideoViewWidget-&gt;UpdateBuffer(RGBBuffer, Width, Height, Size);
}
void UVideoCallViewWidget::ResetBuffers()
{
	if (!MainVideoViewWidget || !AdditionalVideoViewWidget)
	{
		return;
	}
	MainVideoViewWidget-&gt;ResetBuffer();
	AdditionalVideoViewWidget-&gt;ResetBuffer();
}

Create the VideoCallWidget C++ Class

The VideoCallWidget class serves as the audio/video call widget for the sample application. It contains the following controls, bound with UI elements in the blueprint asset:

  • The local and remote video view (represented by VideoCallViewWidget)
  • The end-call button (EndCallButton variable)
  • The mute-local-audio button (MuteLocalAudioButton variable)
  • The video-mode button (VideoModeButton variable)

Create the Class and Required Includes

Create the widget C++ class as you did before, and add required includes and forward declarations:

//VideoCallWidget.h
#include "CoreMinimal.h"
#include "Blueprint/UserWidget.h"
#include "Templates/UniquePtr.h"
#include "Components/Image.h"
#include "Components/Button.h"
#include "Engine/Texture2D.h"
#include "VideoCall.h"
#include "VideoCallViewWidget.h"
#include "VideoCallWidget.generated.h"
class AVideoCallPlayerController;
class UVideoViewWidget;
//VideoCallWidget.cpp
#include "Kismet/GameplayStatics.h"
#include "UObject/ConstructorHelpers.h"
#include "Components/CanvasPanelSlot.h"
#include "VideoViewWidget.h"
#include "VideoCallPlayerController.h"

Add Member Variables

//VideoCallWidget.h
...
UCLASS()
class AGORAVIDEOCALL_API UVideoCallWidget : public UUserWidget
{
	GENERATED_BODY()
public:
	AVideoCallPlayerController* PlayerController = nullptr;
public:
	UPROPERTY(BlueprintReadOnly, meta = (BindWidget))
		UVideoCallViewWidget* VideoCallViewWidget = nullptr;
	//Buttons
	UPROPERTY(BlueprintReadOnly, meta = (BindWidget))
		UButton* EndCallButton = nullptr;
	UPROPERTY(BlueprintReadOnly, meta = (BindWidget))
		UButton* MuteLocalAudioButton = nullptr;
	UPROPERTY(BlueprintReadOnly, meta = (BindWidget))
		UButton* VideoModeButton = nullptr;
	//Button textures
	int32 ButtonSizeX = 96;
	int32 ButtonSizeY = 96;
	UTexture2D* EndCallButtonTexture = nullptr;
	UTexture2D* AudioButtonMuteTexture = nullptr;
	UTexture2D* AudioButtonUnmuteTexture = nullptr;
	UTexture2D* VideomodeButtonCameraoffTexture = nullptr;
	UTexture2D* VideomodeButtonCameraonTexture = nullptr;
	TUniquePtr&lt;VideoCall&gt;  VideoCallPtr;
	...
};

Initialize VideoCallWidget

Find the asset image for each button and assign it to the corresponding texture. Then initialize each button with textures:

//VideoCallWidget.h
...
UCLASS()
class AGORAVIDEOCALL_API UVideoCallWidget : public UUserWidget
{
	GENERATED_BODY()
public:
	...
	UVideoCallWidget(const FObjectInitializer&amp; ObjectInitializer);
	void NativeConstruct() override;
	void NativeDestruct() override;
private:
	void InitButtons();
	...
};
//VideoCallWidget.cpp 
void UVideoCallWidget::NativeConstruct()
{
	Super::NativeConstruct();
	InitButtons();
}
void UVideoCallWidget::NativeDestruct()
{
	Super::NativeDestruct();
	if (VideoCallPtr)
	{
		VideoCallPtr-&gt;StopCall();
	}
}
UVideoCallWidget::UVideoCallWidget(const FObjectInitializer&amp; ObjectInitializer)
	: Super(ObjectInitializer)
{
	static ConstructorHelpers::FObjectFinder&lt;UTexture2D&gt;
		EndCallButtonTextureFinder(TEXT("Texture'/Game/ButtonTextures/hangup.hangup'"));
	if (EndCallButtonTextureFinder.Succeeded())
	{
		EndCallButtonTexture = EndCallButtonTextureFinder.Object;
	}
	static ConstructorHelpers::FObjectFinder&lt;UTexture2D&gt;
		AudioButtonMuteTextureFinder(TEXT("Texture'/Game/ButtonTextures/mute.mute'"));
	if (AudioButtonMuteTextureFinder.Succeeded())
	{
		AudioButtonMuteTexture = AudioButtonMuteTextureFinder.Object;
	}
	static ConstructorHelpers::FObjectFinder&lt;UTexture2D&gt;
		AudioButtonUnmuteTextureFinder(TEXT("Texture'/Game/ButtonTextures/unmute.unmute'"));
	if (AudioButtonUnmuteTextureFinder.Succeeded())
	{
		AudioButtonUnmuteTexture = AudioButtonUnmuteTextureFinder.Object;
	}
	static ConstructorHelpers::FObjectFinder&lt;UTexture2D&gt;
		VideomodeButtonCameraonTextureFinder(TEXT("Texture'/Game/ButtonTextures/cameraon.cameraon'"));
	if (VideomodeButtonCameraonTextureFinder.Succeeded())
	{
		VideomodeButtonCameraonTexture = VideomodeButtonCameraonTextureFinder.Object;
	}
	static ConstructorHelpers::FObjectFinder&lt;UTexture2D&gt;
		VideomodeButtonCameraoffTextureFinder(TEXT("Texture'/Game/ButtonTextures/cameraoff.cameraoff'"));
	if (VideomodeButtonCameraoffTextureFinder.Succeeded())
	{
		VideomodeButtonCameraoffTexture = VideomodeButtonCameraoffTextureFinder.Object;
	}
}
void UVideoCallWidget::InitButtons()
{
	if (EndCallButtonTexture)
	{
		EndCallButton-&gt;WidgetStyle.Normal.SetResourceObject(EndCallButtonTexture);
		EndCallButton-&gt;WidgetStyle.Normal.ImageSize = FVector2D(ButtonSizeX, ButtonSizeY);
		EndCallButton-&gt;WidgetStyle.Normal.DrawAs = ESlateBrushDrawType::Type::Image;
		EndCallButton-&gt;WidgetStyle.Hovered.SetResourceObject(EndCallButtonTexture);
		EndCallButton-&gt;WidgetStyle.Hovered.ImageSize = FVector2D(ButtonSizeX, ButtonSizeY);
		EndCallButton-&gt;WidgetStyle.Hovered.DrawAs = ESlateBrushDrawType::Type::Image;
		EndCallButton-&gt;WidgetStyle.Pressed.SetResourceObject(EndCallButtonTexture);
		EndCallButton-&gt;WidgetStyle.Pressed.ImageSize = FVector2D(ButtonSizeX, ButtonSizeY);
		EndCallButton-&gt;WidgetStyle.Pressed.DrawAs = ESlateBrushDrawType::Type::Image;
	}
	EndCallButton-&gt;OnClicked.AddDynamic(this, &amp;UVideoCallWidget::OnEndCall);
	SetAudioButtonToMute();
	MuteLocalAudioButton-&gt;OnClicked.AddDynamic(this, &amp;UVideoCallWidget::OnMuteLocalAudio);
	SetVideoModeButtonToCameraOff();
	VideoModeButton-&gt;OnClicked.AddDynamic(this, &amp;UVideoCallWidget::OnChangeVideoMode);
}

Add Button Textures

Find the Content/ButtonTextures directory in the demo application. (You don’t have to open the project. Simply find this folder in the file system.) All button textures are stored there. In your project content create a directory called ButtonTextures, and drag and drop all button images there to make them available in your project.

Add Setters

//VideoCallWidget.h...
UCLASS()
class AGORAVIDEOCALL_API UVideoCallWidget : public UUserWidget
{
	GENERATED_BODY()
	...
public:
	void SetVideoCallPlayerController(AVideoCallPlayerController* VideoCallPlayerController);
	void SetVideoCall(TUniquePtr&lt;VideoCall&gt; PassedVideoCallPtr);
	...
};
//VideoCallWidget.cpp
void UVideoCallWidget::SetVideoCallPlayerController(AVideoCallPlayerController* VideoCallPlayerController)
{
	PlayerController = VideoCallPlayerController;
}
void UVideoCallWidget::SetVideoCall(TUniquePtr&lt;VideoCall&gt; PassedVideoCallPtr)
{
	VideoCallPtr = std::move(PassedVideoCallPtr);
}

Add Methods to Update Buttons View

//VideoCallWidget.h
...
UCLASS()
class AGORAVIDEOCALL_API UVideoCallWidget : public UUserWidget
{
	GENERATED_BODY()
	...
private:
	void SetVideoModeButtonToCameraOff();
	void SetVideoModeButtonToCameraOn();
	void SetAudioButtonToMute();
	void SetAudioButtonToUnMute();
	...
};
//VideoCallWidget.cpp
void UVideoCallWidget::SetVideoModeButtonToCameraOff()
{
	if (VideomodeButtonCameraoffTexture)
	{
		VideoModeButton-&gt;WidgetStyle.Normal.SetResourceObject(VideomodeButtonCameraoffTexture);
		VideoModeButton-&gt;WidgetStyle.Normal.ImageSize = FVector2D(ButtonSizeX, ButtonSizeY);
		VideoModeButton-&gt;WidgetStyle.Normal.DrawAs = ESlateBrushDrawType::Type::Image;
		VideoModeButton-&gt;WidgetStyle.Hovered.SetResourceObject(VideomodeButtonCameraoffTexture);
		VideoModeButton-&gt;WidgetStyle.Hovered.ImageSize = FVector2D(ButtonSizeX, ButtonSizeY);
		VideoModeButton-&gt;WidgetStyle.Hovered.DrawAs = ESlateBrushDrawType::Type::Image;
		VideoModeButton-&gt;WidgetStyle.Pressed.SetResourceObject(VideomodeButtonCameraoffTexture);
		VideoModeButton-&gt;WidgetStyle.Pressed.ImageSize = FVector2D(ButtonSizeX, ButtonSizeY);
		VideoModeButton-&gt;WidgetStyle.Pressed.DrawAs = ESlateBrushDrawType::Type::Image;
	}
}
void UVideoCallWidget::SetVideoModeButtonToCameraOn()
{
	if (VideomodeButtonCameraonTexture)
	{
		VideoModeButton-&gt;WidgetStyle.Normal.SetResourceObject(VideomodeButtonCameraonTexture);
		VideoModeButton-&gt;WidgetStyle.Normal.ImageSize = FVector2D(ButtonSizeX, ButtonSizeY);
		VideoModeButton-&gt;WidgetStyle.Normal.DrawAs = ESlateBrushDrawType::Type::Image;
		VideoModeButton-&gt;WidgetStyle.Hovered.SetResourceObject(VideomodeButtonCameraonTexture);
		VideoModeButton-&gt;WidgetStyle.Hovered.ImageSize = FVector2D(ButtonSizeX, ButtonSizeY);
		VideoModeButton-&gt;WidgetStyle.Hovered.DrawAs = ESlateBrushDrawType::Type::Image;
		VideoModeButton-&gt;WidgetStyle.Pressed.SetResourceObject(VideomodeButtonCameraonTexture);
		VideoModeButton-&gt;WidgetStyle.Pressed.ImageSize = FVector2D(ButtonSizeX, ButtonSizeY);
		VideoModeButton-&gt;WidgetStyle.Pressed.DrawAs = ESlateBrushDrawType::Type::Image;
	}
}
void UVideoCallWidget::SetAudioButtonToMute()
{
	if (AudioButtonMuteTexture)
	{
		MuteLocalAudioButton-&gt;WidgetStyle.Normal.SetResourceObject(AudioButtonMuteTexture);
		MuteLocalAudioButton-&gt;WidgetStyle.Normal.ImageSize = FVector2D(ButtonSizeX, ButtonSizeY);
		MuteLocalAudioButton-&gt;WidgetStyle.Normal.DrawAs = ESlateBrushDrawType::Type::Image;
		MuteLocalAudioButton-&gt;WidgetStyle.Hovered.SetResourceObject(AudioButtonMuteTexture);
		MuteLocalAudioButton-&gt;WidgetStyle.Hovered.ImageSize = FVector2D(ButtonSizeX, ButtonSizeY);
		MuteLocalAudioButton-&gt;WidgetStyle.Hovered.DrawAs = ESlateBrushDrawType::Type::Image;
		MuteLocalAudioButton-&gt;WidgetStyle.Pressed.SetResourceObject(AudioButtonMuteTexture);
		MuteLocalAudioButton-&gt;WidgetStyle.Pressed.ImageSize = FVector2D(ButtonSizeX, ButtonSizeY);
		MuteLocalAudioButton-&gt;WidgetStyle.Pressed.DrawAs = ESlateBrushDrawType::Type::Image;
	}
}
void UVideoCallWidget::SetAudioButtonToUnMute()
{
	if (AudioButtonUnmuteTexture)
	{
		MuteLocalAudioButton-&gt;WidgetStyle.Normal.SetResourceObject(AudioButtonUnmuteTexture);
		MuteLocalAudioButton-&gt;WidgetStyle.Normal.ImageSize = FVector2D(ButtonSizeX, ButtonSizeY);
		MuteLocalAudioButton-&gt;WidgetStyle.Normal.DrawAs = ESlateBrushDrawType::Type::Image;
		MuteLocalAudioButton-&gt;WidgetStyle.Hovered.SetResourceObject(AudioButtonUnmuteTexture);
		MuteLocalAudioButton-&gt;WidgetStyle.Hovered.ImageSize = FVector2D(ButtonSizeX, ButtonSizeY);
		MuteLocalAudioButton-&gt;WidgetStyle.Hovered.DrawAs = ESlateBrushDrawType::Type::Image;
		MuteLocalAudioButton-&gt;WidgetStyle.Pressed.SetResourceObject(AudioButtonUnmuteTexture);
		MuteLocalAudioButton-&gt;WidgetStyle.Pressed.ImageSize = FVector2D(ButtonSizeX, ButtonSizeY);
		MuteLocalAudioButton-&gt;WidgetStyle.Pressed.DrawAs = ESlateBrushDrawType::Type::Image;
	}
}

Add the OnEndCall Method

//VideoCallWidget.h...UCLASS()
class AGORAVIDEOCALL_API UVideoCallWidget : public UUserWidget
{
	GENERATED_BODY()public:	...
	UFUNCTION(BlueprintCallable)
	void OnEndCall();
	...
};//VideoCallWidget.cpp void UVideoCallWidget::OnEndCall()
{
	if (VideoCallPtr)
	{
		VideoCallPtr-&gt;StopCall();
	}	if (VideoCallViewWidget)
	{
		VideoCallViewWidget-&gt;ResetBuffers();
	}	if (PlayerController)
	{
		SetVisibility(ESlateVisibility::Collapsed);
		PlayerController-&gt;EndCall(std::move(VideoCallPtr));
	}
}

Add the OnMuteLocalAudio Method

//VideoCallWidget.h...UCLASS()
class AGORAVIDEOCALL_API UVideoCallWidget : public UUserWidget
{
	GENERATED_BODY()public:	...
	UFUNCTION(BlueprintCallable)
	void OnMuteLocalAudio();	...
};//VideoCallWidget.cppvoid UVideoCallWidget::OnMuteLocalAudio()
{
	if (!VideoCallPtr)
	{
		return;
	}
	if (VideoCallPtr-&gt;IsLocalAudioMuted())
	{
		VideoCallPtr-&gt;MuteLocalAudio(false);
		SetAudioButtonToMute();
	}
	else
	{
		VideoCallPtr-&gt;MuteLocalAudio(true);
		SetAudioButtonToUnMute();
	}
}

Add the OnChangeVideoMode Method

//VideoCallWidget.h...UCLASS()
class AGORAVIDEOCALL_API UVideoCallWidget : public UUserWidget
{
	GENERATED_BODY()public:	...
	UFUNCTION(BlueprintCallable)
	void OnChangeVideoMode();
	...
};//VideoCallWidget.cppvoid UVideoCallWidget::OnChangeVideoMode()
{
	if (!VideoCallPtr)
	{
		return;
	}
	if (!VideoCallPtr-&gt;IsLocalVideoMuted())
	{
		VideoCallPtr-&gt;MuteLocalVideo(true);		SetVideoModeButtonToCameraOn();
	}
	else
	{
		VideoCallPtr-&gt;EnableVideo(true);
		VideoCallPtr-&gt;MuteLocalVideo(false);		SetVideoModeButtonToCameraOff();
	}
}

Create Blueprint Classes

Make sure the C++ code compiles properly. Without a successfully compiled project you cannot move on to the next steps. If you’ve compiled the C++ code successfully and still don’t see required classes in the Unreal Editor, reopen the project.

Create the BP_EnterChannelWidget Blueprint Asset

Create a Blueprint of UEnterChannelWidget. Right-click Content and select Widget Blueprint from the User Interface menu.

Change the parent of the class of this new User Widget.  

When you open the blueprint, the UMG Editor Interface appears and by default opens to the Designer tab.

Click the Graph button (top-right corner button) and select Class Settings. On the Details panel, click the Parent Class drop-down list and select the C++ class previously created: UEnterChannelWidget.

Return to the Designer tab. The Palette window contains several different types of widgets that you can use to construct your UI elements. Find Text, Editable Text, Button and ComboBox (String) elements and drag them to the workspace as in the screenshot. Then go to the definition of UEnterChannelWidget in the EnterChannelWidget.h file to see the names of the member variables with the corresponding types (UTextBlock, UEditableTextBox, UButton, and UComboBoxString).

Return to the BP_VideoCallViewWidget editor and set identical names to the UI elements that you have dragged into your widget. You can do this by clicking the element and changing the name in the Details panel. Try to compile the blueprint. You will see an error if you forgot to add something, or if there is a widget name/type mismatch inside your UserWidget class.

Save it to the preferred folder. For example: /Content/Widgets/BP_EnterChannelWidget.uasset

Create the BP_VideoViewWidget Asset

Create the BP_VideoViewWidget asset, set the parent class to UVideoViewWidget, and name the Image element RenderTargetImage.

It’s important to set image anchor here:

Create the BP_VideoCallViewWidget Asset

Create the BP_VideoCallViewWidget asset, set the parent class to UVideoCallViewWidget, and add UI elements MainVideoViewWidget and AdditionalVideoViewWidget of BP_VideoViewWidget type. Also add MainVideoSizeBox and AdditionalVideoSizeBox UI elements of SizeBox type.

Create the BP_VideoCallWidget Asset

Create the BP_VideoCallWidget asset, set the parent class to UVideoCallWidget, find in the Palette the BP_VideoCallViewWidget UI element and add it with the name VideoCallViewWidget, and add the EndCallButton, MuteLocalAudioButton, and VideoModeButton buttons.

Create the BP_VideoCallPlayerController Blueprint Asset

Now it’s time to create the BP_VideoCallPlayerController blueprint asset, based on the AVideoCallPlayerController class that was described earlier.

Create a Blueprint of AVideoCallPlayerController

Right-click Content, click the Add New button, and select the Blueprint Class. In the Pick parent class window, go to the All Classes section and find the VideoCallPlayerController class.

Now assign the previously created widgets to the PlayerController as shown:

Save it to the preferred folder (for example, /Content/Widgets/BP_VideoCallPlayerController.uasset).

Create the BP_AgoraVideoCallGameModeBase Asset

Next, you create a blueprint of AVideoCallPlayerController.

Click the Add New button, select Blueprint Class, and choose Game Mode Base Class.

Modify GameMode

Now you need to set your custom GameMode class and Player Controller. Go to the world settings and set the specified variables:

Specify Project Settings

Go to Edit > Project settings and open the Maps & Modes tab. Specify Default parameters:

Run the Game

Windows

Select File > Package Project > Windows > Windows(64-bit), select a folder where you want to place the package, and wait for the result.

Mac

Select File > Package Project > Mac and specify a Builds folder you want to build to. Before running the game, you have to add permissions.

Mac Build Setup

Add Permissions in the info.plist File for Device Access

Note: To access the .plist file, right-click the <YourBuildName>.app file and select Show Package Contents. The info.plist file is inside Contents.

Add these permissions to the file:

  • Privacy — Camera Usage Description
  • Privacy — Microphone Usage Description

Add the AgoraRtcKit.framework Folder to Your Build

  1. Go back to your project directory and open the Plugins folder.
  2. From Plugins/AgoraPlugin/Source/ThirdParty/Agora/Mac/Release, copy the AgoraRtcKit.framework file.
  3. Paste AgoraRtcKit.framework into your newly built project folder: <Packaged_project_dir>/MacNoEditor/[Project_name]/Contents/MacOS/

iOS Packaging

To package the project for iOS, you need to generate a signing certificate and provisioning profile. Follow the instructions in the UE4 documentation: iOS Provisioning.

Tip: I recommend going to ProjectSettings > Build > and selecting the Automatic Signing checkbox.

Then you need to add the certificate and the provisioning profile to your project:

Select Edit > Project Settings > Platforms: iOS, and then select the certificate and the provisioning profile you created before.

If you don’t see one of them in the table, click Import Certificate or Import Provision, choose the correct file in the Finder, and click Open.

Then enter a Bundle Identifier: it must be the Bundle ID you used during certificate creation.

iOS Permissions

For testing in iOS, I recommend testing by clicking the Launch button in the top bar of the Unreal Editor, with your iOS device selected in the launch settings.

To add the permissions in the info.plist file, select the Edit > Project Settings > Platforms: iOS and add the following line to the Additional Plist Data:

  • Privacy — Camera Usage Description
  • Privacy — Microphone Usage Description

To add the permissions in the info.plist file, select the Edit > Project Settings > Platforms: iOS and add the following line to Additional Plist Data:

<key>NSCameraUsageDescription</key><string>AgoraVideoCall</string>

<key>NSMicrophoneUsageDescription</key><string>AgoraVideoCall</string>

Now you are ready to package your project for iOS or launch it on an iOS device.


Want to build Real-Time Engagement apps?

Get started with 10,000 free minutes today!

If you have questions, please call us at 408-879-5885. We’d be happy to help you add voice or video chat, streaming and messaging into your apps.


Stay inspired by accessing all RTE2020 session recordings. Gain access to innovative Real-Time-Engagement content and start innovating today.

Share