C# (Unity Engine)

Unity is a cross-platform game engine developed by Unity Technologies.

It is particularly popular for iOS and Android mobile game development and used for games such as Pokémon Go, Monument Valley, Call of Duty: Mobile, Beat Saber, and Cuphead.

Sariska Media provides powerful Unity APIs for developing real-time applications.

You can integrate audio/video, live streaming cloud recording, transcriptions, language translation, and many other services on the fly.


Changes to Unity preference Settings

  • Switch Platform to either Android, iOS, or WebGL

  • Got to Player Settings and disable Multi-Threaded Rendering

  • Add OpenGLES2 and OpenGLES3 in Graphics APIs

  1. Set Minimum API Level to Android 5.1 ‘Lollipop’ (API level 22)

Import Sariska’s implementation of Unity Jar Resolver by google

git clone https://github.com/SariskaIO/Sariska-Unity-Jar-Resolver
  1. In your unity project, go to Assets -> Import-Package -> Custom Package

  2. Select "external-dependency-manager-1.2.169.unitypackage" from the cloned repo and import all.

Create XML file for importing Sariska Media Transport SDK

  1. Add the external-dependency-manager-*.unitypackage to your plugin project (assuming you are developing a plugin).

  2. Copy and rename the SampleDependencies.xml file into your plugin and add the dependencies your plugin requires.

Below is an example of the xml file used to import Sariksa Media Transport SDK for Android and iOS.

<?xml version="1.0" encoding="UTF-8" ?>
        <iosPod name="sariska-media-transport" addToAllTargets="false">
    <androidPackage spec="io.sariska:sariska-media-transport:5.2.1-exp"></androidPackage>

The latest stable version is “io.sariska:sariska-media-transport:5.2.1-exp”

Create Canvas for Local and Remote streams in a scene

Create two canvases in a scene one for each local and remote client


Create a script to access Sariska Media Transport plugin functions.

Declaring Raw images

Declare Raw images wherein the video will be embedded

public class ExternalTexturePlugin : MonoBehaviour
[SerializeField] private RawImage localImage;
[SerializeField] private RawImage remoteImage;

Declaring Textures and Texture pointers

Declare Textures and Texture pointers for the two video streams:

public class ExternalTextureSecond : MonoBehaviour
[SerializeField] private RawImage localImage;
[SerializeField] private RawImage remoteImage;

private Texture2D localTexture2D;
private Texture2D remoteTexture2D;

private IntPtr _nativeTexturePointerLocal;
private IntPtr _nativeTexturePointerRemote;

Obtaining Texture pointers

Get the texture pointers from the declared Textures, and attach the textures to the declared raw images:

public class ExternalTextureSecond : MonoBehaviour
    void start(){
        _nativeTexturePointerLocal = localTexture2D.GetNativeTexturePtr();
        _nativeTexturePointerRemote = remoteTexture2D.GetNativeTexturePtr();
        localImage.texture = localTexture2D;
        remoteImage.texture = remoteTexture2D;

Implementation for Android and iOS

Importing SariskaMediaUnitySdk

using Plugins.SariskaMediaUnitySdk;

Initialize SDK

After the SDK is added to your project, first initialize the SDK before proceeding with the setting up for the video call.


Setup Local Stream

The SetupLocalStream method in SariskaMediaUnitySdk gives you the option to set up a stream with a set of parameters, which lets you choose between an audio or a video call, and the resolution of the video in case of a video call.

Additionally, the method requires the user to send texture pointers for both textures defined by the user.

// void SetupLocalStream(audio, video, resolution, localTexturePointer, remoteTexturePointer)

SariskaMediaUnitySdk.SetupLocalStream(true, true, 180, localTexturePointer, remoteTexturePointer);

Create Connection

In order to enter the room and start a call, the create connection method lets you give a room name and user name as parameters. Once these parameters are sent to the SDK, it automatically creates a JWT token for the user and establishes a conference for people to join it.

SariskaMediaUnitySdk.CreateConnection(roomName, userName);

Mute/Unmute Call

A local participant can mute or unMute their audio by using the following methods.

// To mute audio

// To unmute audio

Mute/Unmute Video

A local participant can mute or unMute their video by using the following methods.

// To mute video

// To unmute video

Switch Camera

A participant can switch the camera between the front and back by calling the switch camera method. By default, the video call initializes with the front camera open.

// To switch between camera

Lock/Unlock Room

A moderator can lock and unlock a room by calling the below two methods. While locking the room, the moderator has to provide a password in the form of a string.

// Lock a room with a password 

// Unlock a room 

Change Audio Output

The audio output can be changed to the speaker and turned off by calling the OnSpeaker method for switching to speaker and OffSpeaker for switching to the default output.

// Speaker on

// Speaker off

Get Participant Count

The GetParticipantCount method returns the number of participants present in the meeting at the time the method is called.

// Get Participant count 
// hidden, if assigned true, counts hidden participants as well 

bool hidden = true;
int participantCount = SariskaMediaUnitySdk.GetParticipantCount(hidden);

Get Dominant Speaker

The GetDominantSpeaker method returns the id in form of a string of the dominant speaker of the meeting.

// Returns the id of the dominant speaker

string participantId = SariskaMediaUnitySdk.GetDominantSpeaker();

WebGL Implementation for Unity

A basic demo of drawing remote video tracks from sariska-media-transport on objects in Unity can be found at https://github.com/SariskaIO/sariska-media-unity-webgl.

Javascript component

Initialize Sariska Media Transport


Create Connection

connection = new SariskaMediaTransport.JitsiConnection(token, {room-name});
connection.addEventListener(SariskaMediaTransport.events.connection.CONNECTION_ESTABLISHED, onConnectionSuccess);

Create Conference

room = connection.initJitsiConference();
room.on(SariskaMediaTransport.events.conference.CONFERENCE_JOINED, onConferenceJoined);
room.on(SariskaMediaTransport.events.conference.TRACK_ADDED, onRemoteTrack);
room.on(SariskaMediaTransport.events.conference.USER_JOINED, id => { remoteTracks[id] = []; });
room.on(SariskaMediaTransport.events.conference.USER_LEFT, onUserLeft);

Capture local streams

SariskaMediaTransport.createLocalTracks({devices: ["audio", "video"]})

Play Local stream

localTracks = tracks;
if (isJoined) {
    for (let i = 0; i < localTracks.length; i++) {

for(var i=0;i<tracks.length;i++){
  if (tracks[i].getType() == 'video') {
    const key = "local";
    window.videoElements[key] = document.createElement('video');
    window.videoElements[key].autoplay = true;

User Joined

function onConferenceJoined() {
  isJoined = true;
  for (let i = 0; i < localTracks.length; i++) {

Playing remote peers streams

function onRemoteTrack(track) {
  if (track.isLocal()) {
  const participantId = track.getParticipantId();
  if (!remoteTracks[participantId]) {
    remoteTracks[participantId] = [];
  if (track.getType() == 'video') {
    // Video elements just get stored, they're accessed from Unity.
    const key = "participant-" + participantId;
    window.videoElements[key] = document.createElement('video');
    window.videoElements[key].autoplay = true;
  else {
    // Audio elements get added to the DOM (can be made invisible with CSS) so that the audio plays back.
    const audioElement = document.createElement('audio');
    audioElement.autoplay = true;
    audioElement.id = "audio-" + participantId;

C# Unity Script

  • Create a script to access Sariska Media Transport plugin functions.

  • Declare external method “UpdateTextureFromVideoElement” which takes textureId generated by Unity as a argument.

private static extern void UpdateTextureFromVideoElement(int textureId);
  • The script's Start function creates a texture and attaches it to a material on the object. The script's Update function calls a JS plugin method, passing a pointer to the texture.

void Update() {
        if (local == 0)
            textureId = texture.GetNativeTextureID();

Plugin for WebGL:

  1. The .jslib plugin files are responsible for acting as a bridge between Javascript and Unity C# scripts.

  2. The JS plugin method uses texSubImage2D to copy pixels from the video element of the track onto the texture. In a real application, you would identify the remote tracks (e.g. by participant ID) and paint each one on a separate object.

  3. The demo application showcases the usage of three participants.

  4. Example code of the plugin file.

mergeInto(LibraryManager.library, {
  UpdateTextureFromVideoElement: function (textureId) {
    const videoElement = Object.values(window.videoElements)[0];

    if (!videoElement) {
      console.log("no video element");
    const texture = GL.textures[textureId];
    if (!texture) {
      console.log("no texture for id: " + textureId);
    GLctx.bindTexture(GLctx.TEXTURE_2D, texture);
      0,  // level
      0,  // x offset
      0,  // y offset

Last updated