SARISKA
  • Introduction
  • Overview
  • Getting Started
    • Get API Key
    • Authentication
  • Real Time Messaging
    • Overview
    • Development
      • JavaScript
      • Swift
      • Kotlin
      • Java
      • Dart
      • C# (Unity Engine)
    • API References - Real Time Messaging
  • Video Calling/ Video Conferencing Api
    • Overview
    • Development
      • JavaScript
      • Swift
      • Kotlin
      • Java
      • Flutter (Dart)
      • C# (Unity Engine)
      • C++ (Unreal Engine)
    • API References - Video Calling
      • Video Calling CDR API's
      • Conference Scheduling Reservation APIs
  • Co-Browsing
    • Overview
    • Javascript
  • Server
    • Pub/Sub Node.js environment
  • Project Management
    • Pricing And Billing
    • Quotas and Limits
  • SDK
    • Mobile
      • Video Calling Mobile Apps
      • Messaging Mobile Apps
    • Desktop
      • Video Calling Desktop Apps
      • Messaging Desktop Apps
    • Browser
      • Video Calling Browser Apps
      • Messaging Browser Apps
      • Co-browsing Browser Apps
  • UI Kit
    • Generating the API Key for your Project
    • Video Conferencing
      • Running Sariska's Unity Demo
        • Android
      • Unity Engine
      • Unreal Engine
    • Audio Conferencing
  • Live Streaming
    • Interactive Live Streaming
    • Non-Interactive Live Streaming
    • API References - Live Streaming
      • API Reference - Interactive Live Streaming
      • API Reference - Non-Interactive Live Streaming
    • Guide to Interactive streaming
  • Sariska Analytics
    • Overview
Powered by GitBook
On this page
  • Set-Up​
  • Scripting​
  • Implementation for Android and iOS
  • WebGL Implementation for Unity​
  • Javascript component​
  • C# Unity Script​
  • Plugin for WebGL:​
Export as PDF
  1. Video Calling/ Video Conferencing Api
  2. Development

C# (Unity Engine)

Last updated 2 years ago

Unity is a cross-platform game engine developed by Unity Technologies.

It is particularly popular for iOS and Android mobile game development and used for games such as Pokémon Go, Monument Valley, Call of Duty: Mobile, Beat Saber, and Cuphead.

Sariska Media provides powerful Unity APIs for developing real-time applications.

You can integrate audio/video, live streaming cloud recording, transcriptions, language translation, and many other services on the fly.

Set-Up

Changes to Unity preference Settings

  • Switch Platform to either Android, iOS, or WebGL

  • Got to Player Settings and disable Multi-Threaded Rendering

  • Add OpenGLES2 and OpenGLES3 in Graphics APIs

  1. Set Minimum API Level to Android 5.1 ‘Lollipop’ (API level 22)

git clone https://github.com/SariskaIO/Sariska-Unity-Jar-Resolver
  1. In your unity project, go to Assets -> Import-Package -> Custom Package

  2. Select "external-dependency-manager-1.2.169.unitypackage" from the cloned repo and import all.

  1. Add the external-dependency-manager-*.unitypackage to your plugin project (assuming you are developing a plugin).

Below is an example of the xml file used to import Sariksa Media Transport SDK for Android and iOS.

<?xml version="1.0" encoding="UTF-8" ?>
<dependencies>
    <iosPods>
        <iosPod name="sariska-media-transport" addToAllTargets="false">
        <sources>
            <source>https://github.com/SariskaIO/sariska-ios-sdk-releases</source>
        </sources>
    </iosPod>
    </iosPods>
    <androidPackages>
        <repositories>
      <repository>https://maven.pkg.github.com/SariskaIO/maven-repository</repository>
    </repositories>
    <androidPackage spec="io.sariska:sariska-media-transport:5.2.1-exp"></androidPackage>
  </androidPackages>
</dependencies>

Create two canvases in a scene one for each local and remote client

Create a script to access Sariska Media Transport plugin functions.

Declare Raw images wherein the video will be embedded

public class ExternalTexturePlugin : MonoBehaviour
{
...
[SerializeField] private RawImage localImage;
[SerializeField] private RawImage remoteImage;
...
}

Declare Textures and Texture pointers for the two video streams:

public class ExternalTextureSecond : MonoBehaviour
{
...
[SerializeField] private RawImage localImage;
[SerializeField] private RawImage remoteImage;

private Texture2D localTexture2D;
private Texture2D remoteTexture2D;

private IntPtr _nativeTexturePointerLocal;
private IntPtr _nativeTexturePointerRemote;
...
}

Get the texture pointers from the declared Textures, and attach the textures to the declared raw images:

public class ExternalTextureSecond : MonoBehaviour
{
    ...
    
    void start(){
        _nativeTexturePointerLocal = localTexture2D.GetNativeTexturePtr();
        _nativeTexturePointerRemote = remoteTexture2D.GetNativeTexturePtr();
    
        localImage.texture = localTexture2D;
        remoteImage.texture = remoteTexture2D;
    }
  ...
}

Implementation for Android and iOS

Importing SariskaMediaUnitySdk

using Plugins.SariskaMediaUnitySdk;

Initialize SDK

After the SDK is added to your project, first initialize the SDK before proceeding with the setting up for the video call.

SariskaMediaUnitySdk.InitSariskaMediaTransport();

Setup Local Stream

The SetupLocalStream method in SariskaMediaUnitySdk gives you the option to set up a stream with a set of parameters, which lets you choose between an audio or a video call, and the resolution of the video in case of a video call.

Additionally, the method requires the user to send texture pointers for both textures defined by the user.

// void SetupLocalStream(audio, video, resolution, localTexturePointer, remoteTexturePointer)

SariskaMediaUnitySdk.SetupLocalStream(true, true, 180, localTexturePointer, remoteTexturePointer);

Create Connection

In order to enter the room and start a call, the create connection method lets you give a room name and user name as parameters. Once these parameters are sent to the SDK, it automatically creates a JWT token for the user and establishes a conference for people to join it.

SariskaMediaUnitySdk.CreateConnection(roomName, userName);

Mute/Unmute Call

A local participant can mute or unMute their audio by using the following methods.

// To mute audio
SariskaMediaUnitySdk.MuteAudio();

// To unmute audio
SariskaMediaUnitySdk.UnMuteAudio();

Mute/Unmute Video

A local participant can mute or unMute their video by using the following methods.

// To mute video
SariskaMediaUnitySdk.MuteVideo();

// To unmute video
SariskaMediaUnitySdk.UnMuteVideo();

Switch Camera

A participant can switch the camera between the front and back by calling the switch camera method. By default, the video call initializes with the front camera open.

// To switch between camera
SariskaMediaUnitySdk.SwitchCamera();

Lock/Unlock Room

A moderator can lock and unlock a room by calling the below two methods. While locking the room, the moderator has to provide a password in the form of a string.

// Lock a room with a password 
SariskaMediaUnitySdk.LockRoom(password);

// Unlock a room 
SariskaMediaUnitySdk.UnlockRoom();

Change Audio Output

The audio output can be changed to the speaker and turned off by calling the OnSpeaker method for switching to speaker and OffSpeaker for switching to the default output.

// Speaker on
SariskaMediaUnitySdk.OnSpeaker();

// Speaker off
SariskaMediaUnitySdk.OffSpeaker();

Get Participant Count

The GetParticipantCount method returns the number of participants present in the meeting at the time the method is called.

// Get Participant count 
// hidden, if assigned true, counts hidden participants as well 

bool hidden = true;
int participantCount = SariskaMediaUnitySdk.GetParticipantCount(hidden);

Get Dominant Speaker

The GetDominantSpeaker method returns the id in form of a string of the dominant speaker of the meeting.

// Returns the id of the dominant speaker

string participantId = SariskaMediaUnitySdk.GetDominantSpeaker();
SariskaMediaTransport.init();
connection = new SariskaMediaTransport.JitsiConnection(token, {room-name});
connection.addEventListener(SariskaMediaTransport.events.connection.CONNECTION_ESTABLISHED, onConnectionSuccess);
connection.connect();

Create Conference

room = connection.initJitsiConference();
room.on(SariskaMediaTransport.events.conference.CONFERENCE_JOINED, onConferenceJoined);
room.on(SariskaMediaTransport.events.conference.TRACK_ADDED, onRemoteTrack);
room.on(SariskaMediaTransport.events.conference.USER_JOINED, id => { remoteTracks[id] = []; });
room.on(SariskaMediaTransport.events.conference.USER_LEFT, onUserLeft);
room.join();

Capture local streams

SariskaMediaTransport.createLocalTracks({devices: ["audio", "video"]})
    .then(onLocalTracks);   

Play Local stream

localTracks = tracks;
if (isJoined) {
    for (let i = 0; i < localTracks.length; i++) {
      room.addTrack(localTracks[i]);
    }
}

for(var i=0;i<tracks.length;i++){
  if (tracks[i].getType() == 'video') {
    const key = "local";
    window.videoElements[key] = document.createElement('video');
    window.videoElements[key].autoplay = true;
    tracks[i].attach(window.videoElements[key]);
  }
}

User Joined

function onConferenceJoined() {
  isJoined = true;
  for (let i = 0; i < localTracks.length; i++) {
    room.addTrack(localTracks[i]);
  }
}
function onRemoteTrack(track) {
  if (track.isLocal()) {
    return;
  }
  const participantId = track.getParticipantId();
  if (!remoteTracks[participantId]) {
    remoteTracks[participantId] = [];
  }
  remoteTracks[participantId].push(track);
  if (track.getType() == 'video') {
    // Video elements just get stored, they're accessed from Unity.
    
    const key = "participant-" + participantId;
    window.videoElements[key] = document.createElement('video');
    window.videoElements[key].autoplay = true;
    track.attach(window.videoElements[key]);
  }
  else {
    // Audio elements get added to the DOM (can be made invisible with CSS) so that the audio plays back.
    const audioElement = document.createElement('audio');
    audioElement.autoplay = true;
    audioElement.id = "audio-" + participantId;
    document.body.appendChild(audioElement);
    track.attach(audioElement);
  }
  
}
  • Create a script to access Sariska Media Transport plugin functions.

  • Declare external method “UpdateTextureFromVideoElement” which takes textureId generated by Unity as a argument.

[DllImport("__Internal")]
private static extern void UpdateTextureFromVideoElement(int textureId);
  • The script's Start function creates a texture and attaches it to a material on the object. The script's Update function calls a JS plugin method, passing a pointer to the texture.

void Update() {
        if (local == 0)
        {
            textureId = texture.GetNativeTextureID();
        }
        UpdateTextureFromVideoElement2(textureId);
}
  1. The .jslib plugin files are responsible for acting as a bridge between Javascript and Unity C# scripts.

  2. The JS plugin method uses texSubImage2D to copy pixels from the video element of the track onto the texture. In a real application, you would identify the remote tracks (e.g. by participant ID) and paint each one on a separate object.

  3. The demo application showcases the usage of three participants.

  4. Example code of the plugin file.

mergeInto(LibraryManager.library, {
  UpdateTextureFromVideoElement: function (textureId) {
    const videoElement = Object.values(window.videoElements)[0];

    if (!videoElement) {
      console.log("no video element");
      return;
    }
 
    const texture = GL.textures[textureId];
    if (!texture) {
      console.log("no texture for id: " + textureId);
      return;
    }
    GLctx.bindTexture(GLctx.TEXTURE_2D, texture);
    
    GLctx.texSubImage2D(
      GLctx.TEXTURE_2D,
      0,  // level
      0,  // x offset
      0,  // y offset
      videoElement.videoWidth,
      videoElement.videoHeight,
      GLctx.RGB,
      GLctx.UNSIGNED_BYTE,
      videoElement
    );
  }

Import Sariska’s implementation of Unity Jar Resolver by google

Clone Unity Jar Resolver from

Create XML file for importing Sariska Media Transport SDK

Copy and rename the file into your plugin and add the dependencies your plugin requires.

The latest stable version is “io.sariska:sariska-media-transport:5.2.1-exp”

Create Canvas for Local and Remote streams in a scene

Scripting

Declaring Raw images

Declaring Textures and Texture pointers

Obtaining Texture pointers

WebGL Implementation for Unity

A basic demo of drawing remote video tracks from sariska-media-transport on objects in Unity can be found at .

Javascript component

Initialize Sariska Media Transport

Create Connection

Playing remote peers streams

C# Unity Script

Plugin for WebGL:

​
https://github.com/SariskaIO/Sariska-Unity-Jar-Resolver
​
SampleDependencies.xml
​
​
​
​
​
​
​
https://github.com/SariskaIO/sariska-media-unity-webgl
​
​
​
​
​
​
​
​