Unity is a cross-platform game engine developed by Unity Technologies.
It is particularly popular for iOS and Android mobile game development and used for games such as Pokémon Go, Monument Valley, Call of Duty: Mobile, Beat Saber, and Cuphead.
Sariska Media provides powerful Unity APIs for developing real-time applications.
You can integrate audio/video, live streaming cloud recording, transcriptions, language translation, and many other services on the fly.
After the SDK is added to your project, first initialize the SDK before proceeding with the setting up for the video call.
SariskaMediaUnitySdk.InitSariskaMediaTransport();
Setup Local Stream
The SetupLocalStream method in SariskaMediaUnitySdk gives you the option to set up a stream with a set of parameters, which lets you choose between an audio or a video call, and the resolution of the video in case of a video call.
Additionally, the method requires the user to send texture pointers for both textures defined by the user.
In order to enter the room and start a call, the create connection method lets you give a room name and user name as parameters. Once these parameters are sent to the SDK, it automatically creates a JWT token for the user and establishes a conference for people to join it.
A local participant can mute or unMute their audio by using the following methods.
// To mute audioSariskaMediaUnitySdk.MuteAudio();// To unmute audioSariskaMediaUnitySdk.UnMuteAudio();
Mute/Unmute Video
A local participant can mute or unMute their video by using the following methods.
// To mute videoSariskaMediaUnitySdk.MuteVideo();// To unmute videoSariskaMediaUnitySdk.UnMuteVideo();
Switch Camera
A participant can switch the camera between the front and back by calling the switch camera method. By default, the video call initializes with the front camera open.
// To switch between cameraSariskaMediaUnitySdk.SwitchCamera();
Lock/Unlock Room
A moderator can lock and unlock a room by calling the below two methods. While locking the room, the moderator has to provide a password in the form of a string.
// Lock a room with a password SariskaMediaUnitySdk.LockRoom(password);// Unlock a room SariskaMediaUnitySdk.UnlockRoom();
Change Audio Output
The audio output can be changed to the speaker and turned off by calling the OnSpeaker method for switching to speaker and OffSpeaker for switching to the default output.
The GetParticipantCount method returns the number of participants present in the meeting at the time the method is called.
// Get Participant count // hidden, if assigned true, counts hidden participants as well bool hidden =true;int participantCount =SariskaMediaUnitySdk.GetParticipantCount(hidden);
Get Dominant Speaker
The GetDominantSpeaker method returns the id in form of a string of the dominant speaker of the meeting.
// Returns the id of the dominant speakerstring participantId =SariskaMediaUnitySdk.GetDominantSpeaker();
functiononRemoteTrack(track) {if (track.isLocal()) {return; }const participantId =track.getParticipantId();if (!remoteTracks[participantId]) {remoteTracks[participantId] = []; }remoteTracks[participantId].push(track);if (track.getType() =='video') { // Video elements just get stored, they're accessed from Unity.const key ="participant-"+ participantId;window.videoElements[key] =document.createElement('video');window.videoElements[key].autoplay=true;track.attach(window.videoElements[key]); }else { // Audio elements get added to the DOM (can be made invisible with CSS) so that the audio plays back.const audioElement =document.createElement('audio');audioElement.autoplay=true;audioElement.id="audio-"+ participantId;document.body.appendChild(audioElement);track.attach(audioElement); }}
The script's Start function creates a texture and attaches it to a material on the object. The script's Update function calls a JS plugin method, passing a pointer to the texture.
The .jslib plugin files are responsible for acting as a bridge between Javascript and Unity C# scripts.
The JS plugin method uses texSubImage2D to copy pixels from the video element of the track onto the texture. In a real application, you would identify the remote tracks (e.g. by participant ID) and paint each one on a separate object.
The demo application showcases the usage of three participants.
Example code of the plugin file.
mergeInto(LibraryManager.library, { UpdateTextureFromVideoElement:function (textureId) { const videoElement =Object.values(window.videoElements)[0];if (!videoElement) {console.log("no video element"); return; } const texture =GL.textures[textureId];if (!texture) {console.log("no texture for id: "+ textureId); return; }GLctx.bindTexture(GLctx.TEXTURE_2D, texture);GLctx.texSubImage2D(GLctx.TEXTURE_2D,0,// level0,// x offset0,// y offsetvideoElement.videoWidth,videoElement.videoHeight,GLctx.RGB,GLctx.UNSIGNED_BYTE, videoElement ); }