Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Unreal Engine (UE) is a 3D computer graphics game engine developed by Epic Games, first showcased in the 1998 first-person shooter game Unreal. Written in C++, the Unreal Engine features a high degree of portability, supporting a wide range of desktop, mobile, console, and virtual reality platforms.
Sariska provides plugins to support media streaming for Unreal Gaming Engine. This documentation walks you through setting up the Sariska Unreal plugin for your Unreal project.
The development of an android project in unreal requires changes to your unreal editor preferences. Follow the steps below to quickly start up with a project or you can follow the official quick start guide by Unreal .
Configure Android Platform
Click Edit > Project Settings to bring up the Project Settings window.
Inside the Project Settings window, navigate to Platforms > Android.
Launching on an Android Device
From the Launch menu under the Devices section select your Android device from the list by clicking on it.
While your level is being launched on your device, the progress will be displayed in the bottom right-hand corner of the screen like in the following image.
When the deployment has finished, the project should automatically start running on your Android device. If the project fails to automatically start, you can start it by finding the App on your device and tapping on it to launch it.
Sariska's Media Plugin for Unreal can be cloned from
You can clone the repository to <Game>/Plugins/ or if you want you can use git submodules to your own git repository. Alternatively, you can copy to the Engine/Plugins/ if you wish to make the plugin available to all of your projects.
Once the plugin is added to your project under the plugins folder(you can create one if you don't have a plugins folder already), you can generate project files and enable the plugin to use it.
If you are using UE version 4.26, you might need AndroidX support to run Sariska's media services. You can clone the following repo into <Game>/Plugins/
A demo app showing Sariska Media usage for Unreal can be found here:
Fill in the Android Package Name with an appropriate company and project name.
If the Accept SDK License button is enabled, click it in order to accept Android's SDK license agreement. If you have previously accepted this agreement, then you will not need to complete this step.
Configuring Editor and PIE for Mobile Previews
In the Toolbar, click Settings > Preview Rendering Level, then select one of the available rendering levels for Android.
Click the dropdown next to the Play button in the Toolbar. Choose one of the available Mobile Preview modes corresponding to the rendering level you selected.
Configure Android SDK
You'd sometimes need to add the location of the Android SDK and NDK manually.
Click Edit > Project Settings to bring up the Project Settings window.
Inside the Project Settings window, navigate to Platforms > Android-SDK.
Make changes to the SDK Config accordingly.

git clone https://github.com/SariskaIO/Sariska-Media-Unreal-Plugingit clone https://github.com/ufna/AndroidX-UE4.gitgit clone https://github.com/SariskaIO/Sariska-Media-Unreal-Demo.git

Empower your mobile applications with seamless real-time communication and media functionalities using the powerful Sariska Media Swift APIs. Integrate features like audio/video calling, live streaming, cloud recording, transcriptions, and language translation effortlessly into your iOS, macOS, watchOS, and tvOS apps.
With CocoaPods
Media Services utilizes WebSockets to establish a continuous, two-way communication channel between your application and the server. This persistent connection offers significant advantages over traditional HTTP requests, particularly in terms of responsiveness.
Create a Jitsi-powered conference for real-time audio and video.
Media Stream
A MediaStream is a collection of audio or video tracks, represented by MediaStreamTrack objects.
Each MediaStreamTrack can have multiple channels (e.g., left and right channels in a stereo audio track).
Capture Local Tracks
Define options:
Specify desired devices ("audio", "video", or "desktop").
Set preferred video resolution.
Optionally configure specific devices, frame rates, screen sharing options, and facing mode.
This event is triggered when a new user joins the conference. Moderators have exclusive control over meetings and can manage participants. To assign a moderator, set the moderator value to true when generating the token.
Make audio and video streams visible to others in the conference by publishing them using the following code:
Sariska-media-transport offers pre-configured events to help you track and analyze user interactions, media usage, and overall performance. This data can be used to enhance your product, improve user experience, and make informed decisions.
Available Events
Here are some of the key events you can track:
User Actions:
User joined
User left
Media Usage:
Add Event Listener to Track Events
Sariska offers powerful features to enhance your application's capabilities. Find your desired feature using the search bar or explore below!
Identify the main speaker: Easily detect the active or dominant speaker in a conference. Choose to stream only their video for improved resolution and reduced bandwidth usage. Ideal for one-way streaming scenarios like virtual concerts.
Dynamically showcase recent speakers: Focus on the active conversation by displaying video only for the last N participants who spoke. This automatically adjusts based on speech activity, offering a more efficient and relevant view.
Set local participant properties: Define additional information about participants beyond the default settings. This can include screen-sharing status, custom roles, or any other relevant attributes.
Get the total number of participants: Retrieve the complete participant count, including both visible and hidden members.
Access all participants: Obtain a list of all participants, including their IDs and detailed information.
Pin a single participant: Give a participant higher video quality (if simulcast is enabled).
Pin multiple participants: Give multiple participants higher video quality.
Retrieve information about the local user directly from the conference object.
Set the subject of the meeting.
Get all remote tracks: Retrieve a list of all remote tracks (audio and video) in the conference.
Get all local tracks: Retrieve a list of all local tracks (audio and video)
Listen for participant kick events
Kick a participant
Kick a moderator
The room creator has a moderator role, while other users have a participatory role.
Grant owner rights
Listen for role changes
Revoke owner rights
Setting a new display name
Listen for display name changes
Lock room: Moderators can restrict access to the room with a password.
Unlock room: Removes any existing password restriction.
Request subtitles: Enable subtitles for spoken content.
Request language translation: Translate subtitles into a specific language.
Receive subtitles: Listen for incoming subtitles.
Stop subtitles: Disable subtitles.
Each participant can contribute two types of data to a meeting: audio and video. Screen sharing counts as a video track. If you want to share your screen while also showing your own video (like when presenting), you need to enable "Presenter mode". This mode hides the gallery of other participants' videos and gives you more control over the meeting layout.
Start screen share: Share your screen with other participants.
Sariska offers robust messaging capabilities for both private and group communication scenarios.
Send a group message to all participants
Send a private message to a specific participant
Listen for incoming messages
Start Transcription: Initiate transcription for the ongoing conference.
Stop Transcription: Stop transcription and get a download link for the transcript.
Mute Remote Participant
Mute/Unmute Local Participant
Local Connection Statistics Received
Remote Connection Statistics Received
The SDK features intelligent auto-join/leave functionality based on internet connectivity status. This helps optimize network resources and improve user experience.
Designed for efficient communication between two participants.
Start Peer-to-Peer Mode
Sariska automatically activates Peer-to-Peer mode when your conference has exactly two participants. This mode bypasses the central server and directly connects participants, maximizing bandwidth efficiency and reducing latency. However, even with more than two participants, you can forcefully start Peer-to-Peer mode.
Stop Peer-to-Peer Mode
If you need to revert to server-mediated communication, you can easily stop Peer-to-Peer mode.
Monitor your WebRTC application performance using CallStats (or build your own). See the "RTC Stats" section for details.
Join conferences with audio and video already muted, or in a silent mode where no audio is transmitted or received. This ensures a seamless experience and respects participant preferences.
Join with Muted Audio and Video
Join in Silent Mode
Broadcast your conference to multiple platforms simultaneously. Embed live streams directly into your app or website using various formats.
Stream to YouTube
Stream to Facebook
Stream to Twitch
Stream to any RTMP Server
Listen for RECORDER_STATE_CHANGED event to track streaming status
Stop Live Stream
Store your recordings and transcriptions in various cloud storage services.
Dial-in(PSTN)
Dial-out(PSTN)
This allows synchronous phone calls, similar to WhatsApp, even if the receiver's app is closed or in the background.
Initiating Calls:
Make a call even if the callee's app is closed or in the background.
Play a busy tone if the callee is on another call or disconnects your call.
Play ringtone/ringback/DTMF tones.
Step 1 : Caller Initiates Call
HTTP Call to Sariska Server
{API Method}
Push Notification to callee using Firebase or APNS
This notifies the receiver even if their app is closed or in the background.
Step 2 : Callee Responds to Call
Reads Push Notification
Processes the notification even if the app is closed or in the background.
HTTP Call to Update Status
{API Method}
No need to join conference via SDK
Status update through the HTTP call suffices.
Step 3 : Caller Receives Response
Listens for USER_STATUS_CHANGED event
Step 4 : After Connection Established
The call proceeds like a normal conference call.
startVideoMuted: true
startSilent: true
rtcstatsServer: ""
callStatsID: ""
callStatsSecret: ""
channelLastN: 10
Purpose: Specifies which devices to request from the browser's GetUserMedia (GUM) API.
Default: If this property is not set, GUM will attempt to access all available devices.
resolution:
Type: String
Values: 180, 240, 360, vga, 480, qhd, 540, hd, 720, fullhd, 1080, 4k, 2160
Purpose: Sets the preferred resolution for the local video stream.
cameraDeviceId
Type: String
Purpose: Specifies the device ID of the camera to use.
micDeviceId
Type: String
Purpose: Specifies the device ID of the microphone to use.
minFps
Type: Integer
Purpose: Sets the minimum frame rate for the video stream.
maxFps
Type: Integer
Purpose: Sets the maximum frame rate for the video stream.
desktopSharingFrameRate
Type: Object
Properties:
min: Minimum frame rate for desktop sharing
max: Maximum frame rate for desktop sharing
desktopSharingSourceDevice
Type: String
Purpose: Specifies the device ID or label of the video input source to use for screen sharing.
facingMode
Type: String
Values: "user", "environment"
Purpose: Sets the camera's facing mode (front-facing or back-facing).
Participant Removal:
Ability to kick non-moderators or even other moderators from the meeting.
Audio Control:
Ability to mute individual participants or all participants at once.
Video Focus:
Ability to make everyone's video view follow the moderator's video.
Joining Settings: Ability to:
Set participants to join with audio muted by default.
Set participants to join with video disabled by default.
Lobby Management:
Ability to enable or disable the lobby room, requiring approval for participants to join.
Join Approval:
Ability to approve or deny join requests when the lobby is enabled.
Moderator Transfer:
If the current moderator leaves the meeting, a new moderator is automatically selected.
Type: String
Purpose: May be used for identification or communication purposes.
moderator:
Type: Boolean
Purpose: Used to control moderation-related features in the UI.
audioMuted:
Type: Boolean
Purpose: Used to display the audio muted state in the UI.
videoMuted:
Type: Boolean
Purpose: Used to display the video muted state in the UI.
displayName:
Type: String
Purpose: Used to identify them in the UI.
role:
Type: String
Purpose: Used to determine their permissions and UI features.
status:
Type: String
Purpose: Used to display ("online", "offline", "away") their availability in the UI.
hidden:
Type: Boolean
Purpose: Typically used for bots like transcribers or recorders.
botType:
Type: String
Purpose: Used to identify the bot's purpose and capabilities.
Conference duration
Camera duration
Audio track duration
Video track duration
Recording:
Recording started
Recording stopped
Local recording started
Local recording stopped
Transcription:
Transcription started
Transcription stopped
Performance:
Speaker stats
Connection stats
Performance stats
actionSubject: The subject of the action (string)
source: The source of the event (string)
attributes: Additional attributes of the event (JSON)
Danish
de
German
el
Greek
en
English
enGB
English (United Kingdom)
eo
Esperanto
es
Spanish
esUS
Spanish (Latin America)
et
Estonian
eu
Basque
fi
Finnish
fr
French
frCA
French (Canadian)
he
Hebrew
hi
Hindi
hr
Croatian
hu
Hungarian
hy
Armenian
id
Indonesian
it
Italian
ja
Japanese
kab
Kabyle
ko
Korean
lt
Lithuanian
ml
Malayalam
lv
Latvian
nl
Dutch
oc
Occitan
fa
Persian
pl
Polish
pt
Portuguese
ptBR
Portuguese (Brazil)
ru
Russian
ro
Romanian
sc
Sardinian
sk
Slovak
sl
Slovenian
sr
Serbian
sq
Albanian
sv
Swedish
te
Telugu
th
Thai
tr
Turkish
uk
Ukrainian
vi
Vietnamese
zhCN
Chinese (China)
zhTW
Chinese (Taiwan)
mr
Marathi
Enter the required credentials for your chosen provider
Dropbox
Obtain a Dropbox OAuth token
connected: receiver accepted your call
expired: receiver didn't answer within 40 seconds
af
Afrikaans
ar
Arabic
bg
Bulgarian
ca
Catalan
cs
Czech
da
pod 'sariska-media-transport', :git => 'https://github.com/SariskaIO/sariska-ios-sdk-releases.git', tag:'1.1.5', :branch => 'master'// Import the Sariska framework to access its classes
import sariska
// Perform initial setup tasks
SariskaMediaTransport.initializeSdk();let token = {your-token} // Replace with your actual token
self.connection = SariskaMediaTransport.jitsiConnection(tokens, roomName: "roomname", isNightly: false)
// Add event listeners for connection events
self.connection?.addEventListener("CONNECTION_ESTABLISHED", callback: {
self.createConference() // Create conference when connection is established
})
// Add event listeners for connection events
self.connection?.addEventListener("CONNECTION_FAILED", callback: {
})
// Add event listeners for connection events
self.connection?.addEventListener("CONNECTION_DISCONNECTED", callback: {
})
// Add event listeners for connection events
self.connection.addEventListener(event: "PASSWORD_REQUIRED", () {
})
// Establish the connection
self.connection?.connect(){
}conference = connection.initJitsiConference(options)
// Add event listeners for conference events
conference?.addEventListener("CONFERENCE_JOINED") {
// Add local tracks to the conference when joined
for track in self.localTracks {
self.conference?.addTrack(track: track)
}
}
// Add event listeners for conference events
conference?.addEventListener("TRACK_ADDED") { track in
let track = track as! JitsiRemoteTrack
DispatchQueue.main.async {
if (track.getType() == "video") {
let videoView = track.render()
self.attachVideo(videoView: videoView, trackId: track.getId())
}
}
}
// Add event listeners for conference events
conference?.addEventListener("TRACK_REMOVED") { track in
let track = track as! JitsiRemoteTrack
DispatchQueue.main.async {
self.removeVideo(trackId: track.getId())
}
}
// Add event listeners for conference events
conference?.addEventListener("CONFERENCE_LEFT") {
print("CONFERENCE_LEFT")
}
// Join the conference
conference.join();// Define options for capturing local audio and video tracks
var options:[String: Any] = [:]
options["audio"] = true
options["video"] = true
options["resolution"] = true // Specify desired resolution
// ... (additional options for desktop sharing, facing mode, devices, etc.)
// Create local tracks with the specified options
SariskaMediaTransport.createLocalTracks(options){
// Handle local tracks once created
}DispatchQueue.main.async {
// Iterate through local tracks
for track in localTracks {
// If the track is a video track
if (track.getType() == "video") {
// Render the track into a video view
let videoView = track.render()
// Configure view properties
// Set object fit to "cover" (fill the view, potentially cropping)
videoView.setObjectFit("cover")
// Set height and width constraints
videoView.heightAnchor.constraint(equalToConstant: 240).isActive = true
videoView.widthAnchor.constraint(equalToConstant: 360).isActive = true
// Add the video view as a subview
self.view.addSubview(videoView)
}
}
}conference.addEventListener(event: "USER_JOINED") { id, participant in
}for track in localTracks {
conference?.addTrack(track: track)
}conference?.addEventListener(event: "TRACK_ADDED") { track in
let track = track as! JitsiRemoteTrack
DispatchQueue.main.async {
// If the track is a video track
if (track.getType() == "video") {
// Render and display the remote video track
let videoView = track.render()
// Configure view properties
// Set object fit to "cover" (fill the view, potentially cropping)
videoView.setObjectFit("cover")
// Set height and width constraints
videoView.heightAnchor.constraint(equalToConstant: 240).isActive = true
videoView.widthAnchor.constraint(equalToConstant: 360).isActive = true
// Add the video view as a subview
self.view.addSubView(videoView)
}
}
} conference?.addEventListener(event: "ANALYTICS_EVENT_RECEIVED"){ payload in
}conference?.addEventListener(event: "DOMINANT_SPEAKER_CHANGED") { id in
// id is a string representing the ID of the dominant speaker
}// Listen for last N speakers changed event
conference?.addEventListener(event: "LAST_N_ENDPOINTS_CHANGED"){ leavingEndpointIds, enteringEndpointIds in
// leavingEndpointIds: Array of IDs of users leaving lastN
// enteringEndpointIds: Array of IDs of users entering lastN
};
// Set the number of last speakers to show
conference?.setLastN(10) // Set a local participant property
conference?.setLocalParticipantProperty(key, value);
// Remove a local participant property
conference?.removeLocalParticipantProperty(key)
// Get the value of a local participant property
conference?.getLocalParticipantProperty(key)
// Listen for changes in participant properties
conference?.addEventListener(event: "PARTICIPANT_PROPERTY_CHANGED"){ participant, key, oldValue, newValue in
}conference?.getParticipantCount();
// Pass true to include hidden participants// Get all participants
conference?.getParticipants(); // List of all participants
// Get participants excluding hidden users
conference?.getParticipantsWithoutHidden(); // List of all participantsconference?.selectParticipant(participantId) // Select participant with IDconference?.selectParticipants(participantIds) // Select participants with IDs// Check if the local user is hidden
conference?.isHidden()
// Get local user details
confernece?.getUserId()
conference?.getUserRole()
conference?.getUserEmail()
conference?.getUserAvatar()
conference?.getUserName()conference?.setSubject(subject)conference?.getRemoteTracks();conference?.getLocalTracks()conference?.addEventListener(event: "KICKED"){ id in
// Handle participant kicked
};
conference?.addEventListener(event: "PARTICIPANT_KICKED"){ actorParticipant, kickedParticipant, reason in
}conference?.kickParticipant(id)conference?.kickParticipant(id, reason) // Kick a moderator, providing a reason for the kickconference?.grantOwner(id) // Grant owner rights to a participantconference?.addEventListener(event: "USER_ROLE_CHANGED"){ id, role in
if (conference.getUserId() === id ) {
// My role changed, new role: role;
} else {
// Participant role changed: role;
}
};conference?.revokeOwner(id) // Revoke owner rights from a participantconference?.setDisplayName(name); // Change the local user's display nameconference?.addEventListener(event: "DISPLAY_NAME_CHANGED") { id, displayName in
// Access the participant ID
};conference?.lock(password); // Lock the room with the specified passwordconference?.unlock();conference?.setLocalParticipantProperty("requestingTranscription", true);conference?.setLocalParticipantProperty("translation_language", 'hi'); // Example for Hindiconference?.addEventListener(event: "SUBTITLES_RECEIVED"){ id, name, text in
// Handle received subtitle data (id, speaker name, text)
};conference?.setLocalParticipantProperty("requestingTranscription", false);var options:[String: Any] = [:]
options["desktop"] = true
let videoTrack = localTracks[1]
SariskaMediaTransport.createLocalTracks(options: options) { tracks in
conference.replaceTrack(videoTrack, tracks[0])
}conference.sendMessage("message");conference.sendMessage("message", participantId);// Add an event listener to handle incoming messages
conference.addEventListener("MESSAGE_RECEIVED" )){ message, senderId in
// Process the received message
});conference.startTranscriber();conference.stopTranscriber();track.muteParticipant(participantId, mediaType);
// participantId: ID of the participant to be muted
// mediaType: Type of media to mute ('audio' or 'video')// Mute a local track (audio or video)
track.mute()
// Unmute a previously muted local track
track.unmute()conference.addEventListener(event: "LOCAL_STATS_UPDATED"){ stats in
// Handle local connection statistics
}conference.addEventListener(event: "REMOTE_STATS_UPDATED") { id, stats->
// Handle remote connection statistics
}conference.startP2PSession();conference.stopP2PSession();var options:[String: Any] = [:]
options["callStatsID"] = 'callstats-id';
options["callStatsSecret"] = 'callstats-secret';
let conference = connection.initJitsiConference(options);var options:[String: Any] = [:]
options["startAudioMuted"] = true;
options["startVideoMuted"] = true;
const conference = connection.initJitsiConference(options);var options:[String: Any] = [:]
options["startSilent"] = true;
let conference = connection.initJitsiConference(options);var options:[String: Any] = [:]
options["broadcastId"] = "youtubeBroadcastID"; ; // Put any string this will become part of your publish URL
options["mode"] = "stream";
options["streamId"] = "youtubeStreamKey";
// Start live stream
conference.startRecording(options);var options:[String: Any] = [:]
options["mode"] = "stream";
options["streamId"] = "rtmps://live-api-s.facebook.com:443/rtmp/FB-4742724459088842-0-AbwKHwKiTd9lFMPy";
// Start live stream
conference.startRecording(options);var options:[String: Any] = [:]
options["mode"] = "stream";
options["streamId"] = "rtmp://live.twitch.tv/app/STREAM_KEY";
// Start live stream
conference.startRecording(options);var options:[String: Any] = [:]
options["mode"] = "stream";
options["streamId"] = "rtmps://rtmp-server/rtmp"; // RTMP server URL
// Start live stream
conference.startRecording(options);conference.addEventListener("RECORDER_STATE_CHANGED"){ sessionId, mode, status in
// Verify mode is "stream"
// Get the live streaming session ID
// Check the streaming status: on, off, or pending
};conference.stopRecording(sessionId);// Configure for Object-based storage
var options:[String: Any] = [:]
options["mode"] = "file";
options["serviceName"] = "s3";
// Configure for Dropbox
var options:[String: Any] = [:]
options["token"] = "dropbox_oauth_token";
options["mode"] = "file";
options["serviceName"] = "dropbox";
// Start recording
conference.startRecording(options);
// Monitor recording state
conference.addEventListener(event: "RECORDER_STATE_CHANGED") { sessionId, mode, status in
let sessionId = sessionId as! String; // Unique identifier for the cloud recording session
let mode = mode as! String; // Recording mode (e.g., "file")
let status = status as! String; // Current recording status ("on", "off", or "pending")
// Handle recording state changes based on mode, sessionId, and status
};
// Stop recording
conference.stopRecording(sessionId)// Retrieve the phone pin and number for users to join via PSTN:
let phonePin = conference.getPhonePin() // Get the phone pin for PSTN access
let phoneNumber = conference.getPhoneNumber() // Get the phone number for PSTN access // Dial a phone number to invite a participant to the conference
conference.dial(phoneNumber)// Join the lobby
conference.joinLobby(displayName, email); // Request to join the conference lobby
// Event listeners for lobby-related actions:
conference.addEventListener(event: "LOBBY_USER_JOINED") { id, name in
// Handle events when a user joins the lobby
}
conference.addEventListener(event: "LOBBY_USER_UPDATED"), { id, participant in
// Handle events when a user's information in the lobby is updated
}
// Additional event listeners for lobby actions:
conference.addEventListener(event: "LOBBY_USER_LEFT") { id in
}
conference.addEventListener(event: "MEMBERS_ONLY_CHANGED"){ enabled in
}
// Moderator actions for lobby access:
conference.lobbyDenyAccess(participantId); // Deny access to a participant in the lobby
conference.lobbyApproveAccess(participantId); // Approve access to a participant in the lobby
// Lobby management methods:
conference.enableLobby() // Enable lobby mode for the conference (moderator only)
conference.disableLobby(); // Disable lobby mode for the conference (moderator only)
conference.isMembersOnly(); // Check if the conference is in members-only mode (lobby disabled)// Initiate a SIP video call
conference.startSIPVideoCall("[email protected]", "display name"); // Start a SIP video call with the specified address and display name
// Terminate a SIP video call
conference.stopSIPVideoCall('[email protected]');
// Event listeners for SIP gateway state changes
conference.addEventListener(event: "VIDEO_SIP_GW_SESSION_STATE_CHANGED") { state in
// Handle events when the SIP gateway session state changes (on, off, pending, retrying, failed)
}
// Event listener for SIP gateway availability changes
conference.addEventListener(event: "VIDEO_SIP_GW_AVAILABILITY_CHANGED"){ status in
// Handle events when the SIP gateway availability changes (busy or available)
}conference.addEventListener(event: "USER_STATUS_CHANGED"){ id, status in
let id = id as! String;
let status = status as!
// - id: receiver's user id
// - status: "ringing", "busy", "rejected", "connected", "expired"
};Sariska Media provides a robust suite of Kotlin APIs designed to streamline the development of real-time android applications. With Sariska, you can seamlessly integrate a variety of features.
With Pre-built Artifacts
For older Android Studio versions:
In your project's root directory, locate the build.gradle file.
Add the following code block within the repositories section:
For newer Android Studio versions:
In your project's root directory, locate the settings.gradle file.
Add the following code block within the repositories section:
With Maven
Add the dependency:
After installing the Sariska Media Transport SDK, begin by initializing it.
Media Services utilizes WebSockets to establish a continuous, two-way communication channel between your application and the server. This persistent connection offers significant advantages over traditional HTTP requests, particularly in terms of responsiveness.
Create a Jitsi-powered conference for real-time audio and video.
A MediaStream consists of various audio or video tracks represented by MediaStreamTrack objects. Each track can have multiple channels, which are the smallest units of the stream (e.g., left and right audio channels in stereo).
This event is triggered when a new user joins the conference. Moderators have exclusive control over meetings and can manage participants. To assign a moderator, set the moderator value to true when generating the token.
Make audio and video streams visible to others in the conference by publishing them using the following code:
Sariska-media-transport offers pre-configured events to help you track and analyze user interactions, media usage, and overall performance. This data can be used to enhance your product, improve user experience, and make informed decisions.
Available Events
Here are some of the key events you can track:
User Actions:
User joined
User left
Media Usage:
Add Event Listener to Track Events
Sariska offers powerful features to enhance your application's capabilities. Find your desired feature using the search bar or explore below!
Identify the main speaker: Easily detect the active or dominant speaker in a conference. Choose to stream only their video for improved resolution and reduced bandwidth usage. Ideal for one-way streaming scenarios like virtual concerts.
Dynamically showcase recent speakers: Focus on the active conversation by displaying video only for the last N participants who spoke. This automatically adjusts based on speech activity, offering a more efficient and relevant view.
Set local participant properties: Define additional information about participants beyond the default settings. This can include screen-sharing status, custom roles, or any other relevant attributes.
Get the total number of participants: Retrieve the complete participant count, including both visible and hidden members.
Access all participants: Obtain a list of all participants, including their IDs and detailed information.
Pin a single participant: Give a participant higher video quality (if simulcast is enabled).
Pin multiple participants: Give multiple participants higher video quality.
Retrieve information about the local user directly from the conference object.
Set the subject of the meeting.
Get all remote tracks: Retrieve a list of all remote tracks (audio and video) in the conference.
Get all local tracks: Retrieve a list of all local tracks (audio and video)
Listen for participant kick events
Kick a participant
The room creator has a moderator role, while other users have a participatory role.
Grant owner rights
Listen for role changes
Revoke owner rights
Setting a new display name
Listen for display name changes
Lock room: Moderators can restrict access to the room with a password.
Unlock room: Removes any existing password restriction.
Request subtitles: Enable subtitles for spoken content.
Request language translation: Translate subtitles into a specific language.
Receive subtitles: Listen for incoming subtitles.
Stop subtitles: Disable subtitles.
Each participant can contribute two types of data to a meeting: audio and video. Screen sharing counts as a video track. If you want to share your screen while also showing your own video (like when presenting), you need to enable "Presenter mode". This mode hides the gallery of other participants' videos and gives you more control over the meeting layout.
Start screen share: Share your screen with other participants.
Sariska offers robust messaging capabilities for both private and group communication scenarios.
Send a group message to all participants
Send a private message to a specific participant
Listen for incoming messages
Start Transcription: Initiate transcription for the ongoing conference.
Stop Transcription: Stop transcription and get a download link for the transcript.
Mute Remote Participant
Mute/Unmute Local Participant
Local Connection Statistics Received
Remote Connection Statistics Received
The SDK features intelligent auto-join/leave functionality based on internet connectivity status. This helps optimize network resources and improve user experience.
Designed for efficient communication between two participants.
Start Peer-to-Peer Mode
Sariska automatically activates Peer-to-Peer mode when your conference has exactly two participants. This mode bypasses the central server and directly connects participants, maximizing bandwidth efficiency and reducing latency. However, even with more than two participants, you can forcefully start Peer-to-Peer mode.
Stop Peer-to-Peer Mode
If you need to revert to server-mediated communication, you can easily stop Peer-to-Peer mode.
Monitor your WebRTC application performance using CallStats (or build your own). See the "RTC Stats" section for details.
Join conferences with audio and video already muted, or in a silent mode where no audio is transmitted or received. This ensures a seamless experience and respects participant preferences.
Join with Muted Audio and Video
Join in Silent Mode
Broadcast your conference to multiple platforms simultaneously. Embed live streams directly into your app or website using various formats.
Stream to YouTube
Stream to Facebook
Stream to Twitch
Stream to any RTMP Server
Listen for RECORDER_STATE_CHANGED event to track streaming status
Stop Live Stream
Store your recordings and transcriptions in various cloud storage services.
Dial-in(PSTN)
Dial-out(PSTN)
This allows synchronous phone calls, similar to WhatsApp, even if the receiver's app is closed or in the background.
Initiating Calls:
Make a call even if the callee's app is closed or in the background.
Play a busy tone if the callee is on another call or disconnects your call.
Play ringtone/ringback/DTMF tones.
Step 1 : Caller Initiates Call
HTTP Call to Sariska Server
{API Method}
Push Notification to callee using Firebase or APNS
This notifies the receiver even if their app is closed or in the background.
Step 2 : Callee Responds to Call
Reads Push Notification
Processes the notification even if the app is closed or in the background.
HTTP Call to Update Status
{API Method}
No need to join conference via SDK
Status update through the HTTP call suffices.
Step 3 : Caller Receives Response
Listens for USER_STATUS_CHANGED event
Step 4 : After Connection Established
The call proceeds like a normal conference call.
startVideoMuted: true
startSilent: true
rtcstatsServer: ""
callStatsID: ""
callStatsSecret: ""
channelLastN: 10
Purpose: Specifies which devices to request from the browser's GetUserMedia (GUM) API.
Default: If this property is not set, GUM will attempt to access all available devices.
resolution:
Type: String
Values: 180, 240, 360, vga, 480, qhd, 540, hd, 720, fullhd, 1080, 4k, 2160
Purpose: Sets the preferred resolution for the local video stream.
cameraDeviceId
Type: String
Purpose: Specifies the device ID of the camera to use.
micDeviceId
Type: String
Purpose: Specifies the device ID of the microphone to use.
minFps
Type: Integer
Purpose: Sets the minimum frame rate for the video stream.
maxFps
Type: Integer
Purpose: Sets the maximum frame rate for the video stream.
desktopSharingFrameRate
Type: Object
Properties:
min: Minimum frame rate for desktop sharing
max: Maximum frame rate for desktop sharing
desktopSharingSourceDevice
Type: String
Purpose: Specifies the device ID or label of the video input source to use for screen sharing.
facingMode
Type: String
Values: "user", "environment"
Purpose: Sets the camera's facing mode (front-facing or back-facing).
view.setZOrderOnTop(0)Participant Removal:
Ability to kick non-moderators or even other moderators from the meeting.
Audio Control:
Ability to mute individual participants or all participants at once.
Video Focus:
Ability to make everyone's video view follow the moderator's video.
Joining Settings: Ability to:
Set participants to join with audio muted by default.
Set participants to join with video disabled by default.
Lobby Management:
Ability to enable or disable the lobby room, requiring approval for participants to join.
Join Approval:
Ability to approve or deny join requests when the lobby is enabled.
Moderator Transfer:
If the current moderator leaves the meeting, a new moderator is automatically selected.
Type: String
Purpose: May be used for identification or communication purposes.
moderator:
Type: Boolean
Purpose: Used to control moderation-related features in the UI.
audioMuted:
Type: Boolean
Purpose: Used to display the audio muted state in the UI.
videoMuted:
Type: Boolean
Purpose: Used to display the video muted state in the UI.
displayName:
Type: String
Purpose: Used to identify them in the UI.
role:
Type: String
Purpose: Used to determine their permissions and UI features.
status:
Type: String
Purpose: Used to display ("online", "offline", "away") their availability in the UI.
hidden:
Type: Boolean
Purpose: Typically used for bots like transcribers or recorders.
botType:
Type: String
Purpose: Used to identify the bot's purpose and capabilities.
Conference duration
Camera duration
Audio track duration
Video track duration
Recording:
Recording started
Recording stopped
Local recording started
Local recording stopped
Transcription:
Transcription started
Transcription stopped
Performance:
Speaker stats
Connection stats
Performance stats
Danish
de
German
el
Greek
en
English
enGB
English (United Kingdom)
eo
Esperanto
es
Spanish
esUS
Spanish (Latin America)
et
Estonian
eu
Basque
fi
Finnish
fr
French
frCA
French (Canadian)
he
Hebrew
hi
Hindi
hr
Croatian
hu
Hungarian
hy
Armenian
id
Indonesian
it
Italian
ja
Japanese
kab
Kabyle
ko
Korean
lt
Lithuanian
ml
Malayalam
lv
Latvian
nl
Dutch
oc
Occitan
fa
Persian
pl
Polish
pt
Portuguese
ptBR
Portuguese (Brazil)
ru
Russian
ro
Romanian
sc
Sardinian
sk
Slovak
sl
Slovenian
sr
Serbian
sq
Albanian
sv
Swedish
te
Telugu
th
Thai
tr
Turkish
uk
Ukrainian
vi
Vietnamese
zhCN
Chinese (China)
zhTW
Chinese (Taiwan)
mr
Marathi
Enter the required credentials for your chosen provider
Dropbox
Obtain a Dropbox OAuth token
connected: receiver accepted your call
expired: receiver didn't answer within 40 seconds
af
Afrikaans
ar
Arabic
bg
Bulgarian
ca
Catalan
cs
Czech
da
allprojects {
// Add the Sariska repository
repositories {
maven {
url "https://github.com/SariskaIO/sariska-maven-repository/raw/master/releases"
}
// Add other repositories
google()
mavenCentral()
maven { url 'https://www.jitpack.io' }
}
}dependencyResolutionManagement {
repositoriesMode.set(RepositoriesMode.FAIL_ON_PROJECT_REPOS)
repositories {
google()
mavenCentral()
maven {
url "https://github.com/SariskaIO/sariska-maven-repository/raw/master/releases"
}
maven {
url "https://maven.google.com"
}
}
}dependencies {
implementation 'io.sariska:sariska-media-transport:5.4.9'
}import io.sariska.sdk.SariskaMediaTransport;
import io.sariska.sdk.Connection;
import io.sariska.sdk.Conference;
import io.sariska.sdk.JitsiRemoteTrack;
import io.sariska.sdk.JitsiLocalTrack;
import io.sariska.sdk.Participant;
SariskaMediaTransport.initializeSdk();// Replace {your-token} with your actual Sariska token
val token = {your-token}
// Create a connection object
val connection = SariskaMediaTransport.JitsiConnection(token, "roomName", isNightly)
// Add event listeners for connection status
connection.addEventListener("CONNECTION_ESTABLISHED"), {
// Handle successful connection establishment
}
connection.addEventListener("CONNECTION_FAILED"), { error->
if (error === PASSWORD_REQUIRED) {
// Token expired, update the connection with a new token
connection.setToken(token)
}
}
connection.addEventListener("CONNECTION_DISCONNECTED"), {
// Handle connection disconnection
}
// Connect to the Sariska server
connection.connect()// Create a conference object using the connection and optional configuration
val conference = connection.initJitsiConference(options);
// Join the conference
conference.join();// Define options for local stream capture
val options = Bundle()
// Enable audio and video tracks in the stream
options.putBoolean("audio", true)
options.putBoolean("video", true)
// Set desired video resolution
options.putInt("resolution", 240) // Specify desired resolution
// ... (additional options for desktop sharing, facing mode, devices, etc.)
// Create local audio and video tracks based on the options
SariskaMediaTransport.createLocalTracks(options) { tracks ->
localTracks = tracks;
}runOnUiThread(() -> {
// Loop through each local track
for (JitsiLocalTrack track : localTracks) {
if (track.getType().equals("video")) {
// Get the view for rendering the video track
val view = track.render();
// Set the view to fill the container
view.setObjectFit("cover");
// Add the video view to the container layout
mLocalContainer.addView(view);
}
}
})conference.addEventListener("USER_JOINED", (id, participant)=>{
});// Loop through all local tracks
for (JitsiLocalTrack track : localTracks) {
// Add the track to the conference, allowing others to see/hear you
conference.addTrack(track);
}// Add an event listener to the conference for "TRACK_ADDED" events
conference.addEventListener("TRACK_ADDED") { p ->
// Cast the event object to a JitsiRemoteTrack instance
val track: JitsiRemoteTrack = p as JitsiRemoteTrack
// Run code on the UI thread
runOnUiThread {
// Check if the track type is video
if (track.getType().equals("video")) {
// Render the video track using Jitsi's WebRTCView component
val view: WebRTCView = track.render()
// Set the view to cover the entire container
view.setObjectFit("cover")
// Store the view for potential later reference
remoteView = view
// Add the view to the container layout, displaying the remote video
mRemoteContainer!!.addView(view)
}
}
}conference.startTracking();conference.addEventListener("DOMINANT_SPEAKER_CHANGED") { p ->
// participantId is a string containing the ID of the dominant speaker
val participantId = p as String
}// Listen for last N speakers changed event
conference.addEventListener("LAST_N_ENDPOINTS_CHANGED") { leavingEndpointIds, enteringEndpointIds ->
// leavingEndpointIds: Array of IDs of users leaving lastN
// enteringEndpointIds: Array of IDs of users entering lastN
};
// Set the number of last speakers to show
conference.setLastN(10)// Set local participant property
conference.setLocalParticipantProperty(key, value);
// Remove a local participant property
conference.rempveLocalParticipantProperty(key)
// Get the value of a local participant property
conference.getLocalParticipantProperty(key)
// Listen for changes in participant properties
conference.addEventListener("PARTICIPANT_PROPERTY_CHANGED"){ participant, key, oldValue, newValue ->
}conference.getParticipantCount();
// Pass true to include hidden participants// Get all participants
conference.getParticipants(); // List of all participants
// Get participants excluding hidden users
conference.getParticipantsWithoutHidden(); // List of all participantsconference.selectParticipant(participantId) // Select participant with IDconference.selectParticipants(participantIds) // Select participant with IDs// Check if the local user is hidden
conference.isHidden()
// Get local user details
confernece.getUserId()
conference.getUserRole()
conference.getUserEmail()
conference.getUserAvatar()
conference.getUserName()conference.setSubject(subject)conference.getRemoteTracks();conference.getLocalTracks()conference.addEventListener("KICKED"){ id ->
// Handle participant kicked
};
conference.addEventListener("PARTICIPANT_KICKED") { actorParticipant, kickedParticipant, reason ->
}confernece.kickParticipant(id)conference.grantOwner(id) // Grant owner rights to a participantconference.addEventListener("USER_ROLE_CHANGED"){ id, role ->
if (confernece.getUserId() === id ) {
// My role changed, new role: role;
} else {
// Participant role changed: role;
}
};conference.revokeOwner(id) // Revoke owner rights from a participantconference.setDisplayName(name); // Change the local user's display nameconference.addEventListener("DISPLAY_NAME_CHANGED"){ id, displayName->
// Access the participant ID
};conference.lock(password); // Lock the room with the specified passwordconference.unlock();conference.setLocalParticipantProperty("requestingTranscription", true);conference.setLocalParticipantProperty("translation_language", 'hi'); // Example for Hindiconference.addEventListener("SUBTITLES_RECEIVED"){ id, name, text->
// Handle received subtitle data (id, speaker name, text)
};conference.setLocalParticipantProperty("requestingTranscription", false);// Create a desktop track
val options = new Bundle();
options.putBoolean("desktop", true);
val videoTrack = localTracks[1]; // Participant's local video track
SariskaMediaTransport.createLocalTracks(options, tracks -> {
conference.replaceTrack(videoTrack, tracks[0]);
});conference.sendMessage("message");conference.sendMessage("message", participantId);// Add an event listener to handle incoming messages
conference.addEventListener("MESSAGE_RECEIVED" ){ message, senderId->
// Process the received message
});conference.startTranscriber();conference.stopTranscriber();conference.muteParticipant(participantId, mediaType)
// participantId: ID of the participant to be muted
// mediaType: Type of media to mute ('audio' or 'video')// Mute a local track (audio or video)
track.mute()
// Unmute a previously muted local track
track.unmute()conference.addEventListener(event: "LOCAL_STATS_UPDATED"){ stats in
// Handle local connection statistics
}conference.addEventListener(event: "REMOTE_STATS_UPDATED") { id, stats->
// Handle remote connection statistics
}conference.startP2PSession();conference.stopP2PSession();val options = new Bundle();
options.putString("callStatsID", 'callstats-id');
options.putString("callStatsSecret", 'callstats-secret');
val conference = connection.initJitsiConference(options);val options = new Bundle();
options.putBoolean("startAudioMuted", true);
options.putBoolean("starVideoMuted", true);
val conference = connection.initJitsiConference(options);val options = new Bundle();
options.putBoolean("startSilent", true);
val confernce = connection.initJitsiConference(options);val options = new Bundle();
options.putString("broadcastId", "youtubeBroadcastID"); // Put any string this will become part of your publish URL
options.putString("mode", "stream");
options.putString("streamId", "youtubeStreamKey");
// Start live stream
conference.startRecording(options);val options = new Bundle();
options.putString("mode", "stream");
options.putString("streamId", "rtmps://live-api-s.facebook.com:443/rtmp/FB-4742724459088842-0-AbwKHwKiTd9lFMPy");
// Start live stream
conference.startRecording(options);val options = new Bundle();
options.putString("mode", "stream");
options.putString("streamId", "rtmp://live.twitch.tv/app/STREAM_KEY");
// Start live stream
conference.startRecording(options); val options = new Bundle();
options.putString("mode", "stream");
options.putString("streamId", "rtmps://rtmp-server/rtmp"); // RTMP server URL
// Start live stream
conference.startRecording(options);conference.addEventListener("RECORDER_STATE_CHANGED"){ sessionId, mode, status in
// Verify mode is "stream"
// Get the live streaming session ID
// Check the streaming status: on, off, or pending
};conference.stopRecording(sessionId);// Configure for Object-based storage
val options = new Bundle();
options.putString("mode", "file");
options.putString("serviceName", "s3");
// Configure for Dropbox
val options = new Bundle();
options.putString("mode", "file");
options.putString("serviceName", "dropbox");
options.putString("token", "dropbox_oauth_token");
// Start recording
conference.startRecording(options);
// Monitor recording state
conference.addEventListener("RECORDER_STATE_CHANGED"){ sessionId, mode, status ->
val sessionId = sessionId as String; // Unique identifier for the cloud recording session
val mode = mode as String; // Recording mode (e.g., "file")
val status = status as String; // Current recording status ("on", "off", or "pending")
// Handle recording state changes based on mode, sessionId, and status
});
// Stop recording
conference.stopRecording(sessionId)// Retrieve the phone pin and number for users to join via PSTN:
val phonePin = conference.getPhonePin() // Get the phone pin for PSTN access
val phoneNumber = conference.getPhoneNumber() // Get the phone number for PSTN access // Dial a phone number to invite a participant to the conference
conference.dial(phoneNumber)// Join the lobby
conference.joinLobby(displayName, email); // Request to join the conference lobby
// Event listeners for lobby-related actions:
conference.addEventListener("LOBBY_USER_JOINED"){ id, name ->
// Handle events when a user joins the lobby
val id = id as String;
val name = name as String
})
conference.addEventListener(event: "LOBBY_USER_UPDATED"), { id, participant ->
// Handle events when a user's information in the lobby is updated
val id = id as String;
val name = participant as Participant
}
// Additional event listeners for lobby actions:
conference.addEventListener(event: "LOBBY_USER_LEFT") { id ->
val id = id as String;
}
conference.addEventListener(event: "MEMBERS_ONLY_CHANGED") { enabled ->
val enabled = enabled as Boolean;
}
// Moderator actions for lobby access:
conference.lobbyDenyAccess(participantId); // Deny access to a participant in the lobby
conference.lobbyApproveAccess(participantId); // Approve access to a participant in the lobby
// Lobby management methods:
conference.enableLobby() // Enable lobby mode for the conference (moderator only)
conference.disableLobby(); // Disable lobby mode for the conference (moderator only)
conference.isMembersOnly(); // Check if the conference is in members-only mode (lobby disabled)// Initiate a SIP video call
conference.startSIPVideoCall("[email protected]", "display name"); // Start a SIP video call with the specified address and display name
// Terminate a SIP video call
conference.stopSIPVideoCall('[email protected]');
// Event listeners for SIP gateway state changes
conference.addEventListener("VIDEO_SIP_GW_SESSION_STATE_CHANGED", (state)=>{
// Handle events when the SIP gateway session state changes (on, off, pending, retrying, failed)
}
// Event listener for SIP gateway availability changes
conference.addEventListener("VIDEO_SIP_GW_AVAILABILITY_CHANGED"){ status ->
// Handle events when the SIP gateway availability changes (busy or available)
}conference.addEventListener(event: "USER_STATUS_CHANGED"){ id, status ->
val id = id as! String;
val status = status as!
// - id: receiver's user id
// - status: "ringing", "busy", "rejected", "connected", "expired"
};Unity is a cross-platform game engine developed by Unity Technologies.
It is particularly popular for iOS and Android mobile game development and used for games such as Pokémon Go, Monument Valley, Call of Duty: Mobile, Beat Saber, and Cuphead.
Sariska Media provides powerful Unity APIs for developing real-time applications.
You can integrate audio/video, live streaming cloud recording, transcriptions, language translation, and many other services on the fly.
Switch Platform to either Android, iOS, or WebGL
Got to Player Settings and disable Multi-Threaded Rendering
Add OpenGLES2 and OpenGLES3 in Graphics APIs
Set Minimum API Level to Android 5.1 ‘Lollipop’ (API level 22)
Clone Unity Jar Resolver from
In your unity project, go to Assets -> Import-Package -> Custom Package
Select "external-dependency-manager-1.2.169.unitypackage" from the cloned repo and import all.
Add the external-dependency-manager-*.unitypackage to your plugin project (assuming you are developing a plugin).
Copy and rename the file into your plugin and add the dependencies your plugin requires.
Below is an example of the xml file used to import Sariksa Media Transport SDK for Android and iOS.
The latest stable version is “io.sariska:sariska-media-transport:5.2.1-exp”
Create two canvases in a scene one for each local and remote client
Create a script to access Sariska Media Transport plugin functions.
Declare Raw images wherein the video will be embedded
Declare Textures and Texture pointers for the two video streams:
Get the texture pointers from the declared Textures, and attach the textures to the declared raw images:
After the SDK is added to your project, first initialize the SDK before proceeding with the setting up for the video call.
The SetupLocalStream method in SariskaMediaUnitySdk gives you the option to set up a stream with a set of parameters, which lets you choose between an audio or a video call, and the resolution of the video in case of a video call.
Additionally, the method requires the user to send texture pointers for both textures defined by the user.
In order to enter the room and start a call, the create connection method lets you give a room name and user name as parameters. Once these parameters are sent to the SDK, it automatically creates a JWT token for the user and establishes a conference for people to join it.
A local participant can mute or unMute their audio by using the following methods.
A local participant can mute or unMute their video by using the following methods.
A participant can switch the camera between the front and back by calling the switch camera method. By default, the video call initializes with the front camera open.
A moderator can lock and unlock a room by calling the below two methods. While locking the room, the moderator has to provide a password in the form of a string.
The audio output can be changed to the speaker and turned off by calling the OnSpeaker method for switching to speaker and OffSpeaker for switching to the default output.
The GetParticipantCount method returns the number of participants present in the meeting at the time the method is called.
The GetDominantSpeaker method returns the id in form of a string of the dominant speaker of the meeting.
A basic demo of drawing remote video tracks from sariska-media-transport on objects in Unity can be found at .
Create a script to access Sariska Media Transport plugin functions.
Declare external method “UpdateTextureFromVideoElement” which takes textureId generated by Unity as a argument.
The script's Start function creates a texture and attaches it to a material on the object. The script's Update function calls a JS plugin method, passing a pointer to the texture.
The .jslib plugin files are responsible for acting as a bridge between Javascript and Unity C# scripts.
The JS plugin method uses texSubImage2D to copy pixels from the video element of the track onto the texture. In a real application, you would identify the remote tracks (e.g. by participant ID) and paint each one on a separate object.
The demo application showcases the usage of three participants.
git clone https://github.com/SariskaIO/Sariska-Unity-Jar-Resolver<?xml version="1.0" encoding="UTF-8" ?>
<dependencies>
<iosPods>
<iosPod name="sariska-media-transport" addToAllTargets="false">
<sources>
<source>https://github.com/SariskaIO/sariska-ios-sdk-releases</source>
</sources>
</iosPod>
</iosPods>
<androidPackages>
<repositories>
<repository>https://maven.pkg.github.com/SariskaIO/maven-repository</repository>
</repositories>
<androidPackage spec="io.sariska:sariska-media-transport:5.2.1-exp"></androidPackage>
</androidPackages>
</dependencies>public class ExternalTexturePlugin : MonoBehaviour
{
...
[SerializeField] private RawImage localImage;
[SerializeField] private RawImage remoteImage;
...
}public class ExternalTextureSecond : MonoBehaviour
{
...
[SerializeField] private RawImage localImage;
[SerializeField] private RawImage remoteImage;
private Texture2D localTexture2D;
private Texture2D remoteTexture2D;
private IntPtr _nativeTexturePointerLocal;
private IntPtr _nativeTexturePointerRemote;
...
}public class ExternalTextureSecond : MonoBehaviour
{
...
void start(){
_nativeTexturePointerLocal = localTexture2D.GetNativeTexturePtr();
_nativeTexturePointerRemote = remoteTexture2D.GetNativeTexturePtr();
localImage.texture = localTexture2D;
remoteImage.texture = remoteTexture2D;
}
...
}using Plugins.SariskaMediaUnitySdk;SariskaMediaUnitySdk.InitSariskaMediaTransport();// void SetupLocalStream(audio, video, resolution, localTexturePointer, remoteTexturePointer)
SariskaMediaUnitySdk.SetupLocalStream(true, true, 180, localTexturePointer, remoteTexturePointer);SariskaMediaUnitySdk.CreateConnection(roomName, userName);// To mute audio
SariskaMediaUnitySdk.MuteAudio();
// To unmute audio
SariskaMediaUnitySdk.UnMuteAudio();// To mute video
SariskaMediaUnitySdk.MuteVideo();
// To unmute video
SariskaMediaUnitySdk.UnMuteVideo();// To switch between camera
SariskaMediaUnitySdk.SwitchCamera();// Lock a room with a password
SariskaMediaUnitySdk.LockRoom(password);
// Unlock a room
SariskaMediaUnitySdk.UnlockRoom();// Speaker on
SariskaMediaUnitySdk.OnSpeaker();
// Speaker off
SariskaMediaUnitySdk.OffSpeaker();// Get Participant count
// hidden, if assigned true, counts hidden participants as well
bool hidden = true;
int participantCount = SariskaMediaUnitySdk.GetParticipantCount(hidden);// Returns the id of the dominant speaker
string participantId = SariskaMediaUnitySdk.GetDominantSpeaker();SariskaMediaTransport.init();connection = new SariskaMediaTransport.JitsiConnection(token, {room-name});
connection.addEventListener(SariskaMediaTransport.events.connection.CONNECTION_ESTABLISHED, onConnectionSuccess);
connection.connect();room = connection.initJitsiConference();
room.on(SariskaMediaTransport.events.conference.CONFERENCE_JOINED, onConferenceJoined);
room.on(SariskaMediaTransport.events.conference.TRACK_ADDED, onRemoteTrack);
room.on(SariskaMediaTransport.events.conference.USER_JOINED, id => { remoteTracks[id] = []; });
room.on(SariskaMediaTransport.events.conference.USER_LEFT, onUserLeft);
room.join();
SariskaMediaTransport.createLocalTracks({devices: ["audio", "video"]})
.then(onLocalTracks); localTracks = tracks;
if (isJoined) {
for (let i = 0; i < localTracks.length; i++) {
room.addTrack(localTracks[i]);
}
}
for(var i=0;i<tracks.length;i++){
if (tracks[i].getType() == 'video') {
const key = "local";
window.videoElements[key] = document.createElement('video');
window.videoElements[key].autoplay = true;
tracks[i].attach(window.videoElements[key]);
}
}function onConferenceJoined() {
isJoined = true;
for (let i = 0; i < localTracks.length; i++) {
room.addTrack(localTracks[i]);
}
}function onRemoteTrack(track) {
if (track.isLocal()) {
return;
}
const participantId = track.getParticipantId();
if (!remoteTracks[participantId]) {
remoteTracks[participantId] = [];
}
remoteTracks[participantId].push(track);
if (track.getType() == 'video') {
// Video elements just get stored, they're accessed from Unity.
const key = "participant-" + participantId;
window.videoElements[key] = document.createElement('video');
window.videoElements[key].autoplay = true;
track.attach(window.videoElements[key]);
}
else {
// Audio elements get added to the DOM (can be made invisible with CSS) so that the audio plays back.
const audioElement = document.createElement('audio');
audioElement.autoplay = true;
audioElement.id = "audio-" + participantId;
document.body.appendChild(audioElement);
track.attach(audioElement);
}
}[DllImport("__Internal")]
private static extern void UpdateTextureFromVideoElement(int textureId);void Update() {
if (local == 0)
{
textureId = texture.GetNativeTextureID();
}
UpdateTextureFromVideoElement2(textureId);
}mergeInto(LibraryManager.library, {
UpdateTextureFromVideoElement: function (textureId) {
const videoElement = Object.values(window.videoElements)[0];
if (!videoElement) {
console.log("no video element");
return;
}
const texture = GL.textures[textureId];
if (!texture) {
console.log("no texture for id: " + textureId);
return;
}
GLctx.bindTexture(GLctx.TEXTURE_2D, texture);
GLctx.texSubImage2D(
GLctx.TEXTURE_2D,
0, // level
0, // x offset
0, // y offset
videoElement.videoWidth,
videoElement.videoHeight,
GLctx.RGB,
GLctx.UNSIGNED_BYTE,
videoElement
);
}



Sariska Media provides powerful Java API's for developing real-time applications.
You can integrate audio/video, live streaming cloud recording, transcriptions, language translation and many other services on the fly.
This API documentation describes all possible features supported by sariska-media-transport which possibly covers any of your use cases.
In your project, add the Maven repository https://github.com/SariskaIO/sariska-maven-repository/raw/master/releases and the dependency io.sariska:sariska-media-transport into your build.gradle files.
The repository typically goes into the build.gradle file in the root of your project:
In recent versions of Android Studios, allprojects{} might not be found in build.gradle. In that case, the repository goes into the settings.gradle file in the root of your project:
Dependency definitions belong in the individual module build.gradle files:
After you install the SDK, perform initial setup tasks by running initializeSdk().
WebSockets are ideal to keep a single, persistent session. Unlike HTTPS, WebSocket requests are updated almost immediately. To start using the media services, the primary step is to create a Media WebSocket connection.
Once you have your connection established, the next step is to create a conference. Sariska is backed by the Jitsi architecture.
A MediaStream consists of zero or more MediaStreamTrack objects, representing various audio or video tracks.
Each MediaStreamTrack may have one or more channels. The channel represents the smallest unit of a media stream, such as an audio signal associated with a given speaker, like left or right in a stereo audio track. Here we mostly talk about track.
This will be your most basic conference call. However, we recommend following up with the two further steps to add customized features to enhance your experience.
Note: You don't any audio element to play sound as it plays in conjunction with video stream.
The moderator of the meeting controls and gatekeeps the participants. The moderator has exclusive control of the meeting.
If you wish to have a moderator, pass the moderator value as true while generating your token. Moderator has the following permissions:
Ability to add a password to a room
Ability to grant the moderator role to non-moderators
Ability to kick non-moderators or other moderators
Ability to mute participates
Use the following code to now publish your call.
That's it you are done with a basic conference call, Follow the guide below for more features.
Sariska-media-transport comes with pre-configured top events used to help improvise your product and overall consumer experience.
Few popular events:
User left
User joined
Conference duration
Camera duration
We will be updating the list of features soon.
You can easily detect the active or the dominant speaker. You could choose to stream only their video, thereby saving on the costs and better resolution to others. This is could be a use case for one-way streaming; such as virtual concerts.
The idea is that we select a subset of N participants, whose video to show, and we stop the video from others. We dynamically and automatically adjust the set of participants that we show according to who speaks – effectively we only show video for the last N people to have spoken.
Set Local Participant Property
Get participant count
Note: Hidden participants are generally bots join the conference along with actual participants. For example: recorder, transcriber, pricing agent.
Get all participants in conference
Get all participants in conference without hidden participants
Pin/Select participant
Select/Pin Multiple Participants
Access local user details directly from conference
Get all remote tracks
Get all local tracks
Grant Owner
Except for the room creator, the rest of the users have a participatory role. You can grant them owner rights with the following code.
Revoke Owner
To revoke owner rights from a participant, use the following code.
Lock room
A moderator can lock a room with a password. Use the code as follows.
Unlock room
Start Screen Sharing
A participant supports 2 tracks at a type: audio and video. Screen sharing(desktop) is also a type of video track. If you need screen sharing along with the speaker video you need to have Presenter mode enabled.
Start Transcription
Stop Transcription
Mute/Unmute Local participant
Mute Remote participant
The moderator can mute any remote participant.
SDK is already configured to auto-join/leave when the internet connection fluctuates.
Start peer-to-peer mode
Sariska automatically switches to peer peer-to-peer mode if participants in the conference exactly 2. You can, however, still, you can forcefully switch to peer-to-peer mode.
Note: Conferences started on peer-to-peer mode will not be charged until the turn server is not used.
Stop peer-to-peer mode
To monitor your WebRTC application, simply integrate the call stats or build your own by checking out the RTC Stats section.
Join Silent( no audio will be sent/receive)
join conference with silent mode no audio sent/receive
Join Muted
To start a conference with already muted options.
Stream to YouTube
You can get youtube stream key manually by login to your youtube account or use google OAuth API
Stream to Facebook
You can get facebook streamId manually by login to your facebook account or use Facebook OAuth API.
Stream to Twitch
Stream to any RTMP server
Listen for RECORDER_STATE_CHANGED event to know live streaming status
Stop Live Streaming
Stop Cloud Recording
Dial-in(PSTN)
Dial-out(PSTN)
To enable the feature for waiting room/lobby checkout APIs below
One-to-one calling is more of the synchronous way of calling where you deal with things like
Calling someone even if his app is closed or background
Play a busy tone if a user is busy on another call or disconnected your call
Play ringtone/ringback/dtmftone
This is similar to how WhatsApp works.
Make an HTTP call to the Sariska server
Send push notifications to callee using your Firebase or APNS account
Callee now reads the push notification using ConnectionService or CallKit even if the app is closed or in the background
Callee can update his status back to the caller just by making an updated HTTP Call, no needs to join the conference via SDK
Since, the Caller has already joined the conference using SDK he can easily get the status just by listening USER_STATUS_CHANGED event
After the callee has joined the conference rest of the steps are the same as the normal conference call
Calendar Sync
Now you can programmatically start scheduling a meeting with google/microsoft calendar.
This integration adds the /sariska slash command for your team so that you can start a video conference in your channel, making it easy for everyone to just jump on the call. The slash command, /sariska, will drop a conference link in the channel for all to join.
Mentioning one or more teammates, after /sariska, will send personalized invites to each user mentioned. Check out how it is integrated .
Low-level logging on peer connection API calls and periodic getStats calls for analytics/debugging purposes. Make sure you have passed RTCstats WebSocket URL while initializing the conference. Check out how to configure RTCStats WebSocket Server .
Ability to make everyone see the moderator video (Everyone follows me)
Ability to make participants join muted (Everyone starts muted)
Ability to make participants join without video (Everyone starts hidden)
Ability to enable/disable the lobby room
Ability to approve join/knocking requests (when the lobby is enabled)
When the moderator leaves, a new one is selected automatically
Audio track duration
Video track duration
Recording started
Recording stopped
Transcription started
Transcription stopped
Local Recording started
Local Recording stopped
Speaker Stats
allprojects {
repositories {
maven {
url "https://github.com/SariskaIO/sariska-maven-repository/raw/master/releases"
}
google()
mavenCentral()
maven { url 'https://www.jitpack.io' }
}
}dependencyResolutionManagement {
repositoriesMode.set(RepositoriesMode.FAIL_ON_PROJECT_REPOS)
repositories {
google()
mavenCentral()
maven {
url "https://github.com/SariskaIO/sariska-maven-repository/raw/master/releases"
}
maven {
url "https://maven.google.com"
}
}
}dependencies {
// (other dependencies)
implementation 'io.sariska:sariska-media-transport:5.4.9'
}import io.sariska.sdk.SariskaMediaTransport;
import io.sariska.sdk.Connection;
import io.sariska.sdk.Conference;
import io.sariska.sdk.JitsiRemoteTrack;
import io.sariska.sdk.JitsiLocalTrack;
import io.sariska.sdk.Participant;
SariskaMediaTransport.initializeSdk();String token = {your-token};
Connection connection = SariskaMediaTransport.JitsiConnection(token, "roomName", false);
// set isNightly true for latest updates on the features else build will point to stable version
connection.addEventListener("CONNECTION_ESTABLISHED" ()->{
});
connection.addEventListener("CONNECTION_FAILED", (error) -> {
if (error === "PASSWORD_REQUIRED") {
// token is expired
connection.setToken(token) // set a new token
}
});
connection.addEventListener("CONNECTION_DISCONNECTED", () -> {
});
connection.connect()Conference conference = connection.initJitsiConference(options);
conference.join();
//Additional options initJitsiConference accepts to enable more feature
// startAudioMuted: true // to join audio muted
// startVideoMuted: true // to join video muted
// startSilent : true // to start conference in silent no audio recieve/send
// rtcstatsServer: “” // send data to rtcstats server
// callStatsID: “” // callStatsID to enble callstats
// callStatsSecret: “” //callstatssecret
// channelLastN: 10Bundle options = new Bundle();
options.putBoolean("audio", true);
options.putBoolean("video", true);
options.putInt("resolution", 240); // 180, 240, 360, vga, 480, qhd, 540, hd, 720, fullhd, 1080, 4k, 2160
// options.putBoolean("desktop", true); for screen sharing
// options.putString("facingMode", "user"); user or environment
// options.putString("micDeviceId", "mic_device_id");
// options.putString("cameraDeviceId", "camera_device_id");
// options.putString("minFps", 20);
// options.putString("maxFps", 24);
SariskaMediaTransport.createLocalTracks(options, tracks -> {
localTracks = tracks;
});runOnUiThread(() -> {
for (JitsiLocalTrack track : localTracks) {
if (track.getType().equals("video")) {
WebRTCView view = track.render();
view.setObjectFit("cover");
mLocalContainer.addView(view);
}
}
})
// WebRTCView class provides additional methods to manage View styling
// view.setMirror( mirror)->true or false if you want to mirror your video to other participants
// view.setObjectFit("cover")—> Can be "contain" or "cover"
// view.setZOrderMediaOverlay(0)-> can be 0 or 1
// view.setZOrderOnTop(0)-> 0, 1, 2conference.addEventListener("USER_JOINED", (id, participant)=>{
// String id = (String) id
// Participant participant = (Participant) participant
// joined Participant class has the following popular properties which you can use to maintain UI states
// avatar
// email
// moderator
// audioMuted
// videoMuted
// displayName
// role
// status
// hidden
// botType
// Generally participants are bots like transcriber, recorder
});for (JitsiLocalTrack track : localTracks) {
conference.addTrack(track);
}conference.addEventListener("TRACK_ADDED", (track) -> {
JitsiRemoteTrack track = (JitsiRemoteTrack) track;
runOnUiThread(() -> {
if (track.getType().equals("video")) {
WebRTCView view = track.render();
view.setObjectFit("cover");
mRemoteContainer.addView(view);
}
});
});// you can start tracking events just by listening
conference.startTracking();conference.addEventListener("DOMINANT_SPEAKER_CHANGED", (id)=> {
//String id = (String) id; dominant speaker id
});// to listen for last n speakers changed event
conference.addEventListener("LAST_N_ENDPOINTS_CHANGED", (leavingEndpointIds, enteringEndpointIds)=>{
// String[] leavingEndpointIds = ( String[] ) leavingEndpointIds; //Array of ID's of users leaving lastN
// String[] enteringEndpointIds = ( String[] ) enteringEndpointIds; //Array of ID's of users entering lastN
});
// to set last n speakers in mid or you can pass option during conference initialization
conference.setLastN(10) // to set local participant property
conference.setLocalParticipantProperty(key, value);
// name is a string
// value can be object string or object
// to remove local participant property
conference.rempveLocalParticipantProperty(key)
// to get local participant propety
conference.getLocalParticipantProperty(key)
// this notifies everyone in the conference of the following PARTICIPANT_PROPERTY_CHANGED event
conference.addEventListener("PARTICIPANT_PROPERTY_CHANGED", (participant, key,oldValue, newValue) => {
});conference.getParticipantCount();
// pass boolean true if you need participant count including hidden participantsconference.getParticipants(); // list of all participantsconference.getParticipantsWithoutHidden(); // list of all participantsconference.selectParticipant(participantId)
//Elects the participant with the given id to be the selected participant in order to receive higher video quality (if simulcast is enabled).conference.selectParticipants(participantIds) // string array of participant Id's
//Elects the participant with the given id to be the selected participant in order to receive higher video quality (if simulcast is enabled).conference.isHidden()
confernece.getUserId()
conference.getUserRole()
conference.getUserEmail()
conference.getUserAvatar()
conference.getUserName()conference.setSubject(subject)conference.getRemoteTracks(); // get all remote tracksconference.getLocalTracks()// notifies that participant has been kicked from the conference by modeator
conference.addEventListener("KICKED", (id)=> {
//String id = (String) id; id of kicked participant
});
// notifies that moderator has been kicked from the conference by another moderator
conference.addEventListener("PARTICIPANT_KICKED", (actorParticipant, kickedParticipant, reason)=>{
//Participant actorParticipant = (Participant) actorParticipant;
//Participant kickedParticipant = (Participant) kickedParticipant;
//String reason = (String) reason; moderator has to give reason for kick
})
// method to kick out a participant
confernece.kickParticipant(id) // participant idconference.grantOwner(id) // participant id
// listen for role-changed event
conference.addEventListener("USER_ROLE_CHANGED", (id, role) =>{
//String id = (String) id; id of participant
//String role = (String) role; new role of user
if (confernece.getUserId() === id ) {
// My role changed, new role: role;
} else {
// Participant role changed: role;
}
});
conference.revokeOwner(id) // participant id// to change your display name
conference.setDisplayName(name);
// Listens for change display name event if changed by anyone in the conference
conference.addEventListener("DISPLAY_NAME_CHANGED", (id, displayName)=>{
// String id = (String) id;
// String displayName = (String) displayName;
});//lock your room with a password
conference.lock(password); //set password for the conference; returns Promiseconference.unlock();// requesting subtitles
conference.setLocalParticipantProperty("requestingTranscription", true);
// if you want to request langauge translation also
conference.setLocalParticipantProperty("translation_language", 'hi'); // hi for hindi
// now listen for subtitles received event
conference.addEventListener("SUBTITLES_RECEIVED", (id, name, text)=> {
// String id = (String) id; id of transcript message
// String name = (String) name; name of speaking participant
// String text = (String) text; // spoken text
});
// stop requesting subtitles
conference.setLocalParticipantProperty("requestingTranscription", false);
// supported list of language codes
// "en": "English",
// "af": "Afrikaans",
// "ar": "Arabic",
// "bg": "Bulgarian",
// "ca": "Catalan",
// "cs": "Czech",
// "da": "Danish",
// "de": "German",
// "el": "Greek",
// "enGB": "English (United Kingdom)",
// "eo": "Esperanto",
// "es": "Spanish",
// "esUS": "Spanish (Latin America)",
// "et": "Estonian",
// "eu": "Basque",
// "fi": "Finnish",
// "fr": "French",
// "frCA": "French (Canadian)",
// "he": "Hebrew",
// "hi": "Hindi",
// "mr":"Marathi",
// "hr": "Croatian",
// "hu": "Hungarian",
// "hy": "Armenian",
// "id": "Indonesian",
// "it": "Italian",
// "ja": "Japanese",
// "kab": "Kabyle",
// "ko": "Korean",
// "lt": "Lithuanian",
// "ml": "Malayalam",
// "lv": "Latvian",
// "nl": "Dutch",
// "oc": "Occitan",
// "fa": "Persian",
// "pl": "Polish",
// "pt": "Portuguese",
// "ptBR": "Portuguese (Brazil)",
// "ru": "Russian",
// "ro": "Romanian",
// "sc": "Sardinian",
// "sk": "Slovak",
// "sl": "Slovenian",
// "sr": "Serbian",
// "sq": "Albanian",
// "sv": "Swedish",
// "te": "Telugu",
// "th": "Thai",
// "tr": "Turkish",
// "uk": "Ukrainian",
// "vi": "Vietnamese",
// "zhCN": "Chinese (China)",
// "zhTW": "Chinese (Taiwan)"Bundle options = new Bundle();
options.putBoolean("desktop", true);
JitsiLocalTrack videoTrack = localTracks[1];
SariskaMediaTransport.createLocalTracks(options, tracks -> {
conference.replaceTrack(videoTrack, tracks[0]);
});
conference.sendMessage("message"); // group
conference.sendMessage("message", participantId); // to send private message to a participant
// Now participants can listen to message received event
conference.addEventListener("MESSAGE_RECEIVED", (message, senderId)=>{
// message sent by sender
// senderId
});conference.startTranscriber();conference.stopTranscriber();
// at the end of the conference transcriptions will be available to downloadtrack.mute() // to mute track
track.mute() // to unmute track
track.isMuted() // to check if track is already mutedconference.muteParticipant(participantId, mediaType)
// mediaType can be audio or video// New local connection statistics are received.
conference.addEventListener("LOCAL_STATS_UPDATED", (stats)=>{
});
// New remote connection statistics are received.
conference.addEventListener("REMOTE_STATS_UPDATED", (id, stats)=>{
});
conference.startP2PSession();conference.stopP2PSession();Bundle options = new Bundle();
options.putString("callStatsID", 'callstats-id');
options.putString("callStatsSecret", 'callstats-secret');
Conference conference = connection.initJitsiConference(options);Bundle options = new Bundle();
options.putBoolean("startSilent", true);
Conference confernce = connection.initJitsiConference(options);Bundle options = new Bundle();
options.putBoolean("startAudioMuted", true);
options.putBoolean("starVideoMuted", true);
Conference conference = connection.initJitsiConference(options);Bundle options = new Bundle();
options.putString("broadcastId", "youtubeBroadcastID"); // put any string this will become part of your publish URL
options.putString("mode", "stream"); // here mode will be stream
options.putString("streamId", "youtubeStreamKey");
// to start live stream
conference.startRecording(options);Bundle options = new Bundle();
options.putString("mode", "stream"); // here mode will be stream
options.putString("streamId", "rtmps://live-api-s.facebook.com:443/rtmp/FB-4742724459088842-0-AbwKHwKiTd9lFMPy"); // facebook stream URL
// to start live stream
conference.startRecording(options);Bundle options = new Bundle();
options.putString("mode", "stream"); // here mode will be stream
options.putString("streamId", "rtmp://live.twitch.tv/app/STREAM_KEY"); // switch
// to start live stream
conference.startRecording(options); Bundle options = new Bundle();
options.putString("mode", "stream"); // here mode will be stream
options.putString("streamId", "rtmps://rtmp-server/rtmp"); // RTMP server URL
// to start live stream
conference.startRecording(options);conference.addEventListener("RECORDER_STATE_CHANGED", (sessionId, mode, status)=>{
String sessionId = (String) sessionId; // sessionId of live streaming session
String mode = (String) mode; // mode will be stream
String status = (String) status; // status of live streaming session it can be on, off or pending
});conference.stopRecording(sessionId);// Config for object-based storage AWS S3, Google Cloud Storage, Azure Blob Storage or any other S3 compatible cloud providers are supported. Login to your Sariska dashboard to set your credentials , we will upload all your recordings and transcriptios.
Bundle options = new Bundle();
options.putString("mode", "file");
options.putString("serviceName", "s3");
// config options for dropbox
Bundle options = new Bundle();
options.putString("mode", "file");
options.putString("serviceName", "dropbox");
options.putString("token", "dropbox_oauth_token");
// to start cloud recording
conference.startRecording(options);
//listen for RECORDER_STATE_CHANGED event to know what is happening
conference.addEventListener("RECORDER_STATE_CHANGED", (sessionId, mode, status)=>{
String sessionId = (String) sessionId; // sessionId of cloud recording session
String mode = (String) mode; // here mode will be file
String status = (String) status; // status of cloud recording session it can be on, off or pending
});conference.stopRecording(sessionId)
String phonePin = conference.getPhonePin();
String phoneNumber = conference.getPhoneNumber();
// Share this Phone Number and Phone Pin to anyone who can join a conference call without internet.conference.dial(phoneNumber)
// dialing someone to join conference using their phone number// to join a lobby
conference.joinLobby(displayName, email);
// This notifies everyone at the conference of the following events
conference.addEventListener("LOBBY_USER_JOINED", (id, name) => {
});
conference.addEventListener("LOBBY_USER_UPDATED", (id, participant)=> {
});
conference.addEventListener("LOBBY_USER_LEFT", id=> {
});
conference.addEventListener("MEMBERS_ONLY_CHANGED", enabled=> {
});
// now a conference moderator can allow/deny
conference.lobbyDenyAccess(participantId); //to deny lobby access
conference.lobbyApproveAccess(participantId); // to approve lobby mode
// other methods
conference.enableLobby(); //to enable lobby mode in the conference call moderator only
conference.disableLobby(); //to disable lobby mode in the conference call moderator only
conference.isMembersOnly(); // whether conference room is members only. means lobby mode is disabled// start sip gateway session
conference.startSIPVideoCall("[email protected]", "display name"); // your sip address and display name
// stop sip call
conference.stopSIPVideoCall('[email protected]');
// after you create your session now you can track the state of your session
conference.addEventListener("VIDEO_SIP_GW_SESSION_STATE_CHANGED", (state)=>{
// String state = (String) state;
// state can be on, off, pending, retrying, failed
});
// check if the gateway is busy
conference.addEventListener("VIDEO_SIP_GW_AVAILABILITY_CHANGED", (status)=>{
// String status = (String) status;
// status can be busy or available
});URL: https://api.sariska.io/api/v1/media/poltergeist/create?room=sessionId&token=your-token&status=calling&user=12234&name=Kavi
Method: GET
where paramters are
* room: current session sessionId of the room you joined inviteCallee
* token: your jwt token
* status: calling
* user: callee user id
* name: callee user name
* domain: 'sariska.io'URL: https://api.sariska.io/api/v1/media/poltergeist/update?room=sessionId&token=your-token&status=accepted&user=12234&name=Kavi
Method: GET
where parameters are
* room: current session sessionId of the room you joined invite Callee
* token: callee jwt token
* status: accepted or rejected
* user: callee user-id
* name: callee user name
* domain: 'sariska.io'conference.addEventListener("USER_STATUS_CHANGED", (id, status) => {
String id = (String) id; // id of callee
String status = (String) status; // status can be ringing, busy, rejected, connected, expired
// ringing if callee changed status to ringing
// busy if callee is busy on ther call
// rejected if callee has rejected your call
// connected if callee has accepted your call
// expired if callee is not able to answered within 40 seconds an expired status will be trigger by sariska
});
TRACE
DEBUG
INFO
LOG
WARN
ERRORSariska Media provides powerful Dart API's for developing real-time applications.
You can integrate audio/video, live streaming cloud recording, transcriptions, language translation, virtual background and many other services on the fly.
This API documentation describes all possible features supported by sariska-media-transport which possibly covers any of your use cases.
The command above will add this to the pubspec.yaml file in your project (you can do this manually):
After you install the SDK, perform initial setup tasks by running initializeSdk().
WebSockets are ideal to keep a single, persistent session. Unlike HTTPS, WebSocket requests are updated almost immediately. To start using the media services, the primary step is to create a Media WebSocket connection.
Once you have your connection established, the next step is to create a conference. Sariska is backed by the Jitsi architecture.
A MediaStream consists of zero or more MediaStreamTrack objects, representing various audio or video tracks.
Each MediaStreamTrack may have one or more channels. The channel represents the smallest unit of a media stream, such as an audio signal associated with a given speaker, like left or right in a stereo audio track. Here we mostly talk about track.
This will be your most basic conference call. However, we recommend following up with the two further steps to add customized features to enhance your experience.
Note: You don't any audio element to play sound as it is in conjunction with video the stream.
The moderator of the meeting controls and gatekeeps the participants. The moderator has exclusive control of the meeting.
If you wish to have a moderator, pass the moderator value as true while generating your token. Moderator has the following permissions:
Ability to add a password to a room
Ability to grant the moderator role to non-moderators
Ability to kick non-moderators or other moderators
Ability to mute participates
Use the following code to now publish your call.
That's it you are done with a basic conference call, Follow the guide below for more features.
Sariska-media-transport comes with pre-configured top events used to help improvise your product and overall consumer experience.
Few popular events:
User left
User joined
Conference duration
Camera duration
We will be updating the list of features soon.
You can easily detect the active or the dominant speaker. You could choose to stream only their video, thereby saving on the costs and better resolution to others. This is could be a use case for one-way streaming; such as virtual concerts.
The idea is that we select a subset of N participants, whose video to show, and we stop the video from others. We dynamically and automatically adjust the set of participants that we show according to who speaks – effectively we only show video for the last N people to have spoken.
Set Local Participant Property
Get participant count
Note: Hidden participants are generally bots join the conference along with actual participants. For example: recorder, transcriber, pricing agent.
Get all participants in conference
Get all participants in conference without hidden participants
Pin/Select participant
Select/Pin Multiple Participants
Access local user details directly from conference
Get all remote tracks
Get all local tracks
Grant Owner
Except for the room creator, the rest of the users have a participatory role. You can grant them owner rights with the following code.
Revoke Owner
To revoke owner rights from a participant, use the following code.
Lock room
A moderator can lock a room with a password. Use the code as follows.
Unlock room
Start Screen Sharing
A participant supports 2 tracks at a type: audio and video. Screen sharing(desktop) is also a type of video track. If you need screen sharing along with the speaker video you need to have Presenter mode enabled.
Start Transcription
Stop Transcription
Mute/Unmute Local participant
Mute Remote participant
The moderator can mute any remote participant.
SDK is already configured to auto join/leave when internet connection fluctuates.
Start peer-to-peer mode
Sariska automatically switches to peer peer-to-peer mode if participants in the conference exactly 2. You can, however, still you can forcefully switch to peer-to-peer mode.
Note: Conferences started on peer-to-peer mode will not be charged until the turn server is not used.
Stop peer-to-peer mode
To monitor your WebRTC application, simply integrate the callstats or build your own by checking out the RTC Stats section.
Join Silent( no audio will be sent/receive)
join conference with silent mode no audio sent/receive
Join Muted
To start a conference with already muted options.
Stream to YouTube
You can get the youtube stream key manually by login to your youtube account or using google OAuth API
Stream to Facebook
You can get facebook streamId manually by login int to your facebook account or using Facebook OAuth API.
Stream to Twitch
Stream to any RTMP server
Listen for RECORDER_STATE_CHANGED event to know live streaming status
Stop Live Streaming
Stop Cloud Recording
Dial-in(PSTN)
Dial-out(PSTN)
To enable the feature for waiting room/lobby checkout API's below
One to one calling is more of synchronous way of calling where you deal with things like
Calling someone even if his app is closed or background
Play a busy tone if a user is busy on another call or disconnected your call
Play ringtone/ringback/dtmftone
This is similar to how WhatsApp works.
Make an HTTP call to the Sariska server
Send push notifications to callee using your Firebase or APNS account
Callee now reads the push notification using ConnectionService or CallKit even if the app is closed or in the background
Callee can update his status back to the caller just by making an updated HTTP Call, no needs to join the conference via SDK
Since the Caller has already joined the conference using SDK he can easily get the status just by listening USER_STATUS_CHANGED event
After the callee has joined the conference rest of the steps are the same as the normal conference call
Calendar Sync
Now you can programmatically start scheduling a meeting with google/microsoft calendar.
This integration adds the /sariska slash command for your team so that you can start a video conference in your channel, making it easy for everyone to just jump on the call. The slash command, /sariska, will drop a conference link in the channel for all to join.
Mentioning one or more teammates, after /sariska, will send personalized invites to each user mentioned. Check out how it is integrated .
Low-level logging on peer connection API calls and periodic getStats calls for analytics/debugging purposes. Make sure you have passed RTCstats WebSocket URL while initializing the conference. Check out how to configure RTCStats WebSocket Server .
Ability to make everyone see the moderator video (Everyone follows me)
Ability to make participants join muted (Everyone starts muted)
Ability to make participants join without video (Everyone starts hidden)
Ability to enable/disable the lobby room
Ability to approve join/knocking requests (when the lobby is enabled)
When the moderator leaves, a new one is selected automatically.
Audio track duration
Video track duration
Recording started
Recording stopped
Transcription started
Transcription stopped
Local Recording started
Local Recording stopped
Speaker Stats
dart pub add sariska_media_flutter_sdk flutter pub add sariska_media_flutter_sdkdependencies:
sariska_media_flutter_sdk: ^0.0.9import 'package:sariska_media_flutter_sdk/Conference.dart';
import 'package:sariska_media_flutter_sdk/Connection.dart';
import 'package:sariska_media_flutter_sdk/JitsiLocalTrack.dart';
import 'package:sariska_media_flutter_sdk/JitsiRemoteTrack.dart';
import 'package:sariska_media_flutter_sdk/Participant.dart';
import 'package:sariska_media_flutter_sdk/SariskaMediaTransport.dart';
import 'package:sariska_media_flutter_sdk/WebRTCView.dart';
SariskaMediaTransport.initializeSdk();let token = {your-token}
let connection = new SariskaMediaTransport.JitsiConnection(token, "roomName", isNightly);
// set isNightly true for latest updates on the features else build will point to stable version
connection.addEventListener("CONNECTION_ESTABLISHED", () {
});
connection.addEventListener("CONNECTION_FAILED", (error){
if ( PASSWORD_REQUIRED === err ) {
// token is expired
connection.setToken(token) // set a new token
}
});
connection.addEventListener("CONNECTION_DISCONNECTED", () {
});
connection.connect()let conference = connection.initJitsiConference();
conference.addEventListener("CONFERENCE_JOINED", () {
for (JitsiLocalTrack track in localtracks) {
conference.addTrack(track);
}
});
conference.addEventListener("TRACK_ADDED", (track) {
JitsiRemoteTrack remoteTrack = track;
for (JitsiLocalTrack track in localtracks){
if (track.getStreamURL() == remoteTrack.getStreamURL()){
return;
}
}
if (remoteTrack.getType() == "audio") {
return;
}
streamURL = remoteTrack.getStreamURL();
replaceChild(remoteTrack);
});
conference.addEventListener("TRACK_REMOVED", (track){
// Do cater to track removal
});
conference.join();Map<String, dynamic> options = {};
options["audio"] = true;
options["video"] = true;
options["resolution"] = 240; // 180, 240, 360, vga, 480, qhd, 540, hd, 720, fullhd, 1080, 4k, 2160
SariskaMediaTransport.createLocalTracks(options,(tracks) {
localTracks = tracks;
});
//default value for framerates are already configured, You don't set all options.let videoTrack = localTracks[1];
@override
Widget build(BuildContext context) {
return Container(
child: Stack(
children: [
Align(
alignment: Alignment.topLeft,
child: WebRTCView(videoTrack.getStreamURL())
)
]
)
);
}
//mirror = true or false,
//zOrder = 0, 1, 2
//objectFit = "cover" or "contain"for (JitsiLocalTrack track in localTracks) {
conference.addTrack(track);
}@override
Widget build(BuildContext context) {
return Container(
child: Stack(
children: [
Align(
alignment: Alignment.topLeft,
child: SingleChildScrollView(
scrollDirection: Axis.horizontal,
child: Row(
children: remoteTracks.map((track) =>
{
if (track.getStreamURL() == "video") {
Container(
width: 240,
height: 360,
child: WebRTCView(
streamURL: track.getStreamURL(),
))
}
}))
)
)
]
)
);
}
// you can start tracking events just by listening
conference.addEventListener("ANALYTICS_EVENT_RECEIVED", (payload){
// payload will have
// payload["name"] of event is string
// payload["action"] is string
// payload["actionSubject"] is string
// payload["source"] is string
// payload["attributes"] , JSON of all extra attributed of the event
})conference.addEventListener("DOMINANT_SPEAKER_CHANGED", (id){
// let id = as String; dominant speaker id
});// to listen for last n speakers changed event
conference.addEventListener("LAST_N_ENDPOINTS_CHANGED", (leavingEndpointIds, enteringEndpointIds){
// let leavingEndpointIds = leavingEndpointIds as List<String>; //Array of ID's of users leaving lastN
// let enteringEndpointIds = as enteringEndpointIds as List<String>; //Array of ID's of users entering lastN
});
// to set set last n speakers in mid or you can pass option during conference initialization
conference.setLastN(10) // to set local participant property
conference.setLocalParticipantProperty(key, value);
// name is a string
// value can be object string or object
// to remove local participant property
conference.rempveLocalParticipantProperty(key)
// to get local participant propety
conference.getLocalParticipantProperty(key)
// this notifies everyone in the conference with following PARTICIPANT_PROPERTY_CHANGED event
conference.addEventListener("PARTICIPANT_PROPERTY_CHANGED", (participant, key,oldValue, newValue){
});conference.getParticipantCount();
// pass boolean true if you need participant count including hidden participantsconference.getParticipants(); // list of all participantsconference.getParticipantsWithoutHidden(); // list of all participants without hidden participantsconference.selectParticipant(participantId)
//Elects the participant with the given id to be the selected participant in order to receive higher video quality (if simulcast is enabled).conference.selectParticipants(participantIds) // string array of participant Id's
//Elects the participant with the given id to be the selected participant in order to receive higher video quality (if simulcast is enabled).conference.isHidden()
confernece.getUserId()
conference.getUserRole()
conference.getUserEmail()
conference.getUserAvatar()
conference.getUserName()conference.setSubject(subject)conference.getRemoteTracks(); // get all remote tracksconference.getLocalTracks()// notifies that participant has been kicked from the conference by moderator
conference.addEventListener("KICKED", (id){
//let id = id as String; id of kicked participant
});
// notifies that moderator has been kicked by other moderator
conference.addEventListener("PARTICIPANT_KICKED", (actorParticipant, kickedParticipant, reason){
//let actorParticipant = actorParticipant as Participant;
//let kickedParticipant = kickedParticipant as Participant;
//let reason = reason as String; reason for kick
})
// to kick participant
conference.kickParticipant(id); // participant id
// to kick moderator
conference.kickParticipant(id, reason); // participant id, reason for kickconference.grantOwner(id) // participant id
// listen for role changed event
conference.addEventListener("USER_ROLE_CHANGED",( id, role){
// let id = id as String; id of participant
//let role = role as String; new role of user
if (confernece.getUserId() === id ) {
// My role changed, new role: role;
} else {
// Participant role changed: role;
}
});conference.revokeOwner(id) // participant id// to change your display name
conference.setDisplayName(name);
// Listens for change display name event if changed by anyone in the conference
conference.addEventListener("DISPLAY_NAME_CHANGED", (id, displayName){
// let id = id as String;
// let displayName = displayName as String;
});//lock your room with password
conference.lock(password); //set password for the conference; returns Promiseconference.unlock();// requesting subtitles
conference.setLocalParticipantProperty("requestingTranscription", true);
// if u want request langauge translation also
conference.setLocalParticipantProperty("translation_language", 'hi'); // hi for hindi
// now listen for subtitles received event
conference.addEventListener("SUBTITLES_RECEIVED" , (id, name, text){
// let id = id as String; id of transcript message
// let name = name as String; name of speaking participant
// let text = text as String; // spoken text
});
// stop requesting subtitle
conference.setLocalParticipantProperty("requestingTranscription", false);
// supported list of langauge codes
// "en": "English",
// "af": "Afrikaans",
// "ar": "Arabic",
// "bg": "Bulgarian",
// "ca": "Catalan",
// "cs": "Czech",
// "da": "Danish",
// "de": "German",
// "el": "Greek",
// "enGB": "English (United Kingdom)",
// "eo": "Esperanto",
// "es": "Spanish",
// "esUS": "Spanish (Latin America)",
// "et": "Estonian",
// "eu": "Basque",
// "fi": "Finnish",
// "fr": "French",
// "frCA": "French (Canadian)",
// "he": "Hebrew",
// "hi": "Hindi",
// "mr":"Marathi",
// "hr": "Croatian",
// "hu": "Hungarian",
// "hy": "Armenian",
// "id": "Indonesian",
// "it": "Italian",
// "ja": "Japanese",
// "kab": "Kabyle",
// "ko": "Korean",
// "lt": "Lithuanian",
// "ml": "Malayalam",
// "lv": "Latvian",
// "nl": "Dutch",
// "oc": "Occitan",
// "fa": "Persian",
// "pl": "Polish",
// "pt": "Portuguese",
// "ptBR": "Portuguese (Brazil)",
// "ru": "Russian",
// "ro": "Romanian",
// "sc": "Sardinian",
// "sk": "Slovak",
// "sl": "Slovenian",
// "sr": "Serbian",
// "sq": "Albanian",
// "sv": "Swedish",
// "te": "Telugu",
// "th": "Thai",
// "tr": "Turkish",
// "uk": "Ukrainian",
// "vi": "Vietnamese",
// "zhCN": "Chinese (China)",
// "zhTW": "Chinese (Taiwan)"Map<String, dynamic> options = {};
options["desktop"] = true;
let videoTrack = localTracks[1];
SariskaMediaTransport.createLocalTracks(options, (tracks){
conference.replaceTrack(videoTrack, tracks[0]);
});
conference.sendMessage("message"); // group
conference.sendMessage("message", participantId); // to send private message to a participant
// Now participants can listen message received event
conference.addEventListener("MESSAGE_RECEIVED") , (message, senderId){
// let message = message as String; message text
// let senderId = senderId as String; senderId
}); // at the end of the conference transcriptions will be available to download
conference.startTranscriber();conference.stopTranscriber();
// at the end of the conference transcriptions will be available to downloadtrack.mute() // to mute track
track.mute() // to unmute track
track.isMuted() // to check if track is already mutedconference.muteParticipant(participantId, mediaType)
// mediaType can be audio or video// New local connection statistics are received.
conference.addEventListener("LOCAL_STATS_UPDATED", (stats){
});
// New remote connection statistics are received.
conference.addEventListener("REMOTE_STATS_UPDATED", (id, stats){
});
conference.startP2PSession();conference.stopP2PSession();Map<String, dynamic> options = {};
options["callStatsID"] = 'callstats-id';
options["callStatsSecret"] = 'callstats-secret';
let conference = connection.initJitsiConference(options);Map<String, dynamic> options = {};
options["startSilent"] = true;
let confernce = connection.initJitsiConference(options);// to join a conference with already muted audio and video
Map<String, dynamic> options = {};
options["startAudioMuted"] = true;
options["startVideoMuted"] = true;
const conference = connection.initJitsiConference(options);Map<String, dynamic> options = {};
options["broadcastId"] = "youtubeBroadcastID"; // put any string this will become part of your publish URL
options["mode"] = "stream"; // here mode will be stream
options["streamId"] = "youtubeStreamKey";
// to start the live stream
conference.startRecording(options);Map<String, dynamic> options = {};
options["mode"] = "stream"; // here mode will be stream
options["streamId"] = "rtmps://live-api-s.facebook.com:443/rtmp/FB-4742724459088842-0-AbwKHwKiTd9lFMPy"; // facebook stream URL
// to start live-stream
conference.startRecording(options);Map<String, dynamic> options = {};
options["mode"] = "stream"; // here mode will be stream
options["streamId"] = "rtmp://live.twitch.tv/app/STREAM_KEY";
// to start live stream
conference.startRecording(options); Map<String, dynamic> options = {};
options["mode"] = "stream"; // here mode will be stream
options["streamId"] = "rtmps://rtmp-server/rtmp"; // RTMP server URL
// to start live stream
conference.startRecording(options);conference.addEventListener("RECORDER_STATE_CHANGED", (sessionId, mode, status){
String sessionId = sessionId as String; // sessionId of live streaming session
String mode = mode as String; // mode will be stream
String status = status as String; // status of live streaming session it can be on, off or pending
});conference.stopRecording(sessionId);// Config for object based storage AWS S3, Google Cloud Storage, Azure Blob Storage or any other S3 compatible cloud providers are supported. Login to your Sariska dashboard to set your credentials , we will upload all your recordings and transcriptios.
Map<String, dynamic> options = {};
options["mode"] = "file";
options["serviceName"] = "s3";
// config options for dropbox
Map<String, dynamic> options = {};
options["mode"] = "file";
options["serviceName"] = "dropbox";
options["token"] = "dropbox_oauth_token";
// to start cloud recording
conference.startRecording(options);
//listen for RECORDER_STATE_CHANGED event to know what is happening
conference.addEventListener("RECORDER_STATE_CHANGED", (sessionId, mode, status){
String sessionId = sessionId as String; // sessionId of cloud recording session
String mode = mode as String; // here mode will be file
String status = status as String; // status of cloud recording session it can be on, off or pending
});conference.stopRecording(sessionId)
String phonePin = conference.getPhonePin();
String phoneNumber = conference.getPhoneNumber();
// Share this Phone Number and Phone Pin to anyone who can join conference call without internet.conference.dial(phoneNumber)
// dialing someone to join conference using their phone number
// to join lobyy
conference.joinLobby(displayName, email);
// This notifies everyone at the conference with the following events
conference.addEventListener("LOBBY_USER_JOINED", (id, name){
// let id = id as String;
// let name = name as String;
})
conference.addEventListener("LOBBY_USER_UPDATED", (id, participant){
// let id = id as String;
// let participant = participant as Participant;
})
conference.addEventListener("LOBBY_USER_LEFT", (id){
// let id = id as String;
})
conference.addEventListener("MEMBERS_ONLY_CHANGED", (enabled){
// let id = id as bool;
})
// now a conference moderator can allow/deny
conference.lobbyDenyAccess(participantId); //to deny lobby access
conference.lobbyApproveAccess(participantId); // to approve lobby mode
// others methods
conference.enableLobby() //to enable lobby mode in the conference call moderator only
conference.disableLobby(); //to disable lobby mode in the conference call moderator only
conference.isMembersOnly(); // whether conference room is members only. means lobby mode is disabled// start sip gateway session
conference.startSIPVideoCall("[email protected]", "display name"); // your sip address and display name
// stop sip call
conference.stopSIPVideoCall('[email protected]');
// after you create your session now you can track the state of your session
conference.addEventListener("VIDEO_SIP_GW_SESSION_STATE_CHANGED", (state){
// let state = state as String;
// state can be on, off, pending, retrying, failed
})
// check if gateway is busy
conference.addEventListener("VIDEO_SIP_GW_AVAILABILITY_CHANGED", (status){
// let status = status as String;
// status can be busy or available
})URL: https://api.sariska.io/api/v1/media/poltergeist/create?room=sessionId&token=your-token&status=calling&user=12234&name=Kavi
Method: GET
where paramters are
* room: current session sessionId of the room you joined inviteCallee
* token: your jwt token
* status: calling
* user: callee user id
* name: callee user name
* domain: 'sariska.io'URL: https://api.sariska.io/api/v1/media/poltergeist/update?room=sessionId&token=your-token&status=accepted&user=12234&name=Kavi
Method: GET
where paramters are
* room: current session sessionId of the room you joined inviteCallee
* token: callee jwt token
* status: accepted or rejected
* user: callee user id
* name: callee user name
* domain: 'sariska.io'conference.addEventListener("USER_STATUS_CHANGED", (id, status){
let id = id as String; // id of callee
let status = status as String; // status can be ringing, busy, rejected, connected, expired
// ringing if callee changed status to ringing
// busy if callee is busy on ther call
// rejected if callee has rejected your call
// connected if callee has accepted your call
// expired if callee is not able to answered within 40 seconds an expired status will be trigger by sariska
});
TRACE
DEBUG
INFO
LOG
WARN
ERRORUnleash real-time audio/video, live streaming, cloud recording, transcriptions, language translation, and more in your web and mobile apps with the versatile Sariska Media JavaScript APIs.
Seamlessly integrate with various JavaScript frameworks (Vanilla, React, Angular, Vue, Electron, NW, React Native, and more).
Access a rich set of features for audio/video conferencing, live streaming, cloud recording, transcriptions, language translation, virtual backgrounds, and more.
Maintain persistent, low-latency connections for real-time data exchange.
With NPM
With CDN
Add a script tag to your HTML head
At the very beginning of your index.js file, insert the following import statement:
Kickstart the SDK with this simple command:
Create a persistent connection for real-time communication.
Create a Jitsi-powered conference for real-time audio and video
Media Stream
A MediaStream is a collection of audio or video tracks, represented by MediaStreamTrack objects.
Each MediaStreamTrack can have multiple channels (e.g., left and right channels in a stereo audio track).
Capture Local Tracks
Define options:
Specify desired devices ("audio", "video", or "desktop").
Set preferred video resolution.
Optionally configure specific devices, frame rates, screen sharing options, and facing mode.
Create Local Tracks
This event is triggered when a new user joins the conference. Moderators have exclusive control over meetings and can manage participants. To assign a moderator, set the moderator value to true when generating the token.
Make audio and video streams visible to others in the conference by publishing them using the following code:
Additional methods for remote tracks:
getType(): Returns the track type (audio, video, or desktop)
stream.toURL(): Returns the stream URL
Sariska-media-transport offers pre-configured events to help you track and analyze user interactions, media usage, and overall performance. This data can be used to enhance your product, improve user experience, and make informed decisions.
Available Events
Here are some of the key events you can track:
User Actions:
User joined
User left
Media Usage:
Add Event Listener to Track Events
Sariska offers powerful features to enhance your application's capabilities. Find your desired feature using the search bar or explore below!
Identify the main speaker: Easily detect the active or dominant speaker in a conference. Choose to stream only their video for improved resolution and reduced bandwidth usage. Ideal for one-way streaming scenarios like virtual concerts.
Dynamically showcase recent speakers: Focus on the active conversation by displaying video only for the last N participants who spoke. This automatically adjusts based on speech activity, offering a more efficient and relevant view.
Set local participant properties: Define additional information about participants beyond the default settings. This can include screen-sharing status, custom roles, or any other relevant attributes.
Get the total number of participants: Retrieve the complete participant count, including both visible and hidden members.
Access all participants: Obtain a list of all participants, including their IDs and detailed information.
Advanced manipulation: You can directly access the conference object for more granular control over conference behavior.
Pin a single participant: Pin a specific participant to always receive their video, even when "last n" is enabled.
Pin multiple participants: Pin an array of participants to always receive their videos.
Retrieve the local user's ID: Get the ID of the local user.
Retrieve the local user's information: Get comprehensive details about the local user, including name, email, ID, and avatar.
Retrieve the local user's role: Get the role of the local user (For example, participant, moderator, owner).
Get all remote tracks: Retrieve a list of all remote tracks (audio and video) in the conference.
Get all local tracks: Retrieve a list of all local tracks (audio and video)
Split your conference meeting into smaller, focused groups with unique audio and video. Moderators can create rooms, assign participants, and reunite everyone seamlessly.
Access breakout rooms: Get an instance to manage breakout rooms.
Create a breakout room: Create a new breakout room with the specified subject.
Remove a breakout room: Remove the current breakout room (if applicable).
Check for breakout room status: Determine if the current room is a breakout room.
Send a participant to a breakout room: Move a participant to a specific breakout room.
Listen for participant kick events
Kick a participant
Kick a moderator
The room creator has a moderator role, while other users have a participatory role.
Grant owner rights
Listen for role changes
Revoke owner rights:
Setting a new display name
Listen for display name changes
Lock room: Moderators can restrict access to the room with a password.
Unlock room: Removes any existing password restriction.
Request subtitles: Enable subtitles for spoken content.
Request language translation: Translate subtitles into a specific language.
Receive subtitles: Listen for incoming subtitles.
Stop subtitles: Disable subtitles.
Start screen share: Share your screen with other participants.
Stop screen share: Stop sharing your screen.
Sariska offers robust messaging capabilities for both private and group communication scenarios.
Send and Receive Private Text Messages
Send and Receive Private Payload
Send and Receive Group Text Messages
Send and Receive Group Payload
Start Transcription: Initiate transcription for the ongoing conference.
Stop Transcription: Stop transcription and get a download link for the transcript.
Mute Remote Participant
Mute/Unmute Local Participant
Local Connection Statistics Received
Remote Connection Statistics Received
No Audio Signal
Audio Input State Changed
Audio Level Indicator
Noise Detection
Detect excessive noise from the microphone used in the conference.
Talk While Muted Detection
Noise Suppression/Cancellation
Reduces background noise from audio signals using a recurrent neural network (RNN).
Change the background behind you in video calls with various options:
Image: Define a static image as the background
Blur: Blur the background for a subtle effect
Screen Share: Show your computer screen as the background
Start: Apply the effect to your local video track
Stop: Remove the effect from your video track
Periodically capture screenshots of your screen share (e.g., every 30 seconds) and upload them to your server for analysis.
Start capturing
Stop capturing
Monitor your WebRTC application performance using CallStats (or build your own). See the "RTC Stats" section for details.
This ensures seamless connectivity even in the face of fluctuating internet connections. It automatically manages connections and disconnections as needed.
Get valuable insights into the current conversation, specifically focusing on speaker dominance. It analyzes the interaction and estimates how long each participant has held the dominant speaker role.
Gain insights into the connection quality of conference participants.
Combine multiple audio tracks into a single, unified audio stream.
Empower your application with robust end-to-end encryption, ensuring secure communication for your users.
Enable End-to-End Encryption
Disable End-to-End Encryption
Join conferences with audio and video already muted, or in a silent mode where no audio is transmitted or received. This ensures a seamless experience and respects participant preferences.
Join with Muted Audio and Video
Join in Silent Mode
Broadcast your conference to multiple platforms simultaneously. Embed live streams directly into your app or website using various formats.
Store your recordings and transcriptions in various cloud storage services.
Dial-in(PSTN)
Dial-out(PSTN)
Designed for efficient communication between two participants.
Sariska automatically activates Peer-to-Peer mode when your conference has exactly two participants. This mode bypasses the central server and directly connects participants, maximizing bandwidth efficiency and reducing latency. However, even with more than two participants, you can forcefully start Peer-to-Peer mode.
This allows synchronous phone calls, similar to WhatsApp, even if the receiver's app is closed or in the background.
Initiating Calls:
Make a call even if the callee's app is closed or in the background.
Play a busy tone if the callee is on another call or disconnects your call.
Play ringtone/ringback/DTMF tones.
Step 1 : Caller Initiates Call
HTTP Call to Sariska Server
{API Method}
Push Notification to callee using Firebase or APNS
This notifies the receiver even if their app is closed or in the background.
Step 2 : Callee Responds to Call
Reads Push Notification (using react-native-callkeep)
Processes the notification even if the app is closed or in the background.
HTTP Call to Update Status
{API Method}
No need to join conference via SDK
Status update through the HTTP call suffices.
Step 3 : Caller Receives Response
Listens for USER_STATUS_CHANGED event
Step 4 : After Connection Established
The call proceeds like a normal conference call.
React Native Libraries:
react-native-callkeep: Handles notifications and call events even when the app is closed or in the background.
react-native-incall-manager: Manages device events like headset state, proximity sensors, and audio routing.
startVideoMuted: true
startSilent: true
rtcstatsServer: ""
callStatsID: ""
callStatsSecret: ""
channelLastN: 10
Purpose: Specifies which devices to request from the browser's GetUserMedia (GUM) API.
Default: If this property is not set, GUM will attempt to access all available devices.
resolution:
Type: String
Values: 180, 240, 360, vga, 480, qhd, 540, hd, 720, fullhd, 1080, 4k, 2160
Purpose: Sets the preferred resolution for the local video stream.
cameraDeviceId
Type: String
Purpose: Specifies the device ID of the camera to use.
micDeviceId
Type: String
Purpose: Specifies the device ID of the microphone to use.
minFps
Type: Integer
Purpose: Sets the minimum frame rate for the video stream.
maxFps
Type: Integer
Purpose: Sets the maximum frame rate for the video stream.
desktopSharingFrameRate
Type: Object
Properties:
min: Minimum frame rate for desktop sharing
max: Maximum frame rate for desktop sharing
desktopSharingSourceDevice
Type: String
Purpose: Specifies the device ID or label of the video input source to use for screen sharing.
facingMode
Type: String
Values: "user", "environment"
Purpose: Sets the camera's facing mode (front-facing or back-facing).
firePermissionPromptIsShownEvent
Type: Boolean
Purpose: If set to true, fires a JitsiMediaDevicesEvents.PERMISSION_PROMPT_IS_SHOWN event when the browser displays the GUM permission prompt.
fireSlowPromiseEvent
Type: Boolean
Purpose: If set to true, fires a JitsiMediaDevicesEvents.USER_MEDIA_SLOW_PROMISE_TIMEOUT event if the browser takes too long to resolve the GUM promise. Cannot be used with firePermissionPromptIsShownEvent
track.isMuted()
Checks if the track is currently muted. Returns true if muted, false if not.
track.stream.toURL
Retrieves the URL of the stream, allowing access to the media content.
Retrieve Track Information:
track.getType()
Identifies the track type, which can be "audio", "video", or "desktop".
track.getId()
Obtains the unique identifier assigned to the track.
track.getDeviceId()
Determines the device ID associated with the track, providing information about the physical source of the media.
track.getParticipantId()
Returns the ID of the participant to whom the track belongs.
Manage Track State:
track.switchCamera()
Switches the camera source between the front and back cameras (for video tracks).
track.mute()
Mutes the track, preventing its audio or video from being transmitted.
track.unmute()
Unmutes a previously muted track, resuming transmission.
Attach & Detach Tracks:
track.attach()
Pairs the track with an HTML audio or video element, enabling playback within a web page.
track.detach()
Disconnects the track from its associated audio or video element.
Track Disposal:
track.dispose()
Releases the track's resources, effectively ending its use and freeing up memory.
Ability to grant moderator privileges to other participants.
Participant Removal:
Ability to kick non-moderators or even other moderators from the meeting.
Audio Control:
Ability to mute individual participants or all participants at once.
Video Focus:
Ability to make everyone's video view follow the moderator's video.
Joining Settings: Ability to:
Set participants to join with audio muted by default.
Set participants to join with video disabled by default.
Lobby Management:
Ability to enable or disable the lobby room, requiring approval for participants to join.
Join Approval:
Ability to approve or deny join requests when the lobby is enabled.
Encryption (Beta):
Ability to enable end-to-end encryption (where available).
Moderator Transfer:
If the current moderator leaves the meeting, a new moderator is automatically selected.
email:
Type: String
Purpose: May be used for identification or communication purposes.
moderator:
Type: Boolean
Purpose: Used to control moderation-related features in the UI.
audioMuted:
Type: Boolean
Purpose: Used to display the audio muted state in the UI.
videoMuted:
Type: Boolean
Purpose: Used to display the video muted state in the UI.
displayName:
Type: String
Purpose: Used to identify them in the UI.
role:
Type: String
Purpose: Used to determine their permissions and UI features.
status:
Type: String
Purpose: Used to display ("online", "offline", "away") their availability in the UI.
hidden:
Type: Boolean
Purpose: Typically used for bots like transcribers or recorders.
botType:
Type: String
Purpose: Used to identify the bot's purpose and capabilities.
getId(): Returns the track IDisMuted(): Checks if the track is muted
getParticipantId(): Returns the participant ID associated with the track
isLocal(): Checks if the track is local
attach(): Attaches the track to an audio or video element
detach(): Detaches the track from an audio or video element
Additional methods for remote tracks:
getType(): Returns the track type (audio, video, or desktop)
getStreamURL(): Returns the URL of the stream
getId(): Returns the ID of the track
isMuted(): Checks if the track is muted
getParticipantId(): Returns the ID of the participant who owns the track
isLocal(): Checks if the track is local
Additional RTCView properties for styling:
mirror: Mirrors the video horizontally (true or false)
cover: Controls how the video fills the view ("contain" or "cover")
zOrder: Sets the stacking order of the view (0 or 1)
Conference duration
Camera duration
Audio track duration
Video track duration
Recording:
Recording started
Recording stopped
Local recording started
Local recording stopped
Transcription:
Transcription started
Transcription stopped
Performance:
Speaker stats
Connection stats
Browser performance stats
actionSubject: The subject of the action (string)
source: The source of the event (string)
attributes: Additional attributes of the event (JSON)
da
Danish
de
German
el
Greek
en
English
enGB
English (United Kingdom)
eo
Esperanto
es
Spanish
esUS
Spanish (Latin America)
et
Estonian
eu
Basque
fi
Finnish
fr
French
frCA
French (Canadian)
he
Hebrew
hi
Hindi
hr
Croatian
hu
Hungarian
hy
Armenian
id
Indonesian
it
Italian
ja
Japanese
kab
Kabyle
ko
Korean
lt
Lithuanian
ml
Malayalam
lv
Latvian
nl
Dutch
oc
Occitan
fa
Persian
pl
Polish
pt
Portuguese
ptBR
Portuguese (Brazil)
ru
Russian
ro
Romanian
sc
Sardinian
sk
Slovak
sl
Slovenian
sr
Serbian
sq
Albanian
sv
Swedish
te
Telugu
th
Thai
tr
Turkish
uk
Ukrainian
vi
Vietnamese
zhCN
Chinese (China)
zhTW
Chinese (Taiwan)
mr
Marathi
connectionSummary
A brief summary of the connection quality (e.g., "Good", "Fair", "Poor")
e2eRtt
The estimated end-to-end round-trip time in milliseconds
participantId
The ID of the participant
framerate
The current media framerate in frames per second
isLocalVideo
Indicates whether the stats are for the local participant's video
maxEnabledResolution
The maximum resolution enabled for the participant
packetLoss
The percentage of packet loss
region
The region where the participant is located
resolution
The current resolution of the media stream
serverRegion
The region of the server handling the connection
shouldShowMore
Indicates whether more detailed connection stats are available
videoSsrc
The video SSRC
transport
The transport protocol being used (e.g., "UDP", "TCP")
Locate the storage credentials section
Enter the required credentials for your chosen provider
Dropbox
Obtain a Dropbox OAuth token
connected: receiver accepted your call
expired: receiver didn't answer within 40 seconds
Toggle speaker/microphone and flashlight.
Play ringtones/ringbacks/DTMF tones.
af
Afrikaans
ar
Arabic
bg
Bulgarian
ca
Catalan
cs
Czech
audioSsrc
The audio SSRC (Synchronization Source identifier)
bandwidth
The estimated available bandwidth in bits per second
bitrate
The current media bitrate in bits per second
bridgeCount
The number of bridges in use
codec
The codec being used for media transmission
npm i react-native-webrtc
const remoteTracks = [];
conference.addEventListener(SariskaMediaTransport.events.conference.TRACK_ADDED, function(track) {
remoteTracks.push(track);
});
return {
remoteTracks.map(track => {
if (track.getType() === "video") {
return <RTCView streamURL={track.stream.toURL()}/>
}
})
};npm i sariska-media-transport<script src="https://sdk.sariska.io/umd/sariska-media-transport.min.js"></script>import 'sariska-media-transport/dist/esm/modules/mobile/polyfills';import SariskaMediaTransport from "sariska-media-transport";
SariskaMediaTransport.initialize();const token = {your-token};
const connection = new SariskaMediaTransport.JitsiConnection(token, "roomName", isNightly);
connection.addEventListener(SariskaMediaTransport.events.connection.CONNECTION_ESTABLISHED, () => {
console.log('connection successful!!!');
});
// Handle connection events
connection.addEventListener(SariskaMediaTransport.events.connection.CONNECTION_FAILED, (error) => {
// Token expired, set again
if (error === SariskaMediaTransport.events.connection.PASSWORD_REQUIRED) {
// Set a new token
connection.setToken(token)
console.log('connection disconnect!!!', error);
}
});
connection.addEventListener(SariskaMediaTransport.events.connection.CONNECTION_DISCONNECTED, (error) => {
console.log('connection disconnect!!!', error);
});
connection.connect();Use isNightly: true to access the latest features, or omit it for the stable version.const conference = connection.initJitsiConference(options);
conference.join();const options = {
devices: ["audio", "video"],
resolution: 240,
}const localTracks = await SariskaMediaTransport.createLocalTracks(options);// Access local media tracks
const audioTrack = localTracks.find(track=>track.getType()==="audio");
const videoTrack = localTracks.find(track=>track.getType()==="video");
// Play video
videoTrack.attach(document.getElementById("videoElement"))
// Play audio
audioTrack.attach(document.getElementById("audioElement"))// Install webRTC library
npm i react-native-webrtc
// Import RTCView from react-native-webrtc to render the live video stream
import {RTCView} from 'react-native-webrtc';
// Access local video track
const videoTrack = localTracks.find(track=>track.getType()==="video");;
// Render video
return {videoTrack.getType() === "video" && <RTCView streamURL={videoTrack.stream?.toURL()}/>}conference.addEventListener(SariskaMediaTransport.events.conference.USER_JOINED, function(id, participant) {
console.log("user joined!!!", id, participant);
});localTracks.forEach(track => conference.addTrack(track));const remoteTracks = [];
conference.addEventListener(SariskaMediaTransport.events.conference.TRACK_ADDED, function(track) {
remoteTracks.push(track);
});
remoteTracks.forEach(track => {
if (track.getType() === "audio") {
RemoteContainer.append(track.attach(document.getElementById("remoteAudioElemId")));
}
if (track.getType() === "video") {
RemoteContainer.append(track.attach(document.getElementById("remoteVideoElemId")));
}
});conference.addEventListener(SariskaMediaTransport.events.conference.ANALYTICS_EVENT_RECEIVED, (payload)=> {
// Construct the payload
const { name, action, actionSubject, source, attributes } = payload;
})// Listen for changes in the dominant speaker
conference.addEventListener(SariskaMediaTransport.events.conference.DOMINANT_SPEAKER_CHANGED, id=> {
console.log(id) // Dominant speaker ID
});// Track changes in the "last N" speakers
conference.addEventListener(SariskaMediaTransport.events.conference.LAST_N_ENDPOINTS_CHANGED, (leavingEndpointIds, enteringEndpointIds)=> {
console.log(leavingEndpointIds) //Array of ID's of users leaving lastN
console.log(enteringEndpointIds) //Array of ID's of users entering lastN
});// Set a local participant property
conference.setLocalParticipantProperty(key, value);
// Remove a local participant property
conference.rempveLocalParticipantProperty(key)
// Get the value of a local participant property
conference.getLocalParticipantProperty(key)
// Listen for changes in participant properties
conference.addEventListener(SariskaMediaTransport.events.conference.PARTICIPANT_PROPERTY_CHANGED, (participant, key,oldValue, newValue) => {
});conference.getParticipantCount();
// Pass true to include hidden participants// Get all participants
conference.getParticipants(); // Array of {participantId: details} objects
// Get participants without hidden users
conference.getParticipantsWithoutHidden(); // Array of {participantId: details} objectsconference.participants; // {participantId: details}conference.pinParticipant(participantId)conference.pinParticipant(participantIds)conference.myUserId();conference.getLocalUser();conference.getRole();conference.getRemoteTracks();conference.getLocalTracks();const breakoutRooms = conference.getBreakoutRooms();breakoutRooms.createBreakoutRoom("room subject");breakoutRooms.removeBreakoutRoom();breakoutRooms.isBreakoutRoom();breakoutRooms.sendParticipantToRoom(participantJid, roomJid)conference.addEventListener(SariskaMediaTransport.events.conference.KICKED, (id)=> { // Handle a participant being kicked by a moderator
// The kicked participant's ID is available in the `id` variable
});
conference.addEventListener(SariskaMediaTransport.events.conference.PARTICIPANT_KICKED, (actorParticipant, kickedParticipant, reason) => { // Handle a moderator being kicked by another moderator
// Information about the actor, kicked participant, and reason is available in the event arguments
});confernece.kickParticipant(id)confernece.kickParticipant(id, reason) // Kick a moderator, providing a reason for the kickconference.grantOwner(id) // Grant owner rights to a participantconference.addEventListener(SariskaMediaTransport.events.conference.USER_ROLE_CHANGED, (id, role) => {
if (cofenerece.user.id === id ) {
console.log(`My role changed, new role: ${role}`);
} else {
console.log(`Participant role changed: ${role}`);
}
});conference.revokeOwner(id) // Revoke owner rights from a participantconference.setDisplayName(name); // Change the local user's display nameconference.addEventListener(SariskaMediaTransport.events.conference.DISPLAY_NAME_CHANGED, (id, displayName)=> { // Handle display name changes for other participants
// Access the participant ID
});conference.lock(password); // Lock the room with the specified passwordconference.unlock();conference.setLocalParticipantProperty("requestingTranscription", true);conference.setLocalParticipantProperty("translation_language", 'hi'); // Example for Hindiconference.addEventListener(SariskaMediaTransport.events.conference.SUBTITLES_RECEIVED, (id, name, text)=> {
// Handle received subtitle data (id, speaker name, text)
});confernence.setLocalParticipantProperty("requestingTranscription", false);const desktopTrack = await SariskaMediaTransport.createLocalTracks({devices: ["desktop"]});
conference.addtrack(desktopTrack[0]); await conference.removeTrack(desktopTrack); // Send a private text message to a specific participant
conference.sendMessage("message", participantId);
// Listen for incoming private text messages
conference.addEventListener(SariskaMediaTransport.events.conference.PRIVATE_MESSAGE_RECEIVED, (participantId, message)=>{
});// Send a private payload to a specific participant
conference.sendEndpointMessage(to, payload);
// Listen for incoming private payloads
conference.addEventListener(SariskaMediaTransport.events.conference.ENDPOINT_MESSAGE_RECEIVED, (participant, payload)=>{
});// Send a group text message to all participants
conference.sendMessage("message", participantId);
// Listen for incoming group text messages
conference.addEventListener(SariskaMediaTransport.events.conference.MESSAGE_RECEIVED, (participantId, message)=>{
});// Send a group payload to all participants
conference.sendEndpointMessage('', payload);
// Listen for incoming group payloads
conference.addEventListener(SariskaMediaTransport.events.conference.ENDPOINT_MESSAGE_RECEIVED, (participant, payload)=>{
});conference.startTranscriber();conference.stopTranscriber();conference.muteParticipant(participantId, mediaType)
// participantId: ID of the participant to be muted
// mediaType: Type of media to mute ('audio' or 'video')// Mute a local track (audio or video)
track.mute()
// Unmute a previously muted local track
track.unmute()
// Check if a local track is currently muted
track.isMuted()conference.addEventListener(SariskaMediaTransport.events.conference.LOCAL_STATS_UPDATED, (stats)=>{
// Handle local connection statistics
});conference.addEventListener(SariskaMediaTransport.events.conference.REMOTE_STATS_UPDATED, (id, stats)=>{
// Handle remote connection statistics
});// Triggered when the conference audio input loses signal
conference.addEventListener(SariskaMediaTransport.events.conference.NO_AUDIO_INPUT, () => {
// Handle the absence of audio input
});// Triggered when the audio input state switches between having or not having audio input
conference.addEventListener(SariskaMediaTransport.events.conference.AUDIO_INPUT_STATE_CHANGE, hasAudioInput => {
// Handle changes in audio input state
});conference.addEventListener(SariskaMediaTransport.events.conference.TRACK_AUDIO_LEVEL_CHANGED, function() {
// Handle audio level change events
});conference.addEventListener(SariskaMediaTransport.events.conference.NOISY_MIC, function () {
// Handle noisy mic events, such as notifying the user or adjusting settings
});conference.addEventListener(SariskaMediaTransport.events.conference.TALK_WHILE_MUTED, function () {
// Handle talk while muted events, such as providing a visual indicator
});await SariskaMediaTransport.effects.createRnnoiseProcessor();const options = {
// Enable virtual background
backgroundEffectEnabled: true,
// Choose image background
backgroundType: "image",
// URL of the image
virtualSource: "https://image.shutterstock.com/z/stock-photo-jasna-lake-with-beautiful-reflections-of-the-mountains-triglav-national-park-slovenia-1707906793.jpg"
};
const effect = await SariskaMediaTransport.effects.createVirtualBackgroundEffect(options);const options = {
// Enable virtual background
backgroundEffectEnabled: true,
// Choose blur background
backgroundType: "blur",
// Adjust blur intensity (0-100)
blurValue: 25
}
const effect = await SariskaMediaTransport.effects.createVirtualBackgroundEffect(options);const [ desktopTrack ] = await SariskaMediaTransport.createLocalTracks({devices: ["desktop"]});
const options = {
// Enable virtual background
backgroundEffectEnabled: true,
// Choose screen share background
backgroundType: "desktop-share",
virtualSource: desktopTrack
}
const effect = await SariskaMediaTransport.effects.createVirtualBackgroundEffect(options);const videoTrack = localTracks.find(track=>track.getType()==="video"); // Get your video track
await videoTrack.setEffect(effect); await videoTrack.setEffect(undefined);const [ desktopTrack ] = await SariskaMediaTransport.createLocalTracks({devices: ["desktop"]});
const effect = await SariskaMediaTransport.effects.createScreenshotCaptureEffect(processScreenShot);
await effect.startEffect(
desktopTrack.getOriginalStream(),
desktopTrack.videoType
);
// Process the captured screenshot
const processScreenShot = (canvas) => {
var dataURL = canvas.toDataURL();
console.log("data", dataURL);
// Upload dataURL to your server
} effect.stopEffect()const options = {callStatsID: 'callstats-id', callStatsSecret: 'callstats-secret'}
const conference = connection.initJitsiConference(options);// Update network status and notify Sariska Media Transport
function updateNetwork() {
// Communicate the current online status to SariskaMediaTransport
SariskaMediaTransport.setNetworkInfo({isOnline: window.navigator.onLine});
}
// When the browser goes offline, updateNetwork() is called to inform SariskaMediaTransport
window.addEventListener("offline", updateNetwork);
// When the browser comes back online, updateNetwork() is called again to update the status
window.addEventListener("online", updateNetwork);npm i @react-native-community/netinfo
import NetInfo from "@react-native-community/netinfo";
// Add an event listener to detect connectivity changes
const unsubscribe = NetInfo.addEventListener(state => {
// Log the current connection status
console.log("Is connected?", state.isConnected);
// Update SariskaMediaTransport with the latest connectivity information
SariskaMediaTransport.setNetworkInfo({isOnline: state.isConnected});
});conference.getSpeakerStats();const connectionStats = conference.getConnectionState();// Obtain the audio tracks to be mixed
const audioTrack1 = getTracks()[0];
const audioTrack2 = getTracks()[1];
// Create an audio mixer instance
const audioMixer = SariskaMediaTransport.createAudioMixer();
// Add individual audio streams to the mixer
audioMixer.addMediaStream(audioTrack1.getOriginalStream());
audioMixer.addMediaStream(audioTrack2.getOriginalStream());
// Initiate the mixing process and retrieve the resulting mixed stream
const mixedMediaStream = audioMixer.start();
// Extract the mixed audio track from the mixed stream
const mixedMediaTrack = mixedMediaStream.getTracks()[0];
// Maintain synchronization between the mixed track's enabled state and the track using the effect// Initialize Olm early for E2EE readiness
window.Olm.init().catch(e => {
console.error('Failed to initialize Olm, E2EE will be disabled', e);
delete window.Olm; // Remove Olm if initialization fails
});
// Activate E2EE:
conference.toggleE2EE(true); // Enable end-to-end encryption
// Verify E2EE support
conference.isE2EESupported() // Check if E2EE is availableconference.toggleE2EE(false); // Disable end-to-end encryption// Join the conference with both audio and video muted initially
const conference = connection.initJitsiConference({
startAudioMuted: true, // Mute audio upon joining
startVideoMuted: true // Mute video upon joining
});// Join the conference in silent mode, disabling both audio input and output
const conference = connection.initJitsiConference({
startSilent: true // Enter the conference in silent mode
});// Define streaming destinations and settings
const appData = {
// Keys for platforms to stream
streamKeys: [
{streamKey: "youtube", streamValue: "youtube-stream-key"},
{streamKey: "facebook", streamValue: "facebook-stream-key"},
{streamKey: "twitch", streamValue: "twitch-stream-key"},
{streamKey: "vimeo", streamValue: "vimeo-stream-key"},
{streamKey: "periscope", streamValue: "periscope-stream-key"},
{streamKey: "instagram",streamValue: "instagram-stream-key"}
// Add keys for other platforms as needed
],
// Optional list of additional RTMP URLs for streaming
streamUrls: ["rtmp://test-rtmp-url-1", "rtmp://test-rtmp-url-2", "rtmp://test-rtmp-url-n"],
isRecording: false,
// Specify "live" for embedding the stream
app: "live",
stream: "livestream"
}
// app and stream: If you want to embed live streaming to your app or website just pass app and stream then you can embed and play live streaming in your platform for HTTP-FLV, HLS, DASH and HDS, mp3, aac
* Play HTTP-FLV: https://edge.sariska.io/http-flv/live/livestream.flv
* Play HLS: https://edge.sariska.io/hls/live/livestream.m3u8
* Play DASH: https://edge.sariska.io/dash/live/livestream.mpd
* Play MP3: https://edge.sariska.io/mp3/live/livestream.mp3
* Play AAC: https://edge.sariska.io/acc/live/livestream.aac
// Start live recording with configuration data
conference.startRecording({
mode: SariskaMediaTransport.constants.recording.mode.STREAM, // Set mode to "stream"
appData: JSON.stringify(appData) // Pass app data as JSON string
});
// Listen for RECORDER_STATE_CHANGED event to track streaming status
conference.addEventListener("RECORDER_STATE_CHANGED", (payload)=>{
// Verify mode is "stream" again
const mode = payload._mode;
// Get the live streaming session ID
const sessionId = payload._sessionID;
// Check the streaming status: on, off, or pending
const status = payload._status;
});
// Stop live streaming using the session ID
conference.stopRecording(sessionId);// Configure for Object-based storage
const appData = {
file_recording_metadata : {
'share': true // Enable sharing
}
}
// Configure for Dropbox
const appData = {
file_recording_metadata: {
upload_credentials: {
service_name: "dropbox",
token: "your_dropbox_oauth_token"
}
}
}
// Start recording
conference.startRecording({
mode: SariskaMediaTransport.constants.recording.mode.FILE,
appData: JSON.stringify(appData)
});
// Monitor recording state
conference.addEventListener("RECORDER_STATE_CHANGED", (payload)=>{
const mode = payload._mode; // Recording mode (e.g., "file")
const sessionId = payload._sessionID; // Unique identifier for the cloud recording session
const status = payload._status; // Current recording status ("on", "off", or "pending")
// Handle recording state changes based on mode, sessionId, and status
});
// Stop recording
conference.stopRecording(sessionId); // Provide the session ID// Retrieve the phone pin and number for users to join via PSTN:
const phonePin = conference.getPhonePin(); // Get the phone pin for PSTN access
const phoneNumber = conference.getPhoneNumber() // Get the phone number for PSTN access// Dial a phone number to invite a participant to the conference
conference.dial(phoneNumber)// Join the lobby
conference.joinLobby(displayName, email); // Request to join the conference lobby
// Event listeners for lobby-related actions:
conference.addEventListener(SariskaMediaTransport.events.conference.LOBBY_USER_JOINED, (id, name) => {
// Handle events when a user joins the lobby
})
conference.addEventListener(SariskaMediaTransport.events.conference.LOBBY_USER_UPDATED, (id, participant)=> {
// Handle events when a user's information in the lobby is updated
})
// Additional event listeners for lobby actions:
conference.addEventListener(SariskaMediaTransport.events.conference.LOBBY_USER_LEFT, id=> {
})
conference.addEventListener(SariskaMediaTransport.events.conference.MEMBERS_ONLY_CHANGED, enabled=> {
})
// Moderator actions for lobby access:
conference.lobbyDenyAccess(participantId); // Deny access to a participant in the lobby
conference.lobbyApproveAccess(participantId); // Approve access to a participant in the lobby
// Lobby management methods:
conference.enableLobby() // Enable lobby mode for the conference (moderator only)
conference.disableLobby(); // Disable lobby mode for the conference (moderator only)
conference.isMembersOnly(); // Check if the conference is in members-only mode (lobby disabled)// Initiate a SIP video call
conference.startSIPVideoCall("[email protected]", "display name"); // Start a SIP video call with the specified address and display name
// Terminate a SIP video call
conference.stopSIPVideoCall('[email protected]'); // End the SIP video call with the specified address
// Event listeners for SIP gateway state changes
conference.addEventListener("VIDEO_SIP_GW_SESSION_STATE_CHANGED", (state)=>{
// Handle events when the SIP gateway session state changes (on, off, pending, retrying, failed)
console.log("state", state);
})
// Event listener for SIP gateway availability changes
conference.addEventListener("VIDEO_SIP_GW_AVAILABILITY_CHANGED", (status)=>{
// Handle events when the SIP gateway availability changes (busy or available)
console.log("status", status);
})conference.startP2PSession();conference.stopP2PSession();conference.addEventListener(SariskaMediaTransport.events.conference.USER_STATUS_CHANGED, (id, status) => {
// - id: receiver's user id
// - status: "ringing", "busy", "rejected", "connected", "expired"
});