Skip to content
Last updated

constructor

Creates a new RealtimeClient to interact with realtime transcription models. If you make use of the client, you can generate this instance automatically via MeetingBot.getRealtimeClient().
If this would be used on the front-end, then you can fetch the websocket url in the back-end, forward it to the front-end and use this class directly.

ParameterDefaultDescription
urlrequiredURL to the websocket server for transcription events
audio_urloptionalURL to the audio websocket server for receiving realtime audio data. Only available when the bot was created with realtime_audio: true.
import { RealtimeClient } from "@skribby/sdk";

// Without audio streaming
const realtimeClient = new RealtimeClient("wss://example.org/123456");

// With audio streaming
const realtimeClientWithAudio = new RealtimeClient(
  "wss://example.org/123456",
  "wss://example.org/123456/audio"
);

Properties

connected

Simple property to be able to check whether the client is connected or not.
By default it's false until the connect method is called and connection is established.

Returns

boolean

Example

console.log(realtimeClient.connected);

audioConnected

Property to check whether the audio WebSocket is connected.
Only relevant when the client was created with an audio_url parameter. By default it's false until the connect method is called and the audio connection is established.

Returns

boolean

Example

console.log(realtimeClient.audioConnected);

transcript

Read-only array of transcript segments received so far (same shape as "ts" payloads).
This lets you access the full transcription at any time without maintaining your own array in userland.

Behavior

  • On connect(): the SDK resets the internal transcript buffer.
  • On "connected" (initial snapshot): the SDK seeds the buffer from data.transcripts.
  • On each "ts" message: the SDK appends the new segment.

Returns

ReadonlyArray<RealtimeTranscriptSegment>

Example

const realtimeClient = bot.getRealtimeClient();

realtimeClient.on("ts", (segment) => {
  console.log(segment.speaker_name, segment.transcript);

  // Full transcript so far (array of segments)
  console.log("Transcript so far:", realtimeClient.transcript);
});

await realtimeClient.connect();

Methods

on

Create a new event listener with a callback to handle events. This method is fully typed with RealtimeEventMap for type-safe event handling.

TypeScript typings notes

  • The initial transcript snapshot is the "connected" event (data.transcripts), and it is included in RealtimeEventMap so realtimeClient.on("connected", ...) is fully typed.
  • A shared RealtimeTranscriptSegment type is used by both the "connected" snapshot payload (transcripts) and the "ts" event payload (single segment).

Parameters

ParameterDefaultDescription
eventNamerequired (string)Event name to listen to. Refer to the Realtime Core Concepts for all available events. You can also use raw to receive all raw messages.
callbackrequired (function with 1 parameter)Callback when event has been triggered. Refer to the Realtime Core Concepts for returned data per event. The callback will contain the data segment, if data is not provided the returned parameter will be undefined.

Returns

void

Example

realtimeClient.on("start", () => {
  console.log("Bot has started!");
});

realtimeClient.on("ts", (segment) => {
  console.log(`${segment.speaker_name} said: ${segment.transcript}`);

  // Full transcript so far (array of segments)
  console.log("Transcript so far:", realtimeClient.transcript);
});

// Listen to all raw messages
realtimeClient.on("raw", (data) => {
  console.log("Raw message:", data);
});

// Listen to audio data (requires audio_url to be provided)
realtimeClient.on("audio", (buffer: Buffer) => {
  // buffer is 16-bit PCM audio at 16kHz sample rate
  console.log("Received audio chunk:", buffer.length, "bytes");
});

send

Send an action to the bot. This method is fully typed with RealtimeActionMap for type-safe action dispatching.

Parameters

ParameterDefaultDescription
actionrequired (string)Action name. Refer to the Realtime Core Concepts for all available actions.
dataoptional (depending on the action)Data to send alongside the action. Refer to the Realtime Core Concepts for required data per action. The required data to be sent is the data portion of each action in that document.

Returns

void

Example

realtimeClient.send("chat-message", {
  content: "Hello from my script!",
});

realtimeClient.send("stop");

changeAvatar

Convenience method to change the bot's avatar during a realtime session. This is a shorthand for send('change-avatar', { avatar_url }).

Parameters

ParameterDefaultDescription
avatarUrlrequired (string)URL of the new avatar image to display

Returns

void

Example

// Change the bot's avatar mid-meeting
realtimeClient.changeAvatar("https://example.com/new-avatar.png");

connect

Start connection with the websocket server(s).
If an audio_url was provided to the constructor, this method establishes two parallel WebSocket connections:

  • Main connection for transcription/control events
  • Audio connection for receiving raw audio data

Both connections are awaited in parallel.

Throws

  • Error - If the websocket connection fails

Returns

Promise<void>

Example

await realtimeClient.connect();

disconnect

Disconnect from the websocket server(s).
If an audio connection was established, both the main WebSocket and the audio WebSocket will be closed.

Returns

Promise<void>

Example

await realtimeClient.disconnect();