Creates a new RealtimeClient to interact with realtime transcription models. If you make use of the client, you can generate this instance automatically via MeetingBot.getRealtimeClient().
If this would be used on the front-end, then you can fetch the websocket url in the back-end, forward it to the front-end and use this class directly.
| Parameter | Default | Description |
|---|---|---|
url | required | URL to the websocket server for transcription events |
audio_url | optional | URL to the audio websocket server for receiving realtime audio data. Only available when the bot was created with realtime_audio: true. |
import { RealtimeClient } from "@skribby/sdk";
// Without audio streaming
const realtimeClient = new RealtimeClient("wss://example.org/123456");
// With audio streaming
const realtimeClientWithAudio = new RealtimeClient(
"wss://example.org/123456",
"wss://example.org/123456/audio"
);Simple property to be able to check whether the client is connected or not.
By default it's false until the connect method is called and connection is established.
boolean
console.log(realtimeClient.connected);Property to check whether the audio WebSocket is connected.
Only relevant when the client was created with an audio_url parameter. By default it's false until the connect method is called and the audio connection is established.
boolean
console.log(realtimeClient.audioConnected);Create a new event listener with a callback to handle events. This method is fully typed with RealtimeEventMap for type-safe event handling.
| Parameter | Default | Description |
|---|---|---|
eventName | required (string) | Event name to listen to. Refer to the Realtime Core Concepts for all available events. You can also use raw to receive all raw messages. |
callback | required (function with 1 parameter) | Callback when event has been triggered. Refer to the Realtime Core Concepts for returned data per event. The callback will contain the data segment, if data is not provided the returned parameter will be undefined. |
void
realtimeClient.on("start", () => {
console.log("Bot has started!");
});
realtimeClient.on("ts", (data) => {
console.log(`${data.speaker_name} said: ${data.transcript}`);
});
// Listen to all raw messages
realtimeClient.on("raw", (data) => {
console.log("Raw message:", data);
});
// Listen to audio data (requires audio_url to be provided)
realtimeClient.on("audio", (buffer: Buffer) => {
// buffer is 16-bit PCM audio at 16kHz sample rate
console.log("Received audio chunk:", buffer.length, "bytes");
});Send an action to the bot. This method is fully typed with RealtimeActionMap for type-safe action dispatching.
| Parameter | Default | Description |
|---|---|---|
action | required (string) | Action name. Refer to the Realtime Core Concepts for all available actions. |
data | optional (depending on the action) | Data to send alongside the action. Refer to the Realtime Core Concepts for required data per action. The required data to be sent is the data portion of each action in that document. |
void
realtimeClient.send("chat-message", {
content: "Hello from my script!",
});
realtimeClient.send("stop");Start connection with the websocket server(s).
If an audio_url was provided to the constructor, this method establishes two parallel WebSocket connections:
- Main connection for transcription/control events
- Audio connection for receiving raw audio data
Both connections are awaited in parallel.
Error- If the websocket connection fails
Promise<void>
await realtimeClient.connect();Disconnect from the websocket server(s).
If an audio connection was established, both the main WebSocket and the audio WebSocket will be closed.
Promise<void>
await realtimeClient.disconnect();