The WebSocket endpoint enables real-time, streaming communication with a Puppeteer agent. It supports sending messages and receiving streamed AI responses token by token.
Connection
wss://api.puppeteerai.com/ws/{thread_id}
Query parameters
| Parameter | Required | Description |
|---|
token | Yes | JWT token associated with the thread. |
project | No | Project name. If provided, the agent is initialized on connection. |
version | No | Config version ID. Use to pin a specific config version. |
Example
wss://api.puppeteerai.com/ws/abc123?token=eyJhbGci...&project=my-project
Sending messages
Send a JSON message to the WebSocket to interact with the agent:
{
"type": "message",
"data": {
"message": "Hello!",
"start_new_conversation": false,
"puppeteer_config_name": "my-project"
}
}
| Field | Required | Description |
|---|
type | Yes | Must be "message". |
data.message | Yes | The user’s message. |
data.start_new_conversation | Yes | If true, ends the current conversation and starts a new one. |
data.puppeteer_config_name | Yes | The project name. |
Receiving messages
The server sends JSON messages over the WebSocket. Each message has a type field indicating what it is.
Stream start
Signals the beginning of a new AI message.
{
"type": "stream",
"data": {
"type": "start"
}
}
Stream chunk
A token (word or partial word) of the AI’s response, delivered as it’s generated.
{
"type": "stream",
"data": {
"type": "chunk",
"token": "Hello",
"index": 0
}
}
| Field | Description |
|---|
data.token | A piece of the generated response. |
data.index | The position of this token in the stream (zero-indexed). |
Stream end
Signals the end of an AI message and includes the complete response.
{
"type": "stream",
"data": {
"type": "end",
"message": {
"message": "Hello! How can I help you today?",
"extras": {
"conversation_id": "1893",
"user_message_id": "132546d5-57fa-4f7e-a324-1d698081d2de",
"reply_message_id": "a1b2c3d4-e5f6-7890-abcd-ef1234567890"
}
}
}
}
| Field | Description |
|---|
data.message.message | The complete AI response. |
data.message.extras.conversation_id | ID of the conversation. |
data.message.extras.user_message_id | ID of the user message that triggered this response. |
data.message.extras.reply_message_id | ID of the AI reply message. |
An agent may send multiple messages in a single response. Each message follows the full start → chunks → end sequence. An end_response event (see below) signals that all messages for the response are complete.
End response
Sent after all messages for a response are complete. Includes response-level metadata.
{
"type": "stream",
"data": {
"type": "end_response",
"extras": {}
}
}
Error
Sent if an error occurs during processing.
{
"type": "error",
"data": {
"message": "Error description",
"code": "WEBSOCKET_ERROR"
}
}
Ping / Pong
Send a ping to check if the connection is still alive. The server responds with a pong.
Client sends:
Server responds:
Example interaction
Client sends:
{
"type": "message",
"data": {
"message": "Hello there!",
"start_new_conversation": true,
"puppeteer_config_name": "my-project"
}
}
Server responds with a streamed reply:
{"type": "stream", "data": {"type": "start"}}
{"type": "stream", "data": {"type": "chunk", "token": "Hi", "index": 0}}
{"type": "stream", "data": {"type": "chunk", "token": "!", "index": 1}}
{"type": "stream", "data": {"type": "chunk", "token": " How", "index": 2}}
{"type": "stream", "data": {"type": "chunk", "token": " can", "index": 3}}
{"type": "stream", "data": {"type": "chunk", "token": " I", "index": 4}}
{"type": "stream", "data": {"type": "chunk", "token": " help", "index": 5}}
{"type": "stream", "data": {"type": "chunk", "token": "?", "index": 6}}
{
"type": "stream",
"data": {
"type": "end",
"message": {
"message": "Hi! How can I help?",
"extras": {
"conversation_id": "1893",
"user_message_id": "132546d5-57fa-4f7e-a324-1d698081d2de",
"reply_message_id": "a1b2c3d4-e5f6-7890-abcd-ef1234567890"
}
}
}
}
{"type": "stream", "data": {"type": "end_response", "extras": {}}}