Skip to main content

Message Types

The API sends different message types during streaming to provide rich context.

AI Content​

Standard text response from the AI:

{
"choices": [{
"delta": {
"content": "text chunk"
}
}]
}
for chunk in stream:
if chunk.choices[0].delta.content:
print(chunk.choices[0].delta.content, end="")

Step Messages​

Shows what the AI is doing (e.g., "Analyzing query", "Searching documents"):

{
"type": "step_message",
"title": "Analyzing query",
"status": "in_progress"
}
if hasattr(chunk, 'custom_data'):
if chunk.custom_data.get('type') == 'step_message':
print(f"\nšŸ”„ {chunk.custom_data.get('title')}")

Status values:

  • in_progress - Currently running
  • completed - Finished successfully
  • failed - Encountered an error

Source Messages​

Shows which documents were used to generate the answer:

{
"type": "source_message",
"content": [
{
"node_id": "source_123",
"text": "Relevant excerpt...",
"metadata": {
"file_name": "document.pdf",
"page": 5
},
"score": 0.95
}
]
}
if chunk.custom_data.get('type') == 'source_message':
sources = chunk.custom_data.get('content', [])
for source in sources:
print(f"šŸ“„ {source['metadata']['file_name']} (page {source['metadata']['page']})")

Suggestions​

Follow-up questions the user might want to ask:

{
"type": "suggestions",
"suggestions": [
"What are the requirements for section 169?",
"How does section 169 relate to section 121?"
]
}
if chunk.custom_data.get('type') == 'suggestions':
suggestions = chunk.custom_data.get('suggestions', [])
for i, suggestion in enumerate(suggestions, 1):
print(f"šŸ’” {i}. {suggestion}")

Human Message​

Echo of the user's question:

{
"type": "human_message",
"content": "What is section 169?"
}

AI Metadata​

Additional information about the response:

{
"type": "ai_metadata",
"metadata": {
"model": "bizora",
"tokens_used": 150,
"processing_time_ms": 1250
}
}

Complete Example​

stream = client.chat.completions.create(
model="bizora",
messages=[{"role": "human", "content": "What is section 169?"}],
stream=True
)

for chunk in stream:
# AI content
if chunk.choices[0].delta.content:
print(chunk.choices[0].delta.content, end="", flush=True)

# Custom messages
elif hasattr(chunk, 'custom_data'):
msg_type = chunk.custom_data.get('type')

if msg_type == 'step_message':
print(f"\nšŸ”„ {chunk.custom_data.get('title')}")

elif msg_type == 'source_message':
sources = chunk.custom_data.get('content', [])
print(f"\nšŸ“š {len(sources)} sources found")

elif msg_type == 'suggestions':
suggestions = chunk.custom_data.get('suggestions', [])
print(f"\nšŸ’” {len(suggestions)} suggested questions")