Stream LLM Response

Send a message to the LLM and receive a streaming response.

Request Parameters

ParameterRequiredDescription
messageYesThe user's message to send to the LLM
clear_historyNoSet to true to clear conversation history before this message

Conversation History

The session automatically maintains conversation history. Use clear_history: true to start a fresh conversation.

Response Format

Returns a Server-Sent Events (SSE) stream with JSON objects containing:

  • status: "success" or "fail"
  • sentence: The LLM's response text (streamed in chunks)

Example

Select "LLMRequest" from the Examples dropdown to see the request format.

Language
Response
Click Try It! to start a request and see the response here!