post
https://api.agent700.ai/api/chat
Processes a chat request to the LLM using the caller's active agent
configuration. Returns a non-streaming JSON response by default, or an SSE stream
if streamingEnabled is set to true in the request body.
Processes a chat request to the LLM using the caller's active agent
configuration. Returns a non-streaming JSON response by default, or an SSE stream
if streamingEnabled is set to true in the request body.
Try It! to start a request and see the response here!