Infer
Performs an inference task on the provided data.
Endpoint
Request Body
Field | Type | Description |
---|---|---|
infer_type | string | The type of inference to perform. |
infer_params | object | Additional parameters for the inference. |
stream_type | string | Controls how the response is streamed. Default is "enabled". |
Possible infer_type values
chat
summarize
smart_chips
Possible stream_type values
disabled
: Returns the entire response at onceenabled
: Streams the response in chunksper_value
: Streams each value separately
Example Request Body
{
"infer_type": "chat",
"infer_params": {
"chat_session_id": "session_123",
"user_message": "What is artificial intelligence?"
},
"stream_type": "enabled"
}
Response
The response format depends on the stream_type
:
For stream_type: "disabled"
A single JSON object containing all the data and a stream_status
of "stream_over".
For stream_type: "enabled"
A stream of JSON objects, each containing:
key
: The key for the current chunk of datavalue
: The current chunk of datastream_status
: "stream_running" or "stream_over"
For stream_type: "per_value"
A stream of JSON objects, each containing:
key
: The key for the current valuevalue
: The current valuestream_status
: "stream_running" or "stream_over"
Error Responses
400 Bad Request
: Invalid input data or unsupported infer_type
This endpoint allows you to perform various inference tasks on your data. The specific behavior and required parameters depend on the infer_type
chosen. Refer to the documentation for each inference type for more details: