Smart Chips Inference
Provides you small chips of information extracted from various contexts. A chip can be a FAQ pair, personalised suggested-query question, or a suggested search query. This inference type is useful for providing quick and relevant information to users.
Endpoint
Request Body
Field |
Type |
Description |
infer_type |
string |
Must be set to "smart_chips" for smart chips inference. |
infer_params |
object |
Parameters for the smart chips inference. |
stream_type |
string |
Specifies the streaming behavior. Options: "disabled", "enabled", "per_value". Default is "per_value". |
infer_params
object
Field |
Type |
Description |
chip_type |
string |
The type of chip to generate. (faq_pair, suggested_query) |
chip_params |
object |
Parameters for the chip type. |
surf_id |
string |
(Optional) Unique identifier for the user. Used to provide additional context from user history across the site. Fetched from the CDP. |
search |
object |
(Optional) Search parameters to provide context. Refer to the Search Documents API for more details on the search object structure. |
chat_session_id |
string |
(Optional) Uses old chat session ID to provide context. |
external_context |
object |
(Optional) Any external context to be considered in the chat. |
specialised_model |
string |
(Optional) The name of a specialised model to use for inference. |
chip_type
values (Required)
Value |
Description |
faq_pair |
Generates a FAQ pair chip. |
suggested_query |
Generates a suggested-query question chip. |
chip_type
- faq_pair
(FAQ Pair Chip)
Provides a question and answer pair extracted from a document or knowledge base.
Request Body Object (infer_params):
{
"chip_type": "faq_pair",
"chip_params": {
"count": 3 // Number of FAQ pairs to generate
},
"search": {
"search_params": {
"doc_id": "doc_12345"
}
}
}
Generates three FAQ pairs from a specific document.
Available Parameters:
Field |
Type |
Description |
count |
integer |
Number of FAQ pairs to generate. |
alternate_system_prompt |
string |
(Optional) Alternate system prompt to use for generating the FAQ pair. Allows to modify the default system prompt for special cases. |
Sample Return Value:
{
"smart_chip_type": "faq",
"content": [
{
"question": "What is artificial intelligence?",
"answer": "Artificial intelligence refers to the simulation of human intelligence in machines.",
"source": "doc_12345",
"chunk_text": "Artificial intelligence refers to the simulation of human intelligence in machines.",
"relevance_score": 0.95
},
{
"question": "What are the types of artificial intelligence?",
"answer": "There are two types of artificial intelligence: narrow AI and general AI.",
"source": "doc_12345",
"chunk_text": "There are two types of artificial intelligence: narrow AI and general AI.",
"relevance_score": 0.85
},
{
"question": "What are the applications of artificial intelligence?",
"answer": "Artificial intelligence is used in various applications such as healthcare, finance, and transportation.",
"source": "doc_12345",
"chunk_text": "Artificial intelligence is used in various applications such as healthcare, finance, and transportation.",
"relevance_score": 0.3
}
]
}
Return value content table:
Field |
Type |
Description |
question |
string |
The question extracted from the document. |
answer |
string |
The answer extracted from the document. |
source |
string |
The document ID from which the FAQ pair was extracted. |
chunk_text |
string |
The text chunk containing or relevant to the FAQ pair. |
relevance_score |
number |
A score indicating the relevance of the FAQ pair to the context (0.0 to 1.0). |
chip_type
- suggested_query
(suggested-query Question Chip)
Generates a suggested-query based on the context of the conversation. This chip is useful for proactively engaging users and collecting feedbacks or additional information based on the context and custom system prompts.
Request Body Object (infer_params):
{
"chip_type": "suggested_query",
"chip_params": {
"count": 2 // Number of suggested-query questions to generate
},
"chat_session_id": "session_12345",
"surf_id": "user_123"
}
Generates two suggested queries based on the conversation context & user history to provide relevant suggestions.
Available Parameters:
Field |
Type |
Description |
count |
integer |
Number of suggested-query to generate. |
alternate_system_prompt |
string |
(Optional) Alternate system prompt to use for generating the FAQ pair. Allows to modify the default system prompt for special cases. |
Sample Return Value:
{
"smart_chip_type": "suggested_query",
"content": [
{
"question": "Would you like to learn more about artificial intelligence applications?",
"chunk_text": "Learn more about artificial intelligence applications.",
"relevance_score": 0.95
},
{
"question": "Can I help you with anything else?",
"chunk_text": "Learn more about artificial intelligence applications.",
"relevance_score": 0.85
}
]
}
Return value content table:
Field |
Type |
Description |
question |
string |
The suggested query generated. |
chunk_text |
string |
The text chunk containing or relevant to the suggested query. |
relevance_score |
number |
A score indicating the relevance of the suggested query to the context (0.0 to 1.0). |
Response
The response is streamed back to the client based on the stream_type
specified in the request. By default, the stream_type
is set to per_value
.
Stream Types
Stream Type |
Description |
Use Case |
disabled |
Returns all chips at once |
When the client can process all chips in one go. |
enabled |
Streams the chips in chunks |
When there are many chips to be processed. |
per_value |
Streams each chip separately |
When the client needs to process each chip individually. |
Response Structure
Field |
Type |
Description |
key |
string |
The key for the streamed data. |
value |
object |
The chip object containing the extracted information. |
stream_status |
string |
Indicates the status of the stream. Can be "stream_running" or "stream_over". |
The key:value pair is streamed for each chip could be the ones inside the content array as they are the actual chips or pairs of chips.
Error Responses
400 Bad Request
: Invalid input data
- Examples:
- Missing required parameters
- Invalid infer_type
- Invalid stream_type
This inference type is valuable for providing and extracting quick, contextually relevant information to users. It can be used for enhancing user engagement, providing personalized suggestions, and improving the overall user experience by offering bite-sized, relevant content.
Note: The smart chips inference API simulates the generation of contextual information chips. In a production environment, it would integrate with sophisticated machine learning models, user behavior analysis, and content recommendation systems to provide highly relevant and personalized information chips.