Skip to main content
One-shot summarization uses the same /chats API as multi-turn chat, but with a different prompt strategy: instead of building a conversation, you send a single question and receive a summarized answer based on your indexed documents. This is useful for displaying AI-generated answers alongside traditional search results. Make sure you have completed the setup guide before continuing.

Configure your workspace prompt for summarization

The key difference from a chat interface is the system prompt. For summarization, instruct the model to produce concise, self-contained answers and avoid follow-up questions:
curl \
  -X PATCH 'MEILISEARCH_URL/chats/WORKSPACE_NAME/settings' \
  -H 'Authorization: Bearer MEILISEARCH_KEY' \
  -H 'Content-Type: application/json' \
  --data-binary '{
    "prompts": {
      "system": "You are a search assistant. When the user asks a question, provide a single concise answer based only on the search results. Keep your response to 2-3 sentences maximum. Do not ask follow-up questions. Do not use your general knowledge. If the search results do not contain enough information, say so briefly."
    }
  }'
Key differences from a multi-turn chat prompt:
  • The system prompt explicitly asks for short, self-contained answers
  • The model is told not to ask follow-up questions
  • Responses are limited to a few sentences
You can use the same workspace you already created in the setup guide, or create a dedicated one for this use case. The rest of the workspace configuration (LLM provider, API key, etc.) stays the same.

Send a single question

Send a request to the chat completions endpoint. The difference from multi-turn chat is that you only send one message and do not maintain conversation history:
curl \
  -N -X POST 'MEILISEARCH_URL/chats/WORKSPACE_NAME/chat/completions' \
  -H 'Authorization: Bearer MEILISEARCH_KEY' \
  -H 'Content-Type: application/json' \
  --data-binary '{
    "model": "PROVIDER_MODEL_UID",
    "messages": [
      {
        "role": "user",
        "content": "What is the return policy for electronics?"
      }
    ],
    "tools": [
      {
        "type": "function",
        "function": {
          "name": "_meiliSearchSources",
          "description": "Provides sources of the search",
          "parameters": {
            "type": "object",
            "properties": {
              "call_id": { "type": "string", "description": "The call ID to track the original search" },
              "documents": { "type": "object", "description": "The documents associated with the search" }
            },
            "required": ["call_id", "documents"],
            "additionalProperties": false
          },
          "strict": true
        }
      }
    ]
  }'
Including the _meiliSearchSources tool lets you display the source documents alongside the summarized answer, so users can verify the information. In a real application, you would run this in parallel with a standard Meilisearch search request and display both results together.

Next steps

Build a chat interface

Create a multi-turn conversational interface with follow-up questions.

Display source documents

Show users which documents were used to generate the summary.

Configure guardrails

Restrict AI responses to topics covered by your data.

Reduce hallucination

Learn techniques to improve accuracy of AI-generated answers.