Use cases
Conversational search supports three main use cases, all powered by the same/chats API route:
Multi-turn chat
Build a full conversational interface where users ask follow-up questions and the agent maintains context across the conversation. This is ideal for knowledge bases, customer support, and documentation search. Example: A user asks “What models do you support?”, then follows up with “Which one is the fastest?” without restating the context.One-shot answer summarization
Generate a single, concise answer to a user’s question without maintaining conversation history. This is useful when you want to display a summarized answer alongside traditional search results. Example: A user searches “How do I reset my password?” and gets a direct answer synthesized from your help articles, displayed above the regular search results.RAG pipelines
Integrate Meilisearch as the retrieval layer in a broader RAG architecture. Meilisearch handles query understanding and hybrid retrieval, while your application controls the generation step. Example: A product recommendation engine that retrieves matching products via Meilisearch, then uses a custom prompt to generate personalized suggestions.How it works
- Query understanding: Meilisearch automatically transforms the user’s natural language question into optimized search parameters
- Hybrid retrieval: combines keyword and semantic search for better relevancy
- Answer generation: your chosen LLM generates a response using only the retrieved documents as context
- Source attribution: every response can include references to the source documents used to generate the answer
Implementation strategies
Chat completions API (recommended)
In the majority of cases, you should use the/chats route to build conversational search. This API consolidates the entire RAG pipeline into a single endpoint, handling retrieval, context management, and generation.
Follow the getting started guide to set up conversational search, then build a chat interface or generate summarized answers.