What you’ll build
A FastAPI endpoint that takes a user message, forwards it to Memic’s chat endpoint, and returns a grounded answer plus citations. Under 40 lines of code.Prerequisites
- Python 3.10+
- A Memic environment with some documents already uploaded
- An API key for that environment
pip install memic fastapi uvicorn
The code
Adding conversation history
Memic is stateless. To have a real conversation, your frontend or session store needs to keep the message array and pass it on every turn:Rendering citations in a UI
Each citation points back to a specific passage in a specific file. A reasonable UI pattern:Going further
- Multi-tenant: swap the API key per customer — see Per-customer isolation
- Custom system prompts: use managed prompts to maintain your assistant’s persona in the dashboard without redeploys
- Retrieval-only: if you want full control over the LLM call, use
memic.search(...)instead ofmemic.chat(...)and plug the results into your own OpenAI/Anthropic call