Skip to main content
This recipe walks through building a customer-facing chatbot that answers questions from your own indexed documents, with source citations. Full working code in Python.

What you’ll build

A FastAPI endpoint that takes a user message, forwards it to Memic’s chat endpoint, and returns a grounded answer plus citations. Under 40 lines of code.

Prerequisites

  • Python 3.10+
  • A Memic environment with some documents already uploaded
  • An API key for that environment
  • pip install memic fastapi uvicorn

The code

import os
from fastapi import FastAPI, HTTPException
from pydantic import BaseModel
from memic import Memic

app = FastAPI()
memic = Memic(api_key=os.environ["MEMIC_API_KEY"])


class ChatRequest(BaseModel):
    messages: list[dict]  # [{"role": "user", "content": "..."}]


class Citation(BaseModel):
    file_name: str
    page: int | None
    passage: str


class ChatResponse(BaseModel):
    answer: str
    citations: list[Citation]


@app.post("/chat", response_model=ChatResponse)
def chat(req: ChatRequest):
    try:
        result = memic.chat(messages=req.messages)
    except Exception as e:
        raise HTTPException(500, f"Memic call failed: {e}")

    return ChatResponse(
        answer=result.answer,
        citations=[
            Citation(
                file_name=c.file_name,
                page=c.page,
                passage=c.passage,
            )
            for c in result.citations
        ],
    )
Run it:
MEMIC_API_KEY=mk_live_... uvicorn main:app --reload
Call it:
curl http://localhost:8000/chat \
  -H "Content-Type: application/json" \
  -d '{
    "messages": [
      {"role": "user", "content": "What is our refund policy?"}
    ]
  }'

Adding conversation history

Memic is stateless. To have a real conversation, your frontend or session store needs to keep the message array and pass it on every turn:
# In your session:
session.messages.append({"role": "user", "content": user_input})
result = memic.chat(messages=session.messages)
session.messages.append({"role": "assistant", "content": result.answer})

Rendering citations in a UI

Each citation points back to a specific passage in a specific file. A reasonable UI pattern:
<div>
  <p>{answer}</p>
  <div className="sources">
    {citations.map(c => (
      <a href={`/files/${c.file_id}#page=${c.page}`}>
        📄 {c.file_name} (page {c.page})
      </a>
    ))}
  </div>
</div>

Going further

  • Multi-tenant: swap the API key per customer — see Per-customer isolation
  • Custom system prompts: use managed prompts to maintain your assistant’s persona in the dashboard without redeploys
  • Retrieval-only: if you want full control over the LLM call, use memic.search(...) instead of memic.chat(...) and plug the results into your own OpenAI/Anthropic call