Lumen Engineering Docs
Lumen is a self-hosted AI knowledge base platform. You upload documents to projects, and an AI assistant answers questions strictly from those documents with citations. Zero third-party data dependency — the only outbound call is the LLM inference API.
What lives where
| Path | Purpose |
|---|---|
| apps/web/ | Next.js 15 frontend (App Router, Tailwind 4) |
| apps/api/ | Hono on Bun — REST API, auth, chat orchestration |
| apps/embedder/ | Python FastAPI — multilingual-e5-small + BGE-reranker |
| apps/worker/ | Python — BullMQ consumer: parse → chunk → embed |
| apps/docs/ | This site (Next 15 + MDX) |
| packages/shared/ | Shared TypeScript types |
| scripts/ | Utility scripts |
| docker-compose.yml | 6-service prod stack |
| AGENTS.md | Project rules for AI agents working on the repo |
Entry points
- Start here: Getting started
- Understand the code: Repo layout → Architecture
- Build a feature: API reference → Access control
- Ship it: Dokploy deploy → Runbooks
Stack summary
| Layer | Tech |
|---|---|
| Frontend | Next.js 15 App Router · Tailwind 4 · SWR · TypeScript |
| Backend | Hono · Bun runtime · Prisma · Zod |
| Database | PostgreSQL 16 + pgvector |
| Queue | BullMQ + Redis |
| Embedder | multilingual-e5-small (384-dim, 100+ langs) |
| Reranker | BGE-reranker-v2-m3 |
| LLM | Configurable via /engineer/providers (OpenAI-compatible) |
| Auth | Custom JWT (15m access + 7d refresh) |
| Deployment | Docker Compose on Dokploy, Traefik routing |
Prod infra (single tenant — dev phase)
- Web: ai-kb.zenmail.my.id
- API: lumen-api.zenmail.my.id
- Docs: you're here (
lumen-docs.zenmail.my.id) - Host:
jaeger— Dokploy v0.29.2, deploy via GitHub webhook onmaster