Properties
## How to read this document
- Dependencies list task IDs that must be complete before this task starts
- Parallel group identifies tasks that can run simultaneously within a phase
- Target identifies which repo and branch the work goes into
- Tasks are numbered P{phase}-{sequence} (e.g., P0-3)
- Acceptance criteria are binary — pass or fail, no judgment calls
Phase 1: Single-User Serverless Wiki
Goal: Port the existing Otterwiki + API + semantic search + MCP stack to run on Lambda + EFS. Single user, single wiki, no multi-tenancy.
P1-1: Mangum Adapter for Otterwiki
Parallel group: Can start during Phase 0 (upstream work)
Dependencies: None (upstream contribution, independent of wikibot-io infra)
Target: otterwiki fork, PR to main
Description: Add Lambda compatibility to Otterwiki via the Mangum adapter. Mangum wraps Flask/WSGI apps for Lambda invocation. The adapter should be opt-in — Otterwiki continues to work as a normal Flask app when not on Lambda.
Investigate Otterwiki's filesystem assumptions beyond the git repo: config files, static assets, session storage, temporary files. Document what needs to live on EFS vs. what can be ephemeral.
Deliverables:
otterwiki/lambda_handler.py— Mangum wrapper, conditionally loaded- Documentation of filesystem assumptions and EFS requirements
- Tests verifying Otterwiki initializes correctly under Mangum
- No changes to existing Otterwiki behavior when Mangum is not used
Acceptance criteria:
- Otterwiki can be invoked via Mangum handler
- Existing test suite still passes
- Static assets served correctly
- Config file loading works from EFS path
- Git repo operations work with EFS-mounted repo path
- No import errors when Mangum is not installed (optional dependency)
P1-2: FAISS Backend for Semantic Search
Parallel group: Can start during Phase 0 (upstream work)
Dependencies: None (upstream contribution, independent of wikibot-io infra)
Target: otterwiki-semantic-search repo, PR to main
Description: Add FAISS as an alternative vector store backend alongside the existing ChromaDB backend. FAISS stores indexes as files on disk (suitable for EFS). The backend is selected via configuration.
The embedding function must be pluggable: the existing all-MiniLM-L6-v2 sentence-transformer for local/ChromaDB use, and a Bedrock titan-embed-text-v2 adapter for Lambda use. The Bedrock adapter is a new addition but should be a clean interface implementation, not Lambda-specific.
Reference the PRD's FAISS details: IndexFlatIP, sidecar metadata in embeddings.json, search deduplication by page.
Deliverables:
otterwiki_semantic_search/backends/faiss_backend.py— FAISS index management (create, upsert, delete, search)otterwiki_semantic_search/backends/chroma_backend.py— extracted from existing codeotterwiki_semantic_search/backends/base.py— abstract backend interfaceotterwiki_semantic_search/embeddings/bedrock.py— Bedrock titan-embed-text-v2 adapterotterwiki_semantic_search/embeddings/base.py— abstract embedding interface- Configuration switch:
VECTOR_BACKEND=faiss|chroma,EMBEDDING_MODEL=local|bedrock - Unit tests for FAISS backend (using
/tmpfor index files) - Unit tests for Bedrock embedding adapter (mocked)
- Existing ChromaDB tests still pass
Acceptance criteria:
- FAISS backend stores/retrieves/deletes vectors correctly
- Sidecar metadata (
embeddings.json) maps index positions to page paths and chunk text - Search deduplicates by page, returns top N unique pages
- Bedrock adapter produces vectors of correct dimensionality
- Backend is selected by config, defaults to ChromaDB for backward compatibility
- Existing tests pass without modification
- New tests cover FAISS CRUD and search operations
P1-3: Otterwiki on Lambda
Parallel group: Phase 1 core
Dependencies: P0-2 (EFS + Lambda infra), P1-1 (Mangum adapter)
Target: wikibot-io repo, feat/P1-3-otterwiki-lambda
Description: Deploy Otterwiki to Lambda using the Mangum adapter from P1-1. Configure it to use a git repo on EFS. Verify the web UI works end-to-end: browse pages, create pages, edit pages, view history.
Deliverables:
- Pulumi Lambda function for Otterwiki with EFS mount
- Otterwiki configuration for Lambda environment (EFS paths, settings)
- API Gateway route for web UI (
/{path+}) - Integration test: create page via web UI API, read it back
Acceptance criteria:
- Otterwiki web UI loads in a browser
- Can create, edit, and delete pages
- Page history works
- Static assets (CSS, JS) load correctly
- Git repo on EFS persists across invocations
P1-4: REST API on Lambda
Parallel group: Phase 1 (parallel with P1-5, P1-6 once P1-3 lands)
Dependencies: P1-3
Target: wikibot-io repo, feat/P1-4-api-lambda
Description:
Deploy the existing otterwiki-api plugin on Lambda alongside Otterwiki. Verify all API endpoints work. The API plugin is already a Flask blueprint — it should load naturally in the Lambda environment.
Deliverables:
- API plugin installed and loaded in Lambda Otterwiki instance
- API Gateway routes for
/api/v1/* - Integration tests: CRUD pages via API, search, link graph, changelog
Acceptance criteria:
- All existing API endpoints respond correctly
- API key auth works via env var
- WikiLink index builds on startup
- Full-text search returns results
- Page create/read/update/delete cycle works
- Link graph queries return correct data
P1-5: MCP Server on Lambda
Parallel group: Phase 1 (parallel with P1-4, P1-6)
Dependencies: P1-3, P0-7 (MCP + WorkOS on Lambda)
Target: wikibot-io repo, feat/P1-5-mcp-lambda
Description:
Deploy the existing otterwiki-mcp server on Lambda. Adapt from SSE to Streamable HTTP transport. The MCP server calls the REST API (P1-4) internally. Auth via WorkOS OAuth (from P0-7).
Check whether the MCP server and Otterwiki can run in the same Lambda function (sharing the Flask process) or need separate Lambdas. Same-Lambda is simpler and avoids internal HTTP; separate Lambdas provide isolation. Document the decision.
Deliverables:
- MCP server deployed on Lambda (same or separate from Otterwiki)
- Streamable HTTP transport configured
- All 12 MCP tools functional
- Integration test: read_note, write_note, search_notes via MCP protocol
Acceptance criteria:
- All MCP tools return correct results
- OAuth authentication works
- Bearer token authentication works
- Streamable HTTP transport functions correctly
- Architecture decision (same vs. separate Lambda) documented with rationale
P1-6: Semantic Search on Lambda
Parallel group: Phase 1 (parallel with P1-4, P1-5)
Dependencies: P1-3, P1-2 (FAISS backend)
Target: wikibot-io repo, feat/P1-6-semantic-search-lambda
Description:
Deploy the semantic search plugin on Lambda using the FAISS backend from P1-2. FAISS index stored on EFS alongside the git repo. Embedding via Bedrock titan-embed-text-v2 (requires a VPC interface endpoint for Bedrock — add to Pulumi if not already present).
Note: In the wikibot.io context, semantic search is a premium feature (Phase 5+). But we validate it works on Lambda now to de-risk. For Phase 1, it runs but is not gated.
Deliverables:
- Semantic search plugin deployed with FAISS backend
- Bedrock VPC endpoint added to Pulumi (if needed for this phase)
- FAISS index persists on EFS
- Integration test: index a page, semantic search returns it
Acceptance criteria:
- Semantic search endpoint returns relevant results
- FAISS index persists across Lambda invocations
- Reindex endpoint works (full rebuild from repo)
- Bedrock embedding calls succeed from VPC Lambda
P1-7: Routing and TLS
Parallel group: Phase 1
Dependencies: P1-3
Target: wikibot-io repo, feat/P1-7-routing-tls
Description: Configure API Gateway routing for all endpoints: web UI, REST API, MCP. Set up a custom domain with TLS via ACM. Route 53 DNS record pointing to API Gateway.
For Phase 1, this is a single domain (e.g., dev.wikibot.io) with path-based routing. Multi-tenant subdomain routing comes in Phase 2.
Deliverables:
infra/components/api_gateway.py— routes for web UI, API, MCPinfra/components/dns.py— Route 53 record + ACM certificate- Pulumi outputs for endpoint URLs
Acceptance criteria:
https://dev.wikibot.io/serves Otterwiki web UIhttps://dev.wikibot.io/api/v1/healthreturns 200https://dev.wikibot.io/mcpaccepts MCP connections- TLS certificate valid and auto-renewing (ACM)
- All routes pass through to correct Lambda handlers
P1-8: Phase 1 E2E Test
Parallel group: Phase 1 (final — all other P1 tasks must complete)
Dependencies: P1-4, P1-5, P1-6, P1-7
Target: wikibot-io repo, feat/P1-8-e2e
Description: End-to-end test covering the full single-user workflow: browser creates a page, API reads it, MCP reads it, semantic search finds it.
Deliverables:
tests/e2e/test_phase1.py— end-to-end test script- Results written to Dev/Phase 1 Summary per Agent Conventions documentation loop
Acceptance criteria:
- Create page via Otterwiki web UI → readable via API and MCP
- Create page via API → visible in web UI and searchable
- Write via MCP → visible in web UI and API
- Semantic search finds pages by meaning, not just keywords
- Git history shows correct authorship for all write paths
P1-9: Self Hosting
Dependencies: P1-8
Target: wikibot-io repo (config), dev wiki repo (content)
Description:
Migrate the local development wiki from docker-compose to dev.wikibot.io. Use Otterwiki's built-in Git HTTP server (GIT_WEB_SERVER=True) to git push the dev wiki repo directly, preserving full git history.
Platform config already set: RETAIN_PAGE_NAME_CASE=True, TREAT_UNDERSCORE_AS_SPACE_FOR_TITLES=True, GIT_WEB_SERVER=True.
Steps:
pulumi upto deploy latest config (GIT_WEB_SERVER, page name settings)- Purge test content from dev.wikibot.io EFS repo (reset to empty git repo)
- Add dev.wikibot.io as a git remote on the local dev wiki repo
git pushdev wiki content tohttps://dev.wikibot.io/.git- Trigger
/api/v1/reindexto rebuild FAISS semantic search index - Update Claude Code MCP config to point at
https://dev.wikibot.io/mcp - Spot check — human via browser, agent via remote MCP
- Shut down local docker-compose dev wiki
Deliverables:
- dev.wikibot.io hosts the complete dev wiki with full git history
- Claude Code MCP config updated to use remote server
- Local docker-compose dev wiki shut down
Acceptance Criteria:
git pushsucceeds with full history preserved- User spot checks
https://dev.wikibot.io/to confirm content looks correct - Agent spot checks via remote MCP to confirm content matches local
- Semantic search returns results on remote instance
- Agent can read/write wiki pages via remote MCP to continue Phase 2 work