We believe in building in public. This is our first monthly update, covering what we shipped in January 2026.
What We Shipped
Ottly Chat (v1.0)
Our AI assistant went live with full support for:
- Web search with real-time results and citations
- Multi-model support (GPT-4o, Gemini 2.5 Flash, Claude Sonnet)
- Code generation with syntax highlighting
- Document and image analysis
- Conversation history with search
Infrastructure
- Deployed on AWS ECS with automated rollback on failed health checks
- Set up CI/CD pipeline with integration testing before every production deploy
- Configured Cloudflare DNS with automatic SSL for all subdomains
Developer Experience
- Established our monorepo structure with independent git repositories per service
- Set up pre-push hooks that require passing builds before code reaches remote
- Created comprehensive CLAUDE.md documentation for AI-assisted development
What We Learned
Keep the architecture simple. We started with a microservices-heavy design and quickly simplified to three core services: API server (Go), frontend (Next.js), and worker (Python). Complexity should be earned, not assumed.
Multi-model support matters from day one. Adding provider abstraction after the fact is painful. We are glad we did it early.
What is Coming in February
- Ottly Desk (AI journalism workspace) launch
- Workflow automation MVP (later became Ottly Automate)
- Improved streaming performance for long AI responses
Thanks for following along. See you next month.