← Back home
What I use
Last updated:
The current stack I reach for when building AI-native automation, agentic systems, MCP servers, and high-performance web applications. This page is rebuilt on every deploy, so it stays in sync with what I'm actually shipping — not an aspirational list.
AI & LLM Tooling
- Claude (Anthropic) — daily driver for complex reasoning, code review, and long-context tasks. Claude Opus 4.7 with 1M-context window for whole-repo refactors.
- Claude Code — CLI agent for actual codebase work. Where most of my shipped automation gets written.
- Model Context Protocol (MCP) — the standard I bet on. I've shipped three open-source MCP servers (Ploomes, WhatsApp, ManyMCPs) and rely on community MCPs daily (Supabase, Figma, Gmail, Calendar, Playwright).
- OpenAI API — for GPT-4.x when a specific task benefits from their tool use patterns.
- Cursor / Zed — backup editors when I need a different keyboard flow.
Languages & Runtimes
- TypeScript — default language for every shipping project. Strict mode, zod for runtime validation.
- Node.js (v20 LTS) — MCP servers, CLI tools, serverless functions.
- Python — data pipelines, ML experiments, scripting.
- Bash / Zsh — daily automation glue.
Frontend
- Astro.js v5+ — static-first with islands. Default choice for corporate sites, portfolios, and content-heavy pages. This site itself runs on Astro 6.
- Next.js — when an app needs server actions, ISR, or edge-side rendering beyond static. Frontend shell for Supabase-backed dashboards.
- Tailwind CSS v4 — utility-first styling, no framework overhead.
- shadcn/ui — when Next.js; borrowed into Astro when needed.
- Lucide Icons — consistent icon system.
Backend & Data
- Supabase — PostgreSQL, auth, storage, realtime, edge functions. My default backend for anything that needs a database.
- PostgreSQL — raw when I need performance tuning, pgvector for RAG.
- Vercel — hosting, edge functions, Speed Insights, Analytics, Cron. Also where this site lives.
- Trigger.dev v3 — code-first background jobs and long-running tasks replacing n8n for production workflows.
AI Infrastructure
- Retrieval-Augmented Generation (RAG) — pgvector + Supabase for most cases; Pinecone when scale justifies it.
- Embeddings: OpenAI text-embedding-3-large for quality, voyage-3 for cost-sensitive use cases.
- Agent frameworks: Claude Agent SDK, MCP servers composed into larger pipelines, Vercel AI SDK for chat UIs.
- Evaluation: lightweight eval harnesses in TypeScript — I avoid heavy frameworks until the problem justifies them.
CRM & Business Operations
- Ploomes — the CRM I automate against most often (hence the unofficial MCP server).
- n8n — still used for quick prototypes, then migrated to code-first when in production.
- WhatsApp Business API — frequent integration target for LATAM clients.
Editor & Terminal
- VS Code — primary editor with Claude Code, GitLens, ESLint, Prettier, Tailwind IntelliSense.
- Ghostty — fast, modern terminal.
- Zsh + Starship prompt — minimal, informative.
- 1Password CLI — secrets injection into dev environments without storing in .env files.
Design & UI
- Figma — design source of truth for all client work.
- Figma MCP — reads designs directly from Figma into Claude Code for implementation.
Dev Workflow
- Git + GitHub — obvious.
- pnpm — primary package manager.
- Bun — for scripts that need speed.
- GitHub Actions — CI; lightweight pipelines without over-engineering.
- Playwright — end-to-end tests when the cost-benefit clears.
Hardware
- MacBook Pro — primary development machine.
- External 4K display for long coding sessions.
- Mechanical keyboard — tactile switches; I type a lot.
Open Source I Maintain
- Ploomes MCP Server — 56 tools connecting Claude to the Ploomes CRM.
- WhatsApp MCP Server — send WhatsApp messages from AI agents.
- ManyMCPs — MCP profile manager for Claude Code.