Building the next generation of AI-integrated web applications.
I’m a fullstack developer focused on turning LLM capabilities into usable products, not demos. My work sits at the intersection of modern frontend engineering, type-safe backend architecture, and practical AI workflows such as structured outputs, streaming responses, and prompt-driven product features.
| Project | Focus | One-line pitch |
|---|---|---|
| cangudev | AI product workflow | An AI-powered technical English coach that transforms developer text into structured rewrites, grammar feedback, idioms, and interview-style challenges. |
| youcanchef | AI + UX | A Gemini-powered web app that evaluates ingredient availability by city to help digital nomads adapt recipes with local-market intelligence. |
| cangu-finance | Fullstack architecture | A production-oriented TypeScript monorepo for mobile and web apps with typed APIs, auth, database workflows, and AI-ready backend services. |
| skillforge | AI-ready platform engineering | A full-stack TypeScript platform combining mobile, web, and streamed LLM interactions through a type-safe API architecture. |
- Prompt engineering for structured outputs
- Gemini and OpenAI-style API integrations
- Streaming LLM responses
- JSON-first AI workflows
- AI-assisted product features
- Response parsing, validation, and error handling
- React
- Next.js
- TypeScript
- Tailwind CSS
- Responsive product interfaces
- UX for AI-powered applications
- Node.js
- Monorepos
- Type-safe APIs
- Hono and modern API backends
- Auth and database integration
- Scalable fullstack project structure
I’m currently focused on building AI-native product experiences with:
- Streaming LLM responses for faster, more natural interfaces
- Structured AI data pipelines that return reliable JSON outputs
- Type-safe fullstack systems where AI features integrate cleanly into production code
- Practical developer tools and web apps that make LLM workflows useful in real products
- Shipping AI features that solve real product problems
- Keeping frontend experiences fast, clear, and production-ready
- Designing backend systems that stay maintainable as complexity grows
- Using LLMs as part of strong software architecture, not as a shortcut around it



