Krux

March 29, 2026
Google's Stitch Now Designs Multi-Screen Apps From Voice
Published: March 29, 2026 at 2:51 AM
Updated: March 29, 2026 at 2:51 AM
100-word summary
Google just turned Stitch into an infinite design canvas where you talk to create entire app flows. The experimental tool now generates high-fidelity UI from natural language, then lets you speak critiques and watch it instantly rework layouts or color schemes. The killer feature: it auto-generates logical next screens as you test flows, so you're prototyping a checkout sequence while it's already mocking up the confirmation page. A design agent tracks every fork in the road, letting teams explore wildly different directions in parallel without losing context. It can also scrape a design system from any URL and export it as DESIGN.md, a markdown format that carries your rules into...
What happened
Google just turned Stitch into an infinite design canvas where you talk to create entire app flows. The experimental tool now generates high-fidelity UI from natural language, then lets you speak critiques and watch it instantly rework layouts or color schemes. The killer feature: it auto-generates logical next screens as you test flows, so you're prototyping a checkout sequence while it's already mocking up the confirmation page. A design agent tracks every fork in the road, letting teams explore wildly different directions in parallel without losing context. It can also scrape a design system from any URL and export it as DESIGN.md, a markdown format that carries your rules into other coding tools.
Why it matters
Figma meets improv comedy, basically.