Google Labs · Announced March 18, 2026 · Powered by Gemini 2.5 Pro · stitch.withgoogle.com
Google's most capable model drives every generation — high-fidelity UI from text, sketches, or wireframes.
Built from the ground up as an AI tool — not a traditional design app with AI bolted on.
Professional designers exploring dozens of variations AND founders building their first app interface with zero design background.
Connects directly to developer tools via MCP server, SDK, and DESIGN.md exports — no more design/dev handoff friction.
Josh Woodward, VP Google Labs: "AI works as a creativity multiplier, helping people explore many ideas quickly."
Generate dozens of layout variations side by side on the same canvas. Compare, iterate, and evolve — no switching between files.
From rough concept to high-fidelity prototype without leaving the canvas. Early sketches live next to polished screens.
The canvas is designed for team collaboration — multiple people can work, review, and iterate together in real time.
Type what you want — "a dashboard for a crypto portfolio tracker" — and Stitch generates it directly on the canvas, ready to iterate.
Draw a rough sketch on paper, upload it — Stitch converts it to a polished high-fidelity design automatically.
Drop in a screenshot of any existing UI and ask Stitch to redesign, restyle, or improve it with natural language instructions.
Start with a basic wireframe — Stitch fills in the visual design, components, and interactions automatically.
Describe your UI in plain English. "A dark mode SaaS dashboard with a left sidebar, stats grid, and data table" — done in seconds.
Click any element on a generated screen — Stitch automatically creates the linked detail page or flow state.
Generates complete flows: logged-in/out states, empty states, error states — with shareable links and mobile QR codes instantly.
"Give me three different menu options" — Stitch generates all three live as you speak. No clicking required.
"Show me this screen in different color palettes" — Stitch cycles through variations in real time while you watch.
Act as a sounding board — give real-time design feedback, suggest improvements, and explain tradeoffs verbally.
"Design me a landing page" — Stitch interviews you with questions to understand your vision before generating anything.
Voice keeps you in your creative flow — no switching between keyboard, mouse, and prompt box while designing.
Transforms AI from a tool into a collaborator — dynamic dialogue surfaces your best ideas through real-time critique.
Click any element — a button, product image, menu item — and Stitch auto-generates the linked screen or detail page.
Full flows generated automatically — logged-in/out views, empty states, error messages. No manual linking required.
Get a mobile QR code immediately — share your prototype with anyone for review on their actual phone.
One-click shareable preview links — pitch to investors or share with clients in seconds, no account required to view.
Transition and interaction design features are launching soon — full micro-animation and motion design in the pipeline.
Founders pitching investors or designers showing clients can preview working app flows in seconds instead of hours.
Every new Stitch project starts with an automatic design system — fonts, colors, spacing, components all defined from the start.
Change the design system once — every connected screen updates automatically. No hunting through individual files.
Point Stitch at any website URL — it extracts the design system (colors, typography, spacing) and applies it to your project.
Export DESIGN.md to AI Studio or Antigravity — your design rules travel with the handoff, keeping design and code in sync.
Pull design rules from existing codebases into a new Stitch project — perfect for redesigns without starting from scratch.
Fonts, spacing, colours, and component styles move with DESIGN.md across projects and team handoffs. No more inconsistency.
Stitch's Model Context Protocol server lets external tools call Stitch's design capabilities programmatically — integrate it into any workflow.
Open-source SDK on GitHub — build custom integrations, automate design workflows, and extend Stitch with your own tooling.
Community-built skills via the Stitch Skills repo — pre-built capabilities you can plug into your Stitch workflow instantly.
Export designs directly to Google AI Studio — seamless handoff from design to AI-assisted development.
Export to Antigravity for development — design rules travel with the export, keeping design and implementation in sync.
Stitch connects the full chain: ideation → design → prototype → developer handoff → code — without leaving the ecosystem.
Build your first app interface without any design skills. Describe it, iterate with voice, share with investors — all in minutes.
Explore 10x more variations in the same time. Use Stitch for rapid ideation and iteration before finalizing in your design tool of choice.
Mockup ideas quickly for stakeholder alignment — no Figma skills needed. Turn user stories into clickable prototypes in minutes.
Use DESIGN.md exports to get design-consistent code. Bridge the gap between design and implementation with AI Studio and Antigravity.
Design thumbnails, landing pages, and app mockups for videos and presentations — fast, professional, zero design overhead.
Stitch outputs design, not deployable code. For full-stack generation, pair with v0, Lovable, or export via Antigravity.
stitch.withgoogle.com
Try it free — Google Labs product, no waitlist
Google Blog — "Introducing Vibe Design"
March 18, 2026 · Rustin Banks, Product Manager
stitch.withgoogle.com/docs/mcp/setup
Set up MCP integration for your workflow
github.com/google-labs-code/stitch-skills
Community-built skills and extensions
github.com/google-labs-code/stitch-sdk
Build custom integrations and automations