newsletter open
Join our growing list of product knowledge seekers, and get our insights on all things product.
You're now subscribed to our newsletter!
close
Something went wrong, please try again later.
Back to Articles

Design Sprint Day Two Recap Pt 2 – Sketch, Share, and Stress-Test

What We Did Today

[This is an ongoing blog series, start your journey here.] In the second half of Day One, the team shared and reviewed early concept sketches for the Farmstand app. Each participant used different AI-assisted tools to visualize both the shopper and farmer experiences. Tools like Miro AI, Claude, Figma Make, and Stitch were tested to rapidly generate interface mockups and explore user flows.

This session was not just a show-and-tell—it was a stress test of AI-assisted ideation. We saw firsthand how non-designers could generate usable UI, and how AI tools are starting to challenge traditional sprint structures by delivering volume and fidelity in record time.

Today, we focused on:

  • Reviewing concept sketches and AI-generated prototypes for both farmer and shopper flows
  • Evaluating tool capabilities across different roles and creative prompts
  • Beginning to identify standout UI patterns and priority features for MVP consideration

Roles, Tools & Working Styles

Developers

Toolset: Claude, Firebase Studio, Stitch

Approach: Focused on using AI to generate app logic, structure flows, and outline future dev requirements. Emphasis on maintainability, performance, and future extensibility.

Designers

Toolset: Miro AI, Figma Make, Stitch

Approach: Rapid prototyping with AI-enhanced tools to visually communicate experience flows. Focused on designing both the shopper journey (location > discovery > reviews > inventory) and the farmer dashboard (inventory management, availability toggles, basic analytics).

Explorers / Generalists

Toolset: Claude, ChatGPT, UIzard

Approach: Used prompt-driven AI workflows to brainstorm concepts and explore alternative flows. Emphasis on translating raw user journeys into visual artifacts quickly to provoke discussion and reflection.

Key Insights & “Aha” Moments

  • AI is enabling fast fidelity—especially for non-designers. Several team members expressed surprise at how quickly and visually compelling their concepts became, even without traditional design tools or skills.
  • “I have some / I have a lot” inventory toggles are more intuitive than precise counts. This farmer-friendly shorthand emerged organically in multiple sketches.
  • Cross-platform patterns are starting to converge. Tools produced similar UX flows, suggesting LLMs are optimizing for median solutions—not innovation.
  • Out-of-stock alerts and real-time updates matter. The idea of “push to notify farmer” when an item is gone on-site resonated as a simple but powerful feature.
  • Volume can overwhelm clarity. With so many screens and directions generated, narrowing to a shared MVP focus became an immediate priority.

Selected AI Tools and Approaches to Explore

  • Miro AI for fast visual generation of shopper and farmer flows
  • Claude for screen-by-screen HTML layout ideas, onboarding flows, and feature breakdowns
  • Stitch for map-driven flows and visual consistency with design references like Airbnb
  • Figma Make for responsive prototypes and collaborative editing
  • UIzard was explored but ultimately dropped due to paywall limitations

Goal: Identify the fastest and most flexible tools for producing an MVP-level prototype, and determine which ones offer sufficient control, collaboration, and export options to move into development.

Quotes from the Team

“I spent one hour and it generated what used to take weeks.” — Ted

Context: Reflecting on how quickly he was able to produce a complete UI flow using AI tools like Stitch and Claude, highlighting the productivity gain for non-designers in a design sprint setting.

“Claude was my first time and it actually looked pretty good—clickable, too.” — Priscilla

Context: Sharing her surprise during the sketch review that Claude, despite being her backup after losing access to Bolt, produced usable output that could be explored interactively—proving its value in early ideation.

“We’re all getting the same outputs. It’s powerful, but also maybe limiting.” — Priscilla

Context: Observing recurring patterns across different tools and team members’ outputs, raising the concern that AI might be reinforcing generic solutions rather than promoting unique or innovative UX thinking.

“I’m not a designer, and I still ended up with something usable.”— Ted

Context: Emphasizing how AI tools helped bridge his skill gap, allowing him to express a complete product idea visually—something that traditionally would have required assistance from a trained designer.

“I don’t want to be handed code from a bot that can’t be audited.”— Cary

Context: Voicing concerns about production-readiness and long-term sustainability of AI-generated code, and asserting the need for developer oversight and escape hatches during the build phase.

Hypotheses We’re Testing Today

Can AI-generated sketches from non-designers actually move us closer to a real MVP?

  • Non-designers can use AI tools to prototype usable flows
  • MVPs generated from AI tools may not be production-ready—but they can accelerate learning
  • AI-generated UI converges toward functional sameness, limiting novel UX patterns
  • Collaborative divergence (everyone using different tools) speeds up ideation, but slows down convergence
  • Prototypes don’t need to be perfect—they need to provoke real user conversations

Risks & Unknowns

Will tool fragmentation, feature sprawl, or false confidence stall the next phase?

  • AI tools often assume full e-commerce workflows—overkill for farm stands
  • Lack of live preview/export in some tools hinders iteration
  • Risk of “over-designing” before agreeing on MVP scope
  • Concerns about AI-generated code security, maintainability, and handoff readiness
  • Feature fragmentation: different tools created overlapping but inconsistent feature sets

Key Decisions Made

Are we converging on a shared MVP fast enough to build something real?

  • MVP will not include payment features—focus is discovery, location, and inventory
  • Firebase Studio will be explored as the base coding environment
  • Each participant will tag and annotate key features they believe should carry forward
  • Group feature prioritization will happen Monday using must/should/could/have-not framework

Takeaways for Builders

What today’s messy mix of tools, roles, and flows revealed about designing with AI

  • The prototyping layer is no longer a bottleneck—alignment is
  • AI tools make design sprints accessible to non-designers without sacrificing fidelity
  • Feature prioritization must be fast and sharp to avoid bloated MVPs
  • Standard design sprint structures may need adjusting when tools generate too much too fast
  • Stickers, emojis, and annotations work—keep them simple and centralized for feature tracking

What’s Next

Coming up:

  • Feature voting + prioritization to finalize the MVP scope
  • Begin Figma Make and Firebase Studio flows based on group consensus
  • Use sticky note + emoji overlays to finalize which features move forward
  • Review Claude-based flows as part of developer planning for structure and handoff

Final reminder: This isn’t about perfection—it’s about proof. We’re building fast, learning publicly, and asking: What can AI actually deliver when it’s part of the team—not just a tool?

Link copied to clipboard