Skip to main content

How we built our MVP in just 30 days: lessons, tools, and real talk from a startup founder

How We Built Our MVP in 30 Days

How We Built Our MVP in 30 Days

By Rex, Founder at Pinegrass
Published: 06/06/2025 – Behind the Scenes Series #1

🚀 “Let’s ship in 30 days… or shut up about it.”

That was the line I scribbled on a sticky note on Day 0. Not because I’m a productivity junkie — but because for 6 months, the idea had been sitting idle in our Notion. And Notion isn’t where startups are born — execution is.

This is a raw look at how we built and shipped our MVP in exactly 30 days — no excuses, no fluff, no overplanning.

🧭 Why We Set a 30-Day Deadline (And Meant It)

Week 1: The Ruthless Cut

We started by saying no — to features, personas, and overthinking. From 24 wishlist features, we cut 18. From 3 user personas, we chose 1. Our only goal was to solve one job well.

  • Signup → Get value
  • Log back in
  • Share it with someone else

No dashboards. No settings. No custom flows. Just the job.

Week 2: Setup & Stack

We ditched the ideal tech stack in favor of a lean, fast one:

  • Frontend: React Native (Expo)
  • Backend & Auth: Supabase
  • Hosting: Vercel
  • Analytics: PostHog

By day 14, we had login/signup and data writing working — barely, but it was real.

Week 3: Fixing UX Fast

Early testers were confused. The UI felt clunky. So we stripped down copy, made buttons clearer, and deleted 2 screens entirely.

“I can’t tell what this app does.” — a tester

Harsh, but golden feedback. It changed everything.

Week 4: Testing with Real Users

We invited 8 users. Some got stuck. Some flagged bugs. Some ignored our “hero feature.” That’s when it clicked — users don’t care what you built, only what solves their pain.

✅ What Went Right

  • We shipped.
  • We got real feedback.
  • We had a v1 we could iterate on.

⚠️ What Went Wrong

  • Underestimated dev time
  • Ignored onboarding UX initially
  • Nearly added features last-minute

🧠 Final Thoughts

An MVP isn’t about minimum features — it’s about maximum learning with minimum build.

The biggest win? We weren’t guessing anymore. We were building with clarity.

🔜 Up Next

Next Friday: The Tech Stack We Chose (And Why We Switched Midway)

Comments

Popular posts from this blog

PromptCraft Blog Series #5: Automating Tasks With Prompt-Driven Workflows - Build AI-powered taskbots using no-code platforms like Lovable and Replit

PromptCraft Series #5 – Automating Tasks With Prompt Workflows ✨ PromptCraft Series #5 "Automating Tasks With Prompt-Driven Workflows" 🗕️ New post every Monday · Brought to you by Marc Rexian 🤖 Why Task Automation Matters With no-code platforms like Lovable and Replit , you can now build bots that: Summarize documents Generate reports Write replies Organize information Trigger API calls No Python. No cron jobs. Just prompts + flow. 🔧 What Is a Prompt-Driven Workflow? A user action or input starts the process A prompt block handles the logic The AI response is used to update the UI, send data, or trigger another action Think of it as Zapier powered by LLMs . ✨ TaskBot Use Cases You Can Build Today TaskBot Type Prompt Pattern Example ✉️ Email Writer ...

PromptCraft Blog Series #6: Prompt Debugging and Optimization – Learn how to fix and improve AI prompt outputs for more accurate, helpful results.

PromptCraft Series #6 – Prompt Debugging and Optimization "As of May 2025, summarize one real, recent science discovery based on known sources. Add links if available and avoid speculation." ✨ PromptCraft Series #6 "Prompt Debugging and Optimization: Getting the Output You Want" 🗕️ New post every Monday 🔍 Why Prompts Sometimes Fail Even the best models can give you: ❌ Irrelevant answers ❌ Generic or vague responses ❌ Hallucinated facts or made-up data ❌ Wrong tone or misunderstanding of intent Often, it’s not the AI’s fault — it’s the prompt . 🔧 How to Debug a Prompt Start with these questions: Is the role or task clearly defined? Did you give examples or context? Are your constraints too loose or too strict? Did you format the output instructions properly? Then iterate your prompt, one element at...

Behind the Scenes: How Generative AI Creates Music, Art, and Stories

When Machines Dream We’re living in a world where machines don’t just compute—they create. Generative AI is writing novels, composing symphonies, and painting pictures. But what’s really going on behind the screen? This post pulls back the curtain to reveal how generative AI actually creates —from writing a bedtime story to composing a lo-fi beat. Whether you're a curious creator or tech enthusiast, you’ll see the art of AI through a new lens. What is Generative AI, Really? Generative AI uses machine learning models—especially neural networks—to generate new content based on learned patterns. Trained on vast datasets, these models produce original music, images, and text based on user prompts. 1. How AI Writes Stories (e.g., ChatGPT, Claude) Step-by-step: You give a prompt: “Write a story about a lonely robot who finds a friend in the forest.” The model (like ChatGPT) draws on its training data to predict and generate the most likely next word, sentence, or paragr...