Highlight Blog

How to Drive Customer-Led Growth With Product Validation

Written by Vicky Frissen | 10/6/25 2:18 PM

"Customers don't know what they want" is probably the most weaponized quote in product development. It's incomplete and often used in different contexts that it was originally intended. Just because Steve Jobs said it and Apple is what it is today, doesn't mean that ignoring customer input is the road to success. On the contrary. 

Steve meant that Apple is *obsessed* with understanding their customer's problems and pain points. They just didn't ask them all to become product designers.

Customers are great at identifying what they do not like, what is wrong, and what could be better. But they are terrible at formulating the exact solution they need. That is your job. 

That principle is the basis for customer-led growth. It doesn't mean building whatever customers request. It means understanding their problems deeply enough to build better solutions than they could articulate themselves. Here's how, and when, to do exactly that.

 

Learning how to listen, and when to act

We're not going to tell you that "being customer-centric" is key. Everyone knows that. What we do want to cover, is how to be that in an authentic and actionable way:

  • When to validate during development–prototype stage, pre-production, final formula each tell you different things
  • Which feedback methods actually predict purchase behavior vs which just sound cool in reports
  • How to build validation into timelines without adding months to your development cycle
  • The ROI math that justifies systematic testing to executives who see it as research budget

If you're struggling with products that test well but fail at launch, or don't know what to do with the feedback that arrives too late to change manufacturing decisions, here's our two cents on what we've learned from thousands of CPG product development cycles.

 

Validation is all about timing

A lot of advice on customer led growth and product development comes from and is aimed at SaaS. But that doesn't always translate to physical products. Learning after launch can teach valuable lessons, but the losses incurred by that process are expensive. Once you've committed to manufacturing, that's it. So validating before that is key, and you need to be truly confident in the data you're basing every decision on–and get the timing right. 

Your gameplan will depend on the different development stages you'll go through for your specific product. Here's a rough overview.

Prototype stage: You'll be testing whether the core concept solves real problems. Small samples (20-30 people) using rough prototypes tell you if you're solving something customers actually care about. This is where in-depth interviews and customer co-creation reveal problems you didn't know existed.

Pre-production stage: Validate formulation, usability, and competitive positioning. You need behavioral data here–how people actually use it, not what they say they'd do. This is where in-home testing reveals friction points that surveys miss entirely.

Final formula stage: Confirm market readiness and identify potential launch issues. This is your last chance to catch problems before you're committed to production runs and retail agreements.

Customer-led growth means involving customers at the right time, and giving them the tools that allow them to truly collaborate with you–not just confirm what you hope is true.

 

Feedback that predicts success vs feedback that sounds good

Customer feedback is always useful, but it doesn't always tell you the same thing, or is meant to inspire action. Sometimes, it's a prompt to explore more. Other times, it's a next step you can take. Here's what different methods actually predict about market success:

Method

What It Actually Tells You

What It Doesn't Tell You

When to Use It

What It Predicts About Market Success

Surveys

Stated preferences, demographic patterns, awareness levels

Actual behavior, usage patterns, real purchase decisions

Early concept screening, broad market sizing

Low - people say they'd buy things they never actually purchase

Focus Groups

Language customers use, problems they articulate, group dynamics

Individual behavior, authentic usage, how products perform at home

Understanding problem spaces, testing messaging

Low - artificial environment, social dynamics influence responses

In-Home Product Testing

Real usage patterns, behavioral data, friction points in natural contexts

Why people feel certain ways (needs follow-up), scale validation

Pre-launch validation, competitive testing, usability

High - actual behavior in authentic contexts predicts purchase patterns

Customer Co-Creation

Unmet needs, problem depth, solution directions

Finished product specs (customers design solutions they wouldn't buy)

Early innovation, problem discovery

Medium - reveals opportunities but needs translation to viable products

A/B Testing (Digital)

Preference between specific options, conversion optimization

Why preferences exist, broader usage patterns

Optimizing existing products, messaging

Medium for digital, not applicable for physical product formulation

Customer Reviews

Post-purchase satisfaction, deal-breakers, unexpected use cases

Pre-launch validation (too late), silent majority views

Understanding existing product performance

Medium - shows what happened, not what will happen with changes

Small authentic samples can reveal more meaningful action points than large preference studies. Thirty people using your product in their actual homes for two weeks could tell you more about market performance than three hundred people absentmindedly rating concepts in a survey. You'll want to measure behavior, not gather opinions. .

This is where customer insights analysis separates teams that launch successful products from teams that launch products that tested well.

 

Building validation systems that fit development cycles

When you need feedback that informs decisions instead of confirming them, here's how we'd approach it.

1. Get behavioral data early enough to act on it. 

In-home testing during pre-production reveals friction points while you can still adjust formulation or packaging. Waiting until you've committed to manufacturing means your validation just becomes expensive confirmation that you're already locked into something that might not work.

2. Use the right method at the right stage. 

Early prototype? Customer co-creation and in-depth interviews reveal problems you didn't know existed. Pre-production? In-home behavioral testing shows you friction points. Final formula? Competitive validation confirms market readiness. Market research questions all have a time and place.

3. Integration without slowdown. 

The (very valid) fear is that systematic validation adds months to development. But with Highlight, you can build quick feedback loops that let you iterate multiple times before launch. Yes: you *can* get authentic usage data in weeks instead of months, which gives you plenty of time to fix problems instead of just launching with your fingers crossed.

 

Doing the math: Do customers actually add to your ROI?

Now, does it actually make sense from an ROI-perspective to include customers in the development process, or does it just sound romantic?

It all depends on your approach. Systematic pre-launch validation sounds like an expensive research budget. Frame it as risk mitigation and the math, and your mindset, changes completely.

One failed CPG product launch costs more than most companies can handle. Factor in development costs, tooling, initial production runs, marketing spend, retail relationships damaged by underperforming products.

Pre-launch validation testing is always cheaper than that, and because it's not a one-size-fits-all package with Highlight, it works for businesses of all shapes and sizes.

If systematic validation prevents even one launch failure every few years, it pays for itself many times over. And that's before you factor in the positive outcomes–launches that perform better because you caught and fixed problems early.

Specific outcomes thanks to customer-led growth:

  • Lower failure rates: Products that make it through behavioral validation in authentic contexts perform better at launch because you've already identified and fixed friction points.
  • Faster launches: Counterintuitive but true. Quick feedback loops mean less guesswork and fewer "wait and see if this works" delays.
  • Margin protection: Preventing reformulation after launch or managing returns from products that don't work as intended protects your margins in ways that show up clearly on P&Ls.

The metrics that matter: repeat purchase rates, customer lifetime value, Net Promoter Score–but only if you're tracking these against control groups. The real validation is comparing products developed with systematic customer feedback against products developed without it.

If you need to demonstrate value to leadership, show them the cost of your last product failure versus the cost of validation that would have caught it. Then show them faster time-to-market when you're not guessing about whether problems exist. Let the numbers do the talking.

 

Act on real consumer behavior

Every brand is already spending on customer research. For most CPG brands investment goes to focus groups, surveys, and market studies. So the problem isn't budget or an unwillingness to do research, it's that you're getting artificial feedback too late in development to actually change manufacturing decisions.

Customer-led growth for physical products means redirecting that investment to methods that predict purchase behavior instead of methods that produce convincing presentations. It means watching what customers do instead of asking what they say they'd do.

And for those who are worried about it becoming too chaotic: customer-led doesn't mean customer-controlled. You're not building whatever customers request. You're understanding their problems deeply enough to build solutions they couldn't have articulated themselves but immediately recognize as better than what they're currently using. Try Highlight and see for yourself.