← Reference Deep-Dives Reference Deep-Dive · 7 min read

Pre-Flight Ad Creative Testing: Test Before You Spend

By Kevin, Founder & CEO

The Economics of Testing Before You Spend


A 30-second television spot costs $50K-$500K+ to produce. A national digital video campaign runs $10K-$100K in production. Social media content creation, while cheaper per asset, adds up to significant spend across a content calendar.

Media spend amplifies the cost. A campaign with $200K in media behind a creative that does not resonate is $200K wasted — plus the opportunity cost of running something better.

A pre-flight creative test costs $200-$300 for 10-15 AI-moderated depth interviews. That is 0.1-0.5% of a typical production budget. The question is not whether you can afford to test. It is why you would skip it.

The math works at every budget level:

Creative TypeTypical Production CostTest Cost (10-15 interviews)Test as % of Production
TV spot$50K - $500K$200-$3000.04% - 0.6%
Digital video$10K - $100K$200-$3000.2% - 3%
Print / OOH$5K - $50K$200-$3000.4% - 6%
Social media content$1K - $10K per asset$120-$160 (6-8 interviews)1.2% - 16%
Radio / podcast$5K - $30K$200-$3000.7% - 6%

Even for social media content at the lowest production budgets, testing a batch of concepts before production costs less than producing the weakest concept in the batch.

What to Test


Not all creative elements benefit equally from pre-flight testing. Focus testing on the elements that are most expensive to change and most impactful on performance.

Video Scripts and Storyboards

Test the narrative arc, the core message, and the emotional tone before committing to production. A script test surfaces whether the story lands, whether the key message is received, and whether the call to action motivates.

Test the visual-headline combination. In print, you have seconds to communicate. Pre-flight testing reveals whether the concept communicates the intended message at a glance or requires explanation.

Social Media Creative

Test batches of concepts to identify which ideas to produce. Social creative is high-volume and high-velocity, making it ideal for rapid screening. Test 4-6 concepts, produce the top 2-3, and skip the rest.

Taglines and Headlines

Test messaging in isolation when you are evaluating positioning options. Taglines carry outsized weight in brand perception and are worth testing independently of the creative execution they will appear in.

Audio Creative

Radio spots, podcast ads, and audio branding can be tested as scripts or rough recordings. The narrative and message structure matter more than production polish at this stage.

The RCA Framework


RCA stands for Resonance, Comprehension, and Action. It provides a structured evaluation framework for any creative concept at any stage.

Resonance

Does the creative connect emotionally? Does it feel relevant to the audience? Resonance is the “do I care?” test.

What to probe:

  • Initial emotional reaction (what did you feel?)
  • Personal relevance (is this about someone like you?)
  • Attention (would this stop your scroll / make you look up?)
  • Memorability (what sticks with you after seeing this?)

Red flags: Indifference, confusion about who the ad is for, “it’s fine but nothing special” reactions across multiple respondents.

Green flags: Unprompted personal connections (“this is exactly what happens to me”), emotional language in responses, respondents who retell the concept in their own words with enthusiasm.

Comprehension

Does the audience understand the intended message? Comprehension failures are the most common creative problem and the easiest to fix in pre-flight testing.

What to probe:

  • Message playback (in your own words, what is this ad saying?)
  • Brand attribution (who is this ad for?)
  • Key benefit recall (what is the product/service promising?)
  • Confusion points (anything unclear or that you had to re-read?)

Red flags: Respondents play back a different message than intended, cannot identify the brand, or focus on a secondary element instead of the main message.

Green flags: Accurate, unprompted message playback. Respondents articulate the benefit in their own words without prompting.

Action

Does the creative motivate the desired behavior? This could be purchase consideration, website visit, sign-up, or simply brand favorability shift.

What to probe:

  • Behavioral intent (what, if anything, would you do after seeing this?)
  • Barriers (what would stop you from acting?)
  • Urgency (does this feel time-sensitive or is it easy to ignore?)
  • Comparison to current behavior (how does this compare to what you do/use now?)

Red flags: “I’d probably forget about it,” no clear next step articulated, respondents who like the ad but cannot connect it to action.

Green flags: Specific intended actions (“I’d look this up”), questions about availability or pricing (indicates real interest), comparison to current dissatisfaction.

Stage-Gated Creative Testing


Different development stages require different testing approaches. Here is a practical stage-gate framework for creative testing.

Stage 1: Concept Screening

When: Before any production work begins. You have multiple creative territories or strategic directions.

Stimulus: Written concepts, mood boards, or rough visual references. Keep fidelity low and consistent across options.

Goal: Identify which 1-2 creative territories to develop further. Kill weak directions before investing in execution.

Sample: 10-12 interviews per concept. If testing 4 territories, that is 40-48 interviews total. At $20 per interview, $800-$960.

Key questions: Which territory resonates most? Which message lands? Which emotional register fits the brand and audience?

Stage 2: Script / Storyboard Testing

When: You have selected a creative direction and developed it into a script, storyboard, or detailed concept.

Stimulus: Full scripts for video/audio, annotated storyboards, or detailed concept boards for print/social.

Goal: Validate the narrative, message, and emotional arc. Identify specific moments that work and moments that lose the audience.

Sample: 10-15 interviews. Enough to reach thematic saturation on what works and what does not.

Key questions: Does the story hold attention throughout? Is the key message received at the intended moment? Does the ending motivate action? Are there confusing or off-putting moments?

Stage 3: Rough Cut / Animatic Testing

When: You have produced a rough version of the final creative. For video, this might be an animatic or rough cut. For print, a near-final layout.

Stimulus: The rough production asset. Brief respondents that this is not final so they evaluate the content, not the polish.

Goal: Fine-tune before final production. Identify pacing issues, visual elements that distract from the message, and any last-stage comprehension problems.

Sample: 10-12 interviews. At this stage, you are looking for specific, actionable feedback rather than strategic direction.

Key questions: Does the pacing feel right? Is there a moment where attention drops? Does the visual support or compete with the message? Does this feel like something you would watch/read to completion?

Stage 4: Final Validation (Optional)

When: Final creative is complete and you want confidence before committing media spend.

Stimulus: Final production asset in a realistic viewing context.

Goal: Confirm that the finished product delivers on what tested well in earlier stages and that production choices did not introduce new problems.

Sample: 8-10 interviews. This is a confidence check, not a strategic evaluation.

This four-stage approach sounds expensive until you do the math. All four stages combined cost approximately $2,000-$3,000 in research. That is 1-2% of a modest production budget and a fraction of a percent of combined production plus media spend.

Sample Design for Creative Testing


Audience Matching

Test with people who match your media target. This sounds obvious but is frequently violated. Testing a Gen Z social campaign with a general population panel produces misleading results.

Define your test sample using the same targeting parameters you will use for media buying: demographics, psychographics, category behavior, and media consumption.

Monadic vs. Sequential Design

Monadic testing (each respondent sees only one concept) produces the cleanest data. There are no order effects or contrast effects. The respondent evaluates the concept on its own merits.

Sequential testing (each respondent sees multiple concepts) is more efficient but introduces comparison effects. The second concept is always evaluated relative to the first.

For depth interviews, monadic design is preferred. The interview goes deeper on one concept rather than shallower on multiple. With AI-moderated interviews at $20 each, the cost difference between monadic and sequential designs is modest enough that clean data should win.

Avoiding Common Sample Mistakes

  • Do not recruit “creative evaluators.” People who frequently participate in ad testing become professional respondents who give polished but unreliable feedback.
  • Screen for category relevance, not ad recall. You want people who buy in the category, not people who remember ads.
  • Include light and heavy category users. Heavy users evaluate creative differently than light users, and your campaign likely targets both.

How Agencies Use Pre-Flight Testing


Agencies that build pre-flight testing into their creative process use it as both a quality tool and a client service.

As a quality tool: Testing at the concept stage reduces the number of rounds of client revision. When the creative brief is validated with target audience data, the resulting work is more likely to be approved without extensive rework.

As a client service: Presenting creative concepts with audience reaction data elevates the conversation from subjective opinion (“I don’t like the blue”) to evidence-based evaluation (“the audience responded strongly to the emotional narrative but missed the brand reveal”). This protects good creative from being killed by executive preference and gives weak creative a fair hearing against the data.

As a revenue stream: Some agencies offer pre-flight testing as a paid service, building research into the scope of creative engagements. At User Intuition’s price point, the margin on this service is significant even at modest markups.

Getting Started


Pick one campaign currently in development. Test the creative concept before it goes to production. Use the RCA framework to structure the evaluation. Compare the depth of insight you get from 10-15 depth interviews against your usual process of internal review and client gut feel.

The first test pays for itself by either validating that you are on the right track (proceed with confidence) or surfacing a problem that would have been far more expensive to discover after production.

For detailed methodology on concept testing, see the complete guide. For guidance on testing social content specifically, see testing social media content with your target audience. For concept statement formatting, see the concept testing template.

Frequently Asked Questions

Testing at the concept stage costs a fraction of post-production research - typically 10-50x less - because weak concepts get eliminated before production budgets are committed. A study that costs a few thousand dollars can prevent a six-figure production investment in creative that research would have flagged as ineffective.
The RCA framework assesses Resonance (does the creative connect emotionally with the target audience), Comprehension (does the audience understand the message as intended), and Action (does the creative motivate the desired behavior). Testing all three dimensions at each stage gates investment in creative that fails on any axis.
Stage-gated testing applies lightweight evaluation at each production milestone - script, rough cut, final cut - eliminating underperforming concepts early and concentrating production investment on creative validated to resonate. Each gate uses appropriate methods for the asset maturity, from concept boards through finished spots.
User Intuition's AI-moderated interviews deliver RCA evaluation at each creative stage in 48-72 hours, at $20 per interview. Agencies and brand teams can test concept scripts before production begins, rough cuts before finishing investment, and final creative before media buy - with enough participants to identify patterns across the target audience.
Effective creative testing requires precise audience targeting - recruiting participants who match the demographic and psychographic profile of the intended audience, not convenient samples. The sample should be large enough to identify divergent reactions across audience segments, since creative that works for one segment sometimes actively alienates another.
Get Started

Put This Research Into Action

Run your first 3 AI-moderated customer interviews free — no credit card, no sales call.

Self-serve

3 interviews free. No credit card required.

Enterprise

See a real study built live in 30 minutes.

No contract · No retainers · Results in 72 hours