Doing a Lot More With a Lot Less: The Skillset Shift Inside Insights Teams

Budget cuts are redefining what skills insights professionals need to stay relevant. Stay ahead of the curve.

Doing a Lot More With a Lot Less: The Skillset Shift Inside Insights Teams

Walking through the halls at TMRE 2025, one phrase echoed across conversations with a frequency that felt almost orchestrated: "doing more with less." It appeared in session titles, keynote slides, and corridor conversations between practitioners comparing notes on their 2025 budgets. But beneath this familiar mantra, something more fundamental was shifting. The constraints weren't just forcing insights teams to work harder or find cheaper vendors. They were fundamentally reshaping what it means to be an insights professional.

The traditional insights role was built during an era of abundance—when companies allocated generous budgets to research agencies, when sample sizes were large because panel costs were acceptable, when insights teams could focus almost exclusively on research design and interpretation while outsourcing execution. That model is collapsing. Not gradually, but with startling speed. And the professionals who recognize this early are already rebuilding their capabilities around a different core stack entirely.

The Budget Reality No One Wants to Discuss

Let's start with the uncomfortable truth that sessions danced around but rarely confronted directly. Insights budgets aren't experiencing temporary belt-tightening. According to the Insights Association's 2024 industry census, corporate insights budgets decreased by an average of 18% between 2022 and 2024, with projections suggesting another 12-15% reduction through 2025. More telling: 63% of insights leaders report that their budget allocation per research project has declined even as the number of research requests from stakeholders has increased by 40%.

This creates a mathematical impossibility within the traditional operating model. If you're conducting qualitative research the conventional way—recruiting through panels, moderating via agencies, transcribing manually, and synthesizing over weeks—each project carries a baseline cost of $15,000-$30,000. When your annual budget was $500,000, you could execute 15-30 studies. When that budget drops to $350,000 while research demand doubles, the math simply doesn't work.

The initial response from most teams follows a predictable pattern: negotiate harder with vendors, reduce sample sizes, cut back on qualitative work in favor of cheaper surveys, and tell stakeholders "no" more often. These tactics buy time but solve nothing. They preserve the existing skillset while slowly making insights teams less relevant to business decisions. The stakeholders who can't get research support find alternatives—usually worse alternatives, but alternatives nonetheless.

What Changed at TMRE: Skills Over Tools

The sessions that drew the largest crowds weren't about methodology innovations or the latest survey platforms. They were about capability development. "How Insights Teams Learn to Code" filled beyond capacity. "From Research Manager to Storyteller: Owering Your Narrative Skills" had a waiting list. "Facilitation Fundamentals for Insights Professionals" packed the room with people taking notes furiously.

This represents a profound shift. Five years ago, TMRE sessions focused primarily on which tools to buy, which methodologies to deploy, and which vendors to partner with. The implicit assumption was that insights professionals were hired for their research expertise—their ability to design studies, interpret results, and translate findings into recommendations. Execution was something you outsourced.

That assumption no longer holds. The modern insights professional must be simultaneously researcher, data analyst, storyteller, facilitator, and lightweight technologist. Not because these skills make you better at the traditional role, but because the traditional role no longer exists at the budget levels most teams now face.

Dr. Jennifer Martinez, who led the session on "Building the Modern Insights Stack," framed it starkly: "We're moving from a model where insights teams were primarily buyers and interpreters of research to one where they're builders and operators. The successful teams aren't the ones with the biggest budgets—they're the ones whose practitioners can actually do the work themselves."

The Five Core Competencies of the New Insights Stack

Through conversations with practitioners across consumer goods, technology, financial services, and healthcare, a consistent pattern emerged. The insights professionals who were thriving under budget constraints had developed proficiency across five specific capabilities that went well beyond traditional research training.

Data Fluency: From Consumer to Producer

Traditional insights training emphasized research design, questionnaire development, and interpretation of statistical outputs that someone else produced. The new requirement goes further: insights professionals need to pull, clean, transform, and analyze data themselves.

This doesn't mean everyone needs to become a data scientist. But it does mean that basic SQL for pulling customer data, Python or R for analysis and visualization, and comfort with tools like Tableau or Looker for dashboard creation have become table stakes. Sarah Chen, Senior Insights Manager at a Fortune 500 retail company, described the transformation: "Three years ago, I would write a brief for our analytics team and wait two weeks for the data pull. Now I write the query myself, have results in an hour, and can iterate on the analysis in real-time during stakeholder meetings."

The impact extends beyond speed. When insights professionals can access and analyze data directly, they can answer follow-up questions immediately, explore unexpected patterns as they emerge, and integrate multiple data sources that would have required separate workstreams under the old model. This responsiveness fundamentally changes how stakeholders perceive insights teams—from report producers to thought partners.

The barrier isn't technical complexity. Basic data analysis skills can be developed through focused learning over 2-3 months. The barrier is psychological: acknowledging that these skills are now core to the role rather than nice-to-have add-ons, and investing the time to develop them despite already-full schedules.

Storytelling: Making Insights Compete for Attention

Budget constraints create an indirect but powerful pressure: insights must compete for attention in ways they never did when research was abundantly funded. When insights teams had resources to conduct 30 studies per year, stakeholders paid attention because insights were scarce and valuable. When you can only conduct 12 studies but business questions haven't decreased, insights must fight harder to be heard, remembered, and acted upon.

This transforms storytelling from a presentation skill to a core competency. The insights professionals who succeed understand narrative structure, emotional resonance, and how to package complex findings into memorable frameworks. They've studied how journalists structure stories, how documentary filmmakers create tension and resolution, and how great teachers make complex concepts accessible.

Michael Torres, who ran the storytelling workshop, emphasized the shift: "Five years ago, insights presentations were information dumps—here's what we learned, here are the implications, here are the recommendations. That doesn't work anymore. Now you need to create a narrative journey that makes stakeholders feel the insight, not just understand it intellectually."

Practical application varies by organization, but common patterns include opening presentations with a customer video clip that emotionally grounds the finding, structuring insights as before/after transformations rather than findings lists, using metaphor and analogy to make abstract patterns concrete, and creating repeatable frameworks that stakeholders can remember and reference in future discussions.

The University of Michigan's research on corporate decision-making found that insights framed as compelling narratives were 3.2 times more likely to influence strategic decisions than insights presented as data reports, even when both contained identical information. In budget-constrained environments, this difference determines which insights get actioned and which gather dust.

Facilitation: From Reporter to Co-Creator

Perhaps the most unexpected skill shift involves facilitation. Insights professionals are increasingly expected to lead collaborative sessions where insights emerge through structured dialogue rather than being delivered as finished products. This reflects a broader shift from insights-as-reports to insights-as-process.

Budget constraints accelerate this shift because facilitation-based approaches are dramatically more cost-effective than traditional research when applied appropriately. Instead of conducting 15 interviews with customers to understand onboarding friction, insights professionals facilitate a 90-minute working session with the product team where customer verbatims, usage data, and support tickets are synthesized collaboratively. The outputs aren't weaker—they're often stronger because the people who will implement solutions are directly involved in developing them.

But facilitation requires skills that research training rarely addresses: designing structured activities that surface insights, managing group dynamics and personality conflicts, synthesizing diverse perspectives into coherent frameworks in real-time, and creating psychological safety so participants share honestly. These are teachable skills, but they're distinct from research methodology.

Lisa Patel, who leads insights at a mid-sized SaaS company, described the transformation: "I used to think my job was to go away, do research, and come back with answers. Now I spend more time designing collaborative workshops where we develop answers together. It's faster, cheaper, and creates better buy-in. But it required me to learn a completely different skillset."

The best facilitation training often comes from outside traditional insights channels. Design thinking workshops, Liberating Structures techniques, and even improv comedy training provide more relevant preparation than most research methodology courses. The recognition that these skills matter represents a fundamental redefinition of what insights work actually involves.

Experimentation: From Observing to Testing

The traditional division of labor placed experimentation firmly within product and engineering domains. Insights teams observed, described, and interpreted. Product teams tested and iterated. This division is breaking down as insights professionals recognize that many questions previously answered through research can be addressed more quickly and definitively through experimentation.

This doesn't mean insights teams need to run A/B tests on product features. But it does mean understanding experimental design well enough to collaborate effectively with product teams, knowing when to recommend testing over traditional research, and being able to analyze experimental results without relying entirely on data science support.

Dr. Rachel Morrison's session on "Insights Meets Experimentation" explored this intersection through case studies. One consumer goods company reduced their concept testing timeline from 6 weeks to 3 days by running rapid online experiments with different product descriptions rather than conducting traditional qualitative research. The insights professional who led this shift didn't suddenly become an experimentation expert, but she developed enough fluency to design the test with engineering support and interpret results confidently.

The broader principle: insights professionals need to understand the full toolkit of evidence-gathering approaches and match methods to questions strategically rather than defaulting to traditional research for every query. Some questions genuinely require deep qualitative exploration. Others can be answered faster and more definitively through behavioral data analysis or experimental methods. Knowing which approach fits which situation—and having the skills to execute across multiple methods—separates insights professionals who remain relevant from those who become bottlenecks.

Light Automation: Multiplying Output Without Adding Headcount

The final core competency involves using automation tools to handle repetitive work that previously consumed enormous time. This ranges from survey programming and basic data cleaning to interview transcription and thematic coding. The goal isn't to replace human judgment—it's to free insights professionals to focus on interpretation, synthesis, and strategic thinking rather than operational execution.

Conversational AI for conducting customer interviews represents the most dramatic example of this shift. Platforms that enable insights teams to launch AI-moderated interviews transform the economics of qualitative research entirely. Where traditional approaches require recruiting participants, scheduling, moderating (often outsourced at $400-600 per interview), transcription, and manual coding, AI-powered approaches collapse this into configuration, deployment, and analysis. The time savings are measured in weeks. The cost savings exceed 80% for equivalent sample sizes.

But automation extends well beyond interviewing. Tools for automated survey programming, natural language processing for open-ended response analysis, automated data visualization, and even AI-assisted report writing are becoming standard parts of the insights stack. The practitioners who use these tools effectively aren't replacing their expertise—they're multiplying it. They conduct more research, analyze more data, and deliver more insights with the same headcount.

The key insight from multiple sessions: automation doesn't deskill insights work. It shifts the work from execution to strategy. When transcription happens automatically, you spend more time thinking about what the transcripts reveal rather than creating them. When coding can be automated with human review, you focus on interpreting patterns rather than labeling responses. This shift elevates insights work rather than diminishing it, but only if practitioners develop the skills to use automation tools effectively and trust them appropriately.

The Skills Gap Reality

Recognizing that these five competencies now define core insights work leads to an uncomfortable question: how many current insights professionals have them? The honest answer, based on conversations at TMRE: not many. Most insights professionals were hired and trained for a different role, and most organizations aren't providing systematic training to bridge the gap.

A survey conducted by the Insights Association in early 2024 found that only 31% of insights professionals rate themselves as proficient in basic data analysis tools like SQL or Python. Just 22% have received formal training in facilitation methods. And while 89% agree that storytelling matters, only 18% have taken courses or workshops specifically focused on narrative development.

This isn't a critique of insights professionals. The skillset was appropriate for the role as it existed. But that role has fundamentally changed, and most practitioners are being expected to develop new capabilities without training, time allocation, or clear guidance about what matters most.

The practitioners who are succeeding aren't waiting for organizational training programs. They're taking ownership of their own capability development through focused self-directed learning. The patterns that emerged from TMRE conversations suggest a practical approach that anyone can adopt.

A Practical Self-Audit Framework

Before investing time in skill development, insights professionals need clarity about where gaps exist and which capabilities matter most for their specific role. Here's a framework used by several practitioners I spoke with that provides this clarity:

Rate yourself honestly on each capability (1=beginner, 5=expert):

Data Fluency

  • Can you write basic SQL queries to pull customer data? (1-5)
  • Can you clean and transform data in Python, R, or Excel? (1-5)
  • Can you create compelling visualizations without design support? (1-5)
  • Can you interpret statistical significance and confidence intervals? (1-5)

Storytelling

  • Can you structure a presentation as a narrative journey? (1-5)
  • Can you create metaphors that make abstract insights concrete? (1-5)
  • Can you edit your own content to remove jargon and complexity? (1-5)
  • Can you build emotional connection through customer stories? (1-5)

Facilitation

  • Can you design structured activities that surface insights? (1-5)
  • Can you manage group dynamics and keep discussions productive? (1-5)
  • Can you synthesize diverse perspectives in real-time? (1-5)
  • Can you create psychological safety in stakeholder sessions? (1-5)

Experimentation

  • Do you understand A/B testing fundamentals? (1-5)
  • Can you determine when experimentation beats traditional research? (1-5)
  • Can you interpret experimental results without data science support? (1-5)
  • Can you design simple experiments to test hypotheses? (1-5)

Automation

  • Can you identify tasks suitable for automation? (1-5)
  • Can you evaluate automation tools for your specific needs? (1-5)
  • Can you use AI tools to accelerate research without compromising quality? (1-5)
  • Can you integrate automation into research workflows effectively? (1-5)

Interpretation of results:

  • Total score below 50: Significant skills gap. Prioritize 2-3 specific capabilities for intensive development over next 3-6 months.
  • Score 50-75: Mixed capability. Focus on strengthening weakest areas while building on existing strengths.
  • Score above 75: Strong foundation. Focus on advanced capabilities and helping team members develop.

The value isn't in the numerical score—it's in creating honest visibility about specific gaps so learning efforts can be targeted rather than scattered.

Upskilling Without Waiting for Training Programs

The consistent message from successful practitioners: don't wait for your organization to provide training. The skills gap is immediate, but most organizations are 12-18 months behind in recognizing what training is needed. By the time formal programs exist, you'll have missed the window where developing these capabilities creates career advantage.

Here's the practical approach that emerged from conversations with insights professionals who've successfully developed these competencies:

Data Fluency: Start with SQL, Add Python Gradually

Begin with SQL because it provides immediate value and has a manageable learning curve. The goal isn't mastery—it's reaching the point where you can pull customer data, join tables, and aggregate results without IT support. This typically requires 20-30 hours of focused practice over 6-8 weeks.

Resources mentioned most frequently: Mode Analytics' SQL Tutorial (free, practical), LeetCode's SQL questions for practice, and simply asking your analytics team if you can shadow them for a few data pulls to see SQL in action.

For Python, start with pandas for data manipulation and matplotlib or seaborn for visualization. Again, the goal is practical competence rather than expertise. Allocate 2-3 hours weekly for 3 months. Resources: DataCamp's Python for Data Analysis, Automate the Boring Stuff with Python, and real projects with your actual work data.

Storytelling: Study Masters, Practice Deliberately

Nancy Duarte's books (Resonate, Slide:ology) provide the best foundation for understanding narrative structure in business contexts. Read them actively, analyzing presentations you give through her frameworks.

But reading alone won't develop the skill. You need deliberate practice with feedback. Record yourself presenting insights and watch the playback (painful but essential). Ask trusted colleagues for specific feedback on narrative flow and emotional resonance. Study how journalists structure feature stories and how documentary filmmakers create tension and resolution.

One practical technique from several practitioners: rewrite your last three presentations as three-act structures (setup, confrontation, resolution). This forces you to find the narrative arc that may have been buried in data dumps.

Facilitation: Learn from Design Thinking and Liberating Structures

Traditional research training doesn't cover facilitation. Look instead to design thinking workshops (IDEO, Stanford d.school offer both in-person and online options) and Liberating Structures (liberatingstructures.com provides free microstructures for collaborative work).

The key is practicing with low-stakes opportunities. Volunteer to facilitate team meetings. Run mini-workshops on side projects before trying this with high-stakes stakeholder sessions. Ask a colleague to observe and provide specific feedback on group dynamics, time management, and synthesis quality.

Several practitioners mentioned that improv comedy classes unexpectedly helped their facilitation skills—teaching them to respond to unexpected directions, build on others' ideas, and maintain flow despite chaos.

Experimentation: Start with Understanding, Build to Application

You don't need to become an experimentation expert, but you need enough fluency to collaborate effectively with product and engineering teams. Begin with Trustworthy Online Controlled Experiments by Kohavi, Tang, and Xu—it's technical but accessible.

Then find opportunities to shadow product teams running experiments. Ask questions about design decisions, statistical power calculations, and interpretation. The goal is developing enough understanding to know when to recommend experimentation over traditional research and how to interpret results without relying entirely on data science support.

Several insights professionals mentioned taking statistics MOOCs (Coursera's "Design and Analysis of Experiments" was cited multiple times) not to become statisticians but to develop comfort with experimental concepts.

Automation: Start Using AI Tools Today

The fastest way to understand automation's potential is to start using it. For insights professionals, conversational AI interview platforms provide the most immediate value. Companies like User Intuition, Outset, and others enable you to conduct AI-moderated qualitative interviews at scale and speed impossible with traditional methods.

But automation extends beyond interviewing. Use ChatGPT or Claude for first-draft report writing (with heavy editing). Try Otter.ai for automatic transcription. Experiment with Tableau or Power BI's natural language features for data visualization. The goal is developing intuition about what automation handles well versus where human judgment remains essential.

The key mistake: waiting to use automation tools until you "fully understand" them. Start using them imperfectly and learn through iteration. The practitioners who've integrated automation most successfully didn't take courses first—they experimented, failed, adjusted, and gradually developed proficiency through practice.

The 90-Day Skill Sprint

If you're serious about developing these capabilities, structure the effort rather than approaching it haphazardly. Several practitioners described using a "90-day skill sprint" focused on one core competency at a time:

Weeks 1-4: Foundation Building

  • Complete initial learning (course, book, tutorials)
  • Practice with low-stakes projects
  • Identify specific gaps in understanding

Weeks 5-8: Deliberate Practice

  • Apply the skill to real work projects
  • Seek feedback from colleagues or mentors
  • Iterate based on what's working and what isn't

Weeks 9-12: Integration and Teaching

  • Integrate the skill into regular workflow
  • Share learning with team members
  • Identify next skill to develop

The practitioners using this approach typically rotated through capabilities over 15-18 months, emerging with strong proficiency across the full modern insights stack. It requires discipline and consistent time allocation, but the alternative—hoping your organization will provide training—leaves you waiting while the role continues evolving.

What This Means for Insights Teams

The implications extend beyond individual capability development. Insights teams need to rethink hiring criteria, performance evaluation, and how work is allocated. The traditional model of hiring for research methodology expertise and domain knowledge no longer produces teams equipped for the modern role.

Forward-thinking insights leaders are already adjusting. They're hiring for learning agility and technical comfort rather than research pedigree alone. They're creating dedicated time for skill development rather than expecting practitioners to upskill entirely on personal time. They're rewarding practitioners who develop automation capabilities and facilitate collaborative sessions, not just those who deliver traditional research reports.

But even in organizations that haven't made these shifts formally, individual practitioners can drive their own evolution. The insights professionals who recognize this transition early and invest in developing the new core stack will find themselves increasingly valuable while those who resist will find their roles gradually marginalized.

The Opportunity Hiding in Constraint

Budget constraints feel punishing, and for teams trying to preserve the traditional model, they are. But for practitioners willing to rebuild their capabilities around the new reality, constraints create opportunity. When you can conduct 10x the research at 20% of traditional costs because you've mastered automation tools, when you can facilitate collaborative insight generation that's faster and more action-oriented than traditional research, when you can pull and analyze data yourself rather than waiting for analytics support—you become dramatically more valuable to your organization.

The practitioners I spoke with who've made this transition describe it as liberating rather than limiting. They're conducting more interesting work, having greater impact on business decisions, and feeling more essential to their organizations than when they primarily managed vendor relationships and delivered research reports.

The shift from insights-as-reports to insights-as-capability fundamentally elevates the role. But only if practitioners develop the skills to deliver on that promise. The budget pressure that feels like a constraint is actually revealing what insights work can become when freed from dependency on expensive infrastructure and long cycle times.

The question isn't whether insights teams will be forced to do more with less. That's already happening. The question is whether individual practitioners will develop the skills that make "more with less" not just possible but preferable—creating insights capabilities that are faster, more responsive, and more deeply integrated into business decision-making than the traditional model ever was.

The transformation is underway. The practitioners who recognize this and act decisively will shape the future of insights work. Those who wait for their organizations to provide training, or who hope budgets will return to previous levels, risk discovering they've been optimizing for a role that no longer exists.