Reflecting on IA’s Ignite Data Quality Conference: Why We Need a Human-First Approach in a Technology-Focused Industry

June 3, 2025

Market Research Industry’s Commitment to Data Quality

One theme throughout the conference was the continued commitment to quality in the market research industry. It’s why we were all there, right? And let’s be honest, it’s been the year of data quality for about four years. In particular, we wholeheartedly agreed with the Insights Association that:

  • Quality is a shared responsibility throughout the research process, from sample to survey to respondent experience.
  • Source transparency (where participants originate and how they engage) is critical for reliability and quality.
  • Data quality is different than data fraud. It’s a minute difference, but an important distinction.

The Future is Confidence Over Quality

At the same time, to us, the big picture going forward is more about confidence, with quality as a part of that. Clients need confidence in the data they’re using to make decisions, and quality helps to ensure that.

A human-first approach helps us achieve both. This approach prioritizes human insight over pure technological solutions in protecting data quality and generating more meaningful insights. We see human-first as a multi-tiered process of confidence layers, so to speak, that includes human expertise with:    

  • Participant integrity – Upfront checks, participant curation, and verifying respondent authenticity
  • Execution quality – Understanding fieldwork nuances, providing client-specific guidance, and data review processes
  • Human-first recommendations – Guidance on improving screening methods, overall research design, and strategic insights
  • Data confidence – Comprehensive data validation, enabling confident decision making

People First…Then Technology

Like the rest of the industry, we are always trying to find new, innovative ways to be more efficient while also promoting better quality. We did see some truly interesting ideas at the conference. Will we jump into any of the new products right away? Most likely, no.

Maybe it’s the researcher in us, or maybe it’s the collective agreement among our team that we should never take things at face value. While we’re always exploring new technology to enhance our work, we haven’t found any tools that can fully replace the human eye.

Some solutions can flag metadata anomalies or hidden signals we might miss otherwise, but they’re just one part of the equation. For tasks like reviewing open-ends, QAing links, and hand-coding responses, human judgment is still essential. We take the time to research, test, and compare emerging tools because we believe staying ahead of evolving technologies is key to delivering real value for our clients.

We want to better understand both what works from a technology standpoint and what works from a behavioral standpoint to have the best possible outcomes for our clients as well. Yes, everybody’s talking about automation and tools, but there’s very little that’s going to take away the human insight from quality assurance completely. Ultimately, we think we add value for our clients by layering the combination of technology and human experience on quality, giving our clients greater confidence in decision-making.

The Importance of Context

One session at the conference focused on a human-first approach in qualitative research. We found a number of parallels to our approach in quantitative research. The speaker discussed how we need to remember that the average American reads at an eighth-grade level. So, it’s important to write to that when you can in concise, non-technical language in the appropriate context. That often requires a human touch.

In fact, we often find ourselves discovering things mid-project and course correcting. For example, say there’s an open-ended review that uncovers bad quality. We then dig into it and discover it’s the way in which a question is being worded, so it needs to be tweaked.

Part of that is a quality assurance checklist, such as: Does the context of the response match the question? Are these responses following a pattern that seems a little bit suspicious? Do the responses make sense with the audience that’s being asked? These things are crucial but hard for automated tools to spot.

Is it Fraud or Lack of Engagement?

That’s why we believe quality control should go beyond verification and uniqueness; it should also ensure that respondents are engaged and understood. We’re actively exploring ways to support this, including positive reinforcement techniques and thoughtful design choices. Because at the end of the day, data quality isn’t just about detection, it’s about connection.

The Takeaway on Data Quality at Research Results

For us, a human-first approach is a no-brainer because it’s how we’ve always been at Research Results. It’s how we have the relationships we have. It’s how we have the clients we have. Keeping that element at the forefront is essential to having quality and confidence in the data and results. Technology is going to be evolving constantly and yes, it’s helpful as a tool, but it’s people who know how to apply it, who know what to look for when the data isn’t as expected and who have the expertise and guidance to ensure you have the quality insights necessary to make confident decisions.

Look for more on our human-first approach and confidence over quality process in future blogs.

For more information on how we can help with your specific market research needs, contact Ellen Pieper, Chief Client Officer, Ellen_Pieper@researchresults.com, or 919-368-5819 today.