Data quality isn't a filter. It's a discipline.

aytm logo icon
Posted May 01, 2026

Meet Jonathan Goodbread, aytm's new Head of Data Quality Strategy.

Jonathan Goodbread has a reputation in market research for saying the thing everyone is thinking but no one wants to say out loud: The industry's approach to data quality isn't working.

Not because the tools are bad. The fraud detection is sophisticated. The AI-powered screening is real. Post-collection cleanup engines are more powerful than they've ever been. But 40% or more of survey responses are still routinely flagged and removed before analysis begins—and that number keeps climbing.

Goodbread's argument is that the industry has been optimizing the wrong end of the process. "The question isn't how to catch more bad responses," he says. "It's how to build the conditions where good responses are the norm."

That distinction—between filtering and discipline—is the reason he just joined aytm as Head of Data Quality Strategy.

Good data starts before collection begins

Most data quality infrastructure starts from a single assumption: Bad data will get in, and your job is to get it out. That assumption shapes everything—where investment goes, what gets measured, what "quality" actually means in practice.

Goodbread sees a different model. One that treats quality as a discipline running through every layer of research—how you build respondent relationships, how you design your instruments, how you screen before collection begins, and how you analyze after it ends.

A filter-first approach accepts contamination and optimizes for removal. A discipline-first approach reduces contamination at the source and validates at every stage. Both care about quality. They produce very different research.

The infrastructure we’re building on

"What drew me to aytm is that they're not treating data quality as a box to be checked," says Goodbread. "They've built systems that take it seriously at every stage. My job is to make it even more rigorous, more transparent, and more meaningful to the clients who depend on it."

That philosophy shows up in how the platform is built. It starts with PaidViewpoint, aytm's proprietary respondent panel—built on respect, fair value exchange, and genuine long-term engagement. When people feel valued, they provide more thoughtful data. That's the first layer of quality, built before a single survey question is asked.

From there, a deduplication system screens respondents before a survey begins. Validated in-survey quality checks monitor engagement during collection. And Data Centrifuge, aytm's AI-powered engine, runs post-collection analysis that surfaces patterns human review alone would miss.

No single layer carries the full weight. Each one reinforces the others.

"We think about data quality not as a toggle but as a living process we build alongside our clients," said Lev Mazin, CEO and co-founder of aytm. "Jonathan is exactly the right person to help us push that standard further."

Less time cleaning data, more time using it

The organizations building quality into every layer now—rather than relying on post-collection cleanup alone—are the ones delivering research their stakeholders trust without qualification. They're also spending less time re-running studies and more time acting on what they find.

That's the work Goodbread came to do. And it's already underway.

Featured Stories

New posts in your inbox