Description
Ernest Baskin has built his career across academia and industry, teaching market research while applying it in real business settings. In this episode of The Curiosity Current, he explains why market research continues to fail organizations, even when they believe they are surrounded by data. The conversation opens with the New Coke example. Ernest reframes the failure as a breakdown in how research was used, not whether it existed. Qualitative insights that signaled emotional backlash were sidelined in favor of cleaner quantitative summaries. The decision appeared sound on paper and failed in the real world. Research fails when inconvenient insight is ignored or when the process is shortened. Stephanie and Molly explore a common executive mindset: companies already have more data than they can handle. Ernest explains why this logic falls short. Most internal data looks backward. It shows what happened, not why it happened. Without direct customer conversation, teams miss motivations, barriers, and tradeoffs that shape real behavior. The episode also challenges the belief that primary research must be slow or expensive. Ernest explains how outsourcing expertise has normalized inflated costs, while modern tools allow teams to design and run credible research quickly when expertise stays in-house.The conversation closes with behavioral nudge examples showing how small environmental changes can quietly but meaningfully shift behavior, reinforcing why well-designed research still matters.
Episode Resources
- Ernest Baskin on LinkedIn
- Saint Joseph's University Website
- Stephanie Vance on LinkedIn
- Molly Strawn-Carreño on LinkedIn
- The Curiosity Current: A Market Research Podcast on Apple Podcasts
- The Curiosity Current: A Market Research Podcast on Spotify
- The Curiosity Current: A Market Research Podcast on YouTube
Transcript
Ernest - 00:00:01:
What I would like companies to start doing is to start writing their hypotheses and their methods of analysis ahead of time so they can be more sure of what they're finding going forward. And similarly, even on the secondary data side, if they're working with large datasets, what I would recommend is, similarly, do a preregistration and even separate the dataset in half, do as much analysis and data mining on one half of the dataset. And on the other half of the data set, see if your conclusion is still valid, right? Preregistration is super, super important.
Molly - 00:00:38:
Hello, fellow insight seekers. I'm your host, Molly, and welcome to The Curiosity Current. We're so glad to have you here.
Stephanie - 00:00:46:
And I'm your host, Stephanie. We're here to dive into the fast-moving waters of market research, where curiosity isn't just encouraged, it's essential.
Molly - 00:00:55:
Each episode, we'll explore what's shaping the world of consumer behavior from fresh trends and new tech to the stories behind the data.
Stephanie - 00:01:03:
From bold innovations to the human quirks that move markets, we'll explore how curiosity fuels smarter research and sharper insights.
Molly - 00:01:12:
So, whether you're deep into the data or just here for the fun of discovery, grab your life vest and join us as we ride the curiosity current.
Stephanie - 00:01:23:
Today on The Curiosity Current, we are joined by Ernest Baskin, Department Chair and Associate Professor of Food Marketing at Saint Joseph's University.
Molly - 00:01:32:
Ernest is a consumer behavior researcher, consultant, and speaker who helps companies connect more meaningfully with consumers through smarter, more human-centric research. His work spans behavioral nudges, survey design, environmental effects, and consumer psychology, with a deep expertise in the food industry and hands-on experience working with Fortune 500 brands.
Stephanie - 00:01:54:
Ernest has also been recognized as a top 40 under 40 best MBA professor by Poets and Quants and has published research in leading journals exploring how subtle cues shape perception, value, and choice.
Molly - 00:02:06:
Today, we're digging into a question that many companies have wrestled with, especially smaller or resource-constrained ones. Why does market research matter so much, and why does it not have to cost an arm and a leg to do it well?
Stephanie - 00:02:18:
Love it. Ernest, welcome to the show. We're so happy to have you.
Ernest - 00:02:21:
Thank you.
Stephanie - 00:02:22:
Well, I wanted to get us started with a little bit of your background. So, you've lived in both worlds: academia, industry. And before we start to dive into, as Molly hinted at, you know, budgets and methods and things like that, I was curious if you could talk a little bit about when did you personally realize the power of consumer research to shape real business outcomes, not just academic insight or foundational understanding, which is, of course, important, but it often feels very upstream from something tangible like a business outcome. Did you have a moment or a project that really kind of cemented that for you?
Ernest - 00:03:00:
Yeah. So, my career has been both in academia and teaching market research and in the corporate world, doing market research. And I think that in both, I've seen so many examples of how market research has the power to shape discovery and strategy. But, also, I've seen and heard many stories about how when you have a bad market research project, and you don't do it well, or you don't do it at all, product launches and things like that can often fail, and you cause a lot of monetary damage to your brand. And my favorite example of this, and I teach this in my classes quite often, is New Coke. We all heard the traditional market research story of how New Coke failed. And one of the key findings in that story is they did qual, they did quant. The qual showed a little bit about how there was going to be some backlash, and the qual was sidelined. And I think that was when I first learned that market research can be pretty powerful, but it's not just doing market research; it's actually paying attention to the entire market research process and integrating all of the insights that you find in it in search of the answer to your final question.
Stephanie - 00:04:26:
For sure. That is an iconic example. And I think we like, I, you know, know about New Coke and was around when it happened. I remember being like, ew, but I did not know that about what research had and had not been done. That's fascinating.
Ernest - 00:04:42:
Yeah. And so just to go back to that for a second, New Coke, you don't hear that. You don't hear the market research story. Most of what you hear is, hey, this is a product. We launched it, and it failed. But this was an era in our history where data was paramount, quantitative data was paramount, and it feels to some extent like we might be swinging back to that in the market research realm a little bit. And so this was qualitative research data that they did, and they had a couple of people in some of their focus groups essentially voice the fact that they would never drink something like this coming from the Coke brand. And because those people were essentially outliers but vocal outliers, when you do a summary in the quantitative research, they basically came out as noise.
Stephanie - 00:05:31:
Wash out. Yeah.
Ernest - 00:05:32:
And so they never actually made it to the final decision makers and weren't really considered in the final decisions.
Stephanie - 00:05:40:
Super interesting. Now, is there a similar story about Crystal Pepsi? Just real quick or something?
Ernest - 00:05:45:
Not that I'm aware of.
Molly - 00:05:47:
That is such a poignant story about the use of data, but I feel like so many companies still will say, like, we have the data. Like, we already have all the information that we could possibly want. We're drowning in dashboards. I don't need more cloud cluttering up my head. So, from your perspective, how is that intentional market research still essential to do? And how do you sort of explain to a CEO or another stakeholder of these companies what the difference is between a raw spreadsheet with thousands of lines of data versus an actual piece of actionable insight?
Ernest - 00:06:20:
Are you sure you don't work for the food industry? Because they say that all the time. So, I work a lot with the food industry, and they have so much data. Every time you're at a grocery store, everything you buy gets scanned, and it's associated with a loyalty card, so you can be tracked every which way. So, they are often drowning in data, right? They know a lot about you. And so the way I like to think about it is it's not so much about the data, it's about how you look at it, and so much of the data that we have is backwards-looking, essentially. So, we can give you ideas about what will happen going forward, but past results are not necessarily gonna be correlated with future results. And when you're thinking about what promotions to run, how to design your packaging, how to do a project launch, secondary data is helpful, right? No doubt about it. But it needs to be coupled with something that's intentional. You need to be actually testing the thing that you're going to do. You need to be talking to your customers. The other thing that I find in doing intentional market research is that you can get the why behind the data. So, we have all the scanner data, but a lot of times, it doesn't tell us the why. You really only get the why when you talk to the actual customer, and that's super important.
Molly - 00:07:50:
And the why being, why did you make this product choice instead of this choice instead of just having a line that said this customer made this choice? Do it as you will.
Ernest - 00:08:00:
Exactly. And you can't really fix a problem if you don't know that it exists. So, until you talk to the customer, you don't know what their specific barrier was to purchasing your product.
Molly - 00:08:12:
Right. Can I'm just curious too about you have all that data, and, yes, you can't think forward, but is there also an avenue to look at trends by that particular customer? So, if they were previously buying something at a big scale and then they suddenly slowed that down, is there a way to isolate that in a way to ask questions? Is it informing questions that can inform future strategy?
Ernest - 00:08:37:
Oh, absolutely. So, in the typical market research process, the way that I talk about it to my classes is you have to start with all that data first, right? Because the data tells you, it helps you form a hypothesis, it tells you what is likely to happen, but then you need to test it because you might be wrong. Or you might have found a pattern in the data that might not be true in reality because it just happened coincidentally. It was just noise. We've all seen something in the data that, upon further digging, turns out to not replicate in a different scenario, turns out to not replicate in a different store or whatever. You need to test all these things, but I'm not advocating for not looking at the data. Don't get me wrong. You have to look at the data because that's how you get your hypothesis. That's how you build your business sense. The data helps inform the types of questions that you're going to ask in the qualitative and quantitative, and how you're going to build that market research process going forward.
Molly - 00:09:38:
Right. And it's bridging the gap that this spreadsheet is not your answer. There's a whole process involved, and that needs to be communicated.
Ernest - 00:09:48:
Exactly.
Stephanie - 00:09:49:
So, there are definitely still perceptions among some business leaders that primary research, which is kind of what we're talking about right now, is slow. It's expensive. It's overly academic. Basically, it doesn't move at the speed of business. I have a hypothesis about this, and I'm very curious to get your take. First of all, if you think that's a fair perception today, given just, you know, the dramatic kinds of technological advances we've had. But then also, what I'm really curious about is when that criticism is levied, what is usually going on? Like, are there typically bigger systemic issues in an organization that prevent research from moving faster?
Ernest - 00:10:30:
So, first of all, I think this is an unfair criticism, and it was an unfair criticism ten years ago, even, which I think would surprise some people. And this is coming from two main issues. One issue is that people don't wanna justify doing market research. They just have a business sense that something is true and just wanna go with the flow. That's one idea. The other idea is companies have basically externalized their market research to a degree that a bunch of consulting companies, experts, etc., come in and they quote a qualitative study for 20 grand or a quantitative study for 30 grand. And companies have essentially normalized that as this is okay. We can do this. They'll come back. They'll give us a PowerPoint presentation, and then we will get an answer. And once you've normalized 20,000 or 30,000, that's what you spend, you don't even question whether or not there's another way, right? And so if you know anything about academia, you will know that we have very little money. So, when I wanna do research, I cannot be spending $20,000 or $30,000, right? I can probably spend $400. That's probably it on a study. So, what do I do? I still do market research. And this really surprises every single person in all my classes and all the companies that I talk to. What you need to do is you have the expertise, right? As long as you're willing to not offload the expertise, you can DIY the whole thing. So, there are platforms where you can build surveys. There are platforms where you can field your surveys to a panel, right? And they are very cheap, right? And so as long as you don't offload the expertise, market research can be really cheap. And it is, I think, eye-opening to a lot of people that this exists. Right? That you can do research very quickly, your data can come back really quickly, and you can make decisions really quickly. In fact, you can do it in a day, and I've done it in a day.
Stephanie - 00:12:47
Yeah. No. We have too. I was gonna say, yeah, Molly and I, you know, the company that we work at, aytm, it's a research technology platform, essentially, right, at its heart. So, we definitely live in that model of empowering our customers and the brands that work with us to do their own research, you know, we're here to help them as much or as little as they need, but the whole idea is that empowerment to move fast and to move more cheaply than you could otherwise. I'm curious, something else that I think goes on in organizations that I would love to get your take on is I really have noticed over my time in supplier side research that the companies that proactively build market research into their innovation process, whether it's ad innovation, product innovation, food innovation versus being reactive, right, versus not doing research unless something comes up, they are able to move so much more quickly because they end up taking a very templatized approach and programmatic approach to the research that makes it just like lightning, like you get on the flywheel of doing that research every time you hit that stage gate sort of part of the process. And it's the companies that are more reactive, which I get happens. The smaller you are, the more you tend to be like that. There are lots of reasons that you might be in a more reactive mode, but it's a lot harder to move quickly when you are only reacting instead of proactively engaging insights.
Ernest - 00:14:15:
I completely agree. And giving that template to this is how we do our processes, really tells your employees, hey, you can't shortcut this. You need to do the market research because it's always very tempting to just go really, really quickly, especially when you have a lot of demands on your time. And even if the market research can be really done quickly, it still takes time. It's not zero, right?
Stephanie - 00:14:42:
It’s not zero.
Ernest - 00:14:43:
Could be a few hours, but it's not zero. So, very tempting to shortcut, and this forces people not to shortcut that process.
Molly - 00:14:52:
How do you bridge that gap of ensuring that that expertise still sits with an organization? So, if you have an organization who's trying to be very scrappy, has a very limited budget, or is even trying to start up research into their normal business processes, and nobody in the organization knows research, is there an avenue or another technology that these organizations can use? I mean, my answer would probably be the obvious one. I don't know if we had this on our agenda to talk about today, but AI is empowering people who don't understand research. So long as the tool is built by someone who does, they can say simply what they're looking to do, and it can build out that questionnaire for them pretty quickly.
Ernest - 00:15:34:
My thought is a little bit different. I think that you can maybe start with a questionnaire on AI, but AI is gonna give you something very, very generic unless you do some very specific prompting. And from what I've seen in my own personal experimentation, it hasn't gone to the point where it can detect bias in the survey questions that it's trying to generate. Or we know from research that the order of questions matters. So, for example, if I ask, “How healthy are these French fries?” And then I ask, “How much do you like these French fries?” Well, guess what? The healthiness of the French fry attribute becomes way higher in the way you evaluate how much you like the French fries, versus if you switch the order of the questions. So, that matters. The AI isn't quite capturing that, and I think that you really need to get some training. I think it's important. That's the way you don't offload expertise. You get some training into the basics of survey design: what are biases, how do I write good questions that participants understand, and that will get you there? What I will say is the survey platforms have done a really, really nice job putting a lot of free educational resources out.
Stephanie - 00:17:00:
Absolutely.
Molly - 00:17:01:
Oh, absolutely. Yep.
Ernest - 00:17:02:
I think now more than ever, if you wanna educate yourself on how do I make a quality survey, the resources for doing that are there. It's just a matter of carving out some time. And I promise you the investment into that is worth it.
Stephanie - 00:17:19:
That's such a good point, and I haven't heard anyone talk about that in a while. I, just from personal, you know, anecdote about that is I, you know, moved from academic research to market research in 2008. So, it's been a while now, but at the time, you know, you have the fundamental basics, right? Like you know about good experimental design, right? So, those are the things that carry over. But there were methodologies like conjoint and things that, you know, I had the statistical background to understand, but they weren't used a lot in psychology, for instance. I learned everything that I knew and ultimately became, you know, somebody very well versed in advanced analytics simply by looking at websites, Sawtooth, right, like all the so many great resources out there by experts that you have easy access to. So, I couldn't agree more with that.
Ernest - 00:18:10:
Yeah. And also, if you reach out to academics, most academics are really friendly and always happy to help, to be quite honest.
Stephanie - 00:18:16:
So, Ernest is saying give him an outreach. He's there to answer your questions. I love it.
Ernest - 00:18:22:
Friend me on LinkedIn.
Molly - 00:18:24:
I love a shameless plug. I’m just kidding.
Stephanie - 00:18:26:
Okay. So, we've started to hit on this a little bit, but I have to say, like, as a consultant and a professor, just the sheer number of studies you've seen evaluated and conducted is pretty large. What are some of the most common mistakes that you see organizations make in how they approach research? Are there any, you know, common mistakes that we can sort of call out here for people to learn from?
Ernest - 00:18:51:
So, I have a lot of pet peeves with regard to how corporations conduct market research. And one of the exercises I typically do in my classes is I have them go to one of these market research panels and just take some market research studies, and we come back and say, is this a good question? Is this not a good question? Why or why not? So, biases are very, very common. Another thing that's very, very common is the use of jargon. So, in the food industry, one example of this is something called own brand or private label. Sometimes, participants do not understand what that means. And in market research, if a participant doesn't know what the word in your question means, then they're gonna answer with a lot of noise, which means that the result you get is not gonna be good. The other common practice that I see in market research is that companies often like to concentrate on their own products, right? And that makes sense. You have a product concept. You wanna test it. You don't wanna expend any effort on anything else. It makes total and complete sense. The problem is if you go into the academic research, there is a deep, deep literature on the fact that people make different decisions when they're looking at things in joint, meaning multiple products side by side, versus separate, meaning one by itself evaluation. And think about retail grocery. What is the evaluation retail grocery? It's joint. It's side by side. What is the typical evaluation mode in a product concept survey? It's separate. And so the problem is you're making a decision based on what people are telling you in separate evaluation mode, and then in the store, they're using joint. And so your product could do well in one and really, really poorly in another. That's really, really problematic.
Stephanie - 00:20:55:
It totally is. And it's so funny that you bring that up because that is something that we are, I had to keep bringing it back to us, but we're absolutely, we think about this all the time. We have a virtual shelf test and ecom test that we specifically advise people use for late-stage concept work because you're absolutely right. Like, the in-context evaluation is totally different. Your attention is being pulled. Like, it's that, and you've got so many different things going on that are, like, conscious and then unconscious at that shelf, right, that contribute to the decision.
Ernest - 00:21:27:
Exactly. 100%. You have to do that. There's just no ifs, ands, or buts because, otherwise, and you have to know what your end goal is, right? Your end goal is: how is my product gonna perform on the shelf against competitors? So, you need to have competitors there to perform against. End of story.
Stephanie - 00:21:45:
Absolutely. Well, I want us to be able to touch on some of your academic work cause we haven't done that yet. Really fascinated by your work in behavioral nudges. And I wondered if you could talk a little bit about how small details and actions can dramatically shift perceptions and maybe give us a real world example where a subtle change in wording, or environment, or framing significantly impacted behavior?
Ernest - 00:22:09:
Yeah. So, I have a couple of really cool studies that I've gotten to work on in my career, and then maybe I'll just talk about two of them. One of them was early in my career. I worked with the Yale Health System, and so their goal was to try and figure out how to get people to go and vaccinate against the flu. And so you would think that that's difficult, right? You would think that there are multiple different ways, but the easiest way it turned out was that you just add a map of the location in the email that you sent out, showing where to get vaccinated. And the funny part is this email gets sent to Yale students and faculty. They all know exactly where the health system is. It's right on campus. But just that small, little nudge of showing them a map is enough to raise participation by a lot of basis points, which is really cool.
Stephanie - 00:23:09:
That's so cool.
Ernest - 00:23:10:
And then the other really cool study I worked on was a study for the Google food team. So, they were interested in taking a look at what does nudging do to people's eating habits. And so in Google, they have these microkitchens, and so all of their food is free in those microkitchens. And one of the key reasons that they go to these microkitchens is that they're going for a drink. So, they go for a soda, they go for a coffee, etcetera. On the other side of the soda and coffee are the snacks. And so one of the things that we looked at was how does distance away from the soda and the coffee, etcetera, affects people's desire to go get snacks. And it was really interesting. We found out that because there's two entrances to this thing, on one side, the snacks were closer to the coffee and the soda. On one side, they're further away. It's not a big difference. It's a difference of basically six feet, so not huge. But there was a significant difference still that in the further away section of this microkitchen, people were less likely to go after the snacks when they were going there to get a soda or coffee, which is pretty cool and just shows you that just really minor things in terms of the structure of the environment can change what you are picking and can change your whole eating behavior.
Stephanie - 00:24:46:
I think something that I really like about you using those examples too is that those are not self-reportable things. You can't ask somebody. Like, would six feet closer make it more likely that you would get a snack? Because they have no idea that that's happening.
Molly - 00:25:00:
Yeah.
Ernest - 00:25:01:
You could, but they'd say, of course not. Never.
Stephani - 00:25:04:
Super cool.
Molly - 00:25:05:
Humans are so weird. Like, that's the things that subconsciously influence people's decisions is constantly mind-boggling. I had not heard of that precise example and the extent to which it can make an impact before.
Ernest - 00:25:23:
Yeah. Another really cool example is that if you have a product mix on your shelf or if you have a product mix, for example, in a consulting proposal for those of you in that business, right, if you give people an option that they are unlikely to pick, so worse in every possible way but it's close to another option in attribute space, people are more likely to pick that option because they are thinking about it in terms of, hey, now I have a good reason to pick it. It's much better than the other option near it. Even though before, they could have been indifferent between the other two options that were in the set.
Stephanie - 00:25:58:
I may have used that one a few times.
Ernest - 00:26:01:
It's called the attraction effect for those who wanna look it up.
Molly - 00:26:04:
So easily swayed. I wanna switch gears a little bit, but still talking about your career in academia. You teach future marketers and market researchers to work with professionals, and you work with them to make sure that they feel comfortable asking better questions and using research confidently. Because we've had some guests on the pod a handful of times that have really talked about being the voice of the customer, and that the data and the insights that are uncovered are not necessarily what the C-suite or the stakeholders want to hear. Or the New Coke. That's not what they wanted to hear. That's not the direction. It goes against investment. It goes against work. It goes against everything that they put into, you know, perhaps this new product. So, how do you sort of instill that in these upcoming professionals that they have to ask hard questions and they have to use research confidently to advocate for consumers and potentially save the business big costly mistakes.
Ernest - 00:27:08:
So one of the things that I tell my students is you really need to go in prepared when you have these conversations. You need to know the data for why you are making the claims you're making off the top of your head. And one of the ways that you can be confident is you need to make sure that you've pressure tested the way in which you've talked to your consumers. So, the way in which you've done your quality search, the exact survey you've used because if you don't, you're not completely confident that you have data that is without bias, and then there can be pushback. So, I've also taught people on the other side of the table, right, the managers that are evaluating that research. And for them, my advice is, well, if you wanna pressure test this, ask for the data. Or if you don't wanna ask for the data, just ask for the questions of the survey and make your own judgment as to whether or not there was bias and in which direction the bias might have occurred. So, one exercise I often do in class, just to show students how meaningful these changes are, is I don't tell the students that I do this, but I divide them up essentially in experiment and half of them get one version of the survey, half of them get another version of the survey, and they fill it out. And then we see how different their answers are, basically showing that, hey, look, this stuff is really important. You have to know it either from the decision maker's perspective or the market researcher's perspective.
Stephanie - 00:28:43:
Yeah. That's compelling.
Molly - 00:28:44:
I kind of wanna sign up for your class.
Ernest - 00:28:46:
I’ll put you up.
Molly - 00:28:48:
Alright, Ernest. Well, thank you so much so far for all of these incredible thoughts from two totally different lenses. I feel like I should be taking more notes about just the takeaways that you've given, all aspects and all stakeholders of the research world. So, thank you so much. I wanna pivot to a quick round of what we call on the show here, Current 101, where we ask all of our guests the same question. What is something that the market research industry should stop doing immediately, and what is something that you think is beneficial that we should be doing more of?
Ernest - 00:29:22:
Okay. Great. So, it's a good question. So my pet peeve, I think, is jargon. I think that needs to be excised, and I think that companies really need to remove it from their surveys in whatever manner it's there, so that consumers actually know what they are being asked, because there's no faster way to turn off a participant from your survey than if they don't understand what they're being asked. And then on the opposite side, a simple high-impact habit that I think that I would like folks to adopt is to start preregistering their hypotheses ahead of time. So, this is coming from me as an academic. There's a big movement in academia right now to ensure replicability in the search for truth in academic research. And one of the ways to do that is to do a little hamstring of yourself, right? One concrete example of this that I can give is that when you're doing a survey, there's always gonna be outliers in some way, and you're always gonna be able to justify removing any kind of person as an outlier, right, 2.5 standard deviations above the mean, 3 standard deviations above the mean. All of these are perfectly valid. None is better than the other, but you could inadvertently remove whichever version best fits your hypothesis, so that your hypothesis then becomes true. And so what I would like companies to start doing is to start writing their hypotheses and their methods of analysis ahead of time so they can be more sure of what they're finding going forward. And similarly, even on the secondary data side, if they're working with large datasets, what I would recommend is similarly do a preregistration and even separate the dataset in half. Do as much analysis and data mining on one half of the data set. And on the other half of the data set, see if your conclusion is still valid, right? Preegistration is super, super important, and I don't know that that movement has really made its way to the corporate world as much as it overtaken academia.
Stephanie - 00:31:38:
Yeah. I don't think that it has. I don't see it very often at all, and I'm familiar with that too, you know, from my background. But it's interesting because I think that in corporate America, you know, the kind of market research, essentially, you're often protecting yourself not against your own biases, but against the biases of your organization or the product owner who just feels like this product is a winner. And it almost makes it more salient and more important to do in this kind of environment, and I don't think people do that very often.
Ernest - 00:32:13:
And one of the things that I will say is this is a way to get product owner buy-in. So, you can go with the you can go through with the product owner exactly what you're gonna do and just preregister it. So, if then you come back and say, well, it looks like we found the opposite, you can then also say, and you agree that this is the correct way of looking at things.
Stephanie - 00:32:36:
For sure. I wanna just really quickly too, because it's on my mind, talk about you know, you were talking about removals of, you know, like, when you're cleaning your data, right, and getting your data read set ready for analysis, and kind of looking at outliers, right? I feel like data quality is a huge issue in the industry right now. Everybody's very concerned with it, which I get. But I am having this conversation on it at least a weekly basis right now with customers, about you know, like, I mean, certainly, we want clean data. We don't want bots in our data. We don't want inattentive survey takers or people who qualified in incorrectly into our surveys. But there is a danger in over-cleaning, and it's exactly what you said. I'm like, do we really wanna make a sanitized version of this universe that is only full of the most conscientious people who shop for this? Because I don't think that's what you want. Like, the real world is messier than we would like it to be, but we have to acknowledge and live with some of that messiness, so we're removing some of the reality from the dataset, you know?
Ernest - 00:33:37:
Yeah. I mean, there's basic checks. I agree with you. And in some platforms, not yours, obviously, but in some platforms, you even have people that are VPNing in from other countries.
Stephanie - 00:33:48:
Oh, sure! Yeah.
Ernest - 00:33:50:
And so all of that is just table stakes. But at the end of the day, there's degrees of freedom. And I think the degrees of freedom is what you really wanna concentrate on. Sometimes you wanna leave those in, sometimes you don't. And I think the danger comes in when you make that decision post hoc based on whether it gets you the results you want or not.
Stephanie - 00:34:11:
Fair enough.
Molly - 00:34:12:
Well, Ernest, it's been an absolute pleasure to talk to you today. To kind of close this out, I'm curious if you could, you know, for the founder, the marketer, the product leader who maybe is listening and still feels that research is more of a nice-to-have, but not essential. What is one thing that you might want someone like that to understand about the real cost of not doing research?
Ernest - 00:34:37:
Yeah. I think that not doing research means you're not listening to a customer at the end of the day, and your product is going to a customer. And at the end of the day, even if you're just making a dollars and cents comparison, the cost of product launch failure is always multiple times higher than the cost of market research. So, market research is money well spent in my opinion.
Stephanie - 00:35:03:
That's it. Yeah.
Molly - 00:35:05:
Okay. I actually, I know we're kind of going backwards, but when you mentioned about wanting to stop jargon in surveys, I kind of had, like, a follow-up to that because I was just really curious. If you're in a very niche company or you have very specific things that you wanna ask, that you wanna make sure is an apples-to-apples comparison to the way that you think about things internally. What kind of, like, maybe audit or, you know, a different test that maybe a person who's like, I think about this in this way 24/7, and I have no idea how to detach my brain to not think about it like an expert. I have to think about it like a layperson. Is there a process that you recommend somewhere in there about, you know, actually, how to make sure that your surveys don't actually inadvertently have that jargon that is not understandable by the average person?
Ernest - 00:35:56:
I think that asking someone in your company is a very, very bad idea. That's how all the jargon makes its way in there. What I recommend is, at a start, you ask your parent or your grandma or somebody, or just ask a regular customer. And so if I were to abstract from that answer, it's do a pretest outside of your company, just have people take the survey, and also, at the same time as they're taking the survey, report what they think certain words mean, report whether or not they're confused about the way that you're writing the questions, and report on how they're interpreting the questions. I often advocate doing a qualitative session with someone taking the survey and you watching them as they go. You can you can learn a lot about your survey design, and it will do and it will give you a much better survey. That's what I do in my classes. Students create a survey, and then we get into groups with others, and then they walk them through the survey. And it gets some nice results usually.
Stephanie - 00:37:04:
We had a client.
Molly - 00:37:05:
A little research on a research on research. I love it.
Stephanie - 00:37:18:
Yeah. We had a client for a long time, who, she would always have her teen son take the surveys. And it was great because she would just be like, he said this, we gotta change this, Stephanie. And I would be like, we're literally getting all our feedback from a teenage boy. But, like, it was better than the three of us who were experts and way too close to this particular, you know, topic. So, it was great. I loved it.
Ernest - 00:37:34:
I love it.
Molly - 00:37:35:
Thank you so much, Ernest, for joining us again today. I feel like we keep saying this, but this has been a super interesting and engaging episode. You have so many perspectives to share. You've had a wealth of information that you've been able to share with our audience. So, this has been an excellent episode, and thank you so much again.
Ernest - 00:37:49:
Thank you both. Really great being on.
Molly - 00:37:51:
Awesome. And I am gonna hit you up. I'm gonna take your class.
Ernest - 00:37:54:
Love it. I'm excited to get you there.
Molly - 00:37:56:
Take care.
Outro - 00:38:58:
The Curiosity Current is brought to you by aytm. To find out how aytm helps brands connect with consumers and bring insights to life, visit aytm.com. And to make sure you never miss an episode, subscribe to The Curiosity Current on Apple, Spotify, YouTube, or wherever you get your podcasts. Thanks for joining us, and we'll see you next time.


















.jpeg)


