The cost of silence in research with Amberly Miller

Description

In this episode of The Curiosity Current, Stephanie and Molly talk with Amberly Miller, Director of UX Research at Prudential Financial, about the internal battles that shape whether research drives action or dies quietly in a repository. Amberly’s path from operations to UX research was built in environments where every decision carried real financial consequence. That pressure became the backbone of her philosophy, which is to “measure twice before you cut.” She explains how she dismantled a research culture built on bottlenecks and fear of doing it wrong, and replaced it with a democratized system that invites more voices in, supported with guardrails, automation, and education that maintain rigor. Amberly also speaks about when silence in the room signaled politics, risk, reputational fear, or the emotional cost of killing work someone has fought for. She discusses preparing for C-suite conversations by learning the business first, tailoring findings to real KPIs, and speaking with a level of clarity that demands attention. They also confront the limits of AI as a research partner and reinforce why human judgment, tribal knowledge, and empathy still carry the work across the finish line. For any researcher questioning their confidence, Amberly reframes the role entirely, that “you are not fighting for your own voice, you are fighting for the customer’s.”

Transcription

Amberly - 00:00:01:  

Your job as a researcher isn't to just collect data, and you're not there to challenge authority with something that you've uncovered. Remember that you are there, and the company has made space for you to be there to understand and represent the voice of the customer or of your user. So when you're sitting on a finding that you know matters, always remember, if you can't find the confidence within yourself to find the confidence to advocate for your user, because that's what, essentially, at the end of the day, everybody that you work with is kind of relying on you to do.

Stephanie - 00:00:41:  

Welcome to The Curiosity Current, the podcast where we dive deep into what's shaping today's trends and tomorrow's consumers. I'm your host, Stephanie, and I'm so glad you're joining me. Each episode, we tap into the minds of researchers, innovators, and insights professionals to explore how curiosity drives discovery and how discovery drives better decisions in an ever-changing market landscape. Whether you're a data enthusiast, a strategy pro, or like me, just endlessly fascinated by human behavior, this is the place for you. So get ready to challenge your assumptions, spark some fresh thinking, and have some fun along the way. Let's see where curiosity takes us next with this brand new episode. 

Stephanie - 00:01:26:  

Today on The Curiosity Current, we're joined by Amberly Miller, Director of UX Research at Prudential Financial. 

Molly - 00:01:34: 

Amberly leads research strategy across Prudential's global retirement and finance advice businesses, guiding over 100 UX designers and product teams through democratization research practices. With more than fifteen years of experience in marketing, UX, and research, she's built a career on making data accessible, actionable, and human.

Stephanie - 00:01:55:  

In this episode, we will explore how user experience research drives business impact, what it takes to create influence with data, and how research democratization can unlock organizational growth. Amberly, welcome to the show.

Amberly - 00:02:09:  

Thank you so much for having me. I'm so excited to get started and chat with you guys today.

Stephanie - 00:02:13:  

Yeah. Us too. Well, to get us started, so Amberly, you've led UX research and design at some of the largest organizations in insurance, financial services, from American Family Insurance to Centene and now Prudential. Before we really dive in, can you help us kinda level set and talk a little bit about the value of UX research more generally? How it can be leveraged to reduce frictions and drive user experience, and maybe most importantly, how you got into this field.

Amberly - 00:02:44:  

I actually found my way into UX research by happy accident. You know, I started out as an admin and then kind of moved into operations. And, you know, that was something that was very much in my family. My dad was an operations manager in the manufacturing industry. So, you know, I thought I was just really good at figuring out how to do things. I lived in San Francisco at the time, so, you know, naturally being there during the software as a service VC big boom, I found myself working with startups and, you know, founders that were kind of chasing incubator money and figuring out their product market fit. And, you know, in that world, you only have so much time and so much money. You know, they refer to it as your runway. And in that environment, every decision counts. And I think that that's where being on the other side of the table and seeing the value of UX research really hit home for me. You know, I love cliches, and one of the ones that I was raised with in my house is that you always measure twice before you cut. And I think that research lets you do just that, you know, before you cut and really spend a magnitude of hard-earned resources, you can spend some of that money on research. And, you know, a common misconception is, it's not just about collecting opinions; research really allows us to make sure every step of what we're doing in business is bringing us closer to what users actually want and what our customers actually want. So, you know, after a year working, you know, as an operational consultant on a project with a really brilliant UX team, I was completely sold. I couldn't believe more companies weren't employing this. This is, kind of; it just intrinsically seemed like the way that everything should be done. And that's kind of how I found my passion, and I'm so lucky I love doing what I do. And, you know, that first project experience and also, you know, working with startups really has kind of shaped how I approach research at every scale, whether it's working, you know, with those scrappy startups or with those incubator-based founders or, you know, bringing that to larger organizations like Centene and Prudential. Any dollar that a company can invest in research, no matter what size, you know, can save you costly, frustrating pivots later. And I know if you've been doing this for any amount of time, you've kind of seen that play out. So I love the title of your podcast because to me, UX research really is about turning curiosity into confidence. You know, that confidence for your product teams, for the sales team to go out and sell it, and most importantly, for our customers to know that this is the solution that they're truly gonna benefit from.

Molly - 00:05:28:  

Yeah. And you said - it was funny - I was listening, and you were talking about startups, and why don't people do this more often? And it almost reminds me a little bit, and Stephanie could talk to this too, but kinda how aytm got started is our founder was actually trying to start another business, and he wanted to do some market research testing into his new business, but he was frustrated to find that there wasn't anything good enough for what he was trying to do. And so he founded aytm instead, and here we are, you know, years and years later. So, you know, when you talk about turning these insights into influence and how crucially important it is, but I feel like there are also a lot of people who, you know, maybe may not see the true value right away. Like, you've presented a really, really great insight that you feel is super critical and important, and you've said it to the boardroom, and it's just crickets. It's silent. Was there a time that you found that there was a finding that was super essential, but maybe the greater team that you were working with didn't see it? And if so, what was the barrier that you had, what was the ‘so what’ moment that you then broke through that barrier to get that eventually on the road map?

Amberly - 00:06:37:  

That's a great question because I think every researcher has faced that moment, right, where you think you're bringing something either so obvious or game-changing. Right? And, you know, your room doesn't react. My experience is interesting because I kind of understand that, sometimes, having been on the other side of that table, those insights are dropping like a bomb, but not how you might realize. So what I kind of try to center on when I'm maybe not getting the reaction that I want is remembering that, like, empathy doesn't just apply to our customers and our users when we're testing, I kind of have to be empathetic and understand that, like, I might be, sometimes I'm giving an insight that will make something better for our customers, and it's a huge win. Sometimes I might be delivering an insight that means that we are not going to be working on something anymore or somebody's gotta make a hard decision on what we include in a road map and what teams might not, I mean, and especially in this economy, there are some times where I've delivered research that I know has meant that things and teams are not continuing, and that's difficult. So I think kind of, like, the first thing I think about when, you know, an insight isn't, like, received the way that I expected it to be is, like, do I actually have all of the context here? And, of course, I haven't been in all of those teams, all of those teams' meetings. You know, I also try to remember that, like, those stakeholders might need to take things back into their individual teams before they can react. That is something that I have definitely learned in this large enterprise environment that I've kind of spent the last bit of my career in. I might be getting very measured reactions from people on a call, but they might be typing in their teams’ chats and really going off background. 

Amberly - 00:08:30:  

So just because it doesn't happen immediately, but, you know, one project that was really memorable for me, and it wasn't that I didn't get a reaction. It was that I felt like I was repeating the same research, delivering the same message, repeating myself over and over again, and kind of after three or four rounds of research where my team had prioritized, essentially trying to go back and rework our questions and ask something over to get another answer, kind of figuring that, like, we needed to change our approach with that team because it wasn't working. Especially, we learned you know, that team had gone, you know, after we had done this work to an outside vendor to try to confirm whether or not we heard what we heard. 

Molly - 00:09:16: 

Ouch.

Amberly - 00:09:17:  

And there's also some other team. 

Molly - 00:09:19: 

And, you know, I try not to have an ego about it because I do feel like I am that data-centric researcher, right? So, like, I love the truth even if I wasn't the one who uncovered it. But at this point, we were all coming up with the same answers, you know, I did get their readouts and the other teams. So, you know, all of that to say, we had to pivot our approach. And the way that we did it was I found a video where a user, you know, started with a very long pregnant pause and then went into pretty good detail about how he had spent an hour with this tool, going through every feature of it. And he still was completely lost on how it would help him, what the main selling point was, you know, like all of these things that they were putting a ton of time and effort behind and just kind of, you know, getting it there a little bit, but missing that, you know, mark that brings something from, okay, this would work to NPS score of 9, I'm telling everybody I know about it. So, you know, after that video, and giving it in a meeting where everybody was kind of there, it was very difficult for them to misunderstand, you know, what was happening at the user level. And then we kind of matched that with similar verbatim from all of the research. We pulled it out of the other team's research. 

Molly - 00:10:40: 

And so, you know, I felt like we kind of married qual with quant in a way that just couldn't be argued with. But at the end of the day, you know, sometimes you can deliver an insight, and that can go unnoticed, but sometimes you can just keep delivering the same insight. So I think that it's important to understand that, you know, a lot can happen in those meetings, but you could be delivering really bad news. So just kind of keep that in mind and be empathetic.

Stephanie - 00:11:08:  

That is so interesting. And I love just even the very basic note of, like, empathy is not just for your customers. It's not. It's for your stakeholders. And you're right because that can sometimes be for some people, an existential threat, right, for their role at the company. Yes. Kind of pulling on that thread a little bit more, insights, communication, storytelling, all that stuff. I think about different audiences, and I think about, like, product teams where probably that existential threat lives, but I also have found in my own work that product teams tend to be pretty receptive because a lot of times they're hungry for information that's gonna be able to drive the product forward. But then I contrast that with, like, a C-suite executive being just a really different kind of audience. And I wondered, like, how do you think about when you have, like, five minutes to get an executive to care about a nuanced user problem? What is the first thing you do? Do you lead with a statistic? Do you use the user quote like you just mentioned? Is it about financial impacts? Like, what is the hook that tends to work with the C-suite?

Amberly - 00:12:17:  

Honestly, this might not be everyone's style, but I truly believe in being bold, you know, if you've got time with a senior leader and you've got their attention, I truly believe in going for broke, you know. But that being said, that only works if you're kind of prepared. You know, one thing, if you work with me, you'll hear this a lot. I always remind people, let's do UX on UX or let's do UXR on UXR. So I might only have five minutes with my stakeholder or my C-suite executive, but I can spend a lot more time preparing for that conversation, right? So those five minutes are extremely impactful. So, you know, I think that it's a twofold approach. I prepare, and I'm very intentional, but I'm also kind of bold. So, you know, what can I learn about that executive? What drives them? The way that I would make a case to my CFO about investing and, you know, improving a life insurance customer application process would be totally different from how I would approach the CMO with that same information. Our company makes it really easy for us and our Internet to go in and understand, I mean, I cover all of our global retirement insurance verticals, and there are six of them. But I can go in, and I can understand the business objectives and KPIs of each one. So, understanding, like, how does that person measure success? What does that person need to report to their boss? Is that the language that I can speak through here? I think that tailoring your message is what matters. I don't think that there's kind of a one-size-fits-all. It's really just about making that moment count and preparing yourself. And I think that as a researcher, naturally, my mind goes to, what can I learn before I step into that conversation, because that's what makes me feel comfortable and confident, right, is knowing as much as I can. And, you know, earlier this year, I was kind of in that exact situation. I was able to meet with our head of UX design, and when we sat down, he asked me what I wanna discuss. I immediately led with, I've been here for three years, and here are three things that I am absolutely certain that we need to change for our customers to get the world-class experience that we promised them. And, you know, that kind of directness really gets somebody's attention. And I got longer than I had scheduled because he really wanted to understand, ooh, what is broken? Like, you seem pretty sure that you know what you're talking about. So, yeah, I was prepared and direct. But, again, that might not be your style. Tailor your approach to what feels natural for you, and really just kind of go into it intentionally and do your research.

Molly - 00:15:07:  

And what's crazy about that is you have this immense, clearly, very well thought-out approach to all of these different things. But it's not just you in that room, right? And it's your larger team. So you've built this huge framework that empowers more than 100 practitioners to conduct research all across Prudential, which is, of course, a massive undertaking. I'm still trying to wrap my mind around it. Take us through some of those biggest challenges that you had. I'm sure there's an arms-length list of challenges, but what were some of the biggest ones that you encountered with scaling research access? And, you know, not only that, but how do you ensure that in every place that you go into, ensuring quality and ensuring consistency?

Amberly - 00:15:53:  

That's a really great question because that's one of the first things that I worked on at Prudential. You know, when I was hired, I asked my my boss what I could accomplish in the first 30, 60, 90 days here and, you know, research democratization and making sure that research was not only accessible, but was something that people really understood when to leverage and and exactly what it could get them, you know, was our main goal. So scaling research at Prudential, though, was as much about mindset as it was about the mechanics and really operationalizing it. You know, the biggest challenge was making sure that everybody had the right foundation and understood that it was something that people were allowed to do, because the biggest hurdle I faced was that my predecessor kind of ran things exactly differently. Everything was done in silo; one person approved every project, one person did all of the analysis, and it -

Molly - 00:16:51: 

Oh my God!

Amberly - 00:17:03: 

Yeah. And this was something, you know, if you think about all of the digital products for everything that Prudential does, life insurance, all of the tools that our advisors use, you know, all of that. It created a situation where, like, people were only researching a few times a year, they were really picking their moment, and it had kind of come to a thing where it didn't seem like it was something that could be done that often. And that's really what I wanted to blow up. So, I kind of you know, shifting to this democratized model where a lot of it was really relationship building, going into these product teams and saying, “Hey, I don't know if you know this. I'm new. We're doing things totally new. I have this platform where we're not really locked down in the frequency, or we're asking questions. Get in there. Play around. There's no such thing as a stupid question as long as you know you're doing it within these guardrails, and kind of, you know, peel the curtain back, if you will, not to give a wicked reference, but peel the curtain back on research can be done by anybody as long as you kind of understand these things.” Because I know everything about research, but I don't have all of the contacts that those people on the design teams have with some of the other groups that we enabled, we have a journey management team at Prudential. They use our platform. We have, like, a lot of different groups that are now using this platform, user teams to, like, ask questions. And I love it, but that was really the biggest thing, like creating that culture shift of breaking down the silo and building the trust, not just in, you know, the fact that they could do it themselves, but, like, having them learn by doing and then see the value of it themselves. Because I think at this point in my career, I have spent a lot of time trying to explain to people what research can do. But I've seen that when you get people in there doing it themselves, it just catches on so much more quickly. So, you know, after that culture shift was created, you know, we had to obviously back it up with some clear guardrails so that people were doing things that weren't, so that we were creating data that we could have confidence in, no matter who was performing the work. 

Amberly - 00:19:15: 

So we have playbooks, you know, templates. We had office hours where people could come in and bring their work or learn about things. The last thing that we worked on, and I'm really proud of this, is that we have like a very automated intake process set up through all of the Microsoft automation tools that takes them all the way through, you know, planning a project through, you know, how to create your transcript. It's emailing you things, as you should be in that step in the process. And then at the end, it kind of reminds you to go put your analysis readout into the repository. So, you know, what I learned through this whole thing is, like, scaling research really isn't just about creating these tight SOPs, rolling out a system, mandating the way that everybody is going to do research in their little corner. You know, it's really about building relationships and fostering a culture where people aren't afraid to ask questions for themselves, kinda feel empowered rather than governed, and, you know, fostering curiosity, which is kind of what it's all about, right?

Molly - 00:20:20: 

For sure.

Stephanie - 00:20:21:  

Absolutely. This kinda tees us up for this very executional question I have, but I think in this context, it's a good context for the question because you're talking about how, you know, bringing non-researchers into these research roles and the value of it, which I totally agree with. But there's also this, it puts responsibility on you to sort of coach up folks on interpretation and analysis, I think, in particular. And I'm curious, with your teams doing research, how do you coach on differentiating signal versus noise? And I think I'm asking this really in the UX research context because it can be tricky working with small numbers of people. I know Jacob Nielsen famously was like; it only takes five people to get to 85% of the insights. But I wonder, does that kinda thinking still ring true today when product teams are more focused than ever on creating these personalized experiences? So what is the process that you use for cultivating a meaningful insight, and what does that even mean to you, I guess?

Amberly - 00:21:25:  

Yeah. This is a really important question because this kind of comes up a lot at a company the size of ours, where we have access to an immense amount of data. So sometimes that will come up. How is this statistically relevant, or how is this, you know, why should I listen to this one user perspective? Right? That we know is more representative of a larger group, you know. How do I get that across? And I think that, you know, it's really easy for some of these product teams, especially when they kind of ride along on research or, like, recently, we've had a few teams, you know, like, really interested in going into Medallia and reading some of the verbatims on our VOCs. And it's easy to mistake a loud anecdote for a real insight, right? So I think for me, I've always been taught that, you know, you have to balance your quantitative data with your qualitative responses. So if you find something in your quantitative metrics, you use qualitative data to tell you what's happening, but that works the opposite, right? If you hear something loud in a qualitative interview, you have to kind of go and make sure that's not like a very singular lens that people are looking through. So, you know, when I'm coaching my team, a meaningful insight is something that's, like, repeating, and it's actionable. So first, we have to pressure test our assumptions, you know, is this a one-off? Where can we go into other sources to understand this, like analytics, support tickets? Have we done more research where we might have more information on this? There is another team that works at Prudential that does, like, larger-scale market research work. And while our work doesn't always overlap, that sometimes is a place that I can go to kind of confirm, what is the actual frequency of what we are thinking about responding to? You know, is this just a very loud outlier? And it comes into our work a lot because some of our sample sizes have to be small. Some of the group insurance work that I do is with a very specific role at an HR company, or in a company in HR where they're responsible for buying the life insurance benefits for hundreds of thousands of people. There aren't too many people like that out there, and how many of them have signed up for user tests? So we really do have to rely on those kinds of smaller sample sizes, but that's what we do, is what we try to do, is kind of marry that with, you know, pressure testing that. And then the second thing is this actually something that we can action on?

Stephanie - 00:24:09:  

So important.

Amberly - 00:24:10:  

Yeah. Sometimes we hear things, and it's great, but it's not technically feasible for us to change it, or just you know, I've always worked in extremely regulated industries.

Amberly - 00:24:23:  

There are some things that we just cannot do because the SEC or FINRA, you know, it's gotta be very specific. So -

Molly - 00:24:30: 

Very constrained. Yeah. 

Amberly - 00:24:31: 

Yeah. You know, I think that that's kind of how I try to make sure that we're focusing our time, especially when we're working on so many things. How I coach people, but also how I make sure that my team is responding to the things that truly matter, that we're solving real salient problems for our users and not just reacting to noise.

Molly - 00:24:52: 

Yeah. Yeah. And through that lens, you've covered a lot. You've talked about coaching - you're a very process-driven researcher - you talked really about empathy, how essential that is for making sure those human impacts get to another human. And then also, you just mentioned regulation, financial regulation, and so I want all of that context as I ask you about the elephant that is in every single room nowadays for tech conversations, AI. So let's talk about it. So there's a little bit of a two-parter, given what you talked about earlier. Practically, what have you been able to use, and what has it actually taken off your plate? Is it just summarizing transcripts, or is it doing something genuinely new to you that you're utilizing in your day-to-day? So that's the first part. And the second part, given that, you know, you talked about how empathy was a really big driving factor for you, when does AI have an insight that feels sort of soulless? It's just kinda one-dimensional. And when do you know when that happens, when, like a human being, actually needs to step in and take a little bit more of the reins on that one?

Amberly - 00:25:58:  

So, you know, honestly, AI is something that Prudential has been very excited to get into, but I would not say that it's something that I have used as long as I think some of my colleagues and other companies have used. I don't wanna say that we've been slow to adopt it, but we've been very intentional and thoughtful and, like, the tools that we've chosen and the ways that it's being employed. Now, in our current phase of kind of piloting things, it's getting really exciting because we are kind of full steam ahead using things. But, you know, I think that originally over the past year and a half, the way that I've seen AI take a lot off of my plate is in the planning and kind of the project management side of research. You know, it's fantastic for eliminating some of the human error that might come in, you know, keeping up with scheduling things, making sure that people are automatically getting reminded of appointments and things like that. The other thing that I've really loved using it for is, you know, for a survey or something like this, and I recently just led a workshop at Prudential where we kind of went man versus machine. But, you know, trying to figure out, like, if I have a survey, these are the questions that I think should be in it. Asking AI just kind of, like, what it thinks and then comparing the two answers has been very interesting because sometimes, you know, you'll think that something is so remedial, but it really is important to ask, and that's what AI is gonna catch for you. You don't see the forest for the trees, and it does kind of help you zoom out a little bit. But I really also like using it for like now, I don't have to think about exactly perfect multiple choice answers. 

Stephanie - 00:27:46: 

For sure. 

Amberly - 00:27:47: 

 For each of my surveys, like or, you know, some of the things going back to that democratization piece, you know, I love the researchers that I work with and all of my designers that, you know, help us research and do their own projects. But some of the things that are the easiest to mess up are those, like Likert scales, where, you know, you'll get it from, it's not exactly polar, or you don't get the full spectrum where somebody can't actually tell you that they didn't like something. No? So I love using AI to make sure that, you know, if I'm doing something like a survey, that I'm not biased. I can really help you make sure that all of your answers are giving the correct spectrum, you know, if I need to create something that's like, tell me what kind of financial products you sell. Obviously, we have templates of these, but I don't have to think through a list of financial products anymore. 

Stephanie - 00:28:38: 

It's great, yeah, for that kind of stuff.

Amberly - 00:28:40:  

We, the analysis, what I really like is, you know, now it can take a large dataset, and at Prudential, we have those. And it can help me understand what's happening at a large scale. So, you know, I have used AI on projects that have been verified by a human, but put those human-made readouts in and said, “What did we learn this year from all of our customers?” And being able to aggregate data, and -

Stephanie – 00:29:07:  

Oh, I love that.

Amberly - 00:29:08:  

To improve a conversation with what we've learned, that's been really meaningful. But, you know, when it comes down to, like, where the human still has to be in the room, I think that, unfortunately, with all of the power that AI has, it just doesn't have the context, right? The nuance, what I know about the meetings that I've been in, you know, what I know about the other projects that I've worked on at Prudential - call it tribal knowledge - I don't, it can give you kind of what's going on at a high level, but I think that we are still required, and this will probably be in the future what we step into, less of, you know, figuring out what happened in these places, in these studies, and more of digging into the why it matters and the now what needs to happen. You know, in my work, it's an accelerator for kind of operational pieces, but from an analytical perspective and especially reporting, we still have a lot of humans working on that part.

Stephanie - 00:30:12:  

This really tees me up, I think, Amberly, for this question here. So, and you've started to answer it a bit, but as UX research becomes increasingly, I think, embedded in strategy, right, in these product teams, and AI makes execution, as you said, aspects of execution more streamlined. What does the next evolution of UX research look like for the researcher?

Amberly - 00:30:35:  

You know, I think it's probably going to a place where we're less about, you know, doing the work and more about strategic partnership. AI is going to help us execute these projects. We've all learned about, you know, synthetic personas and all of that fun stuff, and it's wild. And when it gets dialed in, it's gonna be extremely powerful, you know, as AI kind of takes over that heavy lifting and also makes it so much more accessible for anybody to jump on and ask questions, you know, I think that the true future of our work as researchers is closing the loop really on why it matters and and making it actionable, right? You know, I don't think that synthesizing things in a way that's going to be meaningful is going to be able to be done by anybody other than a human in my career lifetime. You know, I think that you can feed as much data as you want into a system, but at the end of the day, you're just gonna kind of skim the surface unless you have that real human touch and human context. So I think that while we see technology will kind of speed up the way that we can do things and kind of change what we spend our time on, you know, I think that the next era will always be human because that's, at the end of the day, what we do, right? We are collecting these stories. We're helping our business partners and our stakeholders, and our leaders understand why are they meaningful? And that's something that I don't think AI is going to be able to do at the level of a really, really strong researcher. 

Amberly - 00:32:20: 

I think, you know, we won't just be facilitators, but, like, true catalysts for change.

Stephanie - 00:32:25:  

Yes. Influencers, catalysts. I love that. I really do. 

Molly - 00:32:29:  

It's really interesting that you mention how humans are always gonna be the driving force because I feel like, especially in a very tech-first industry, an industry that's required to predict trends and to see what consumers are gonna want in the future, it can be very easy to fall into this. Well, I'm just gonna have AI do everything for me. And I think that loses that human touch that is the core of the business. How am I supposed to use exclusively technology to predict something that a human is going to want? I just don't think, you're right, that's never gonna be possible.

Amberly - 00:33:02:  

Yeah. And I feel like all of our, you know, I saw a really funny meme lately, and it was like, you know, the president of a company, and it was like, “What do we want?” AI. “What do we want it for?” We don't know. 

Molly - 00:33:14: 

We don't know. Totally. 

Amberly - 00:33:16:  

I think that, like, we hear we need to use AI. We need to use AI. We need to use AI. But unless you've actually used it for some of these tasks that people think can be replaced. 

Molly - 00:33:26: 

Yeah. 

Amberly - 00:33:27:  

I think the only people who think that humans can be replaced by AI are people who haven't tried it out on some of these things yet. Because it's wildly wrong sometimes, you know, you really have to finesse the answers, and it will always require a driver as far as I'm concerned. And I think that that's what we're stepping into, you know, sitting on top of these tools and not being replaced by them.

Molly - 00:33:52:  

Yeah. Yeah. And we at aytm have the exact same mindset as that; our suite of AI tools is called Skipper for that exact reason. It's not -

Stephanie - 00:34:00:  

It's not. Right.

Molly - 00:34:02:  

The captain of your ship. It's your skipper, it's your assistant.

Amberly - 00:34:05:  

I love that.

Molly - 00:34:06:  

Awesome. Well, this has been such an insightful and incredible conversation. I wanna switch gears a little bit and do a quick round of our recurring segment, Current 101. So we ask all of our guests the same question and hear their responses. So the question to you, Amberly, in your experience, what is one trend or practice in market research that you would like to see stop, and what is one thing that you would like to see more of? 

Stephanie - 00:34:33:  

And you can do UX research since that's your area.

Amberly - 00:34:37:  

Yeah. Yep. So I'm thinking, you know, there are a few things that I'd like to see stop, but I think, you know, for me -

Molly - 00:34:44: 

That sounds loaded. 

Amberly - 00:34:45:  

Well, you know, because I think that, like, as a researcher, we notice a lot of things, right? You know, I think treating research like it's a one-off checkbox instead of a continuous feedback loop, right? 

Molly - 00:34:59: 

Yes. 

Amberly - 00:35:00: 

It's not a one-and-done. The future of research really lies in continuously building on what we're learning.

Stephanie - 00:35:08:  

It does.

Amberly - 00:35:09:  

If each project is like an ad hoc kind of one-off, goes onto the dusty repository shelf, and it, like, never sees the light of day, how was it truly useful? My team laughs every time I say this, but I use this a lot, like, to me, research is the tree that fell in the forest. People have to hear it for it to have happened. If nobody is using my work, I have a hard time, and maybe this is my startup mind, but why am I here? You know, I have my ROI fully rely on the message being used. So, you know, I think that, you know, using research and kind of inserting it into more places is something that really is meaningful to me. So stop treating it like it's just something to be checked off. And then one thing that I'd like to see more of is, you know, I'd like to see more embedded researchers directly in product and design teams.

Stephanie - 00:36:08:  

Love that.

Amberly - 00:36:09:  

I've been fortunate enough in my career to work in that Atlassian Triad model where you have product design and research and development kind of all in the same room. In that environment, I learned so much about, you know, what it takes, you know, for everyone in that room to have kind of competing priorities and still figure them out on a road map, but also, you know, so many of the intricacies of product handoff or even, like, the slight development issues that come around with, you know, design to coding and things like that. You know, I think that when everybody has a similar level of context at every stage, it just creates a super-efficient team. And, you know, I think that something like that, combined with the technology that we have available today, would be really powerful. So, eliminate the silos, make the table bigger as far as I'm concerned.

Stephanie - 00:37:07:  

Cosine. Hard cosine. I love that. Well, to close this out, Amberly, you've spent your career, as we've noted, empowering teams to use data with confidence. But I wanna ask you about your confidence. For the researcher listening, who's maybe a bit more junior, who's sitting on a finding they know or they feel is important, but they're maybe afraid to challenge the status quo or a senior leader, what is your best piece of advice for sort of finding that voice and speaking that truth?

Amberly - 00:37:40:  

You know, I think that my best piece of advice is to remember that your job as a researcher isn't to just collect data, and you're not there to challenge authority with something that you've uncovered. Remember that you are there, and the company has made space for you to be there to understand and represent the voice of the customer or of your user. So when you're sitting on a finding that you know matters, always remember, if you can't find the confidence within yourself to find the confidence to advocate for your user, because that's what essentially, at the end of the day, everybody that you work with is kind of relying on you to do, right? The closer that we can get to making the perfect thing for our customer, everybody at our company will thrive, right? So, really understand, you know, who you are fighting for. It's not yourself. That, to me, always empowers me. But, also, you know, kind of just remembering when imposter syndrome sets in and you're like, “Why does this matter?” Almost use that energy. Like, I grew up in a family where we would have those dining room table debates. Sometimes, questions that you ask yourself can help you strengthen your argument, right? So use that self-doubt to say, actually, can I be confident in what I'm saying? You know? And make sure that there are no holes in your argument, because after going through that kind of analysis, and I think I can, you know, assume that we all kind of have the same way of doing things in our brains as researchers on this podcast. You know, if I analyze my thoughts and I can be pretty confident in it there, and I know why I'm doing this work and who I'm doing it for, it's really a lot easier to kind of feel fearless and really remember who and why you're doing this work for, you know?

Stephanie - 00:39:38:  

Yeah. That is such great advice. And I don't think we talk enough about how what a powerful advocate a researcher ultimately is for a customer or a user, because we're so used to having to be in that space of, “I'm not biased. I'm presenting unbiased”, but you are an advocate for that voice. You don't have bias around what they say, but it is your job as their advocate to represent their evaluation and their beliefs and such. 

Amberly - 00:40:07:  

Very powerful. I love that. Absolutely.

Molly - 00:40:10:  

Yeah. That's gonna be something, Amberly, that sticks with me is framing research as a service to your customer, as a service to whatever your business is, you know, your customer that you're trying to serve, that you're trying to empower, that you're trying to make their lives easier or better in some small or deeply meaningful way. So that's gonna stick with me. I love that for you. 

Amberly - 00:40:31: 

Yeah. You know, Prudential's been around for a hundred and fifty years. So, you know, we've started off as, like, a fund for widows and orphans, you know, so our company message has always been about helping people. It's been really fun to participate in it in this way, and I feel like it's really easy here to do that. But yeah, you know, at the end of the day, that's who we're advocating for. So it's not really about us. 

Molly - 00:41:00: 

Yeah. Yeah. I love that. What a wonderful way to close this off.

Stephanie - 00:41:04:  

Yeah. Thanks so much, Amberly. This has been really insightful, and we appreciate it so much.

Amberly - 00:41:09:  

Thank you for having me. I've really enjoyed it today.

Stephanie - 00:41:13:  

The Curiosity Current is brought to you by aytm. To find out how aytm helps brands connect with consumers and bring insights to life, visit aytm.com. And to make sure you never miss an episode, subscribe to The Curiosity Current on Apple, Spotify, YouTube, or wherever you get your podcasts. Thanks for joining us, and we'll see you next time.

Episode Resources

  • The Curiosity Current: A Market Research Podcast on Apple Podcasts
  • The Curiosity Current: A Market Research Podcast on Spotify
  • The Curiosity Current: A Market Research Podcast on YouTube