Episode 2: What’s Real Anymore? AI & Sample Integrity with Rick Kelly

 

In an era of rising automation and declining confidence in sample quality, how do we stay human-centered in research?

Cynthia Harris, sits down with Rick Kelly, Chief Strategy Officer at Fuel Cycle, for a timely conversation on fraud, authenticity, and the evolving role of AI in the insights industry. Together, they unpack the persistent trust issues plaguing sampling, challenge some of the biggest misconceptions about the industry, and explore why AI won’t replace researchers—but will transform how we work.

Rick also shares his personal journey into market research, why he’s energized by lifelong learning, and how he thinks about the future of “good data.” The episode wraps with a rapid-fire round that reveals a few unexpected truths about Rick and where the industry may be headed. If you’ve ever found yourself asking, “What’s real anymore?” this one’s for you.

Your peek behind the curtain of the insights industry starts here. Enjoy this episode of Research Revealed!

  • Rick is the Chief Strategy Officer at Fuel Cycle. With a background in political science and an MBA from UNC-Chapel Hill, he blends strategic insight with business acumen to drive innovation. Rick’s passion lies in simplifying complex systems and fostering clarity in decision-making. Outside of business, he enjoys cooking, DIY home projects, and spending quality time with his kids.

  • Hey, everyone. We're back with another exciting episode, this time with Rick Kelly, chief strategy officer at Fuel Cycle. On this episode, we discussed the issues of fraud within the insights industry, the importance of confidence in sample quality, and, of course, we had a chat about AI in the market research industry as well. Listen in. Alright.

    Here we are, Rick. I'm so excited to chat with you today. So thank you for making the time. After meeting you at SampleCon this year, I was just really intrigued by how you emceed the panel that you emceed with two people that I really respect in our industry, and then also just made the time to get to know me a little bit more. So I'm excited to get to know you more on this episode of Research Revealed.

    Yeah. It's an absolute pleasure from my end. I enjoyed the time we got to spend together in person and great to, you know, connect here too. So appreciate you having me on. Awesome.

    And on a Friday, so I'm very excited about that. It's a great way to end the week. Indeed. You know, Rick, I am just intrigued not just by you as a person, as a professional, but also you made a post recently that was just very candid about some of the fraud that's going on in industry. And fraud has been such a hot topic in the insights industry as you know for quite some time.

    This is not a new thing. It's not a new problem. But when you think about the landscape, Rick, how would you have described fraud historically versus how you think about fraud today in our industry? It it's a great question. I think that fraud has been a long standing issue.

    You know, since the rise of online panels, you know, over twenty five years ago, there's always been a consistent issue. And, you know, it's been been been very persistent, and, you know, it's an arms race, and everybody says it's an arms race. And when I look at the, the the state of, you know, kind of general population panel sample today, I think we have a significant problem. It's a real challenge. You know, for example, when we run, like, two factor authentication on mobile devices, we find about a 27 failure rate.

    You know, so there's there's real significant issues. It's a real challenge to deal with. I think as an industry, we probably know how to solve it. The commercial component behind that is not easy to manage, but, I think we know how to solve it. It's just a matter of will and thinking about how we can actually, you know, get in and and actually manage quality problems.

    Okay. I'm really holding on to the fact that you said it's a matter of will. When you say, Rick, it's a matter of will, what exactly do you mean by that? I have an idea of what I think you mean, but I wanna hear you talk a little bit more about, like, the will. What gut is gonna be required to kind of solve this thing?

    Yeah. You know, it's not a cheap or easy problem to solve. But, ultimately, if people are willing to do third party data matching, where you can take, you know, respond and identity and validate it across, like, different set of third party data sources to ensure that, you know, people are who they say they are, they live where they say they live, and those types of things, then, you you can have pretty high confidence in the quality of the respondent that you're getting back. Obviously, there may be other issues, but identity can be verified in a number of ways. And, you know, there's the trade off is is that it's not always as easy to get sample.

    You know, we may pay a little bit more. But, ultimately, I think, commercially, you know and there's something viable there where, you know, our customers, the the industry would be willing to pay for confidence, essentially, and confidence that, you know, sample provided to them is, is gonna be accurate and valid. Mhmm. That word confidence really resonates with me too, Rick, because, I think it was just this morning I saw a post on LinkedIn. Somebody was discussing how we've been in this land of do everything faster but cheaper and better.

    And, you know, there's always a trade off when you think about those trifecta of things. It's hard to do all of those things at the same time. And I'm curious when you think about the trade offs associated with quality sample, how should we be thinking about this as an industry? Do you think about it in an absolute in terms of, like, you know, low cost may mean low quality. High cost must mean that it's, you know, highly efficacious.

    How should we be thinking about the trade offs with sample choices? Yeah. It's a it's a really good question. I think the priority number one should ultimately be confidence. And everything else that falls behind that is, you know, downstream of having confidence in our data.

    The reason, you know, reason why that's the case is because when we're offering up insights to a brand, you know, to enterprise who go and they make decisions, the question is, why are they doing research in the first place? Ultimately, they're trying to make more confident decisions, and that's why they're seeking insight. Otherwise, you could just use executive intuition, domain expertise, kind of overall awareness to go ahead and make a decision independent of the research that we're doing. But, ultimately, we wanna make confident decisions, and we want to have confidence that the data that we're providing to a brand is correct and valid. And so I think we have to set the bar at confidence that our sample is good, confidence that our methods are good, and then let the commercials fall out behind that.

    And, you know, I don't necessarily think that good quality sample has to be incredibly expensive. Doing some of these identity validation techniques, they're not particularly expensive. Some of them can be, especially for, like, b to b type things. But, for the most part, it is a it's a problem, I think, as an industry we could probably be solving better. Yeah.

    Always room. I think that's what attracts so many of us to this industry is there's always room to make something better. And I do think that we're at a critical juncture, especially as it relates to sample, and it's gonna take all of us holding hands and figuring out what the next solution is. One thing we think about at eight twenty eight a lot, Rick, is how clients are gonna use the data that we collect for them, to make a decision. Sometimes it looks like we need to completely refresh our brand, and, therefore, we might need to pay a little bit more to get, like, really high quality sample.

    There are other times when they're looking to just validate an idea. Right? And so I'm wondering, do you think about efficacy of sample differently based on what a client might be using the information for? And how might you encourage people to have those conversations in terms of deciding what level of efficacy they might need in a sample? Really, like, everything we do should be fit for purpose.

    Right? And, so there has to be a lot of things that are fit for purpose and whether you're trying to use, you know, a synthetic, audience to screen ideas, you know, ahead of actually doing human testing or you're trying to, you know, make a big business decision. I think there's a lot of benefit to doing things really quickly, you know, really fast, and doing essentially more testing. But, you know, ultimately, it comes down to the trade offs and your exposure to risk, from a decision making perspective. And so if you have a lot of downstream risk or the risk is going to be too high, then you need to be really ironclad on your methodology, your approach, you know, your sample.

    But in many cases, if you're looking to try and, you know, make really quick kinda minute decisions that are tactical in nature, you can have a little more risk tolerance. I I think that we do need to have good quality data in our datasets for every decision we make, whether it's tactical or high level and strategic. But in many cases, you know, the approach in general is gonna be very fit for purpose, and that purpose is really about the level of risk and exposure that a brand has downstream. Makes a lot of sense. What about AI?

    Let's be honest. Every conversation these days, at some point, has AI as a component of the conversation. At least I feel that way. You know, I was at a conference recently and probably 50%, if not more, of the presentations were talking about AI and the implications of it to our industry. When you think about AI and all the possibilities that come from a large language model or even generative AI, how do you think about that relative to sample, innovation and sample future?

    And I I'm I'm not just thinking of generative AI, Rick. I'm also thinking about things like synthetic users, for example. How do you think about sample relative to some of these other options that are coming online here? How do we kind of think about those altogether? It's a great question.

    So you should know that my bias is that I am an AI maximalist. I am absolutely a believer that AI will transform our industry. It'll be, you know, I think it's catching up, and it's gonna be we're gonna see a wave of adoption over the next couple years. Like, we've never seen anything. I think when it comes to AI, whether it's traditional machine learning and coupled with, like, generative AI, there's a huge range of options.

    It it ultimately, what would allow us to do is to do more research to make more you know, kinda make more confident decisions because we can use existing datasets to make predictions around concept breakthroughs or the breakthrough of an ad or something like that. You know, when we couple that with the the way that AI will flatten a lot of, like, the research processes that are done today, I think there's huge transformative potential. When it comes to, you know, when it comes to synthetic sample and things like that, obviously, it's a category that, I'm very interested in. Ultimately, I think that, you know, the closer that we understand a job to be done, whether it's looking for a, you know, concept breakthrough or trying to predict, you know, predict the the the quality of an ad or the resonance resonance it's going to have. We're gonna be able to test more up top of the funnel and then move things down to human testing.

    You know, someday in the future, AI models will be engaging in transactions with each other, and that's gonna change, like, the way that we do research. But as long as humans are making decisions and humans control, like, a lot of the purchasing power, then I think it's really important that we always validate with human subjects. And if that ever changes, we can change our model. But synthetic samples can be really good for top of funnel and idea generation. What, Rick, if anything, will we miss in a world where all of our bots talk to one another?

    That sounds somewhat silly, but not that far away if we're being honest. Yeah. You know, I have somebody in my life who they have been posting things on LinkedIn, and they used to never post things, Rick. And I'm like, you're definitely using ChatGPT. I can tell that you're using ChatGPT because I obviously look at this stuff all the time.

    And, just a couple of days ago, they text me and said, oh, now I can tell when people are using generative AI on their posts and things like that. So it was an interesting thing to see his reflection go from denial that his posts are generated by an LLM, which, again, I know this person well, and I know he didn't write those articles. But it's interesting that it just within a couple of weeks, his whole attitude shifted to, almost like, I don't know that he would describe it this way, but he almost feels like he's mastered this thing right in a couple of weeks. I'm just curious as a posture as professionals, how do you expect that we will need to shift given, you know, the introduction of all of these tools that we didn't have when I started twenty years ago. How do we need to show up differently?

    What should our attitudes be as an industry as we embrace this new technology? Yeah. So there's a lot of noise, and especially in content generation, there there's so much more noise, as a result of AI. Right? Because you wanna generate a blog post, you can generate a blog post.

    Authenticity becomes, like, a key driver of acceptance and belief, and we can tell something that is, you know, AI generated. The authenticity and, like, the kind of emotion you have around something goes away. The interesting thing is I I think I brought this, this example before. Actually, at SampleCon, there was a paper published in Nature, which is a great journal. I think it was November of twenty twenty four.

    And they they evaluated poetry that was written in the style of authors by AI versus actual poetry written by some famous poets. And they asked people to evaluate the the quality of the poetry. These were, like, English majors, like, domain experts, people who actually understand poetry. What was interesting is that they rated the quality of the AI generated poetry as higher than that of human authored poetry. But when they knew or were told that AI had written poetry, they rated it as lower quality.

    And so it's not that AI can you know, is is going to necessarily produce worse results. In many cases, it'll be better. But in many cases, I think authenticity for professional and just kinda being who you are and having, you know, kind of a a persona and knowing what you're about is going to be a key factor to cut through noise in general. Mhmm. Makes a lot of sense to me, Rick.

    I I was just telling a client recently, actually, to envision essentially their their favorite musical artist and think about reading their lyrics on a page, right, which might move you potentially if the lyrics are associated with something you're doing in your life. Right? But then think about listening to that same artist on the radio or on your streaming account. Right? So you might get a different emotion because you're listening to the music and you can hear the composition and the emotion.

    Right? But that doesn't compare to being front row at the concert. Right? Like, you get a totally different visceral experience when you're there, and Yep. That's how we at least think of the continuum of options that we can deliver for clients.

    Do you think about these things in a continuum as well? And if so, how so? Yeah. I mean, I think so. Like, I think AI generated music is already a pretty significant source of streaming on, platforms like Spotify.

    It is a big thing. Value doesn't necessarily accrue to, like, streaming. You can generate some value, but it's the authenticity of an artist that essentially allows you to go out and to tour, to have people show up to, you know, one of your concerts and to enjoy something, that's where the monetary value ends up going. And so, there's definitely a spectrum. And, again, it kinda goes back to that fit for purpose concept.

    You know, if I'm just looking for to be entertained or to listen to something while I'm driving down the road, you know, maybe AI generated music is gonna be just fine. But same thing, it doesn't compare to standing and dancing with 10,000 other people at your favorite artist show either. And And I think it's the same thing with same thing with insights too. People, I think, love to talk about whether AI will replace researchers. As of today, there's no line of sight to, you know, replacing in person qualitative research.

    Like, I can't see that going away. Will we end up doing a lot more qualitative research with AI? Absolutely. But that doesn't mean that, you know, the things that are in person and very hands on are gonna go away anytime soon. I sure hope not because that's core to our business.

    You know? And frankly, it it is a big bet for me and my team. You know? We what we're really good at is taking people into the visceral experience. We spend a lot of time thinking about how do we get people as close as possible to the front row seat at a Kendrick Lamar concert, which is top of mind because I was Yeah.

    Just at the Kendrick concert a couple of weeks ago. And Oh, nice. It was incredible. Highly recommend if you get a chance. It's just a really cool, show.

    But what I walked away with inspired by beyond Kendrick was something shocking to me. It was looking at the crowd around me, which looked different than I expected it to look, and I would not have known the diversity of the people that love this guy, you know, if I was not there in that room at that moment. And I think about research a lot that way, especially qualitative. You know, if you are not in the room at that moment, you might miss a whole lot of context about a movement, about a thing. And so I do think it's such an interesting time for us to figure out the balance between, you know, technology and all the good that it can do for us because we certainly believe technology does a lot of good, but not losing that humanity, not losing what actually can help us kind of remember the data points that we're generating.

    So, super exciting. Definitely go to Kendrick. Absolutely. Mhmm. Absolutely.

    Well, Not Not Like Us is one of my favorite pump pump tracks. So I I would say I listen to it at least once a day. Yeah. It was yeah. When I tell you Mercedes Benz in Atlanta, Georgia that day, I thought the roof was gonna blow off.

    It was incredible. So if he's close to you I think he's ending in LA, but we'll talk about that after we're done. So, Rick, you know, again, what I was super attracted to a couple of weeks ago, you posted. I know you guys got caught up in what we all have read about at this point. There was a really large fraudulent case that kind of made the airways a couple of weeks back, which there are so many people in the industry that were impacted by this unfortunate reality of a firm who was producing fraudulent sample.

    Yep. What you did, I thought, was really commendable, Rick, and you took, accountability and reached out to your customers, which I'm certain other people did, but you talked about it very publicly on LinkedIn. And, I would love to learn a little bit more about your decision to do that And, you know, why was that important for you to say something out loud on a social platform like LinkedIn? I don't know that I have, like, a good train of thought or decision making around it, but I was pretty pretty frustrated and pretty annoyed by the whole thing. Obviously, it's a huge issue, and it's impacting the livelihoods and confidence of so many, you know, more people that were employed by those firms that had confidence in what they were selling, you know, like, those types of things.

    I I feel devastated for the people that were, you know, really, really impacted more than I was. Ultimately, we did a favor to a customer of ours who really asked that we, you know, spend they they essentially selected the vendor, and then we had some exposure where we had some funds allocated to us. And we spent a total of $4,000 on sample from Slice MR. We identified, you know, tons of issues there, and we were really surprised when our name showed up on a victim list from the, you know, US attorney. We are petitioning to have our name removed from the victim list just because we feel like it's it's important for us to give us such a small thing, and probably the business exposure has been has cost a lot more than $4,000 to us.

    But I think, you know, in in general, it goes back to the authenticity comment where there's lots of ways we could try and message it and share it, but I'm a big believer that, you know, kinda direct to your audience and being very open when something bad happens is going to engender trust. And so after, you know, three days of kinda answering customer questions day in and day out, I was like, you know what? I'm just gonna put this out there and just kinda let people know where we stand. Yeah. I certainly have a lot of respect for you for doing that.

    I know I'm not alone because there's a lot of engagement on that post specifically, and I'm, certain and I'm hopeful that your clients even saw that as a a positive thing too. I think authenticity has always been important in insights, but I agree with you bringing this up time and time again because I think it's only gonna be more important in the future because I'm certain that won't be the last time No. Over the next, at least, our working time. You know, I'd love to retire next year, but that's probably not gonna have it. So it's probably not gonna be the last time we see something like that, and I just have a lot of respect for how you handled that.

    Well, I I I appreciate it. I appreciate it. I am who I am and wanted to share publicly, and so I can't say there's a big strategic thought behind it other than when something happens to you in public, I wanted to respond in public. Let's talk about your personal journey, Rick. How did you even get into this world?

    What was your start, and what keeps you passionate about this space? I don't have, like, a linear journey. I've worked in market research, worked professionally for, you know, like, fifteen years or so, which is exciting. And I love our industry. I think it's a ton of fun.

    The truth is is that I was, I was coming out of grad school during the recession, the, you know, the great recession. And, ultimately, what had happened is I had a job lined up that fell through. And so a couple weeks before graduation, one of my professors introduced me to, somebody who ran a sample company. And so within, within a couple weeks, I was starting a job in market research. I had no idea that I would love it so much.

    But I think for me personally, you know, it ends up being something that where there's a lot of, complexity, there's a lot of, like, new things to learn. And if you want to be an expert in qualitative research, you can. And there's, like, a whole career that can be built around that. If you wanna be a discreet choice expert, you can go learn something. And so for somebody who's naturally curious, I think think it works really well.

    I've been at fuel cycle now for about eleven years, and it's been a it's been a great kinda journey for me where I just am able to constantly learn every three months or so. I look back, and I'm like, wow. I actually can't believe how much I've learned in the past three months. And so it it just it really scratches an itch for my brain, and I can't imagine working in some place that that didn't do that. Very cool.

    I I just was in Whole Foods earlier this morning, and I picked up a magazine about whiskey because it's relating to something that we're working on. And I'm like, this is why I love what we get to do because there's always something else for us to learn about. And, you know, it sounds like you have that same curiosity gene and excitement to learn things about the world around you. Yeah. What makes people tick?

    It's a, endless curiosity. Love it. That's amazing. When you think about the future of our industry, you know, I would love to know what keeps you excited about still being a part of the space, especially given, you know yes. There's a lot of exciting things happening, but let's just call it spades bait.

    There's also a lot of pressure happening in the insights industry. What's gonna keep you motivated to stay in this space for however long you plan to continue to work, Rick? Very deep question. Let me see if I can tackle this. I I think that from a professional perspective, I love working with Dana.

    I love finding meaning in things. And so, you know, that really drives me and motivates me, and it scratches that curiosity edge. I think they're we're gonna go through a huge inflection point over the next, you know, call it five to ten years. And being part of that sounds really exciting to me. You know, I want to be part of that.

    The other thing I would say is you alluded to people that I was on a panel with at SampleCon recently. I just feel like, our industry has some of the greatest people. Some of my closest friends in general are people that I've met in the industry. And so the the ability to work with people I like, to know people well, you know, to be able to build something, is really fun. And so combination of I like the general genuineness of the people that we work with, and it also scratches a professional edge, especially as you go through a a big transition over the next few years.

    Mhmm. Makes a lot of sense. Is there anything that you wish people knew about our industry that you think is a current misconception, Rick? Oh, that's a really good question. I know.

    I'm coming with a fire on a Friday. I don't know what's gotten to me. Yeah. Yeah. I would say that, you know, people often associate market research with focus groups and surveys, but there's so much complexity to it.

    And the ability to accurately, you know, predict outcomes and to provide really meaningful data is much more than, like, a general, like, CX survey that somebody gets. And so there's been huge impacts, you know, that market research has had on businesses everywhere. And, that comes with, like, a lot of complexity and know how and understanding that's beyond just kinda simple filling out a form. And so there's there's endless things to learn. And, you know, probably from the outside, you don't necessarily see all the, complex decision making that goes into doing good research.

    So it's true. Like, picking up magazines in the middle of a Whole Foods. You know? Exactly. It's not a part of our research design, but I really think all the time, Rick, about how data's all around us.

    And sometimes on walks, we're picking up data, and we can use that in our in our work, which I think is a privilege to do this work. Alright. So I have, one more question, and that is when you think about what clients don't ask you or what colleagues in industry do not ask you, what do you wish we talked more about as an industry? What do you feel like is missing in the conversation? I think there's some foundational things that we're missing as an industry.

    Probably, in many cases, you know, you have people who come out of college, they come out of school, they come into the industry, and they have a kind of basic foundational understanding of statistics and everything else. And and many things, are learned kind of by tribal knowledge. And so having a deep understanding of why we use p values and why you use a t test rather than, like, a z test and those types of things, I think are really foundational. You know, there's outside most commercial market research, there's wide ranging debates around, like, the validity of frequentist, interpretation of data versus, like, Bayesian interpretation. And I think that those, you know, those things are really important where we should prob like, in my opinion, we should probably be thinking more like Bayesians rather than frequentists, but those types of things don't get brought up all the time.

    And so it goes against the grain. It's it's a bit of a challenge because there's a lot of, like, almost philosophical challenges in the way that we actually use and interpret data. But, ultimately, you know, I think that those are the types of conversations I think we should be having as an industry rather than just about sample quality. Hopefully, we can solve sample quality things, and then you kinda move to even more foundational and kind of philosophical underpinnings of what our industry is built on. I agree with you.

    I think, you know, we've been having the sample quality conversation for quite some time, and I don't think we're ever gonna not have that conversation. But it does feel like it's time to start introducing some new topics. Thank you for that. So we end every episode, Rick, with a couple of rapid fire questions. There's no right or wrong answers.

    We just wanna know what comes top of mind for you. So the first question is, what's the first word that comes to mind when you hear market research? Data. Data. What's one buzzword in research that you wish we could retire?

    Synthetic. Name a a brand or a company or a movement that just gets connection right. Oh, this is a good one. I'm not a gamer, but I feel like the gaming industry really has figured out, like, how to connect with their audiences in a really authentic way. Now it doesn't always feel authentic, but observing, you know, gaming companies and working with some of them, you see the level of authenticity that they have to bring to, you know, player communities and things like that.

    I think that's a that's an important thing. And if I can follow on with the second answer, I would say Taylor Swift. I'm a big Swifty, myself, and, like, the ability to scale connection is is pretty impressive. Agreed on both fronts. And you're reminding me I have to buy my nephew some Roblox.

    Or they call it I think they're Roblox for Roblox. Roblox. I'm I am a 12 year old, so I know. I love it. Alright.

    So would you rather have more time or more budget on a project? Oh, time. Always. And this is a fill in the blank question. The future of market research is AI.

    Name one book, podcast, or article that everybody in our industry should read. My favorite podcast podcast is done by an economist named Tyler Cowen. He writes at Marginal Revolution, and I I would recommend Tyler Cowen's, podcast. Okay. I'll have to check that out.

    And then what was your dream job as a kid, Rick? I wanted to be an inventor. Like, I enjoyed engineering and dreaming up ideas, and I'm fortunate to be in a position where I get to implement cool stuff all the time. Awesome. And finally, do you drink coffee or tea?

    I mostly drink coffee. I drink a lot of coffee. That's awesome. It's so necessary in the insights world. Coffee and wine, I might add.

    I I I agree with you there. Well, Rick, thank you so much for taking the time. I am excited that we're now connected and, hopefully, that we'll have more conversations outside of this, but hope you had a little bit of fun too. Yeah. Thank you so much, Cynthia.

    And for people who are coming to the podcast for the first time, I would endorse going back and listening to Cynthia's other episodes because she's done a great job. Well, thank you. That's so kind of you. Well, let's get to the wine, Rick. I'll talk to you soon.

    Take care.

 

Episode created and produced by Cynthia Harris and Emily Byrski of 8:28 Insights.

Score provided by Swoope and Natalie Lauren Sims, friends of the 8:28 Insights Collective.

 
Previous
Previous

Episode 3: Unapologetic Insights & the Power of Black Narratives with Pepper Miller

Next
Next

Episode 1: Creativity, Data & Emotional Truths with Donovan Triplett