The chances are you know customer research is a helpful tool to grow your business but when it comes to actually doing effective research it can feel intimidating and easy to deprioritize. Today I’m joined by the former head of Research at CopyHackers Hannah Shamji to chat about the research playbooks DTC brands can use to go beyond the numbers and start capturing the voice of their customers.
Listen to the episode above, or check it out in your favorite podcast app.
Quantitative data is tidy and attractive, but messier qualitative data can tell you how to move the needle and manipulate the current picture.
“I think quant is very tangible, right? It's so easy, it's very neat and tidy and you can see a number and react to a number on a scale. It's just very accessible. And I think that makes it feel more robust. Like, "Oh, I've actually distilled all of these findings down to something." But once you have that, something, it's kind of flat. Like how do you move it? How do you, like, why is it that number and not a bigger number? … you do have to kind of like, peel back that curtain and go behind the number layer and look into like, what I'm calling, I call it like "The messy data" right? The stuff that like, might not always fit into a nice little bundle, but will actually help you change the quantitative, change the numbers.”
The key lies into distilling qualitative data into digestible chunks, so that quant and qual eventually become two halves of the same whole.
“there is tremendous value in being able to distill qualitative data into something that closely resembles quantitative. Like you can, to be able to shrink it down into something bite size. So it is meaningful. So it might be able to, not necessarily fit on a dashboard, but be easy to reference, digestible, useful, actionable. That does require you to take the messy and make it not messy...I always try to broach it in two. Like, ‘okay, well quant is half.’ So how do we get the other half?”
Brands just starting on the qualitative journey should put simple surveys in place, with the final question asking if they are willing to be interviewed.
“I typically like to start with surveys just because it's an easier - I think it's more comfortable if a brand is not as familiar, or used to, or hasn't done interviews. And then, at the last question of the survey, tee it up to have folks join and do an interview…you can have follow up questions in a survey, but in an interview you can explore some of your hypotheses without biasing the interview in a way that you just get a lot richer insight.”
By collecting a baseline understanding of your customers, you can ensure that your messaging, copy, and positioning are on track.
“why are folks buying, right? That's going to be your biggest bang for your buck. Get you some messaging and some clarity on like, is your value prop synching up with people? Should you optimize your copy in any particular way?... So when someone just signs up to your list, you can ask them ‘What made you sign up to the list?’ Of course, a little more nuanced than that. Like, ‘What made you sign up for, what made you look for X or X is perhaps the product or the service?’ And what that'll do is help you get a sense of who are these folks on the list. What are they looking for? What do they want? And you can use that to really start to match your messaging, understand who your reader base is.”
Catch customers while they are fresh off of a critical action, such as signing up, making a purchase, or cancelling.
“a cancellation survey is really, really powerful. You're looking there at what made you decide against this, right? Then if something that you can fix, if it's something that's like, ‘Oh, well, why don't we just make sure we include this information up front?’ and we can eliminate or minimize a cancellation or churn.”
“response rate can be really high if you put these surveys on the back of an action that they're already taking. So like, they just signed up and as you're thanking them, you ask them, ‘Hey, by the way,’ it minimizes the need to recall. You're not making the survey an entirely separate event, which can feel very similar, like work...and you get much more accurate insight cause they're not thinking about what they're doing. They're right in the action in that particular moment.”
Keep questions short & simple, bold/capitalize important information, and write like you talk.
“Do not mix two ideas in one question. Even if it is technically a compound question, so they do tie together. You want to keep it really, really simple. And if that means two questions, go for two. So one actual question that you're asking per question, really simple language, like exactly what you're saying conversationally. If you wouldn't say it out loud, maybe don't say it in the survey. You don't want people stumbling over the words. And I think sometimes, I'll be really careful of bolding or capitalizing a word that is pretty critical to the sentence.”
Initial questions should prompt the reader for simple facts, while later questions should tease out deeper insights.
“if there's a decent chunk of time between the event and then you asking about the event, the survey needs to sort of act a bit like a prompt. So you start, you want to always start with like really easy questions, right? Like kind of factual, reference point. You're not getting them to think. And then over the course of the survey, they might start to remember things that you're not necessarily asking about, but that would be helpful to know, or they want to share, and that question can really be them like - it just becomes this bank of insights that you kind of just got as a bonus...people will really, really share if you give them the mic in that last question."
Set up two surveys: an initial survey and cancellation survey and keep them static. Then, dip your toes into doing interviews using that pool of survey takers.
“I would have them start with those two surveys. So it could be an opt in, so either on the thank you page, or as you mentioned, the welcome email, and pop into the cancellation survey. On the end of both of those I would ask for folks to sign up if they're interested in talking more. And then I would leave them like static. I think those are really, really good placeholder surveys that you would just want to have, and be looking at the data pretty regularly. And once you get those interviews coming in and interview requests, I would start booking right away.
"What are those conversion points that are lower than you like, or that you really want to emphasize or focus on? And I would speak to those in the interviews. That's going to be a really good place for you to, rather than kind of getting lost in which survey at first. Just have those static pieces or static surveys, and then use them to set up these conversations.”
While an interview may start out slowly and with short answers, keep pressing. Both you and the interviewee may end up surprised by the deeper insights that emerge.
“the surprising factor has been, for me, and maybe continues to be, is that if you just sort of stay with people in an interview in particular, they share a lot more than they even think they were going to share. And sometimes even I think that they were going to share in the beginning, often to the point that they will have this moment of like, ‘Oh, I didn't really think about it like that.’ or ‘I didn't realize I do that, but yeah, that's kind of what's happening.’ But it does require you to really dig and ask the follow ups and kind of chase your curiosity there.”
Speaker 0 (0s): Quant is very tangible, right? It's so easy. It's very like neat and tidy and you can see a number and react to a number and a number of points on a scale. Like it's just very accessible and they think that makes it feel more robust. Like, Oh, I've actually distilled all of these findings down to something. But once you have that something, it is kind of flat, like how do you move it to, how do you, like, why is it that number and not a bigger number?
Speaker 1 (30s): Hey, I'm Stuart Balcombe and this is the DTC Voice of the Customer podcast. Join me as I go behind the scenes with top DTC eCommerce operators, to understand how they leverage the voice of their customers to drive sustainable growth. We'll be chatting about how they capture thoughts and feedback from customers who owns the customer journey, what tools they're using and how they translate, what they know about customers into product, to marketing The resonates and drives revenue, whatever you want, actual tactics to capturing the voice of your own customers, or just wants to see how other top brands are doing things.
This is the podcast for you. Chances are it, you know, Customer Research is a helpful tool to help grow your business. But when it comes to actually doing it effectively, it can feel intimidating and easy to deprioritize. Today I'm joined by former head of research at Copyhackers Hannah Shamji to chat about the Research playbooks DTC brands can use to go beyond the numbers and start capturing the voice of their customers to really get in the deep end. I'm going to kind of start with a question that I know that you have a lot of experience.
Well, so I know from the work that you do, you're primarily focused on servicing qualitative customer insights for a DTC or really any, any business. They may be a much more familiar with quantitative data. So why is qualitative data is so important when most teams are focused on, on Quan?
Speaker 0 (1m 56s): I love this question. It makes me feel like shake my fists internally. I think a good Kuan is very tangible, right? It's so easy. It's very like neat and tidy. And you can see a number and react to a number and a number of points on a scale of like, it's just very accessible. And they think that makes it feel more robust. Like, Oh, I actually distilled all of these findings down to something, but once you have that something it is kind of flat, like how do you move it to, how do you, like, why is it that number and not a bigger number?
Or why is it so big? And we didn't do anything different. So knowing how to move the needle on rate on percents, on conversions, on sails, you do have to kind of peel back that curtain and go behind the number layer and book into like what I'm calling. I call it like the messy data, right? The stuff that like, it might not always fit in to a nice little bundle, but will actually help you change the quantitative changed the numbers. So without it, if you don't have the qualitative, you know, you're kind of stuck and, and shooting in the dark to figure out like, well, how do I change the quant?
You do need both. But yeah, I do think that like the sex appeal is so much bigger with porn, for sure.
Speaker 2 (3m 12s): Yeah. That's it, it's so addressing that you call it the messy data, such a good name for it. I've definitely had people ask me or like, you know, one too, sort of try to find a way to put qualitative data on a dashboard or like how, how do we report on this thing that is not, like you said, it not sort of defined in the same way that sort of a single number might be. So for somebody who's sort of in that mindset of like, Oh, well it won't fit on the dashboard. Or, you know, it's like, is it going to be enough data? Is there, can we actually trust the data if they're on, you know, the same number of data points that we have with sort of a quant data?
How do you handle that? Objection, sort of, how do you talk about positioning the value of qualitative data?
Speaker 0 (3m 53s): Yeah, I think that's a really important question. I think it sort of made me, I'm going to go on a tiny little tangent.
Speaker 2 (3m 59s): We can go on a tangent. I love it.
Speaker 0 (4m 4s): I think there is a tremendous value in being able to like distill qualitative data into something that closely resembles quantitative, like you can do to be able to sort of like shrink it down into something bite-sized so it is meaningful. So it might be able to not necessarily fit on a dashboard, but be like easy to reference digestible, useful, actionable, like that does require you to take the messy and make it not messy and more closely kind of something that you can pair with, like the numbers and the rates and metrics KPIs.
But yeah, I guess I always try to think of it in, I always try to broach it in to that like, okay, well quant is half so WWE. How do we get the other half? How do we understand what leaders to push and pull to change sales, to understand sales? And they think the way I find success is like, I'll just start asking questions with a client. So like they might show some data and show some insight. And when they start to ask a bit more about the why, and like, well, what did you do to get that?
And what's different between, you know, a month to month, its sort of like hope comes out some of those gaps on its own without necessarily having to like point at them, which I think is like the, probably the easiest and best way that they can kind of see like, Oh, okay, actually we all know where sort of unsure about this. We're trying that. And then it becomes sort of like, okay, well here's like an actual tried and tested approach to fill in those gaps, you know, do whatever that strategy is that qualitative research strategy.
And I'm curious to hear your thoughts on this to see. Cause I, I feel like it ends up being much more of a conversation that you just need to have and kind of help them see the value of both and not kind of, you know, more closely tied to one. You know what I mean? Right. I totally agree
Speaker 2 (5m 58s): With that. That it's, it's not one of the other way that they they're both sort of equally important pieces to achieving ultimately to helping a company improve. This is the metrics, right? Like we're not doing, nobody's advocating for qualitative research for the purpose of just doing research. Right. But the point is to, to move the needle on, on some metrics. So yeah. I, I totally agree with you that that's the starting point. It was always, you know, why is that the case? Do you know why? Right. Like, do you have, and often people will have some kind of hypothesis or some sort of inkling that, you know, this might be a direction or it might be a, at least they'll sort of know where the gaps are.
Right. So its sort of gives you a starting 0.4 for a deeper dive.
Speaker 0 (6m 43s): And I think also that like, it is so important to realize like just because the qualitative data gets to like a messy IRAP doesn't mean that it can stand alone. Like also alone. It's not as helpful. Like it does it actually work. Is it actually move the needle or is it changing quantitative data? I mean, without that it's exactly the same case. It was just like a quality of data is like the younger brother that it gets like hidden behind, you know?
Speaker 2 (7m 8s): Right. Yeah, totally. So, so let's I guess to make this a little more concrete, it was like, what does a good qualitative research looked like that for a company? And I know this is sort of a lot of misconceptions or, you know, sort of uncertainty about like how do we actually get started with qualitative research, right? Like maybe I'm pretty used to, you know, most business owners are probably pretty used to quantitative data, whether it's in Google analytics, whether it's, you know, it just that revenue numbers and the sort of sales and, and costs and numbers, but which is sort of a very sort of hard numbers, but what does good qualitative research look like and what does it not look like?
Speaker 0 (7m 46s): I think so I always start with the methods have to come out of like a plan out of a strategy, specifically research questions. So at Copyhackers we had done some work with a client where they wanted to understand why their numbers were changing so rapidly in this particular case, sales are going up, which was great and they want it to be able to hold on to that and replicate it. And so that introduced a specific research question, like what is moving the needle? Is it a new customer base as being attracted?
Is it some aspect of the offer? Is there a particular messaging that they are responding too? Is it the package the product comes in and what that did is help distill what are the areas that we need to isolate and inspect? Like what could those potential triggers be? And once you identify that, then you can start to dig into, well, like, okay, well how do we evaluate that? Is it a survey or is it about looking at perhaps some quantitative data? Is it maybe interviews with particular folks who signed up to a list, but didn't convert versus those who did become customers that will really help you kind of drill down so that the data that you're collecting can funnel back to answering those questions.
Right? Because without that, it's sort of like, I've always found that if you don't have clear expectations up front, it's very easy to sort of miss the target when you do collect the data and have that like disappointment of like, Oh, this didn't really answer what we wanted to know. So to avoid that you wanna, you wanna absolutely have that strategy. You have that fun, your research questions and then your methods, but there are definitely like the go-tos right, that in-depth interviews, it might be online or it could be like field interviews.
I don't know how likely that is with all the things Corona lead surveys, customer surveys. I typically like to start with surveys just because it's an easier, I think it's a more comfortable for a brand is not as familiar or used to it. Hasn't done interviews. And then kind of at the last question of the survey, tee it up to have folks like join and do an interview. But those are really the two prongs that I, I mean, there are other kinds of ethnographic methods.
You could also mind like discovery calls or sales calls and analyze those for some messaging, for languages, for objections as well. But typically you can get like a whole host of that insight from interviews and really kind of fine tuned surveys that help people's sort of self-select from those. So, I mean, those are the two that I really, really lean on. And if, if I get the chance and a client's kind of open I'll, I'll jump through the door with those, which I think is if you open to it a really, really powerful way to start to get hit a lot of areas, a lot of birds and get really deep in a way that like a survey might not get or won't get you there, right?
Like you can have a follow up questions in a survey, but in an interview you can kind of explore some of your hypothesis without biasing the interview in a way that you just got a lot richer insight. Does that helpful? I kinda went on like a lot.
Speaker 2 (10m 58s): No, that's, that's great. I want to call out sort of two things that you mentioned, that one that I absolutely love that I would suggest that everybody go do sort of right away, right. Is including that last question in your surveys to get to opt into interviews. Right. And I think this is something that we, you know, maybe we'll talk about separately as well, but having a method for continuous recruitment is such a powerful tool because it means whenever you do have questions, whenever something does pop up that you want to, to be able to ask your customers about you sort of already have that sort of scheduling mechanism in place.
So that's, that's a really great one. And the other thing that I know that you brought up there is sort of time to first insight is pretty important, especially with a new you're working with the clients, but even if you're just doing qualitative research for the first time yourself, right. Not sort of not trying to make too big of an investment and waiting weeks or months or whatever, to, to get some data so that you can sort of start to see, right. Like just in that first interview, like, I mean, you can do it like we're doing it all the time.
It's very rare if ever that I come away from an interview and like, Oh, I didn't learn something. Right. So just doing the first one is such a big staff, sort of the Quaker that you can get. It just got on the phone with somebody, the better of it. One thing that you mentioned that is that sort of surveys and interviews being the sort of go to methods. So when do you want to dive deeper? A little bit on surveys? So you said you typically start with that sort of first, but I know you, you have a project that you're working on to do some survey, tear downs, sort of talk about, you know, what, what makes up a good survey and sort of what are some things to avoid.
So when it comes to surveys, do you have a particular, I'm sure this all goes back to the research questions that you mentioned that you start with anyway, but do you have sort of a typical like starting point survey? Like if somebody is running nothing, Today, what's typically the first survey that you would, that you would run.
Speaker 0 (12m 55s): Good question. So if someone is running absolutely nothing. I would do one of the two. So the, with the goal here of understanding, why, why are folks buying right? That's going to be your biggest bang for your buck, get you some messaging and some clarity on like, is your value prop sinking out with people? Should you optimize your copy in any particular way? So you could either pop a survey in front of the folks who just sign up to your list. I think I just did it a tear down on this to ha so when someone just signs up to your list, you can ask them what made you sign up to the list?
Of course, like a little more nuanced than that. Like what made you sign up for what made you look for X or X is perhaps the product or the service. What that'll do is help you get a sense of like, who's who, who are these folks on the, on the list? What are they looking for? What do they want? And you can use that to really start to match your messaging, understand who your reader base is. Once you have a lot of, or a decent amount of open source or free text responses, then you can start to kind of ask something that might be more multiple choice or a single choice.
But that's a really, really good one to start. If you want to get a sense of like whose, who is on your list, another one that's like maybe less people might think of this as like an afterthought. But I do think a like cancellation survey is really, really powerful. You're looking there at like, what made you decide against this, right? And if it's something that you can fix or is it something that's like, Oh, well, why don't we just make sure we include this information up front and we can eliminate or minimize cancellation or churn.
Those are two really powerful ones. And the cancellation one in this initial kind of neat signup are both pretty short. Right? And you're looking for a cancellation, maybe one to three questions. And for the opt-in one, one question, maybe two, which will get you some pretty good data to start acting on. Like you said, that kind of time to incite, you can really kind of get something that you, Oh, like, let me just feed this back earlier into the funnel and, and you're good to go. So those are the two that I would recommend starting with and seeing what you get.
Speaker 2 (15m 10s): Yeah, absolutely. I think that's a great point that, you know, they don't have to go along surveys and you could always add that, that extra, you know, where you like, would he be willing to help on the phone? And it took more about this on the back end of those, but they're also points where people are switching, right? So people, either the switching to you, all the switching from your product or service, and those are typically sort of, they've just hit some point, right? Like there's some of the activation energy that's made them make that sort of decision in either direction and really tapping into that.
I really like how you said sort of helped you match messaging, right? It's it's not a route necessarily righting, you know, quote unquote, better copy it's about writing copy or positioning or packaging or whatever it is you're going to change that aligns with what your buyer is thinking and sort of feeling in that, in that moment.
Speaker 0 (15m 59s): Yeah. It really, really increasing that relevance. I love that you kind of capsulated that in like switching that kind of jobs to be done, sort of like your switching behaviors, switching kind of products, and those are really charged moments. So to capture like, especially right in the midst, I think response rate can be really high if you put the survey's on the back of like an action that they're already taking. So like they just signed up and as you are thanking them, you ask them, Hey, by the way, it minimizes like the need to recall.
You're not making the survey like an entirely separate event, which can feel burdensome or like work and you just kind of capture, like, I love these by the way surveys, you know, like, Oh, since you're here real quick, can I ask you one thing and you get a much more accurate insight? Cause they're not thinking about what they're doing. They're like right in the action in that particular moment.
Speaker 2 (16m 52s): Yeah. People are pretty bad and we we'll get to that. And this is an a, a second, I'm sure that people are pretty bad at predicting the future and I are pretty bad at recalling the past. So yeah, I totally agree at any time that you can meet them in that moment is great, but there's probably people out there who are, you know, in the listening to this who's is running an e-com store or, you know, something where there's a critical next step, which is not a survey, right? Like there might be an upsell or cross sell or a, you know, an opt into something else on that. Thank you page. So one sort of correct.
Or like a tip that I would mention for folks is if you're sending a welcome email, especially if we're sending a welcome email from the CEO, often the founder, or somebody's like high up with some sort of name recognition, just to ask the question on that email. Right. I always, like I say, always go to the train to put it on the thank you page, first of all, the confirmation page first. But if you can, like, what's the next touchpoint that you have and how can you either just ask for a reply to the email or include the survey there
Speaker 0 (17m 51s): Totally it's this, it's just also their kind of the really warm right. They own the list. They are looking for that email and you can really kickstart kind of a conversation there. So yeah. That's awesome idea.
Speaker 2 (18m 3s): Yeah. Th that's this sort of idea of conversational. Research like, you know, Research in the conversation is, is really interesting to me. And I think is something that we're sort of start to see more of, you know, as people want to be treated like people and not numbers and a funnel, you know, bots go through a real process. So let's talk a little bit, I guess, about the things to avoid and in surveys and they don't have to be, you know, in these two surveys that we just, just talked about. It, it sort of generally, like if your thinking, you know, you go to this point, okay, surveys are the thing that I should be doing.
What are some things that you should be not doing when you're thinking about your, you know, designing your surveys?
Speaker 0 (18m 42s): So a couple that are kind of big no-nos in my head to like, do not mix two ideas in one question, even if it is technically a compound question, right? So they, they do tie together. You want to keep it really, really simple. And if that means two questions, go for it too. So one actual question that you're asking per question, really simple language, like exactly what you're saying conversational. If you wouldn't say it out loud, maybe don't say it in the survey, you don't want people like stumbling over The the words.
And I think sometimes I'll be really careful too, of like bolding or capitalizing a word that is pretty critical to the sentence. So for example, a word, like not like if they skip and don't read the word, not is going to maybe change the way they answer the question. So really using your formatting there just to make it extremely obvious that they can skim, they can afford to skim. And I think that, yeah, that simplicity's is key, but also I was actually, so someone on Twitter just commented.
I have a poll on there right now. That's like, how do you, what's your take on surveys? Are they fun are annoying. And I mean, its kind of a facetious and also kind of like if you have to pick one, what would you opt for? And someone on there said that they'd like surveys, but they just kind of get lost. And they think that when surveys start to get high in like numbers of questions, if we're looking at like over five or even at five, if you have a question that is starts to compound or they have to like reference a previous response and remember that in order to answer the next question, you need to really like make it easy for them.
You know, if they have to struggle to remember like, what was the thing that you just asked me about your data is not going to be a strong and their responses. You just introduced some friction there. So really, really keep it simple. Keep it conversational sometimes to the point of like, you might not say it, you know how to actually write it that way, but you would say it that way. I would err on the side of like writing, like you talk in those surveys and really leaning on formatting. Do you have any, I'm curious, do you have any tips that you can have?
Speaker 2 (20m 58s): Yeah. A couple of the big ones for me typically go to a recall. I definitely agree about the form using formatting wherever possible and you know, writing like your talk. I think it is just generally a good thing to do. Being really careful about asking people to predict the future. It was a really a big one for me. I feel like people do that all the time and people just aren't good at it. The problem is that nobody that's no like, you know, and knock on your Customer, like nobody is going to put it to the future.
You can just think about the number of times that you I'm quoting somebody else here. But like, you know, new year's resolutions, right? Like everybody said to them or a lot of people set them. How many are not actually part of the core new here? I think this is a fun, one of your survey data
Speaker 0 (21m 43s): When you guys stole it from, from someone else who I'm trying to, I think I stole it from Rob Fitz
Speaker 2 (21m 49s): Author. Great book. Yeah.
Speaker 0 (21m 51s): But it's so true. Like the calling, I mean the, the level of inaccuracy is just huge. And I think that when you ask a future focused question, it's so tempting, right? Like it can feel really good cause I'm getting, there is some sort of promise that maybe they say something really positive and that thing will come true. So it was very validating, but it's just, it's just really like wobbly stilts in terms of data. So absolutely like hypothetical questions are just steer clear.
Speaker 2 (22m 23s): Yeah. And I, and I think that you can, this sort of ties two to another sort of a broad category of bias, right. That if you're asking future focus questions and then you then follow it up with a question that is, you know, the thing that you are hoping happens, right? Like if we have this idea for a product and we say, how likely are you to use this thing in the future? By the way, if we launch this product, would you pay us for it? Like you sort of compound the compound, the problem. And especially if you're going to use the survey to which ultimately is the goal, right.
You're going to use the survey data to inform strategy or some, you know, campaign experiment or whatever it might be. You can really get yourself in some sort of places where you, you have the data to support it, but it just wasn't data. That was actually a valid and sort of good based on the assumptions that people made.
Speaker 0 (23m 12s): Yeah. And I think it's so easy to sort of forget how sensitive we are as humans. Like, you know how even in like you can read a text and like interpret my husband interprets like feet when I use the ellipses, like the three dots. It, I don't mean it in a negative way, but he'll think it's like laid in with like a passive aggression at the same with like in-person, I think we're really, really sensitive. And we read in like little bites all the time with like the texting was Twitter with posts.
So I think that The being really careful about how you ask questions and phrasing and like the ordering is really, really important to make sure that you're actually getting accurate insight and not just like that feel good hit for the data.
Speaker 2 (23m 57s): Totally. Yeah. And I think the other big one, which are actually, you mentioned earlier when your sort of, especially when you're just getting started, I think this is the risk is to only pay. If you're going to ask a multiple choice questions to only ask the questions or only put the responses that you want as answers. Right. And, and not include an other option, especially if it's a required question, right. You'll get people who will just, it's not really any of these, but I'll just pick one because they want to go to the next step. I want to be done with a survey leaning on open response until you sort of have that subset or that sort of selection group or at the very least included, like always including the other option.
Right. And the interesting thing is that you'll often get things that are pretty unexpected in the other, right. It's the same thing with like, if your, to, even if you have a survey that's full of multiple choice questions, I would definitely recommend adding one at the end. Is there anything else you'd like to tell us? Right.
Speaker 0 (24m 53s): Because,
Speaker 2 (24m 54s): And I mean, people always say like, Oh, well it wasn't like people would just leave us a one word answer, but I know I've seen a time and time again, people will write essays in a text field and that's obviously not everyone, but you got to like take it with a grain of salt, right. To that. They may not be a significant across a larger population, but there is often sort of a pretty rich insight and you feel like people, right.
Speaker 0 (25m 17s): And it's a great catch on fire. I think that you, sometimes the survey will, especially if there is a decent chunk of time between the event and the, and then you asking about the events, the survey needs to sort of act a bit like a prompt. So you start, you want to always start with like a really easy questions, right? Like kind of factual reference point. You're not getting into think. And then over the course of the survey and they might start to remember things that you're not necessarily asking about it, but that would be helpful to know or they wanna share.
And that question you can really be, then it just becomes this bank of insights that you like, you kinda just got as a bonus and absolutely gotten some really like long, I mean, I feel bad not completing a survey that I get just because I really appreciate it when people complete surveys that I send out, but yeah. People will really, really share a few kind of give them the mic in that last question.
Speaker 2 (26m 12s): Right. Yeah. And, and, and I think that, that's what it was a general positioning of a survey is that, you know, th this is not like if you position it as we're just gonna, we're just trying to collect data from you all are sort of much more closed off with, with how they answered. But if it's, you know, this is actually to your benefit, ultimately our goal is to listen to you. And as long as it actually is your goal, which hopefully it, it is if you are sending surveys, but people do open up a fair amount. So w we've talked to a good amount about surveys here. I do want to ask you one very specific, which is maybe a little contentious.
I'm going to ask it anyway. Cause I know you, you started off your survey project with this survey paper, which is MPS, which is talked about it. A lot of which is touted as a, you know, the thing that you have to be doing, the thing that you have to be improving and sort of measuring your performance by when it comes to, you know, experience and customer satisfaction. What's your take on, on NPS as a question, I guess. So a two-part, which is a bad way to ask, to ask a survey, the two, but your take on the NPS as a metric, as a thing to measure a business way.
But also as in, as in the question itself for MPS, okay.
Speaker 0 (27m 28s): I like this question. I think the MPS is good because it's the MPS, you know, it's a familiar metric, it's, there are industry standards that you can measure your MPS against a competitor. You can measure against your past, like is good because I think the question's bias. But then if everyone's biased, it's like somehow canceling out maybe, and the familiarity factor isn't small, right? Like it's a huge, it, I think it's a huge point of reference that if you know about MPS, it means something to you so that it does have that, like the recognition and the credibility by way of that.
I think the way that the score is calculated, their AR the score calculation, the tally. So the actual, like out of a a hundred NPS score, I don't think that is fully accurate. And there are some kind of embedded errors with it. And I don't remember the specifics off the top of my head, but there are a few articles that you can Google and kind of read about
Speaker 2 (28m 28s): It.
Speaker 0 (28m 29s): Yeah. And then with the question itself, I mean, I just feel like it's this double whammy of future-focused and then like a rating scale that is quantitative. And so you have to scale on a scale of one to 10 or zero to 10, and there is no markers there. I think at least with something like a Likert scale, like a five-point, you know, strongly disagree all the way to strongly agree. There's some qualitative insight of, like, we know that the five would be strongly disagree, but a here like a 10 for you, you might be like an eight for me.
Like maybe I'm just not someone. And, and I'm not someone who would like to just give out of 10. I'm always like, ah, there's always room for improvement. And so I think that while it does try to include that, like buffer, I know that they categorize, like the, I think is like the detractors and you want, who answers zero to six and then seven to eight is grouped in one category and, and the nine to 10 in another. So I think that that does help kind of compensate and buffer for the subjective response. But when you ask them like future, and then they have to speculate, and then on top of that, you know, the zero to 10 with no real I qualitative measure, I just feel like it's not a useful data point beyond it being an MPS.
That folks recognize if that makes sense.
Speaker 2 (29m 57s): Yeah, absolutely. I think that's sort of the way that I think about it as well. I definitely agree with you that it's, it's highly recognizable. There's a lot of value. And, you know, if, if executives or whoever is whoever's sort of wanting to run surveys, recognizes MPS is the metric that is more broadly accepted, then that's certainly a big point. I do think that there are the vast majority of its value is in being able to compare you your MPS versus others NPS, or, and I've seen companies doing this well, is measuring NPS across the buying journey or across the customer journey, and then comparing your, your own score at each point.
Right? Because then you can say, Oh, well, it's, you know, it's 70 everywhere, except this one spot, which is a negative 30, right? Like we probably have to have a problem or a bottleneck at the spot where is negative 30. Right. That's certainly valuable. Right. To know. I do think that again, just the school by itself, sort of beyond being able to identify drop-offs and, and peaks sort of in your, you know, the trend is you need to know why, right. Like we talked about it is sort of to bring in all the way back to the beginning of that sort of ultimately is a quantitative measure.
Right. So knowing why this is a drop-off at that point is, is really the name of the game. If you wanna move the ball forward.
Speaker 0 (31m 20s): I think you had mentioned this when we chatted a couple of weeks back, that it is, and I completely agree. It's like a great warmup question, right? It's, it's easy. It's not a lot of brain power and it's great, like survey work to then dive into the why and the more nuance. But yeah, it totally agree that it is, it's great for comparison and like a reference point, but as an I a standalone data data point, it's not, definitely not my go-to.
Speaker 2 (31m 50s): All right. So to sort of go to the other direction, so to start to wrap things up here, which was a little bit about if you, you know, if you're not currently running any surveys, what should you do first? But if you don't have any, you have no qualitative sort of measures or, you know, research methods that you're currently using and let us keep it sort of scoped to, you know, direct consumer e-comm brand, what would be your sort of, what will be the first steps? And if you have to lay out like at a three-step or a five-step plan that you can go and implement, you know, this week to, to go get some, some qualitative data, where would you start?
Speaker 0 (32m 25s): This is for a brand that doesn't have anything in place.
Speaker 2 (32m 29s): Yeah. Let's say that they just have, you know, their own tracking, quantitative, like Patriot is conversions, Google analytics type of data. I would have
Speaker 0 (32m 40s): Them start with those two surveys. So it could be an opt-in. So either on the thank you page, or as you mentioned, the welcome email and pop into the cancellation survey and a, both of those, I would ask for folks to sign up if their interested in talking more, and then I would leave them like static. I think those are really, really good, kind of a placeholder surveys that you would just want to have and be looking at the data pretty regularly.
And once you get those interviews coming in and interview requests, like I would start booking right away. And if you can look at your quantitative data and get a sense of like, through the journey or through like user flows, are there specific points that you want to optimize? Is it kind of the add to the page conversion? Is it on site to sign up? Like, what are those conversion points that are lower than you like, or that you really want to emphasize or focus on?
I would talk, speak to those in the interviews. That's going to be a really good place for you to, rather than kind of getting lost in the, which survey at first, just kind of have those static pieces of static surveys and then use them to set up these conversations that, like you mentioned it to where you can kind of have like a continuous stream. And then, you know, what you asked about on the survey last month might be different, what you focus on in this coming month, but you still have a really, really strong inputs to that, to the data that you need.
So that's kind of where I would start those two surveys opt in cancellation, checking your current data to see where you want to optimize and then raising kind of putting that as the focus for your interviews, which I, yeah. I mean, if you're getting good in influx of customers kind of go through it, doesn't take too long. I don't find to get first interviews on the calendar. And once you do kind of one or two, you'll get some good momentum. You kind of feel things out and it will always give a lot more insight than, than you are banking for.
So yeah, that's kind of where it was. Is there anything you would add Stuart
Speaker 2 (34m 53s): No, I, I definitely agree that having that sort of baseline in place right. At the surveys that you can be running all the time, that a continuously scheduling interviews, that sort of really the first step, because from there, like, yes, you can definitely ask more nuanced surveys. You can ask sort of a more targeted service to specific places. You can never go wrong with understanding why your customers buy and why they cancel. Right. Is if you, if you do nothing else and you do those two things, well, you're definitely going to see some, see some growth and see some, some improvements.
Speaker 0 (35m 23s): I think those are such strong transition points to that they will absolutely impact like insights from their will impact kind of a middle part of that buyer journey to so, yeah, totally agree.
Speaker 2 (35m 34s): Cool. So we, we really haven't got into interviews at all, but I'll have to save that for a lot of time. That's definitely a whole nother topic by itself. So you can wrap up with one sort of final question is a little bit out of left field, but we feel it's a lot about, you know, getting more than you're probably bargaining for when it comes to learnings and insights from qualitative research, what's been the most surprising thing that you found in your research and you know, this from our client that you can share, but it was
Speaker 0 (36m 5s): So are you, are you kind of asking the full gamut, like surprising in terms of collecting and the nature of the research or surprising in terms of kind of full circle of how the interpretation and analysis of the research feeds into a whole?
Speaker 2 (36m 19s): I think they're the thing that you maybe went in with, you know, thinking it might be one thing and then it actually turned out to be something completely different.
Speaker 0 (36m 28s): Okay. Let's see. I mean, there's been, there's been quite a few. I think that if you keep an kind of open, curious mind, each of the interviews do tend to give an extra layer. So I, I guess the surprising factor has been for me and maybe continues to be is that if you just sort of stay with people in an interview in particular, they share a lot more than they even think they were gonna share. And sometimes even I think that they were going to share in the beginning often to the point that like, they will have this moment of like, Oh, I didn't really think about it like that.
Or I didn't realize I do that, but yeah, that's kind of what's happening, but it does require you to really dig and like ask the follow-ups and kind of chase your curiosity there, which I think can feel it at the beginning. Like, are you really getting anywhere because maybe they give kind of a shorter answer than, and, and if you don't press, you will kind of just be left with that. So I think the surprising thing is like, people really are receptive and willing to go with you if you keep kind of digging.
And then that's really where that rich insight comes, where it is that moment of like revelation from both of people, the people are like, Oh, that's so interesting. And they are like, I know, like I didn't even realize that. So yeah, I think that would be the most surprising thing.
Speaker 2 (37m 52s): Awesome. Well, this has been, it's been really fun. We're definitely gonna have to have you come back to us to do a part two on interviews, but for those who want to find out more and follow up, follow your survey project, why should they go? And well, of course, to link everything up on the show notes on it.
Speaker 0 (38m 6s): Well, thanks for having me too. This was super, super fun. You can find everything that I'm up to and insights in the surveys project as well, all on Twitter. It's just, Hannah underscore Shamji and you're going to link to it, to do it. I know you just said, but yeah, everything is on Twitter. So any posts they put up any updates on a survey, any random musings, or like, Hey, this thing just happened with a customer interviews. All of that. We'll be, we'll be right there. So come say hi.
Speaker 2 (38m 31s): Awesome. Can we do to follow along? Thanks so much for doing this. Thank
Speaker 1 (38m 36s): You. Stuart Hey, thanks for listening to this episode of the DTC Voice of the Customer podcast. If you liked this conversation, I'd love for you to leave a five star reading and maybe even a review wherever you get your podcasts or reach out to a Hannah on Twitter and tell her how much you liked this interview. If you have any questions about the Research playbooks we talked about, you can always reach out to me on Twitter at Stuart Balcombe or send me an email Stuart at discovery, sprints.com. Next up we have the next installment and the way I bought this series, this time, featuring a way travel in this series.
I interviewed the customers of top DTC brands to uncover their buying journey and extract actionable ideas for growth. As always, you can find the complete show notes, more episodes, and our email@example.com slash pod.
Get new interviews with DTC operators and their customers in your inbox every Thursday.