00;00;00;00 - 00;00;04;28 Speaker 1 All right, buckle up, everyone, because today we're doing that deep dive in the user interviews you asked for. 00;00;04;29 - 00;00;07;02 Speaker 2 Yeah, it's a really crucial topic. 00;00;07;02 - 00;00;14;25 Speaker 1 Definitely. We've gathered some great articles, some real world examples to to really unpack why these interviews matter so much in product design and how. 00;00;14;25 - 00;00;16;22 Speaker 2 To actually do them well, which is key. 00;00;16;24 - 00;00;27;03 Speaker 1 Exactly. So consider this year shortcut to understanding the power behind user interviews, why they lead to better products. And you know the practical steps. 00;00;27;03 - 00;00;28;27 Speaker 2 Sounds good. Where should we start? 00;00;28;29 - 00;00;37;18 Speaker 1 Let's kick off with the why. What makes these conversations so valuable? The source is really emphasized going beyond just the numbers, didn't they? 00;00;37;19 - 00;00;54;04 Speaker 2 They absolutely did. It's all about the qualitative side of things. Analytics tell you what users are doing, like clicking a button or dropping off a page. Right? But interviews, they uncover the why behind that behavior, the stories, the feelings, the context stuff data alone just can't capture. 00;00;54;04 - 00;00;56;16 Speaker 1 So it fills in the gaps in our understanding. 00;00;56;16 - 00;01;03;04 Speaker 2 Precisely. You might discover users have these clever workarounds for things your product doesn't support. 00;01;03;07 - 00;01;07;10 Speaker 1 Like that example from one source exporting data to Excel. 00;01;07;10 - 00;01;18;18 Speaker 2 Exactly. Someone mentioned users exporting data because the app couldn't do a specific analysis they needed. Boom. That's not just a pain point. It's a potential feature idea staring you in the face. 00;01;18;21 - 00;01;22;25 Speaker 1 An idea you'd probably miss if you just looked at usage stats, for sure. 00;01;22;28 - 00;01;32;08 Speaker 2 Or you uncover frustrations the team never even considered, or maybe aspirations that spark entirely new directions. It's about understanding the human behind the clicks. 00;01;32;08 - 00;01;40;13 Speaker 1 Okay, so it gives us richer insight, but how does that translate into, you know, actual business value? Does talking to users actually help the bottom line? 00;01;40;13 - 00;01;45;10 Speaker 2 Oh, absolutely. It's not just a nice to have. Think about it as de-risking development. 00;01;45;11 - 00;01;46;26 Speaker 1 You risking how so? 00;01;46;26 - 00;01;53;21 Speaker 2 Well, by talking to users early in the process, you validate your ideas before you invest tons of time and money building something. 00;01;53;22 - 00;01;55;13 Speaker 1 So you avoid building the wrong thing. 00;01;55;13 - 00;02;03;11 Speaker 2 Exactly. One source used a great analogy in interviews are like a flashlight in a dark room. You spot obstacles before you crash into them. 00;02;03;11 - 00;02;05;06 Speaker 1 I like that saves a lot of wasted effort. 00;02;05;06 - 00;02;14;11 Speaker 2 I imagine a huge amount and it doesn't stop there. Understanding user needs deeply means you build products that are genuinely useful, more engaging. 00;02;14;17 - 00;02;17;20 Speaker 1 Which should lead to better adoption. People sticking around longer. 00;02;17;21 - 00;02;25;08 Speaker 2 That's the idea. If the product solves a real problem in a way that makes sense to the user, they're far more likely to adopt it and keep using it or tension goes up. 00;02;25;11 - 00;02;32;21 Speaker 1 Makes sense. Like analyzing why people cancel. That's often interview territory to right to find out what needs fixing. 00;02;32;24 - 00;02;41;20 Speaker 2 Definitely exit interviews or surveys can give clues, but a real conversation often reveals that deeper issues that quantitative data might mask. 00;02;41;21 - 00;02;51;19 Speaker 1 Okay, so the value is clear. Better insights, less risk, better adoption. Let's talk practicalities. How do we actually do these interviews effectively? Sounds a bit daunting. 00;02;51;24 - 00;02;56;14 Speaker 2 It really doesn't have to be. It's a learnable skill. Honestly. The key is knowing when to do them and when. 00;02;56;14 - 00;02;57;24 Speaker 1 Is that just at the beginning? 00;02;57;24 - 00;03;17;03 Speaker 2 Nope. Ideally throughout the whole product lifecycle early on, yes. To define the problem space and understand needs. Okay. But also during the design phase to test concepts and prototypes, get feedback before you build. And even after launch to understand how people are really using the product, identify areas for improvement and iterate. 00;03;17;03 - 00;03;18;28 Speaker 1 So it's an ongoing conversation. 00;03;18;28 - 00;03;24;16 Speaker 2 It should be now about how many people. There's that common myth isn't there. 00;03;24;18 - 00;03;27;06 Speaker 1 The you only need five users thing. Yeah. 00;03;27;07 - 00;03;34;29 Speaker 2 That one. It's a bit misleading. Five can be okay for usability testing, specific tasks, finding the most glaring issues. 00;03;34;29 - 00;03;37;13 Speaker 1 But not for understanding broader needs. 00;03;37;16 - 00;03;41;29 Speaker 2 Probably not for deep exploration. The goal should be saturation. 00;03;41;29 - 00;03;45;19 Speaker 1 Saturation, meaning what exactly? 00;03;45;20 - 00;03;52;23 Speaker 2 It's the point where you stop hearing significantly new insights or themes. You start hearing the same stories, the same pain points repeated. 00;03;52;25 - 00;03;55;06 Speaker 1 Okay. So you feel like you've kind of mapped the territory. 00;03;55;07 - 00;04;08;09 Speaker 2 Pretty much. The number varies attending on the complexity and the scope, but aiming for maybe a dozen interviews is often a good starting point for exploring a problem space. It's more about the quality of the insights than hitting a magic number, right? 00;04;08;09 - 00;04;15;24 Speaker 1 Depth over just quantity. So let's say we've got our interview schedule. How do we ask the right questions to get those rich stories? 00;04;15;26 - 00;04;20;15 Speaker 2 Good question. The absolute key is open ended questions. 00;04;20;15 - 00;04;21;22 Speaker 1 Not yes no questions. 00;04;21;22 - 00;04;34;13 Speaker 2 Definitely not. You want to encourage storytelling instead of do you like this feature? Ask can you tell me about the last time you use this feature? Or walk me through how you accomplish X? 00;04;34;13 - 00;04;37;13 Speaker 1 Get them talking about their actual experience exactly. 00;04;37;13 - 00;04;45;06 Speaker 2 And crucially, avoid leading questions. Don't put words in their mouth or signal the answer you want here. Let their experience come through. 00;04;45;10 - 00;04;51;02 Speaker 1 So less wouldn't it be great if and more. What challenges do you face when. 00;04;51;02 - 00;04;59;19 Speaker 2 Perfect and focus on past behavior, not future hypotheticals? What people say they might do is often very different from what they actually do. 00;04;59;23 - 00;05;04;09 Speaker 1 Tell me about a time you did X is more reliable than would you use feature Y? 00;05;04;12 - 00;05;08;13 Speaker 2 Much more reliable. People are not great at predicting their own future behavior. Okay. 00;05;08;13 - 00;05;16;18 Speaker 1 Make sense. So we've done the interviews. We've got all this rich qualitative data, pages of notes, maybe recordings. What happens next? I can't just hand that over. 00;05;16;18 - 00;05;23;04 Speaker 2 No definitely not. The analysis is where the real magic happens. Turning those raw observations into actionable insights. 00;05;23;06 - 00;05;24;20 Speaker 1 How do we do that systematically? 00;05;24;22 - 00;05;30;07 Speaker 2 Thematic analysis is a common approach. It sounds fancy, but it's really about finding patterns. 00;05;30;12 - 00;05;32;24 Speaker 1 Looking for recurring ideas or problems. 00;05;32;24 - 00;05;46;13 Speaker 2 Exactly. You might read through transcripts, tag quotes, or observations and then start grouping similar points together. Difficulty finding X needs feature Y, confusing workflow Z. 00;05;46;14 - 00;05;49;03 Speaker 1 And you start seeing themes emerge from the data. 00;05;49;04 - 00;05;54;02 Speaker 2 Right? One technique mentioned in the sources is affinity mapping, which is great for teams. 00;05;54;02 - 00;05;55;12 Speaker 1 Oh yeah, the sticky notes thing. 00;05;55;12 - 00;06;07;12 Speaker 2 That's the one everyone writes down individual insights or quotes on sticky notes. Then you collaboratively group them on a wall or whiteboard. It's a very visual way to see the patterns and build shared understanding. 00;06;07;12 - 00;06;09;04 Speaker 1 I could see how that would get the whole team involved. 00;06;09;05 - 00;06;12;23 Speaker 2 It really does. And then of course, you need to communicate these findings effectively. 00;06;12;25 - 00;06;14;19 Speaker 1 Because insights are useless if they stay. 00;06;14;19 - 00;06;24;12 Speaker 2 Hidden totally, you need to share them with the wider team stakeholders. Whoever needs to know this could be through reports, presentations, highlight reels. 00;06;24;12 - 00;06;26;21 Speaker 1 Even short video clips of user speaking. 00;06;26;28 - 00;06;33;24 Speaker 2 Yeah, those can be incredibly powerful for building empathy, but the absolute crucial part is making it actionable. 00;06;33;24 - 00;06;34;22 Speaker 1 Connecting the dots. 00;06;34;22 - 00;06;49;27 Speaker 2 Yes, don't just present findings suggest what they mean for the product. Because users consistently struggled with X, we recommend prioritizing improvements to feature Y or redesigning workflows. Z tie it back to decisions. 00;06;49;27 - 00;06;59;02 Speaker 1 That Actionability is key, and the sources had some brilliant real world examples of this. In practice, the Airbnb story was quite something, wasn't it? 00;06;59;07 - 00;07;02;25 Speaker 2 Early days bookings were slow. They didn't just guess why they. 00;07;02;25 - 00;07;03;22 Speaker 1 Talked to their hosts. 00;07;03;22 - 00;07;11;22 Speaker 2 They did and discovered that the photos people were using for their listings were, well, often pretty bad. Poorly lit. Didn't show the space well, it. 00;07;11;22 - 00;07;13;06 Speaker 1 Seems obvious in hindsight. Maybe. 00;07;13;07 - 00;07;22;09 Speaker 2 Perhaps. But they didn't know until they talked to people and likely saw the listings themselves through that lens. That insight led them to invest in better photography for hosts. 00;07;22;09 - 00;07;24;08 Speaker 1 And that was a huge turning point for them, wasn't. 00;07;24;08 - 00;07;33;01 Speaker 2 It? A massive one? It dramatically improved the appeal of listings and helped build trust, sparking significant growth, all from listening to users about photos. 00;07;33;07 - 00;07;36;04 Speaker 1 Incredible. Yeah. Any other examples that stood out? 00;07;36;06 - 00;07;44;05 Speaker 2 Well, Google Meet during the pandemic was interesting. They had this sudden, massive influx of new users, teachers and students. Right? 00;07;44;05 - 00;07;46;15 Speaker 1 For virtual classrooms, a whole new context. 00;07;46;15 - 00;07;56;13 Speaker 2 Exactly. So they rapidly interviewed educators to understand their specific needs. Things like managing a class, online breakout rooms, integration with education tools. 00;07;56;13 - 00;07;59;00 Speaker 1 And they adapted the product based on their feedback. 00;07;59;01 - 00;08;10;21 Speaker 2 They did very quickly, adding features specifically for education help them become much more relevant and useful in that space. Driving adoption. It showed agility fueled by user feedback. 00;08;10;23 - 00;08;15;18 Speaker 1 It really highlights that even huge data rich companies need that qualitative input for sure. 00;08;15;18 - 00;08;18;26 Speaker 2 Look at Spotify too. Data and a b testing are huge for. 00;08;18;26 - 00;08;19;18 Speaker 1 Them, of course. 00;08;19;25 - 00;08;26;27 Speaker 2 But one source mentioned an instance where they were testing names for a new feature a b testing could show which name got more clicks maybe. 00;08;27;02 - 00;08;27;29 Speaker 1 But not the feeling. 00;08;28;03 - 00;08;39;02 Speaker 2 Exactly. User interviews revealed a stronger emotional connection to one of the names, a sense of delight that the data couldn't quite capture. They chose that name based on the qualitative insight. 00;08;39;09 - 00;08;42;03 Speaker 1 It adds that human layer to the decision making. 00;08;42;03 - 00;08;51;22 Speaker 2 It really does. Which brings us to another sometimes overlooked benefit. Oh, what's that? How user interviews can actually bring teams together? 00;08;51;24 - 00;08;54;00 Speaker 1 How so? By giving them a common focus. 00;08;54;00 - 00;09;00;27 Speaker 2 Precisely when you have direct evidence from users, their struggles, their needs, their words, it shifts the conversation. 00;09;01;03 - 00;09;02;20 Speaker 1 Away from just opinions. 00;09;02;20 - 00;09;12;25 Speaker 2 Debates become less about I think we should do this and more about what does the user need? Or how can we best solve this user's problem. It creates common ground. 00;09;12;25 - 00;09;14;09 Speaker 1 Based on empathy for the user. 00;09;14;09 - 00;09;22;18 Speaker 2 Right? And involving different roles in the process is powerful. Having engineers, designers, product managers, even support staff observe interviews. 00;09;22;19 - 00;09;23;23 Speaker 1 They hear at first hand. 00;09;23;23 - 00;09;33;18 Speaker 2 Yes, an engineer might hear a user describe a technical frustration and immediately get an idea for a fix. A support person might hear the root cause of recurring issues they deal with. 00;09;33;19 - 00;09;36;03 Speaker 1 It connects everyone directly to the user experience. 00;09;36;03 - 00;09;45;09 Speaker 2 And that fosters a truly user centric culture. When everyone in the organization has that empathy, it starts to permeate every decision, not just product features. 00;09;45;16 - 00;09;48;24 Speaker 1 That's a really powerful point. It's not just about the product team in isolation. 00;09;48;24 - 00;09;53;16 Speaker 2 Not at all. It's about aligning the whole organization around solving real user problems. 00;09;53;17 - 00;10;00;13 Speaker 1 So wrapping this up, it seems crystal clear that user interviews aren't just an optional extra. They're fundamental. 00;10;00;17 - 00;10;08;25 Speaker 2 Absolutely fundamental for building products that truly resonate, solve problems, and ultimately succeed. 00;10;08;26 - 00;10;14;18 Speaker 1 They give you the why, they de-risk development, guide design, and even align your teams. 00;10;14;20 - 00;10;16;02 Speaker 2 Couldn't have said it better myself. 00;10;16;02 - 00;10;29;00 Speaker 1 So maybe the final thought to leave our listeners with is this what hidden opportunities, what crucial insights, what potential breakthroughs might your organization uncover if you just started talking to your users more? 00;10;29;03 - 00;10;30;10 Speaker 2 Good question to ponder. 00;10;30;10 - 00;10;37;22 Speaker 1 Definitely something to think about. And of course, the source materials we looked at have even more detail and examples. So dive in there if you want to explore further.