curved line
PODCAST
EPISODE
113

Ep. 113: Design Thinking: The Secret Weapon for Business Success

SUMMARY

Learn how to create a culture that values quality and embraces design thinking. Key takeaways include making design thinking a continuous practice, fostering transparency and collaboration for innovation, aligning ideas with strategic objectives, and managing conflicts while respecting different ideas.

apple podcasts buttonspotify podcasts buttongoogle podcasts button
podcast recording

Description

Are you ready to build a culture that values quality and embraces design thinking? We'll show you how to foster collaboration, encourage idea-sharing, and establish feedback loops for data within your organization. Learn how to establish consistency, manage system risks, and make informed decisions for future scalability. Rest assured, creating a culture of design thinking goes beyond simply following a process - it's about making quality a core value of your organization.

This week's takeaways:

  • Design Thinking as a Habit: Make it a continual practice, not a one-time exercise.
  • Transparency and Collaboration: Encourage everyone to challenge and share ideas, fostering a culture of innovation and problem-solving.
  • Aligning Ideas with Objectives: Evaluate how feasible ideas contribute to strategic goals and adapt as necessary.
  • Managing Conflicts and Respecting Ideas: Foster a comfortable environment for disagreement when important decisions need to be made.

If you're interested in editing or providing feedback for their upcoming ebook, please contact them at feedback@definitelymaybeagile.com. And don't forget to subscribe to the podcast for more insightful episodes.

Transcript

Peter: 0:05

Welcome to Definitely Maybe Agile, the podcast where Peter Maddison and David Sharrock discuss the complexities of adopting new ways of working at scale. Hello, Dave, how are you today?

Dave: 0:15

Excellent, excellent. I'm even on the same time zone as you, so that means that I have no excuse for saying I'm a few hours behind you when we have a conversation.

Peter: 0:23

Of course, I don't have to be quite so far ahead. I'll keep it in the right zone. So what are we talking about today?

Dave: 0:33

We touched on in the preamble to this around design thinking. So I know we've talked about design thinking before just in the act of what it is or the principles for how to apply it, say within an agile delivery process. But there's a lot of really great stuff that comes out of design thinking that gets diluted as it gets built up and actually delivered to customers and I guess it's just trying to figure out how to be really impactful with pre-work that gets done.

Peter: 1:03

Yeah, how do you actually be effective with this as a method and for the listener it's kind of my two-minute version of design thinking of do a really good job of defining the problem, brainstorm a whole ton of possible solutions, work, prioritize which ones you want to invest in and come up with within a constraint to test the feasibility of those ideas and then bless the ones you want to move forward. And so it's kind of a former design thinking in a nutshell and being a part of service design, which has many hundreds of practices. There's lots of different variants on this and different ways of going about it, but I always think of it. How do you come up with feasible ideas? How do you come with ideas which you can test as feasible that you can move into the rest of your process?

Dave: 1:50

And this happens before a lot of the sort of earlier agile design pieces, typically so before you start to go and build MVPs, even when thinking about what we're going to do, I mean, I think, from the way you're describing that, peter, it starts with understanding the customer or the end user, the prospective end user, and the problems that they're experiencing and doing, whatever it is that we're offering them a product or service to do. And as soon as you start with that and then move into the solution design piece, I think one of the big headwinds that you hit almost immediately is, by definition, so many of the people involved in product delivery have perspective, with their own cognitive biases, which we talked about before, that they bring to the table. So it's the findings that come out of quite an analytical approach about we need to do. A, b and C are often interpreted or beaten down by opinions and experience and so on from a different perspective. That's the headwind, almost, that those changes come into.

Peter: 2:55

Yeah, and this, I think, is where we start to struggle, because we've got great idea it's really innovative way that we could go about solving this problem and I've done some initial testing that it's feasible, we could try this, and why don't we go and try it? And then you hit those headwinds which is well, no, we've got this entire platform over here. That does that Like, no, we don't need these pieces. No, why would we do that? And so the work that goes into that isn't necessarily going to get consumed and built into the organization, and we've seen this at scale in organizations. But I think there's also this sort of smaller scale usage design thinking in the more teams or areas where it also struggles to get this outside of the whole. And I remember now where we've talked about this before, when we talked about innovation and innovation departments.

Dave: 3:54

And so I think there's a huge piece which is we need to get into our heads that we need to continually be questioning what we're building and whether or not what we're building is solving a real problem. And if it does solve a real problem, is it solving it sufficiently well that people are kind of drawn to it? And this is where that whole concept of product market fit comes in, and whether or not what your product is, what it delivers, is fits to a real customer need or a customer pain point. That whole process of continuously questioning things which design thinking, the original design thinking concept and what's been repackaged in a number of different ways, such as Google's innovation sprint format and things like this there's a lot of different ways of all doing the same thing, which is bringing that conversation around the customer and really understanding that coming in. And I think one of the key takeaways is we should be doing that whole process as a habit within the teams continuously. How do we build that in as a muscle that we use all the time?

Peter: 4:59

Yeah, and if you don't? Because if you don't then the problem is that if it happens elsewhere, then taking those ideas and actually testing them into the market becomes very difficult. It's the because you can't do the next piece and it becomes almost wasted effort if it just happens in isolation. So, making it a habit, something that we always do, we're always looking at what are the different ways we could approach this. Hey, we've got a problem here that we want to solve. Let's have a process which we go through to come up with ideas of how we might go about solving that and also testing which of those ideas might actually be feasible for us to try.

Dave: 5:39

Well, and it comes into because, as you're describing that, I'm just thinking that first of all, there are brand new products in brand new fields. Take the time to go and go through that whole process. That's definitely. There's a lot of value in that side of things. And I think what we're talking about about creating that sort of habitual step or the mental muscle that we want to continuously use is to always be challenging how we make those changes, and what that brings to mind is transparent. We need to be able to measure it, and it's a risk problem Because I'm thinking user experience teams. They can often. Certainly we're seeing this more where they're either disconnected from the delivery teams and they're often slowing down the work because they want to go and do user experience on the whole process, and I think there's a risk management piece conversation to be had where. How do we change that more quickly by identifying which elements are most likely to benefit from a deeper dive in terms of user experience, so balancing things correctly, then.

Peter: 6:41

Yeah, and I agree. I think this is a critical piece, especially if you can get this into that UX space I was talking about that. There was a conversation around consistency and saying the closer the user you are, the more consistent the experience needs to be. So when you use your experience, you want a consistent experience. It's like there are definitely products I interact with in the marketplace where I find myself switching between different parts of the product and I find a totally different experience and it makes navigating much harder. And you can definitely feel it when a product has been built by different areas and then glued together afterwards and there isn't that consistent user experience as you move through the product. For sure.

Dave: 7:22

I mean, this is one of those things that we kind of circle around quite a bit and maybe we're trying to break things down into sort of two or three areas. One of them is brand new product, brand new space that's being investigated. I really like that Formulate invest the time, pull people together and it's costly. But it really is what we're talking about. That's something that has a high, high cost to actually pull together. And it's a huge value to spectrum, actually, obviously, if it works and you come up with something that is that is really unique and and solves a problem in any in a different way. But then there's the Habitual piece. How do you, how do you raise all of the all of those involved in delivering a product to continuously be challenging themselves and it's a little bit like in previous Previously we'd have talked about that around quality. How do you get everybody to be thinking about quality? Quality is not one person's responsibility with a particular role or skill. Quality is everybody's Responsibility and in many ways. This whole idea is design. Usability is everybody's responsibility. It's not that one role or skill set.

Peter: 8:30

Yeah, it reminds me of a CTO that I used to work for who came in with into the organization with exactly that message that yeah, quality is everybody's problems, but he didn't actually come in with any real solutions for it. The way, I think we did end up with some design thinking, that I'm not sure if the two things were connected and and that, I think, was a part of the problem it was like a one-off session on design thinking versus a. Well, okay, we, if we want quality to be a part of what we do and want to think about things and constantly challenge how we're approaching problems, what, what are the, the mechanisms we're putting into the organization to encourage that to happen? And it's because one thing to stand at the front and on a podium and utter the words, it's another thing to actually think about the, the actual Processes and practices that need to be in place.

Dave: 9:23

Well, and I think it's a, as nearly every conversation we have, it's ultimately a cultural change with some sort of a, you know, mindset shift. So, as you're describing that experience with the, with the incoming executive, what I was thinking is two things kind of term to mind. One is data and information. We fundamentally design but also quality in the same kind of Space requires a really short feedback loop. The quicker I find out that there's a defect affecting customers experience Whether it's a quality problem or whether it's maybe some interruption in the in the user experience the quicker we learn from it we can build on it. But that information needs to be shared with everybody so that Everyone can sort of see it and understand the impact of it.

Peter: 10:09

Yeah, definitely, and and that's I mean I think that's another key element of it as well. Well, one of the nice things about those processes is that does give you that opportunity to bring everybody together. There's a there's a similarity to Some the exercises we do from a mapping perspective, when we do things like value stream mapping is bringing people together, getting them to talk about the entire problem set in the system, but in just a different domain really in there, with and with a few less steps and elements involved in it. But it's essentially a similar type of concept getting us to come together, share ideas, collaborate on what solutions might be, and then that in turn can then feed into Other aspects of the system.

Dave: 10:52

Well, but it's you raise an interest, so it. If you step into that mindset, then I've certainly seen I'm sure you've seen this as well where everything becomes open to interpretation and opens to change. And yet the point that you just made around consistency means that actually one of the rules that has to come in, or we have to recognize that at some point this thing, whatever it is, isn't going to change. We're going to hold it firm for whatever it is six months a year, three months, whatever but we're not going to keep iterating on every element just because somebody has a feeling that that's what they should do. A little bit like from a quality perspective. Again, I think that analogy is working quite well. But from a quality perspective, there's some elements of your systems that you're just going to ignore the quality of. It's rare that there's a problem there. So from a risk management side, we can't make everything 100% perfect. We have to understand where to spend the time, and one of the things that I find interesting is very little conversation about the impact of poor design and usability.

Peter: 11:56

Yes, yeah, I would agree. In a related conversation I was having earlier today was around scaling in a startup, where they were asking a question like how do you know when to scale the system? And I said, well, you'd need to be able to look ahead, work out what does it going to take for me to be able to scale it? And then I need this amount of time to work out what I'm going to need to do. And that starts to build up. At what point do I have to be able to make a decision? And now I've got to make sure everybody's on the same page so that we're able to have those points that we know. Okay, we're onboarding a new customer that's going to bring in this much more demand into my system. What am I going to need to do? When do I need to know so that I can start to make those decisions about how I'm going to scale? And then thinking too, about at some point you will hit a point where the current architecture or design, if you're lucky, this is a good problem to have that the current architecture that you have is not going to support the ongoing growth of the system, and you'll have to rethink it so that and I think this does tie into that piece around people don't spend the time thinking through what those points might be Like. When am I going to need to know this? How do I put those points in? How do I start to look at what I'm doing, either from a UX or scaling, or a quality perspective? How do I know that things aren't going in the right direction and that I am at a point to be able to make the right decisions?

Dave: 13:24

And there's some really interesting stuff around. Leadership on this one because part of it is strategy is looking ahead and making decisions with poor information, low levels of information, but making some sort of core ball on what's going to be the next area of improvement or whatever it might be. But there's another element of the leadership which is helping people recognize that we want them to challenge things, we want them to bring to interpret the data, make suggestions for changes over here, but we don't want them to spend time and effort and money over here. So again, I remember one organization where we had they were delivering a sort of a social media platform, one of the many sort of networking apps which are out there, and there was a real push to try and go into China, because obviously China has 1.x billion people and so any sort of network thing is going to explode in theory if you put it in front of 1.x billion people, but with no understanding that, the cost and the strategic implication that had and all the rest of it that comes with it. So that's an example of where we don't want to change that element. What we want to change is usability things and how we can send messages backwards and forwards or whatever else may be involved in that. So how do we kind of keep people focused on the right areas where they can use that innovation muscle?

Peter: 14:46

Yeah, and I think this, actually this brings an interesting piece as we start to think of the feasible ideas that we bring out of design thinking into implementation, that we're looking to do and how, and then saying, well, though, those things that we have align to the objectives that we've set and are we going in the right direction, and how will those ideas impact those? Like, if we have defined a set of objectives and we've got an idea of how we're going to measure the success in those areas, how will these ideas we're coming up with, how are they going to tie into helping us achieve those objectives?

Dave: 15:22

Yeah, you know, as you're describing that, one of the things that strikes me is what we're really talking about. It's like deeply collaborative, high levels of communication on a team wherever that team might be, where everybody has a different perspective, but being able to really handle the conversation that goes on, because if you think about something like that, if I imagine some of the scenarios we've talked about, somebody somewhere is going to say Dave, that's an idiotic idea. They may not put it across like that. You can think of that as not where we're going. How do you have those conversations? Well, part of it is you practice. Who, respectively, of course, but a lot of it is you practice. You have these conversations over and over again, but you're also individually. You've got to take the note that sometimes my idea is the wrong one. It's not the one we're going to go to. I'm going to take Peter's idea as, whatever I may think of it, that the consensus is that's where we have to go, and I have to now implement it, bully 100% behind where that direction is.

Peter: 16:21

This very topic came up earlier today, too, and the two of the other pieces that I added to that were around how do we have the agreement up front? As you say, practice it. How are we going to deal with conflict as a team when we interact? What are the protocols that we have in place? If I totally disagree with what you're saying, like what is it?

Dave: 16:42

Again, again.

Peter: 16:45

And how are we going to handle that is a key piece, and another one that I threw out there is don't just have conversations about the work. Make sure you're having other conversations too, socially or outside of it, so that there's opportunity to disagree in a safer space, so that, when it's absolutely critical that you do need to disagree, you feel comfortable doing so.

Dave: 17:10

Well, we've had quite a meandering conversation here we have. I feel like somehow we slowly brought it back to the point that we started with, which was that dilution effect of coming up with something through a design thinking process, but that seeing that idea dilute as it got executed. How are we going to wrap this one up? I'm going to give it a go.

Peter: 17:31

It's a very good question because I agree we covered a lot of ground here today. So, starting with that original, the piece we're talking about, like effective design thinking, like the tying in that it can happen at different areas, that there's this piece where where I liked what you're talking about around there's design thinking on the big, hairy, scary problem that we want to solve how government delivers service to a particular constituent, and that the big problems that we're looking to solve versus hey, I've within the team. I want to rethink how we're approaching this, like what does that look like? And using this as a process and making it a habit to challenge ideas, to think about how might we do this differently, bringing in that so that we're not when we are not just looking at things once and then just executing blindly without going back and maybe thinking, ok, how could we solve this in different ways? Is this the right way to go about it? That's why I like that. So what would you add?

Dave: 18:35

I think I just there's a lot that we kind of bounced around on, but I think one of the sort of really nice things that we ended up closing with is is and it's really to do with, with shortening everything, so we have to learn to work with one another and we have to therefore practice that conversation of, I think, being innovative and bringing things to the table, informed by data, informed by an appreciation and an understanding of what's said, but also being comfortable not pursuing that idea like continue and actually accepting other ideas. And I think that's it's. Like you said, is that social fabric in there as well? Is, how do you? Because you practice, the teams that we've worked on the best I'm sure we've had this experience is the ones that you, you really have a deep connection with over time. Lots of conversations, lots of respectful but, no doubt, passionate conversations around topics.

Peter: 19:32

Yes, yeah, indeed. So so with that, thank you. As always, Dave, great conversation, and if anybody wants to reach out, they can at feedback@ definitelymaybe agile. com, and don't forget to hit subscribe If you like subscribers, subscribers of fun until next time.

Dave: 19:46

Thanks again.

Peter: 19:47

Till next time you've been listening to Definitely Maybe Agile, the podcast where your hosts, Peter Maddison and David Sharrock, focus on the art and science of digital agile and DevOps at scale.