curved line
PODCAST
EPISODE
11

Ep. 11: Quality and testing

SUMMARY

Hosts Dave and Peter are joined by special guest Peter Budden to explore the concept of quality in agile and DevOps teams, emphasizing the importance of meaningful conversations, skill improvement, and cultivating a testing mindset for effective outcomes.

apple podcasts buttonspotify podcasts buttongoogle podcasts button
podcast recording

Description

In this episode, Dave and Peter have a special guest, Peter Budden, who has over 15 years working in QA and Testing. We discuss what quality means in the terms of agile and DevOps teams.

This week takeaways:

·         Make sure that you're having conversations about what quality means for you and for your team.

·         Figure out ways to help your teams improve their testing skills.

·         Testing is a mindset, asking good questions, and giving people the space to be inquisitive.

We love to hear feedback! If you have questions, would like to propose a topic, or even join us for a conversation, contact us here: feedback@definitelymaybeagile.com

Transcript

[00:00:00] Peter Maddison: Welcome to definitely maybe agile, the podcast where Peter Madison and David Sharrock discussed the complexities of adopting new ways of working at scale.

[00:00:12] Hello and welcome. Dave and I here again, we're joined by a special guest, Peter Budden. Who's going to give us some wonderful information, all about testing and lots of different aspect of that. Would you like to go ahead and introduce yourself, Peter?

[00:00:24] Peter Budden: Yeah, sure. I've been working in QA and testing for the last 15 years and working with projects large and small. At companies from bare bone startups, all the way to large enterprises. And a lot of that time spent in agile and DevOps, really trying to problem solve projects that have gone wrong and thinking about how we can improve from a QA mindset. How we can improve the way that QA is working. The way that testing's happening and thinking about how to then bring that into an agile process. We've seen over the last 15 years a huge change in testing. Things have have massively changed. And I think on the podcast you've been talking about, cross-functional teams and all these things. I think using a testing lens to look at that, and thinking about how do we deal with that problem of quality, is a really useful perspective for a lot of agile teams, who are having trouble in this space.

[00:01:13] Dave Sharrock: Well, welcome to our conversations, Peter. I'm looking forward to this discussion just because I'll be able to throw a question out and say, "Peter", and I'll leave the two of you to figure out who's going to answer first. This is a really an interesting point is how quality has changed. The topic of testing and quality is one of those things that is almost an add-on. It's a core of agile teams and it's a core of DevOps, but it's very rarely given its day in the sun to really understand what it means to be at the heart of agile teams. In terms of building quality in. In terms of focusing as a team on quality. Also if you look at DevOps with things like automation it's right in the middle of that. Peter, you mentioned that you've worked a lot with agile teams, but also a lot, not with agile teams. What's the difference?

[00:02:05] Peter Budden: I think we can look at differences, and we should. Let's also look at some things that are the same, right? Like quality is essentially all about the consumer's perspective. So the difference is about how we approach quality, but the consumer just sees quality in terms of the product they receive. And of course, agile is about how we get to that quality of the end product. I think when we talk about agile teams versus, more traditional waterfall teams, the difference is that instead of having one person who's taking accountability for testing, we need to all be a part and present in that process. More than in any other methodology, there's an emphasis on everyone having a part of that problem. It's not enough to just do testing. We also need to fill in all the gaps, and the knowledge and the skillset that testers in a traditional team would've built up. We need some of that. Everyone needs to understand that. If you look at really high-performing agile teams, I think what distinguishes them from a lot of waterfall teams, is that everyone on the team has got a tremendous amount of discipline. When I talk to developers, they're already at a relatively high level of understanding of all the things that testers intrinsically understand. Whether it's formal training or informal training, they're doing good testing.

[00:03:27] So when we look at agile teams, you may have someone who's got a background as a tester in the team, and they may be involved. But it's not the case that they're doing all the testing. The reason why they can get away from doing all the testing, is that everyone else in the team already knows a lot about testing. When we think about how we create a great performing team in with a cross-functional skill set, what distinguishes those who are successful and not successful is that everyone's got that discipline there. It's also about wanting to get involved in testing and understanding the value that that has.

[00:03:56] One of the things, when I look at it purely from a QA or a testing lens, I come to the agile world just focused on that thread that runs through all. And when I look at successful teams, I see that everyone has taken a little bit of that away from the tester and brought it into their own world. So the tester can focus on other things. And testing in an agile world is a process of coaching and mentorship. So you may have someone in the team who's a tester. One of the questions I often ask Scrum Master's is, "Hey, is the tester actually spending time with other people in the team?". Do they have their bandwidth to coach and to mentor? Are we seeing that improvement of testing over time in everyone's workspace? That's one question that I find really helpful to ask.

[00:04:38] Dave Sharrock: One of the patterns that I often see is this whole push of, "Oh, we've got an automated testing". "We've got some sort of a DevOps team and they're over here". I have an agile team and they're doing their agile thing, and then they're flinging work over to this other testing team. I had one conversation with an organization this week where their test automation team is three or four Sprints behind their agile delivery team. For me, that's an anti-patern. We're gonna trip 'em up and stop this. Look at that and bring those teams together a little bit. But if there's a really good understanding, that gap between test automation and agile delivery team, isn't going to happen. The agile delivery team is getting really upset and uncomfortable because they know that gap means there's a whole bunch of undiscovered stuff, which is gonna come back and kick them in the shins as they go forward. So there's an understanding and appreciation there.

[00:05:32] Peter, how to address test automation? How to bring these skills and technologies up? What do you see as patterns that work, and patterns that don't?

[00:05:43] Peter Maddison: I think you touched on it there a little. It's this move away from this idea that you can actually assure quality. Which you can't, really. Quality is something in the eyes of the beholder. We can't assure it. We can't say, "Hey, you've got 94.3% quality now, you've succeeded. You've got it. You've done. And you're awesome". But you can start to move to a world where you can engineer quality into your processes and your practice. You can start to build out the platform, the engineering, the capability to make it easier to test. Build out the test harnesses to simplify your ability to engage. I completely agree with you that the engineering piece of it can't be behind. It needs to be ahead of where the teams are. It's got to be engineering capabilities that are going to start to enable those teams to be able to go faster. Basically offload some of the complexity of understanding how to glue these different pieces together, so that the automation is actually helping them. You're moving some of the complexity out into the system, into the platform, so that you can focus on the value add on top of that.

[00:06:47] Peter Budden: I think the situation you've described is really typical, and there's a couple of things actually underneath it, also that we should explore. So if an automation team is four weeks behind, what do we mean by four weeks behind? Four weeks behind of what? What we hear is the automation team is four weeks behind, because we've got a target of achieving X percentage of coverage, using end-to-end test automation of the new features, or the new requirements that we are building. Or the new stories that we are building. One of the things that is important to look at is that automation doesn't need to be at end-to-end level. Doesn't need to be at a functional element. Unit testing and integration testing, and test automation at these levels, is also an important part of the test coverage picture. When people look at automation teams and they're sitting off in their silo, part of the issue may be that they're not taking into account all of the testing that's actually going on. We know that as test automation gets more complex and it covers more business features, that is way more brittle. So, when we talk about quality and testing, one of the things that I really hope people take away and think about is, am I looking at all of testing, or am I just looking at what the team that's most obviously doing testing, is doing. In this case it's the test automation team. So sure, they are four weeks behind based on a target of X coverage. What does that mean? Is it really taking into account everything that's going on?

[00:08:16] Dave Sharrock: So many stakeholders in the development of features view testing as some sort of a linear "I add a feature, I add a test" and therefore that feature is now tested. It's a linear challenge to them. And yet, testing is so much more of a mindset of that experience that customers have. Anybody working in IT has had that whole conversation of "I've tested it. It works. Here I can prove it's working", and yet the experience from a customer's perspective is "it might be working here, but it's not working where I am". I distinctly remember working with a data center migration where, we couldn't get past the firewall. So of course all the testing going on inside the data center was fantastic. It looked perfect. But everybody outside of the data center had a very different experience. Understanding that testing is not about a linear approach to testing all the different integrations, functionality, features, but is actually it's that whole exploratory piece, right? The automation is to get the boring stuff out of the way, so that you can do the testing you really need to do. Which is the exploratory customer experience testing, is one way of looking at it.

[00:09:22] Peter Budden: Yeah. I really like that idea that really automation is great for verifying that things that did work, still work. But it's not the same as sitting down at the application and really thinking," Hey, what, might we have broken? What new tests should we be adding? And how do I investigate? How do I explore? How do I think more deeply and be inquisitive about this system?" And that's the mindset I think of a real tester. Of course a developer can develop automation scripts. Of course, an automation team can develop automation scripts, but that's not the same as testing. So when do you actually do testing? Forget this verification stuff. What time is there in the week when you sit down and you just have nothing else on your plate, you're not delivering new code, you're not looking at features. You're not verifying the existing things continue to work the way they did, or fixing tests that were just broken. When are you actually looking at testing and investigating? There's a perspective that is a mindset. And like you said Dave, we need to do this because customer's expectations changed over the last 10 years. What they expect from systems has just changed. Amazon and Microsoft and all these amazing enterprises who have upped the game in IT, have meant that customer's expectations is relative to what else is out there. It's subjective, right? You can't just say, "Hey, I have 80% coverage of my code". That's not good enough. What matters is what the Product Owner says is good enough. It's possible to spend an endless amount of time in testing. You can never cover all the combinations. So that's not the objective. What I would like to see, in all Scrum teams, is regular conversations about what good enough looks. Instead of saying," Hey, we're four weeks behind on this coverage percentage that we need to get to". The question is more, are we achieving the level of quality, and does everyone on the team understand what that quality level really means. Should we be doing more quality related activities, more testing, or can we get away with less? And how are we spending time in that testing mindset?

[00:11:21] Peter Maddison: When we push stuff out and we think of the amount of complexity, one of the things I've noticed that's caused this is that we have very many, many more platforms that we're testing across. Many, many more places that we need to go, and so many tests that we've automated across them. You can end up with a massive heat map of all of the different areas, for every platform I'm testing across, and every single browser combination, every device combination, and then all of the different tests. At that point, I cannot possibly guarantee that every single little piece of that is going to work. At some point, I have to be able to make a call " Yes, this is good enough. I'm okay living with the fact that some parts of this, I know aren't necessarily gonna perfect, but the quality's good enough".

[00:12:00] Peter Budden: Yeah, I think what you're describing is essentially combinatorial explosion. I think everyone's come up against this at some point. It's when you have potential number of combinations of things you could test, that's just so huge that you can never test it. I guess that's the essence of testing, which is it's about risk. It's one of the things we should really learn from like traditional style of testing. We should think back to the techniques that work then, to manage this problem of risk, and make sure that they're well understood. There are skills about testing that I think are still relevant in an agile world. There are skills that developers should understand and we should always make sure in agile organizations that there's someone to learn from. So, if you're a Scrum Master, or Product Owner, and you're listening to this and you're saying, "How can I build skillsets?". One of the ways that's really easy: if you were in your team, who would you go to for advice on how to test better and more efficiently? Is there someone in the organization who can be an oracle of testing good habits. Google had been talking about this for years. They've got they're "testing on a toilet", that's been going on for a long time. Testing skills, they don't happen by magic. This is a skill and you can get better at it over time. So we should try to find ways to coach and to draw that through the organization.

[00:13:09] Dave Sharrock: Peter, I've just gotta follow up. Testing on the toilet at Google. You're gonna have to say more about that.

[00:13:15] Peter Budden: Honestly, it's hilarious. I think it's a very Google thing, and I wouldn't recommend it be taken verbatim. I think they instituted it in 2007, where they started posting in the toilets on their campuses, different testing techniques because they recognized that testing was a major part of making Scrum teams productive. A major part of making developers productive is giving them feedback as quickly as possible. So they said, "Hey, let's create a team that can explain and coach an advocate for good quality testing". They developed some rules, by which they assessed different teams. "Hey, are you following good practices in your test automation? Do you have good practices in testing?". Then they posted things for people to read while they were using the loo, in a way to raise awareness right.

[00:14:01] Dave Sharrock: Testing is not this unique skill that one individual has, it's something that the whole team has to understand and it comes into so many aspects of how features are developed, designed, coded, and so on. So it really becomes a topic of conversation across the team. I would say the same thing, when I've worked with the best agile teams I've seen, all of the skills are distributed across the team. There's an appreciation of how to do the different skills on the team. And it's not necessarily that they can go build a career in it, but there is an appreciation of that handoff and what's required. Rather than that whole tossing something over a fence and not worrying too much about what gets picked up on the other end.

[00:14:47] A follow on question around that. And I know we've got a minute or so. If you are asking our listeners to walk away with something, two or three things, that they can have front of mind when they go back and talk to agile teams that they're working with, what would it be?

[00:15:03] Peter Budden: Let me start with the idea that quality isn't a given. There's no number that we put on it for all software. Make sure that you're having conversations about what quality means for you and for the product you're delivering to market. Have regular conversations about it. Don't be the team that spends too much time on testing. Don't be the team that breaks production. Talk about quality and what it means to your customer in terms that the customer would understand.

[00:15:27] Another thing to take away is that you can see skill atrophy and testing, and it's important to figure out how people in your team can learn to be better at testing over time. Try testing yourself if you're a Product Owner, if you're a Scrum Master. Get in there and figure out what the team are doing and ask questions about what other people in the team are doing, to get a full picture of what they're doing and their skills. It's important, I think, in every organization to think about how testing skills are kept up and advanced.

[00:15:58] And the third thing that I would think of everyone should understand, is that testing is a mindset. It's about asking awkward questions. It's about giving people the space to explore and be inquisitive is hugely important to finding the things that you didn't think of when you were designing.

[00:16:14] Dave Sharrock: Brilliant. Peter, did you want to wrap things?

[00:16:17] Peter Maddison: I was about to do that. I thought that would be a wonderful thing to do. I think we've hit the top of our 20 minutes. So thank you very much, Peter. And thank you very much, Dave. It's been an interesting conversation as always. And I look forward to hearing some feedback from our listeners.

[00:16:31] You've been listening to definitely, maybe agile, the podcast where your hosts, Pete Madison and David Sharrock focus on the art and science of digital, agile, and DevOps at scale.