[ MUSIC ]
HOLITZA: Good to see you both again. How are you guys doing?
BOOCH: Good to be here.
AMBLER: Doing great.
HOLITZA: I'm really enjoying these chats, these are great. And you know, I'm more of a testing guy, that's more of my background, so I'm really curious to understand how testers fit into the Agile paradigm. I've heard a lot of different things, but I want to get the straight scoop from both of you. So you talked a little bit in the last episode, the last session, about test driven development. What is test driven development?
AMBLER: Okay. So test driven development is a technique where you write a single test before you write a line of code. So I write one test and then just enough production code to fulfill that test. I can do that at both the requirements level with customer or customer acceptance tests, whatever you want to call them, and I can also do it at the design level with developer tests.
So, and the reason I talk about requirements level and design level is because you're writing the test before you're writing the code, you're not only doing validation, but you're also doing specification. So, and with test driven development your tests are actually doing double duty, both the specifications and as tests.
HOLITZA: Oh, wow. That's very interesting. So I've heard that, you know, testers are passe in the Agile world. Is this true?
AMBLER: Passe is probably the wrong term, but the role definitely changes. I like to look at it as, there's two different aspects here. The first one is with test driven development as a form of confirmatory testing. What we're seeing when organizations adopt Agile is that as they change their cultures, they train people and mentor them, they find that some of their traditional testers actually become developers.
And I work closely with developers because the testers will have very good testing skills, but their development skills might not be there. And similarly, the developers will probably have very good developer skills but, you know, might not have the testing skills so they need to pair more often and learn from each other and do that.
But what we're also seeing is independent investigative testing where some of your existing testers actually start to specialize a little bit more on testing, and what they do is they support the team as the develop.
So with continuous integration, what Agile teams do, they will constantly have working builds. And what they'll do is on a regular basis, perhaps every weekend or at the end of each iteration, they'll push their working build to the testers who will then do system integration testing, security testing, usability testing perhaps.
They'll be looking for more of the non functional requirements and as well as looking for problems that just slipped through the cracks, that the testers and developers on the development team didn't find.
BOOCH: Okay, note what Scott's talking about here, that in effect this process leads you to testing early and testing often at not just the unit level, but also the integration level. Your customer who's been wrapped up in waterfall approaches probably, if they're following traditional historical approaches, probably didn't do any integration testing until much later in the development process.
And that's far too late to make any difference, and as what Scott was describing here, you're actually accelerating the time to do integration testing. You can integrate everything because you don't have everything, but you begin to test what you can integrate.
AMBLER: Exactly. And this reduces your overall risk but it does require a little bit more discipline on the point of view of both the development team and the testers. And in effect you're pushing what used to be, you know, the end of lifecycle type testing is now being pushed to the front of the lifecycle where it's less expensive to actually address the defects that you find. So you reduce your risk, you reduce your overall costs, but you increase your discipline.
HOLITZA: Right, yes. This customer tends to do their integration regression testing in the last two to three weeks. And what happened is they did their performance testing there also, so if they had any performance issues that caused great strain on the entire team and you know, those risks were exposed very late in the process.
AAnd so often they'd have to either, you know, accept the risks and put it in production and fix it later, or they'd have to...they'd have to, you know, stop the whole thing and push the release out several weeks or a month.
AMBLER: Yes. So I guess the good news is at least they're exposing those risks but they should've been doingit a lot earlier.
HOLITZA: Right. That's what it sounds like to me. And so what I was also wondering is, so in an Agile world, is test automation more important than it was before?
AMBLER: Yes it is. Automation is absolutely critical both from a continuous integration point of view as well as from the automated testing itself. And particularly with the independent investigative testing, we end up seeing, you know, greater levels of sophistication.
So for example I was talking earlier about security testing. And then so you'll use some of the...you know, some products like Appscan, for example, to do that, because security is an absolute critical aspect of most systems that are being developed these days.
And yet, the developers really don't have the skills to address security in an intelligent manner. So we desperately want to automate that and you know, find your security bugs as soon as you possibly can.
BOOCH: Things like security and performance, for example, are systemic issues, and whereas you can begin to attest some of the local things, it really requires an integration test to be able to truly test those kinds of things which is why accelerating those kinds of larger tests is so important to attack risks they're around.
HOLITZA: Right, right. So what kind of adjustments do organizations need to make to move into, around testing specifically, when moving to more of an Agile paradigm?
AMBLER: Well, I think there's a...like I was saying earlier, there is this movement where some of your existing testers will actually pick up more development skills and be working much closer with developers so that way they can share their testing skills with developers and vice versa.
And then, but then there's also this move for some of your testers to become even greater, even more specialized in the testing, which means they need to pick up a wider range of testing skills, be working with, perhaps with a slightly different set of tools, and to be a little bit more flexible.
One of the things we're also seeing in the Agile world is that because we're not pushing testing there into the very end of the lifecycle, there is no detailed spec, you know, that's completely accurate.
So the testers need to be more flexible and be willing to test what they would perceive as just a portion of the system, or as a partially built system, to find the bugs that are there right now so that way they can be addressed earlier.
So the end result is this greater level of flexibility ends up giving you better quality in the long run, but it does require greater flexibility and greater discipline so. This can be frustrating for some people.
BOOCH: It might also be challenging for your customer in that traditional approaches testers are a group out there and the developers throw things over to the testers. So another cultural change is that the testers and the developers end up being peers, and there have to be opportunities created for them to communicate with one another as opposed to just throwing things over the wall.
HOLITZA: I was going to say that. It sounds like the testers and the developers are going to work closer together, which I think is a great thing to happen and you know, I couldn't think there'd be a better way to do software development of those things...
BOOCH: Software development is a team sport, and in so far as you marginalize groups along the way then you've cut away some of the power of that group soul.
HOLITZA: Exactly. Well, this is great stuff, guys. I'm a...now we've talked a little bit about architecture, a little bit about testing, now let's...you know, next we get together, I'd like to talk more about how we do scaling of Agile and how it works in a large organization or distributed organizations. Sound good, guys?
AMBLER: Works for me.
[ MUSIC ] [END OF SEGMENT]