Do We Still Need the Phrase “Agile Testing”?

March 04, 08:45 AM
/

Panelists

Matthew Heusser
Michael Larsen
J
Janet Gregory
J
Jenny Bramble
Transcript

Agile as a development practice (and by extension Agile Testing) has been around now for two decades. By virtue of that, many changes and adaptations have been made and it brings up a simple question: is the term “Agile Testing” all that relevant any longer? To this end, Matthew Heusser and Michael Larsen have a chat with Janet Gregory and Jenny Bramble about Agile Testing as a name and as a practice and dare to consider… is an Agile Testing practice by any other name just as effective?

References

Transcript

Michael Larsen (INTRO):

Hello and welcome to The Testing Show:

Episode 110:

Do We Still Need the Phrase “Agile Testing”?

This show was recorded Wednesday, January 26, 2022.

In this episode, Matthew Heusser and Michael Larsen welcome Dragonfire’s Janet Gregory and Papa’s Jenny Bramble to talk about Agile Testing as a name, as a brand, where it has been, where it is going and dare to consider… is an Agile Testing practice by any other name just as effective?

And with that, on with the show.

Matthew Heusser (00:00):
Well, thanks, Michael. Welcome again, everyone, to The Testing Show. We’ve got a real treat for you this week. We’ve got Janet Gregory who is a winner of the Most Influential Agile Test Professional Person award, a frequent conference type, and a consultant who has a sort of broad perspective on what going on in the industry, both from conferences and from working directly with a lot of clients. Kind of one of the holders of the keys of Agile Testing, you might argue. So welcome back again to the show. Janet, always good to have you here.

Janet Gregory (00:34):
Thank you.

Matthew Heusser (00:35):
And we’ve also got Jenny Bramble who is director of QA at Papa, and Jenny’s been around… more new to the scene, but a practitioner, directly involved, contributing with a single company at a time while being involved in the community. You’ve come up pretty fast from practitioner to manager to director. So you’ve got a more vertical perspective, I think, which we’d love to hear. Did I miss anything?

Jenny Bramble (01:02):
Sounds good to me.

Matthew Heusser (01:03):
Thanks. And of course, as always, we’ve got Michael Larsen here, the show producer, who’s been at a company that once was Socialext now for… 15 years?

Michael Larsen (01:14):
Not quite that long, but this will be my 10th year. Socialtext was a very small company, acquired by Peoplefluent, a fairly large company. And then people fluent was acquired by Learning Technologies Group in the UK. While I don’t technically work for social text any longer, I work for an integrations group within LTG and Peoplefluent. I still consider it the same company because I work with a lot of the same people, but 10 years.

Matthew Heusser (01:41):
So you’ve got that longevity as the world was throwing out all of these new, cool, hot ideas. You should try this thing. You should try that thing. You got to see them ride the wave of hype and come back down in some cases.

Michael Larsen (01:53):
And in a couple of companies before that, too. So yes.

Matthew Heusser (01:57):
Oh yeah. You’ve been around forever 10 years at one company, but 15-20 of testing before that.

Michael Larsen (02:04):
30, if we want to be precise, I started my career in 1991,

Matthew Heusser (02:09):
A tiny little company called Cisco, right?

Michael Larsen (02:11):
At a tiny little company called Cisco, which was a tiny little company when I got there in 1991.

Matthew Heusser (02:16):
That’s what I’m saying. And today we wanna talk about Agile Testing. When we started talking about Agile, gosh, I don’t know, 20 years ago, we said we want to deliver software every couple of weeks. And that was ridiculous and crazy. And we want to do it where the customers don’t have to find all the bugs. So we have to radically re-envision what we mean by testing. Agile Testing supported Agile Software Development. Now it’s not everyone, but when a client typically calls me, they’re at least doing some sort of weak version of Scrum. They do stories. They do sprints. They do standups. They might not be very good. But generally speaking, you could argue the techniques that we used to call Agile Testing, today we pretty much just call “testing”. Is there value in this differentiation or should we be looking at something new?

Janet Gregory (03:12):
Well, when we first started… a little bit of background, my first foray in Agile was in about 2000. And I think actually, Michael, I started in my first company in 1991 as well. But in 2000 is when I first started in Agile. And what was happening was that teams and people had no idea how to integrate testing into their Agile projects. There was customer testing and there was developer testing. Those were back in the days of Extreme Programming and people didn’t know what to do. All these testers were out there and they were lost. That’s when we started talking about what does it actually mean to be a tester on an Agile team? Hence the book “Agile Testing” was born to give somebody grounding, to give them a place, to feel like they belonged. And then we wrote “More Agile Testing” and it was just, the world needed some place to land. I think it was a good place, but that’s how it started. That’s how we started talking about it. Whether it’s still needed today or not, I’m not sure. I’ve been thinking about that a lot.

Michael Larsen (04:24):
So this is a little parallel and I want to kind of roll with this. And I realize that this doesn’t quite match Agile Testing, but I think it has similarities. So indulge me for a second here. Back in 1990, a lot of larger ski resorts started to allow snowboarding at establishments. Before then it was just little tiny hills and backcountry enthusiasts. And that started what seemed like a war at the time. And that war was depending upon what you had on your feet. It defined your friend groups. It defined where you went. Some places adamantly refused to allow “us” on the hill (I’m a snowboarder. I’m just gonna state that. I started the early nineties). And over time though, those long holdouts started to become fewer and fewer to where once upon a time, there were a lot of places that wouldn’t allow it. Now you’re down to just a handful of places and lots of choice to go wherever you want to. My kids and their friends, some ski, some snowboard, they don’t care. It doesn’t matter to them. As long as we’re going up to the snow for a day and having a good time, that’s all they care about. They don’t have conversations about what goes on your feet other than, “Oh, do you like to ski? Oh, do you like to Snowboard? Cool.” All of that previous history, all of that contention, all of that fighting is long gone. I think in a manner of speaking, a lot of the clashing about ideologies and where Agile fits or whether Agile is the way to go or whether Waterfall is or some other combination, I think we’ve had enough time now to where we’ve experienced them and we’ve seen the benefits and the disadvantages. And now much like we just go up on the hill and we don’t care what’s on your feet, As long as we’re having a time, I kind of think we don’t necessarily care what we call it. As long as we’re getting stuff out the door in a timely manner. Make sense?

Janet Gregory (06:26):
It does. Except I’m gonna just kinda add cuz I was one of those people back then, that was a skier. And part of the problem with the snowboarders coming on was they didn’t know the etiquette and they would be cutting across us who were trying to ski. And it was almost a real anger for some of those trying to figure out how to work together. Now, when you go on the ski hills, everybody knows how to work with each other. And I think that’s part of the being able to let it go. I still have little moments of panic when I see a snowboarder come at me, but then I am able to relax, cause I realize I actually know the etiquette of the ski hill. I kind of like that analogy because I think that’s what I see more and more in a software development, people are understanding how they fit in what their roles are. And so in that sense, it makes a good analogy.

Matthew Heusser (07:21):
So a couple funny things around 2003, 4, 5, 6, 7, I remember conference brochures with names like, “whether you’ve got plan-driven methods or Agile, we’ve got you covered!” In this kind of rhetoric of “you have to choose”. And I don’t think anyone’s talking that way anymore. The few organizations still have more write a spec for the entire project and deliver it once. Those organizations are building medical software and are just sort of disconnected from most commercial software development. Everybody knows what everybody’s doing and we’ve got our stuff figured out and you can do your thing and I can do my thing and sort of the methodology wars are over. So what’s next?

Jenny Bramble (08:08):
I’d like to go back to something you said about it being medical software and that sort of thing. I think there are other bites of non-Agile practice that’s still out there. I work for a client services company and it is hard to build something if the client doesn’t tell you what they want in full up front. We can do Agile within that. But there’s still like this waterfall-esque-ness to it. I came into software decently later than y’all. So I didn’t get really that waterfall to Agile transition. I did also start at Cisco. So I saw some of that. But as I was moving through my career, the language was all Agile. It was always assumed that it was gonna be Agile. And as I’ve gotten deeper into software, I wonder if that’s the right thing. A lot of times I feel like we’re going towards Agile because Agile is what the cool kids are doing. It’s what everyone is saying is how we should be building software. But I think there’s still moments where we should step back and say, you know, waterfall works here or something else works here. SaFE works here. As long as we don’t forget the etiquette, like Janet was saying, as long as we see the etiquette of working together as people and not just throwing things over, I think the future is a hybrid. I think the future is a toolbox where we can pull the tools that we need for this particular project out of it and say, you know, maybe this is a waterfall project.

Matthew Heusser (09:35):
We’ve done it a whole bunch of times. It’s very similar to other things we’ve done. It’s gonna take us about a month. Maybe we could just build it. You make a great point, I think. There were these sort of extremos who would say things like, I can’t promise you anything. I don’t know anything. We’re gonna have something for you in two weeks. And the client would say, well, we’ve done similar project and we know this is at least $10 million. It’s gonna be at least four teams. It’s gonna be at least a year or whatever their math was. We kind of wanna know what we’re gonna get. At least the roadmap, give us something. So I think the expectation to have it all figured out on day one, people have moved away from that because it’s not realistic. And I think the expectation that I’m not gonna tell you anything, I’ll just show you working software every two weeks, leave me alone and let me write code. I think that expectation you can’t see my hands, but they’re moving toward each other. We’re moving away from both sides and trying to figure out the middle today, which I think explains some of SaFE’s popularity. That’s really what it promises, SaFe promises. You can have your cake and eat it too. I think that’s probably a different podcast, but I think that’s where they’re going. Is it fair to say that there are just a lot of testing techniques and we can just pull the ones out that we need to satisfy the system of forces so we can lose the labels and just talk about what we need for project today?

Janet Gregory (10:58):
I was gonna say one of the things that Lisa and I have been moving away from trying to call it Agile Testing for many of the same reasons we’ve just been talking about, but looking at it from a holistic perspective. So that’s the word I’m using these days is holistic testing. What do you need? Looking at your tools, looking at where you are in your life cycle and what tools are we trying to use here? What testing activities do we need at this point in time but looking at it from a holistic perspective. I think it’s time to drop some of that because it really doesn’t matter anymore. Whether you’re doing Agile, if you’re waterfall and doing it well, go for it. If you’re not, there’s lots of Agile practices, I guess, that can help no matter where you are, but do we need to use that term anymore? I’m not sure.

Jenny Bramble (11:50):
Janet. I love that holistic testing. That just feels really good to say it, to have that in my brain. Thank you for that phrase. Love that.

Janet Gregory (12:01):
You are so welcome.

Matthew Heusser (12:02):
Janet did a great talk. We can probably link to it. I hope it’s on YouTube in salt lake city, where she introduced this concept and it kind of ties DevOps and usable and user all of it into producing risk throughout the development cycle.

Jenny Bramble (12:18):
I’d like to say, I mentioned I work for client services. Used to I don’t anymore. Now I work for healthcare, which is more Agile than client services, which is funny. But one thing that we would do is we would find that a lot of our clients were very waterfall internally. And when we’d come up and we’d be like, Hey, we’re gonna do Agile stuff. Our waterfall clients were like, wow, we don’t feel good about that. That feels bad to us. There was also an element of what Agile practices can we move into this waterfall area? And testing actually was a big part of that because we could bring testers into requirements meetings. We could bring testers into other places and say testing, isn’t just making the tippity taps on the keyboard. It’s actually to borrow a phrase, a holistic practice.

Matthew Heusser (13:06):
Yeah. We had one large client. I was working with a few years ago where on four different continents, 14 different teams. And we said in one full day, maybe not a business day, but maybe a day by the beginning of the next morning, in whatever time zone you want, we picked central US, at the end of 24 hours, we will give you an evaluation of product status, call that whatever you want, call it Agile. It’s not continuous delivery. Call it DevOps, whatever you want… Wouldn’t you want to have that? And at the time they were doing something like SaFE, they had product increments. I think they were like 18 weeks from start to deploy. Hardening sprint was two weeks. We said, how would you like it to be one day? I don’t care what you call it. Wouldn’t a waterfall shop want that too?

Michael Larsen (13:59):
I think one of the things that we can also look at here is regardless of what we call it or how it is done, teams ultimately start to gravitate towards and they start doing, even if you have all of the rituals in place, let’s say they start to gravitate towards what works effectively for them. I’ve had a chance to feel this because we had the more classic Scrum and Kanban environment when I was working at Socialtext, fairly solid group. And the interesting thing about Socialtext was that Socialtext had a fairly good ebb and flow of the number of people that came through it and worked on projects and moved on to other projects. And so you always had a fresh stream of people, which is a good thing. I’m contrasting this with the group that I work with right now, and this is not to disparage the group. Not at all. I think they’re very effective. What’s interesting about that group. I’ve been working with them going on my third year. I’m the new guy. Everybody else on this team has been actively working these projects for more than 10 years. Even now there’s still so many things that I am just scratching the surface on. And even when you have an environment where Agile Tcesting or whatever else is there, you still need to be able to have a corpus to communicate to people. You need to be able to say, if I wanna understand how something’s working here, where do I get that information? If you have a continuous stream of people coming in and out that can live in living memory, if you will. But if you’re working on projects for a decade plus with a steady team, there’s a whole lot of implicit knowledge there. As the new person I have to constantly ask, Hey, do we have someplace where this is documented? Do we have someplace where I can go back and understand this? Or do I have to reinvent the wheel or figure something out. And to their credit for most of the things? Oh yeah, no, we’ve got this over here. I mean, yeah, it was written a long time ago, but it’s still pretty good. It’s still got the basics. Our current projects that we have, we have documents, we have diagrams and flow charts and stuff that we can always reference and say, remember, this is what we’re working towards. Shat you decide to call it. Whether that’s Agile, whether that’s some kind of a Scrummerfall approach, whether it’s, again, a more holistic approach to how you do things, it is going to be dependent upon the group that you are working with. If you have long lived projects and long lived teams, there’s going to be a tremendous pooling of implicit knowledge. Anybody new that comes into that environment is gonna have an uphill battle.

Matthew Heusser (16:48):
There’s also a huge difference between doing something; stories, sprints, and standups… and doing them well. That’s been one of my frustrations in software is that we popularized the idea of exploratory testing. And there was an inflection point where all of a sudden, instead of getting a full room of people to learn about it, everybody thought they knew about it and they didn’t need to go to training on it. They had it all figured out. But when you tested with them, they just missed a whole lot of obvious bugs. They had a level of depth of knowledge that you could learn in one hour watching a YouTube video. And there was so much more to learn. And I guess that’s my concern with testing is this kind of oversimplified understanding in order to do Agile Testing, we’re going to figure out all the things you test normally manually and automate them. I still run into that. It’s a difficult conversation because it’s a faith-based conversation cuz obviously they had never really seen that works because that’s not how it really works or their understanding is just too simplistic.

Janet Gregory (17:58):
Yeah. But I think that happens. And has always happened, no matter where you go, it’s what is the level of understanding? And I think it starts first of all, with the people and how they articulate what it means. When teams take and say, we don’t need testers anymore and fire all their testers. And then two months later realize they don’t have the testing capabilities because they just didn’t understand what it was. And then hire a bunch back. Every person who’s from a testing background or understands testing, trying to articulate what that really means. And so one of the things that I try to do is to show how many testing activities there are, because I think we underestimate some of them. Exploratory testing is one piece of it. How do we do that well? One of my favorite ways is just pairing with people, sitting down with people and showing them what that might mean. Questioning at the very beginning when you first get a feature, start talking about what are the quality attributes we need to worry about? What’s it gonna look like? How are we going to deal with those quality attributes? Start the team thinking about all of those pieces of the puzzle. And it’s only when you see all of that together do you realize how big that is, and that one person can’t do it. The whole team really needs to be part of that. But part of the conversation in articulating what testing act we need. So if we think one person by themselves can do it, that’s where a lot of times people think it must be easy and only get that kind of testing. You were talking about thoughts.

Michael Larsen (19:46):
I do think that a lot of times we tend to get very simplistic with this. Or like you said before, people think that testing is easy or that it’s something that anybody can do. To be fair, I do frankly think that. I think it is possible that everybody can do quality testing. I’m not one of those people who says, well, if you’re a developer, you don’t know how to test because it’s a different skill set. I think that’s bunk. I know some very good testers who develop and who have those very good testing skills, but they worked at it. Does an organization need dedicated testers? I would argue that maybe you don’t, but what you absolutely is somebody who understands testing and somebody who understands quality and can help everybody else get to that level so that they can test effectively.

Matthew Heusser (20:45):
I will say that every organization that has told me we don’t need no testers. I have said, bring me in for one day and let me see if I can find bugs that make you go. We need to think about our test strategy. Maybe you still don’t need testers, but you need to think about your test strategy.

Michael Larsen (21:00):
Correct. Yes.

Matthew Heusser (21:01):
I have made that offer before and I’ve never lost that bet.

Jenny Bramble (21:06):
I feel like we put a lot of emphasis on do we need testers? And the real question is, do we need test activities? There’s a lot of activities that testers do that we think of as the realm of the tester. And it turns out, yeah, we’re always going to need test activities. We’re always gonna need someone to check and see if they’re bugs. Maybe that’s automation, maybe that’s Bob down in finance. We’re always gonna need those testing activities. And I also feel like we put way too much of an emphasis on the activities themselves like building automation as opposed to test methodology and how we test and how we think about testing. That’s a soapbox there!

Matthew Heusser (21:47):
It is. Test design. It’s given this set of inputs and transformations, this logical spec, what test would you run? Junior people will have like one test; click the button to see that you get the error message. Mid-level people will fill the sheet up with test ideas and it’s not practical for you to run them all. And I find the more senior people, they can actually balance the risk and the time to understand the time constraints that we’re under to pick the most powerful test ideas, run them. And then given a very small sliver of coverage, make conclusions about the quality that stand up to some reasonable standard of scrutiny. I think that’s a skillset. I think it applies to all brands of testing, not just Agile testing. And I don’t think we see enough of that. It’s too easy to drop to labels where I go to meetings and people just say, oh yeah, we’re using Celery with Cucumber to build Kite the Python stuff. There are other people that are interested in that. I’ll leave it to them.

Michael Larsen (22:57):
I wanna introduce one more thing. As many people know I’m also a musician outside of being a software tester. And the band that I’m currently in has gone through a little bit of turmoil, not just because of the pandemic, but because of the fact that our original guitar player basically decided to step back from what we were doing. And so made the decision that we had to hire somebody new to play guitar with us. Interestingly enough, we realized through the process of doing that, remember I said again, the implicit knowledge and such. When that guitar player left, we realize, oh, it’s not just that the guitar player is leaving. A lot of our recording infrastructure is leaving. And a lot of our recording knowledge and skill. I mean, I have some of it and all of us have some of it, but it seemed to be a little bit outsized, which kind of makes sense. He’s the one that did a whole lot of the pre-production work that we did. Well that now suddenly fell on our shoulders. And so we had to learn how to do a lot of this stuff. And in a sense, my drummer and I picked up the slack and we did a whole lot of learning and experimenting and putting things in place and practicing various techniques. We then also later on, we actually ended up hiring two guitar players, interestingly enough, just because we felt that it would thicken up the sound. There was somebody who had become available, who was a full-time studio-style musician that we could work with. And what was great about working with this new player as we came in and we were all excited about the things that we had learned and all the stuff that we had worked on and focused on all this new knowledge that we had. “And he just looked at what we were doing. He said… you do realize you’ve got this wired backwards. Don’t get me wrong, which you’re doing works, but may I make a suggestion? And he went in and reallocated, everything says, now this is gonna work a lot faster here. Let me show you something that, and he just, boom, knocked out this template that we could use. He says also, Hey, let me make a couple of other suggestions. I know. Yeah. It’s really nice to be able to have your cabinets all done here and this, but let me show you this technique. And he took a couple of direct boxes and wired these raw signals that we were recording. And oftentimes we’d go back and listen to our tracks. And all you would hear is this plink of a very clean guitar and pickups. I thought, is this gonna be usable for something? He said, oh yeah, let me demonstrate why you would do this. This is your safety net by virtue of this. Now, if you decide that this sound, you don’t like here, I’m gonna take this and lemme patch this in. And it plays through the amps, just like a guitar player was. He says, Now, if you want to sample with the sound you can do so without having to make the guitar player play it over and over again. And I just sat there and thought, wow, okay. That’s the difference between the two of us who found ourselves in a pinch and went and learned as much as we could versus this veteran who has been a studio musician for 20 plus years and who makes music for corporate stuff and for soundtracks for a living. And he just said, yeah, you guys are working way too hard. Here, let me show you some things that you can use. So to bring it all full circle, I think in a lot of ways, we get enamored by the things that, Hey, new, shiny, cool. We can do this. We can make this work, but it doesn’t take the place of people who have genuine expertise with this (in some cases, decades worth) who can say, I know it sounds fun. I know it’s shiny, but come on, you are running way too hard for what you’re getting.

Janet Gregory (26:32):
I think that’s a really good analogy to thinking about when you have a tester on a team that is a really good tester, good all around tester with a lot of experience, they can help that team in so many ways. Granted, programmers are perfectly capable of testing when they choose to want to learn as mentioned before, but often when you have a really good tester, their experience allows them, enables them to tweak those little things by thinking about it early, by asking the questions when it matters or by doing a great exploratory testing and really seeing those unknowns that you just didn’t think about to me, that worked really well as an analogy. Thank you, Michael.

Jenny Bramble (27:18):
I like the idea of getting enamored with shiny things. I wonder if that’s partially where this conversation about, do we need the phrase Agile Testing”” is coming from, we’ve been enamored with Agile for the past couple of decades now, and maybe this is the moment where we step back and we say, is this fulfilling all of our needs? Or are we just doing it because it’s what the kids are doing. What does it mean to get into more of a situation where we’re not talking about Agile testing? We’re not talking about waterfall style testing. We’re just talking about testing activities that people can do and can find value from ’em. And of course discussing what value we’re looking at from those things.

Matthew Heusser (28:05):
So Jenny, it’s about time to say goodbye. Is that your last word? And do you have anything you wanna talk about? Where you’re gonna be next, where people can get in touch to you? What you’re working on? Are you hiring?

Jenny Bramble (28:16):
Yeah. Do you mind if I pump up my conference a little bit?

Matthew Heusser (28:20):
Sure. Yeah.

Jenny Bramble (28:21):
Cool! Man, this was a really great conversation. Thank you so much for having me on. Where you can find me. So I’m gonna be at Papa for hopefully a while. I am hiring, looking for a senior tester, but the next place you can find me that’s real big and real close to my heart is TSCA. Our conference is gonna be March 9th and 10th. It’s all virtual. I’ve heard the speakers are amazing, right Janet? (laughter) So we’re gonna have a whole bunch of different people and it’s gonna be really great. Tickets are on sale. Now you can find them at TSQA.org or find me on Twitter and I’ll let you know. Thanks, Y’all!

Janet Gregory (28:58):
Yes. Speaking of TSQA. Yes, I will be there. If I remember correctly, I’m gonna be talking about holistic testing as well out there. I’ll wrap up my thoughts here. I think that Agile has given us vocabulary to use, which has helped us things like collaboration, being key to that. Transparency, letting people know what’s going around, removing the idea of handoffs. Let’s work together to get it happen. I think those are are important key concepts, and that’s what we need to keep going forward. But again, one size does not fit all. Each one of us has different constraints. Every company, every team I work with have different constraints. They have different contexts. A lot of times I hear people talking about being a software tester. And sometimes that’s not enough because we’re not only working with software. We have a full product. We have to integrate it with a chunk of hardware. And that becomes more and more as we go forward. When we start talking about the methodologies, how do we get those things and fast feedback, collaboration, not waiting forever to deliver our product to our customers. That’s what we need to know. Does it need to have a name? Can we just call it good development? Just gonna stop there. You can find me at TSQA, but I’ve been writing blog posts on holistic testing as well. I’ve given a few talks. You can find them around. Janetgregory.ca CA for Canada helps people or AgileTestingFellow.com. Those are both places where on my own, I write blog posts and on the other Lisa and I write blog posts. I should say Lisa Crispin; not everybody knows who she is. And with that, I’ll say, thank you for having me on.

Michael Larsen (30:55):
So my last word on this, what we call it and what the words are, are secondary. The point is, we want to be able to be collaborative, be effective, and speed is also important. But I think sometimes we get bogged down with the whole idea that what’s fast, what’s new, what’s shiny is what’s best. Frankly, I don’t think that’s true. I think we need to replace some of, of those with what is effective. What’s prudent, e,ven. That’s what I feel honestly we need to focus on. And I also agree with Janet that calling us testers isn’t enough. We are expected to, and we are needed to do more and we should be ready, willing, and able to do more. As far as is where I’m going to be appearing. Next, I have been asked and I have accepted a spot to speak at InflectraCon 2022. I believe the conference is happening May 5th, May 6th. It’s scheduled to go in Washington, DC. I believe it’s going to be done as a hybrid conference. I’m hoping by then, I’ll be able to appear in person and be able to communicate with some folks in person on the east coast, but only time and conditions will tell. I am also actively working with the Pacific Northwest Software Quality Conference, putting on the 2022 show. And I am actively working with their board to help make that happen. So expect more of me talking about that in the coming months.

Matthew Heusser (32:30):
All right.

Janet Gregory (32:30):
Thank you. Thank you so much for working with our crazy calendars. I appreciate it.

Jenny Bramble (32:34):
Thank you.

Janet Gregory (32:35):
I actually like this fast, short kind of timeline. It works well!

Michael Larsen (32:42):
Good to know.

Janet Gregory (32:45):
All right. Talk to you later.

Michael Larsen (32:46):
Thanks for joining us.

Michael Larsen (OUTRO) (32:47):
That concludes this episode of The Testing Show. We also want to encourage you, our listeners, to give us a rating and a review on Apple podcasts, Google Podcasts, and we are also available on Spotify. Those ratings and reviews, as well as word of mouth and sharing, help raise the visibility of the show and let more people find us. Also, we want to invite you to come join us on The Testing Show Slack channel, as a way to communicate about the show. Talk to us about what you like and what you’d like to hear, and also to help us shape future shows. Please email us at thetestingshow (at) qualitestgroup (dot) com and we will send you an invite to join group. The Testing Show is produced and edited by Michael Larsen, moderated by Matt Heusser, with frequent contributions from our many featured guests who bring the topics and expertise to make the show happen. Additionally, if you have questions you’d like to see addressed on The Testing Show, or if you would like to be a guest on the podcast, please email us at thetestingshow (at) qualitestgroup (dot) com.

Recent posts

Get started with a free 30 minute consultation with an expert.