Insights Podcasts The Testing Show: Shifting Security Left with Jeffrey Wheat

The Testing Show: Shifting Security Left with Jeffrey Wheat

February 23, 2018

In this episode, we are joined by Jeffrey Wheat, Director of Cyber Operations for QualiTest. Jeffrey spoke with Perze Alaba, Jessica Ingrassellino, Matthew Heusser and Michael Larsen about the changing nature of security, security vulnerabilities and ways that the discussion around Security topics can be handled more directly and earlier in the software development process.

Also in our news segment, what happens when a security issue affects nearly every current version of a CPU, even when the machines are not turned on?


















MICHAEL LARSEN: Hello everybody, and welcome to the show.  I am Michael Larsen, and I’d like to welcome Jessica Ingrassellino?



PERZE ABABA:   Hi, everyone.


MATTHEW HEUSSER: Good time zone.

MICHAEL LARSEN: And we’d like welcome our guest to the show, Jeff Wheat?

JEFF WHEAT: Hi, everyone.

MICHAEL LARSEN: Matt, let’s get this show on the road.

MATTHEW HEUSSER: Great.  Well most of the audience you know (all the regulars), but Jeff might be new to you.  Jeff is a security consultant with QualiTest, and he’s a little bit more high level.  He’s not just a penetration tester, and he’s not just a hardener.  He is an individual contributor, but he also does real consulting.  So Jeff, why don’t you tell us a little bit more about what you’ve done?

JEFF WHEAT: Well, basically I’ve got a lot of background with a lot of high- volatility networks, both from a design aspect as well as from running them from a Security Operations Center kind of standpoint operations deal as well as a lot of policy, best practices, kind of SISO-type level work.  I’ve written a book on Wireless and Wireless Security.  Various things.  I always kind of seem to be on the new technology.  At the time, it was Wireless.  Then, it was Convergence with ATM.  More recently, it’s been Big Data, Machine Learning, as it applies to Cyber and getting more of a proactive way of dealing with things instead of always reacting to the attack.  That’s a brief summary of my background, at least for this conversation.  Also, from a QualiTest standpoint, I’m the Director of Cyber Operations globally.

MICHAEL LARSEN: Well, fantastic.  Welcome to the show, Jeff.  First and foremost, new year, new interesting issues, and new challenges.  We’re going to get right into the News for Today.  This isn’t exactly brand new news, but it’s a good bet that a lot of people aren’t quite aware of this.  Intel’s latest core processors might have some serious security flaws.  This particular report I’m talking about appeared on Engadget in late November of 2017.  So, if you happen to have a PC or a machine that has the sixth, seventh, or eighth generation core chips (that’s Skylake, Kaby Lake and Kaby Lake-R, along with Pentium, Celeron, Atom, or Xeon chips), you may have a potential of hackers to “load and execute arbitrary code outside the visibility of the user and operating system.”  In fact, you could even have this vulnerability while the machine is not even turned on.  Yay!

JEFF WHEAT: Yeah.  You know, this is going to be an industry thing.  There are two large stressors in the security role right now.  It’s the Cloud (because everyone is going there, and it’s a paradigm shift), and the other is the Internet of Things (The reason being there—the Internet of Things—it’s just going to be mass volume).  When you look at the business model that goes with the Internet of Things, the downward pressure on the businesses are price point, feature functionality, and form factor.  Anytime you start pushing that, it’s kind of like that old security and convenience, you know, they counterbalance each other.  The more secure it is, you know, in air gap, the less convenient it is.  That’s part of what drives a lot of this stuff.  There’s such a push to get to market fast, that security sometimes becomes the last point.  Now, it’s not as bad as it used to be.  This kind of scenario with the in-processors having the embedded issues is kind near and dear to anyone who’s done SCADA (Supervisory Control and Data Acquisition) systems.  You know, a lot of the remote sensory-type stuff in the past is in that realm, while they were always very simple devices that just had small bandwidth and could be out in the middle of nowhere kind of thing.  But, being able to put encryption and the normal kind of security that you’re used to on a laptop, I mean, your phone is more complex now than a laptop was 10 years ago.  That whole business pressure is down there to get it out fast, get it small, and get it cheap.  Security usually isn’t that cheap.  So, that’s usually the last thing they do.  Sometimes they just wait until someone finds it to react to it.  This is one of those deals where we’re getting better and better about finding it before the consumer has to react to it, and this is an example of that.  Again, it’s going to come down to patches.  It’s just like the Wi-Fi cracking recent issue that involved the actual Wi-Fi radio in iPhones and Androids.  It didn’t matter.  It was Broadcom.  You’re going to constantly be dealing, as things get faster they’ve got to move them down at the hardware level, which is also moving a lot of the security issues down in there.  So testing, patching, all that’s important.

MICHAEL LARSEN: I should also note, “Intel has published a detection tool for Linux and Windows to help administrators and users detect if their systems are vulnerable.”  If you have a PC and you are running with a Core or a Pentium Chip, you’re affected.  Also, if you’ve got a PC that is 2015 or newer, pretty good chance you are affected.  Mac people, you’re not necessarily off the hook here either.

MATTHEW HEUSSER: That’s what confused me, Michael.  Macs use the Core Chip.


MATTHEW HEUSSER: Why are we not talking about Macs?  If it’s a hardware‑ level problem, “My operating system is different,” isn’t going to really protect you.

MICHAEL LARSEN: Well, a lot of it also has to do with if you’re using the Intel AMT (Active Management Technology) Firmware.  So, if you are using it, you probably have something to worry about.  If you’re not using it, you might not have to worry about it.  We should also point out there’s an article on ZDNet that’s describing it saying that, “Intel’s Management Engine Feature can be switched off,” and that’s probably a bigger topic than we can cover in just a short news segment.

MATTHEW HEUSSER: So basically, “If you’ve got Windows, go get the detection tool, see if you’ve got the problem, and if you do, we’ll tell you what to do about it.”  Is there a patch out for this thing yet?

JEFF WHEAT: I haven’t seen a patch yet, but what I’ve heard is more the category of workarounds, like you said, where you turn the management system off.

MATTHEW HEUSSER: Okay.  There will be the detection tool in the Show Notes.

MICHAEL LARSEN: By the time this show goes live—hey—there may already be a fix out for it (by the time this happens).  If that is the case, I will put a link in the Show Notes for exactly that fix.

MATTHEW HEUSSER: But I think the greater question here is:  It says that this is a weekly thing that, “Oh, no.  A large swath of computers are being compromised and problems could happen.”  My first question with Jeff is, “Do we just need to have a really good firewall and then not worry about it?  What’s going on?”

JEFF WHEAT: It’s defense in layers.  The old days of, “I’m going to build big and then I’m fine,” are kind of over.  There’s just way too many ways to get past a single layer.  Good security hygiene, obviously.  You know, keep the patches up-to-date.  The majority of issues you see out in the industry, by Equifax or whatever, is about late patches.  They’re reacting much quicker to the patches.  There’s a lot more bounty programs out there that are finding the 0 days much quicker.  It’s like any kind of security situation—physical or whatever—just don’t be the soft target.  In most cases, if you’re in a situation where you’ve got some really good crown jewels that a lot of people want, then you’ve got more to deal with.  It’s the actual threat.  It’s, “How valuable are you?”  Don’t be stupid.  The scary thing about these things that are baked down into the system, in the security world, that’s the definition of “owned.”  If someone has gotten down into the BIOS on your laptop, it’s hard to be anything but a doorstop.  [LAUGHTER].  I wouldn’t use it to do my online banking.  I’ll put it that way.  You know, so at that point, you’re just buying a new laptop.

MATTHEW HEUSSER: This reminds me a little bit, I think, if I understand you correctly, when the Anna Kournikova Virus came out—What was it?  2001?  2002?—and it was just an e-mail that said, “Hey, look.  There’s a picture of Anna Kournikova,” who was at the time and still is a very attractive tennis player.  It was, “Anna Kournikova.jpg,” but it was also underneath that .VB or something.  VBScript.  JavaScript.  One of those two.  You double clicked on it, and it ran some VBScript that ran in your browser that took over your computer.  I was working for a Fortune 100 at the time.  They had flew in the security officer and she said that they’d fixed it, “Couldn’t possibly get hit with the Anna Kournikova Virus.”  They’d hardened the system.  Like, a week later, there was another one.  I forget what it was called.  I want to say it was heart bleep, but it was a different file .VBScript instead of JavaScript, [LAUGHTER], and it was the same basic principle and all of our systems were vulnerable to it.  That just amused me.  You didn’t generalize to try to solve this problem.  You blocked one particular kind of e-mail in your rules.

JEFF WHEAT: That’s the lesson here.  So, the most recently one was the Ransomware stuff.  It came out, and I happened to be working in some large healthcare groups and the executive levels reacted to it, “How do we stop this?”  I said, “Well, you stop this one by getting people to stop clicking e-mails.  However, that’s not your real problem.”  It’s just like you just described.  It’s, “How they infected you with it is going to change.  It’s what it does to you.”  So that, in the concept of Shift Left that we’re talking about here that gets into threat intelligence, “Let’s look at what it did once it got to us.  Let’s look at other vectors that the same thing could be introduced.”  It wasn’t a week later that SMB was exploited.  Then, it became a much more serious problem because now it wasn’t just training your people not to click e-mails.  Now, it was coming in through real vulnerabilities in Microsoft.

MATTHEW HEUSSER: I’m sorry.  What’s SMB?

JEFF WHEAT: System Management Broker, I think is what it breaks out to.  It’s a command protocol within Microsoft.  They found a flaw in that, but they used it to introduce the same flaw.  Point being is that they’re always going to find new ways to do that.  You’ve just got to start, you know, you’ve got to think like the bad guys.  You’ve got a “White Hat” and what we call “Red Hat.”  We start thinking, “Okay.  How else can we be exposed?”  That’s the whole concept of Shift Left.

MATTHEW HEUSSER: So, let me as you a couple of questions about Shift Left then.


MATTHEW HEUSSER: Would you agree?  We’ve been talking behind the scenes about what Shift Left means to try to nail down a definition before this meeting, and we didn’t have a ton of agreement.  We used a Penetration Test, and this show is mostly about software development and sometimes we talk about the greater sort of systems delivery, which sits on top of that, which can include things like customer support and how customers get their value from the software.  We used to build the software and then Penetration Test it and then go, “Uh, man.  I can’t believe there’s this capability we didn’t think of where you combine this feature with that feature and escalate privilege.  We’ve got to go lock it down.  Okay.  Now, retest it.”  We used to do it that way.  Even worse it would be like, “Oh, yeah.  Gee, this data is not encrypted.  This data is not running over SSL.  This data is just straight HTTP.  These passwords aren’t salted.”  Whatever it is.  Shift Left includes moving security requirements into requirements so that we can try to build it secure in the first place and then we test that those requirements are met.  Instead of, “I’m just going to find me some security bugs at the end.”  What is missing from that definition?

JEFF WHEAT: What’s missing is a little bit of a history lesson.  If you’re in software development, you know what I’m talking about.  Waterfall Development used to be the longer process.  It took, possibly, years depending on the complexity of the system to actually crank out.  We moved to Agile Development.  So, you had quick Agile incremental change.  All good.  A lot of unintended consequences though.  In that Waterfall Development Cycles, there was always a strong systems engineering element to it.  There was always someone gathering the functional requirements from the business elements, translating those into technical, detailed design requirements down to the development teams.  In the process of doing that, there was usually a strong interface control document practice.  In security, especially in intersystem security, a lot of the weakness is at the seams.  The different modules, different vendors, B2B, B2C, whatever.  Where they communicate is where the weakness is because that’s where the least amount of guidance and control is, and that’s always been the case.  It’s just become worse with Agile Development with 2-week spins.  You’re always trying to catch back up.  Once you find the security flaw, you’re trying to go back and get it back into that spin flow.  One, it’s not efficient to do that from a cost standpoint.  When you do that, you’re really kind of stopping your whole creative process at the Agile Development to start over and redo the security, and some of those security changes may drastically affect the end result of what you were producing.  It could be anything from performance, line speed, games, or whatever.  It matters.  The customer experience matters.  Mobile apps, customer experience matters.  You know, people won’t sit around and wait.  There are click times, where you are all part of that.  So, having the security bolted on later, tends to be inefficient like it is in any kind of system.  Building it in up front and dealing with any sort of performance issues, anything that the security might’ve created, is much more economical.  You’re better off staying well than getting sick and trying to get well.  It falls into that kind of scenario.

MICHAEL LARSEN: So, it seems to me that, again, we’re having this conversation of, “Security’s shifting left,” and this is a term that I know a number of people struggle with or aren’t very big fans of.  Be that as it may, it’s having security be part of the total software development conversation, “How do we get to have a say in that conversation?”

PERZE ABABA: Can I also add an extra question in that one?

MICHAEL LARSEN: Oh, go for it.

PERZE ABABA: It’s not just fostering the conversation but, “How do you expect accountability from some sort of collective responsibility now, instead of just having this one expert that does it in his or her own time across the development team?”

JEFF WHEAT: The key on what I was saying before was the systems engineering kind of role.  The role you’re talking about is exactly that role.  It’s kind of adding a new team member to the development.  When I talk about systems engineering, it’s not about just playing [unintelligible].  It’s more about keeping everyone moving in the right direction.  In this case, it would be in security.  When I say “security,” and application person with security understanding.  You know, security runs the gamut of the OSI model.  I’m talking about at the application layer here.  There’s all the things down in your protocol or whatever.  That’s not really part of it.  It’s more about hardening up the variables that are within your code so that they can’t be used for scripting and to do multiple things.  So that systems engineering role is to be that mentor to the application team so that when they find things that are a security risk, not only do they fix the risk, but they teach the team what it was that created the risk.  Unfortunately, with Agile Development, it’s kind of gotten rid of that systems engineering approach for speed.  It needs to be put back in.  That team member needs to go back in as a member of the team but always evaluating the security profile, but at the same time, however, being a mentor to the other developers so that, you know, it’s not only a “GOTCHA” or a “Check,” it becomes, “Found this flaw.  This is the nature of the flaw.  This is why it’s a flaw.  Going forward on your code, you need to strap down the variables better so that there’s no ambiguity.”  When you’re error testing, “How clean is your code?  Is it cleaning up after itself?”  Are there things like that that can be done?  Think of a best practice, as we do in like security.  There’s best practices that you can apply to the application side and that becomes not just doing the quick and dirty.  You know, as a coder, it’s real easy to write a simple variable, with a simple way that it accepts data.  It’s harder to write it in such a way that it hammers down the format of that data when it’s possible, and that’s where you get cross-scripting from in applications.  It’s because you’re actually putting SQL injection commands inside what a variable is taking and the code executes itself.  It just kind of becomes as if you’re a development team that you have a member of that team whose focus is security but understands coding.  It’s much more economical to do that up front then to try and re-sync a security change back into a two-week Agile Development Cycle.  For one, the individual that wrote the original code may be off on another project and you’re asking him to reboot from a brain standpoint, [LAUGHTER], to back to what he was thinking when he wrote that code.  It’s better to get with those individuals up front and have that sense of access.  In security, a lot of the Pen Testing you hear about where we’re running the scans and we’re looking at protocols and weaknesses in protocols, that’s really coming from the network side up.  Whenever I build my teams out, in terms of security testing, I always look at where I’m coming at, “Do you want to Pen Test?”  Pen Test, I’m going to bring my network types in.  They understand the network protocols.  They understand why the protocols are there, what they’re used for, and then I have people who you can think outside the box in terms of, “Okay.  If that protocol did this, this might be a good way to do a Denial-of-Service Attack.  I can overload this.  Or, this might be a way to slide in on the coattails of Protocol Command.  So, that’s kind of the network person’s world.  But, when you get into the flaws and the applications, that’s really the computer science realm.  You know, always trying to be the harder target.  You know, make them go after the softer targets.  That’s the main game plan with any kind of thing like that.

PERZE ABABA: In relation to that response, I really like the suggestion where bringing someone in to act as sort of security coach to be able to mentor the team on, “What can be risky?  What can’t be risky?”  My question for that, it seems like you’re suggesting the person that we should be bringing in to have balanced scales, not just some blue team type of person, but it has to be a blue team/red team type skills within that team?

JEFF WHEAT: You’ve hit it on the head.  What you’re describing is basically the systems engineer.  Security in the past was always the systems engineer because they always controlled the interface control document.  They always controlled the bigger‑picture thinking, how the system itself works.  In security, the weaknesses are on the seams, that’s where those communications occur.  That’s where the bad guy is attaching.  The man-in-the-middle attack.  All those happen at the seams.  So, having someone kind of understands the system or understands how systems work and the communications work and being able to apply that at a code.  So, yeah.  You hit it right on the head.  You’re looking for a multi‑talented individual.

JESSICA INGRASSELLINO: Jeff, this is really fascinating to me.  One of the things that you just said really kind of strikes a chord with some research that I have been doing lately.  You said something about “security thinking” or the “security mindset,” and so far we’ve been talking about tester skills or developer skills and we’ve been talking about it sitting at the computer level or analyzing the systems level, systems thinking, and working with the systems or with the networks or with the code.  Do you think that there is value in giving the whole organization a security mindset or trying to encourage an organization to think that way as they’re developing products and as they’re doing kind of early development activities, pre-code writing?  If so, how have you seen that succeed?

JEFF WHEAT: To answer your first question, absolutely.  I’ve seen it succeed better in the industries that are tied to what your application is or the code is, is a technology company because they kind of get it.  Where it’s harder is where you’ve got applications for companies where the business model has nothing really much to do with technology.  The key I think going forward—this is going to go on a little tangent—are the things like GDPR out of the EU where they’re starting to find people for data privacy.  The days of just paying a fine or ignoring it are gone, because they’re talking about real fines—4 percent of gross domestic product—[LAUGHTER] for the company.  What I’m seeing in that space, some are making the mistake of thinking, “Oh, this is just the IT Group’s problem.  The IT Group may be the one that exposed the data privacy issue, but it’s the data privacy issue that they’re going to get fined for.  A lot of that is targeted at U.S. multinationals that are doing business in the EU.  I think you’re going to see that paradigm shift change because, just like with Equifax, the CEO doesn’t necessarily move on and become a CEO somewhere else if he’s had a catastrophic event like that.  So, you’ve got a better listening group from the corporate level down, driving in this policy, but you’ve got to the Boardroom, to the C Level, and get their understanding and get their buy-in or at least explain to them why they care, to get that driven down for a holistic corporate-wide approach.  Getting everyone to think “security” is hard.  Getting people to not click on links or not surf online at work is tied to all those, because the weakest link is always the human.  In the security world, if you can find a way to get to the human, you can get in.  Where I’ve seen it work the best is when the corporate level has taken it on as a part of their Ethos, “We are going to protect our customers’ data.  We care, and we’re going to do the right thing.  We’re going to put in the right practices.”  It doesn’t mean you’re not going to get hacked, but at least at that point you’re going to pass the Washington Post Test, which is where we say, “If this shows up in the Washington Post, if you’re Equifax, if your CISO has a music degree -not that there’s anything wrong with music degrees– but the fact that he has a music degree didn’t pass the test, because he failed to do what he needed to do to protect 147 million records, [LAUGHTER], and that’s kind the way you’ve got to work at a higher end at the Board level.  Get them to push it down, because the IT Group can’t get everyone to do it.  The Application Group can’t get everyone to do it.  You have to have a congruent message throughout the relationships.

MICHAEL LARSEN: So if I can jump in here real quick, as software testers primarily, what would we want to do to bone up on this conversation to be able to participate if this isn’t our primary wheelhouse?

JEFF WHEAT: The thing about security is it’s always changing.  You’re going to have to have someone that is passionate about it that can drive it to the team.  If you can find the attitude and aptitude in the people that are passionate about it, mentor, and bring along, you know, and just share that knowledge.  It’s all the old-school stuff, “Let’s get together and talk about security.  What do you have new?”  It could be a blog.  A knowledge repository.  You know, at QualiTest we have a center of practice kind of deal where it’s a lot of the old‑school stuff, but you still need the incentive to do that because it is a lot of work to stay up to speed on.  So, when someone takes that on and starts taking on that role, reward them, nurture them.  That’s all those things you have to do with any kind of hard thing that is to learn.  Take care of the people that show the attitude and the aptitude.  Reach back, help the new ones.  Get them thinking.  It’s just second nature, like brushing your teeth.  It becomes what you do.

JESSICA INGRASSELLINO: A big area that I’m thinking about, “How do we have these conversations with people, and how do we empower people to create these environments even if (say) you can’t go to the C Level?”  I mean, top-down is obviously best because then you have complete buy-in and you’re doing that data protection really at the highest level for the company and being able to have the conversation about security mindset and what that looks like and what it means even on a smaller level can still have some benefits.  So, you definitely helped me to get a little more understanding about that.

JEFF WHEAT: It certainly does, and you’re right.  There are a lot of cases where the CEO has bigger fish to fry.  He’s not going to deal with it.  But, if you can go to the leadership in the development groups, if you can get to them and get their buy-in and get them to set up technical leads and one of those technical leads is a security lead, it’s the same sort of thing.  You have lead programmers, and you have lead testers.  It’s the same thing.  Just create the same sort of mentality of award as well as getting someone to step up and be that.  Find the person that wants to be that.  You can’t force them to be that.  That’s the problem, “Whoever drew the short straw, gets the security work.”  That’s not going to work.  In the security space, if you can find people that will lean forward on that, and it’s really good for their careers.  It’s like any kind of multi-discipline, if you’re going to be able to respond to the market and be able to do that.  So being able to pitch that to someone that it’s good for their career to take those leads, there’s plenty of growth in this company to do that or there’s plenty of opportunity, [LAUGHTER], out there if there isn’t.

MATTHEW HEUSSER: Well, I wanted to ask.  I think you said something really powerful, Jeff, earlier on that I wanted to follow up on, and that was on, “Security as a tradeoff.”  Usually we frame security as security and power for the users are sort of enemies.  Security inconvenience or utility are kind of enemies, and the security guys come in and say, “You have to do these things,” and the programmers go, “Eh.  Yeah.  Yeah.  Yeah.  Okay.  And, that’s going to make the user experience worse, but we’ll do it to make you happy.”  There’s not really a business decision, and there’s no way to quantify these tradeoffs.  It is simply the loudest voice wins or the person that can say, “You have to do this or they’re going to take our PCI away.”  That person always win.  In Scrum and XP, we have this idea of Spikes.  If you can conduct a Spike and say, “Yes.  We could add this security measure and it will cost you this much and it will impact performance this much,” early instead of trying to shoehorn in at the end and then getting yelled at because performance is slow, that could be a real game changer for a lot of teams.  I think that’s one example of what you mean by Shift Left, and to do that you need a security person who’s multi-skilled on the team.  Are you tracking so far?

JEFF WHEAT: Oh, yeah.  Definitely.  If I were to identify the skillsets within a development team that lent itself to that sort of thing, it’s the performance types.


JEFF WHEAT: Because they’re kind of getting down in the abyss.  Those are the people that they’re not just writing code, they’re writing code in test and writing test.  You know, they’re tuning.  Security, in a lot of ways, is tuning.  You’re tuning, you’re shutting vulnerabilities down, is a form of tuning.


JEFF WHEAT: So, the performance mindset thinks the same way.  It’s like, “How can I get an extra 10 milliseconds of performance out of this?”  It’s like it’s never good enough.  A performance type, there is no endpoint.  It can always be improved.  What I’m saying is, if you’re looking for the individual who’s probably the most mentally tuned towards the same type of environment as security, the performance types are usually the ones that fit well with that because performance has a lot of the same sort of requirements that security has, in terms of its always being improved.  You’re getting down into the minutiae to make it better.  It’s people who think that way.  They’re Goldstar is getting an extra 2 milliseconds out of it.  Well, you know, that kind of mentality of that incremental improvement is the kind of thing that works very well with security because security is one of those things, it’s diligence.  You know, it’s that old deal that, “We have to be right 100 percent of the time.  They only have to be right 1 percent of the time.”  It applies in cyber security as well.  So, your performance types within the application space usually have that same sort of mentality.  You’re worried about performance.  It’s an environment where you’re always trying to squeak out that last bit.  So, that works well from a security standpoint too.

MICHAEL LARSEN: All right.  Awesome.  I think that this has been a great discussion, and Jeff we want to respect your time and everybody else’s.  So, I guess this would be a good point for us to get in the Closing Thoughts and Summaries.  Of course, we want to encourage everybody to participate in our game of shameless self-promotion.  So you are our guest, so let’s have you go for Jeff.

JEFF WHEAT: Yeah.  Well, thanks for having me on.  I’ve enjoyed it.  Hopefully, you can hear me okay.  I’m fairly new to quartets, and we’re here to maximize the cyber offerings.  I think QualiTest has an awesome position to do that with our application knowledge coming from the application layers down, and we’ve had a lot of good success.  I’m at a customer site in Bangalore right now.  So, I apologize with any weirdness with the connectivity or whatever, but our opportunities are around the world and we’re excited to take them on.  I use LinkedIn as a professional link with colleagues and peers and whatnot.  It’s Jeff [email protected]@QualiTest.  We’re always responding to things there.  The FBI InfraGard board and things like that, so there’s always some knowledge we’re passing out to the deals.  That’s the best way to find me is on LinkedIn, and I’ll answer on that as I can.

MICHAEL LARSEN: Excellent.  Fantastic.  Again, thank you so much for being with us.  Jessica, what are you up to.  It’s a new year.  Are you doing some cool things in the next few months?

JESSICA INGRASSELLINO: I’m trying to.  I have a couple of book projects that I’ve been working on developing.  So one is a Python book, and that’s contracted and ready to go.  So hopefully, by June, we’ll have a new Python book geared toward adults who are starting the language, and then I have another book that’s mid-research level.  So, that probably won’t see publication until December, but yeah.  I’m working on a couple of projects.  Of course, the PyCon Education Summit is coming up in May.  So, we are going through the process of putting that together, and it’s always really a fun time.

MICHAEL LARSEN: Awesome.  Perze, what are you up to?

PERZE ABABA: So looks like, as of today, I will be in San Francisco by the beginning of February for a Consumer Advisory Board Meeting.  Maybe there’s going to be a Bay Area Software Testers MeetUp by then.  I don’t know, but I look forward to meeting up with folks on my downtimes.

MICHAEL LARSEN: Matt what are you up to, man?

MATTHEW HEUSSER: So, I’m trying to be at Agile and Beyond in May and Agile Testing Days USA in June and CAST in August.  Other than that, I’m trying to maybe do less conferences this year and more actual consulting.  Looks like I’m going to be in Columbus, Ohio quite a bit this year doing consulting.  Trying to travel a little bit last.  Tons of writing coming out, and keeping Exelon going.

MICHAEL LARSEN: Fantastic.  I’m deliberately keeping to a particular theme, my personal appearances or places where I’m going to speak.  I’m making an effort to speak on Accessibility and Inclusive Design, and to that end, I will be speaking in Toronto, Kitchener‑Waterloo area at the end of February (February 28th), and I am also giving a presentation and a workshop specific to Accessibility and Inclusive Design at STPCon, which is going to be held in Newport Beach in April.  Matt you got one more thing, so go ahead?

MATTHEW HEUSSER: Yeah.  Just that we did a lot of partnering with QualiTest last year.  It’s looking like we’re going to be able to do more of that this year.  I’m really excited about the two groups working together, and QualiTest just has such a broad reach in terms of their capabilities to deploy people, even onsite the day after tomorrow kind of thing.  It’s going to be fun.  One the testers that we actually placed with QualiTest, Gerry Owen, is going to be on an upcoming episode of this show, and I’m looking forward to hearing what she’s up to.

MICHAEL LARSEN: Well, okay, gang, I think that’s a wrap.  Thanks so much for being with us, and we will talk to you soon.  Take care.


MATTHEW HEUSSER: Thanks, Michael.

PERZE ABABA: Thank you.