What’s new in SAP testing?

May 14, 17:37 PM
/

Panelists

Matthew Heusser
Michael Larsen
Bill Hayden
Daniel do Valle
Kenneth Small
Transcript

Michael Larsen (INTRO):

Hello, and welcome to The Testing Show.

Episode 146.

What’s New in SAP Testing.

This show was recorded on Tuesday, April 16, 2024.

SAP testing is a big player with the larger companies out there. Still, many know little about it. To that end, Matthew Heusser and Michael Larsen welcome Daniel Do Valle, Bill Hayden, and Kenneth Small to discuss the challenges of testing SAP software, the potential impact of generative AI on the testing process, and the upcoming SAP Sapphire conference where they will highlight the shift from quality assurance to quality engineering in the testing space.

And with that, on with the show.

Matthew Heusser (00:00):
Welcome back to The Testing Show, and if you were listening last time we were talking about generative AI and specifically trying to apply it to create test cases and tie that into tooling, to some extent automatically. One of the most interesting use cases that we came up with that was consistent and could work is SAP, which is enterprise planning software for large organizations. The screens are so standardized and the data model maps so well to it that predictive AI has a much better chance of having success. And we wanted to talk about that more. So we got our experts, we’ve got three today. We’ve got Bill Hayden, Danny do Valle, and Kenneth Small, who I’m told prefers the name “Kenny” and we want to particularly talk about Qualitest’s experience, Tricentis experience, and what’s happening in the world of SAP testing. Might be interesting to you and how maybe that might impact you. Let’s start with Kenny. Welcome to the show.

Kenny Small (01:04):
Thank you. Happy to be here.

Matthew Heusser (01:06):
Pleased to have you here. Pretty international show, we’ve got. You came from the United Kingdom. Rose through the ranks on the supply chain side and then joined Qualitest and you are head of the US SAP and enterprise practice for Qualitest. Did I get that right?

Kenny Small (01:23):
Yes, correct. So originally from London. I got asked to come over and help out SAP with the first ever S for global finance implementation in New York, and then I never left. So yeah, I’m now eight years, coming up to eight years in the United States and taking care of the enterprise practice, looking at all enterprise technologies because SAP is our core and our focus and one of our main partners, but as we know, it’s all about how it integrates into the wider landscape and we look at providing that business assurance across the board. Upstream, downstream, all the messages coming in and out of SAP and it’s given the scale of it and the scale of organizations we work with, it’s quite a complex task. So that’s what myself and my team do in collaboration with a lot of our partners like Bill and Danny.

Matthew Heusser (02:07):
It’s kind of its own little island in that sort of mainstream broad, when you hear about software development, you hear about Google or Microsoft, all of those things. We don’t hear so much about this little empire that is the enterprise planning world. I think we should. Maybe we can get some cross-pollination today. To help us do that, we’ve got Bill Hayden, an Architect for Alliances at Tricentis. Bill is an actual rocket scientist. He’s got two aerospace engineering degrees from Georgia Tech, but he’s been in software pretty much his whole career and was a Qualitest person. Now a Tricentis person working on partnerships and alliances and they’ve actually got some tools specifically that help with SAP. Welcome to the show, Bill.

Bill Hayden (02:56):
Thank you for that great introduction and for pointing out the complete irrelevance of my educational background. But like we used to say, one of my formal companies, “Testing is not rocket science, but just in case it is, we’ve got you covered.” As you pointed out, I’ve been in software and testing in particular now for almost 25 years. So the gray hair you will see on my avatar is actually very well-earned. I’ve worked in several organizations in the entire ecosystem, both on the sales and delivery side and had the great honor to work directly with Kenny for a year and a half at Qualitest when I ran the Southeastern business unit there, started my automation and performance engineering career in the late nineties and the build-up to Y2K and really started delving into SAP testing in particular around 2004 when word of the S/4 migration first started to break, they told us more money would be spent on that, testing wise, than Y2K, and of course for every testing organization on the planet that was music to our ears, hence we dealt headfirst and began a whole new division.

(04:02):
The organization I was at at the time, just specifically around SAP testing and it’s been very interesting to me over the decades that have ensued how our industry has grown and changed, particularly the speed at which change is occurring to us. I like to say that today is as slow as it’s ever going to be again. It’s only going to get faster and faster. It’s our job to allow organizations to undergo that change and completely de-risk it. So when people ask me what Tricentis does in a nutshell, I say quite simply, we de-risk change through our founders founding principles and efficiencies that he’s injected into the process.

Matthew Heusser (04:43):
Well thanks for that Bill. Glad to have you here. And finally, we have Danny do Valle. Did I get that right?

Daniel do Valle (04:49):
That’s pretty good.

Matthew Heusser (04:50):
Okay, great. Welcome to the show Danny. So Danny is a solution architect for Tricentis. He’s in Texas now, originally from Brazil. Two degrees including an MBA and a degree in computer engineering from schools in Brazil. Speaks Portuguese and Spanish. I don’t know if we’re going to get into sort of the international flavor of some of these projects but spent some time at Nokia. How do I introduce you, Dan? You’ve done so many things.

Daniel do Valle (05:20):
Yeah, it seems like I have. It’s been 30 years in the making of IT and the majority of this time with quality with testing, I started as a tester. I call myself a tester today. I see things being built and I’m the one who needs to break them. My wife absolutely loves that. My really entire career has been looking into software and how we can make it better, how can improve it. I have had the pleasure of working with Kenny and Bill in the last many years. Throughout my career I’ve seen a lot of different types of software. I started in a very unstructured type of work in the financial markets testing for the New York Stock Exchange and many of the exchanges around the world that build in very in-house software moving into more structured software when SAP started to show up and just to give your listeners some perspective, nine of out 10 large companies, Forbes 100, Forbes 1000 use SAP today. It’s embedded in our world and it’s going to increase even more. Just this last week I was at two events in Canada with SAP with many clients discussing this new thing, AI, that everybody wants to talk about what are the expectations and what are the actual realities, what are the actual products that we have available and how can we help? Bill, Kenny, and I have been doing this for quite some time and hopefully can provide some insights into this world and help people understand it a little better.

Matthew Heusser (06:47):
Welcome, Danny. Before we dive into the challenge that people have with SAP, we should spend a minute on what SAP is. I would say any large organization has things they want to count and things they want to track, things in the warehouse, plannings, schedules, timelines for operations, HR components. There are payroll components. Most large companies have some significant subset of them and they want to customize it to add whatever things they want to track. The broad name of the software designed to get all that data in one place is Enterprise Resource Planning or ERP. Is that a decent introduction to SAP if someone’s unfamiliar but has listened so far?

Daniel do Valle (07:26):
I think so. Now mind you, again, when we talk about SAP in large implementations of SAP, you could consider the core software about 80% of their complexity. But make no mistake, in every SAP implementation, you have an average of 15 integrations with other tools. So whether you talk one ERP or another, you’re actually talking about one specific problem that’s embedded in a plethora of many other complexities and silos that all need to be taken care and accounted for. And this helps us to introduce those tools and processes and experiences to help clients overcome those challenges.

Matthew Heusser (08:11):
I think that leads right into the first question I had, which was what are some challenges people have testing SAP software? Why is it hard? You’ve got this software everybody has, everybody uses it, it’s just a database with a front end. What’s so hard?

Bill Hayden (08:24):
Well, I’ll go back to that pace of change that we’ve been talking about. You think about a few years ago, there might’ve been one or two SAP upgrades in an organization each year. It was really easy to borrow the accounting department for a weekend, throw pizza at ’em, and say, “Tell me if we got this right.” Well, once you transition to S/4 Hana, and I can take one example of our friends at Bose, they went through 57 changes in a single quarter. I can’t borrow the accounting team for that much time, let alone run a full regression like we used to. So I want to go back to one of our founder’s guiding principles for efficiencies. What if we could do less testing but more intelligently? We need a way to really optimize the testing effort up front just to keep up with that pace of change.

(09:11):
And if we’re going to automate, we have to do so in the most maintainable way possible. We can’t spend more time fixing our automation than running it, which was always the bane of any script or code-driven approach. And if we’re going to performance test, why do we have to reinvent the wheel? Why can’t we leverage what we’ve had in the functional arena for performance as well? Those touch on a few of the challenges all stemming back to that pace of change. And oh, by the way, if SAP doesn’t work, as Daniel was just saying, your organization does not fundamentally function.

Michael Larsen (09:43):
So that’s a good place for me to jump in here. So I want to ask this again, we’re assuming everybody who’s listening to this podcast knows what SAP is and as we were joking before the show started, you know if somebody doesn’t know what S-A-P is because they call it “sap”, correct. So with that, what is… and again, I’m approaching this from the factor of I’m a tester like many others out there and currently we’re looking at this world and this market right now to where a lot of people are looking for work and they’re looking for, “Well, where are ways that I can apply the skills that I have?” And SAP shows up in a lot of job listings. So if I don’t understand or know what SAP is or how to interact with it, why would I want to know what it is for one? And two, if I’m not somebody intimately familiar with the SAP universe, how much of a disadvantage am I at If I’m hoping to break into this?

Kenny Small (10:46):
I actually get asked this by my friends all the time, “What do you actually do?” I try and explain it as best I can, but I think what we don’t realize is SAP touches us. Danny touched on the organizations that use it, which are all of the large ones. It touches us or we touch it in our everyday interactions. So if you go to the website of a large fashion retailer, and you buy some sneakers, it’s like the SAP is the brain behind getting that to your door. When you go on a website and you process it, we call it a transaction log. That transaction log gets sent to SAP with all your details, your address, what you’ve ordered, the quantity, and it makes sure that that product gets to your door. And we call that order to cash. That’s the process that we start with.

(11:26):
And within that, there’s a subprocess, but it is the brain of an organization. Sticking to the retail example, it’s the brain that actually enables you to take that website product that you love, get that order, and get it to your front door on time. And also now what about change? For me that business change is about the responsiveness. Business needs to be more responsive to their customers. So it’s not just, “Hey, I want this order to my door in a week,” it’s probably got to be the next day now. And also, I want updates every five minutes if that’s the case on where that order is, what time it’s coming, I want the ability to go and change that date if I’m not home. So SAP is the brain behind all of that, enabling it to get to your door. So I think you asked the question about what experience I need to have or how can I get into it. Once you understand that it’s everywhere and it’s all of the processes that help you purchase goods, whether they’re physical goods or even a subscription, you use it every day. And once you apply that thought process, I think everyone can get into the industry and leverage that business knowledge. One of our taglines actually is all about business processes, not just technology. SAP’s a technology, but it’s really there to drive that business process. And sometimes we just don’t realize how much we use those business processes in our day-to-day life.

Bill Hayden (12:35):
And as Danny pointed out, those business processes don’t just live in SAP. He mentioned on average 50 integration points. So another challenge going back to the previous point, is how do we do that full end-to-end testing across all those integrations? We need a platform that’s capable of being able to handle testing against any interface at the UI, API, or data level. Great example, there’s a restaurant company in central Florida. They own a chain of Italian restaurants. Just the process of going online and ordering spaghetti touches 27 different systems. Everything from front of house to back of house to accounting to loyalty program, all the way down to the curbside pickup. So that’s what we mean by true end-to-end testing. And those integration points from a testing perspective are critical.

Matthew Heusser (13:23):
So then you mentioned the subject matter experts are too busy to define the tests. They don’t even want to spend a lot of time reviewing them. They’re certainly not going to do testing. It’s probably a limited budget. It’s one of those situations where you’re like, “No problem. Let’s hire a bunch of contractors.” Might not work. That’s not… no, no, no, no. We want an innovation, not to spend more money. So what can we do? You’ve got 27 integration points. What can we do to make sure we’re ready for go live?

Daniel do Valle (13:50):
So I guess the best way, really, is how can you create knowledge that is in people’s heads and distribute that using technology and using innovation. And that’s what tools are for. When we say automation, that’s what we mean. We mean making things faster, making things more productive for people so that the accounting team, as Bill was saying, is focusing on their accounting things. Those very complex processes are not in people’s heads but are documented in a place that can be reusable, it’s scalable so that when changes happen, they can be reproduced. That’s basically the whole concept. It’s acquiring this knowledge, capturing it, and making it reproducible. Just to give you context. Because of all those changes that are happening, despite the percentage number of people who have challenges in SAP space or not, still 80% of the testing of applications, particularly SAP space is done manually, which means people are going to spend a lot of time, going to spend a lot of money, and this is going to impact their business.

(14:57):
And today is all about the bottom line. It’s about speed. It’s about efficiencies. And when you look at that, what you want to do is to utilize something that can increase those efficiencies and increase your productivity. This is what our tools help, but they need someone who knows how to do them, how to implement them, and how to know the business. And this is where Kenny and his team come in full circle into our solution because it doesn’t really matter just knowing the actual tool, you actually need to know the business processes. SAP is extremely complex. You are asking, “Why should I care about SAP?” Maybe you don’t. But if you are in a large company and you want to learn about how those companies work, you’re going to understand how those very complex processes work and you’re going to become a very unique specialist in that field. We won’t be able to be a finance guy and then work on materials, on inventory, and things like that. Everyone will have their own specialty. Now how can you make this all document in each of these worlds? That’s where we come in. So it’s the junction of technology, product, and software with business knowledge, in order to refer those challenges into business outcomes.

Bill Hayden (16:11):
It’s so challenging to get that business knowledge from the head of the experts and into the hands of the people creating the test. One of the places I traditionally struggled with other approaches was having a team that understood SAP very well, understood their automation tool very well, but they may not know how it was configured for, say, a food services company. So we always ran into that issue where we would ask the experts, “Can you document your steps?” And they are already working 10 to 12 hours a day. They don’t have time for that. But now let me paint a scenario where we’re testing a change coming into an SAP system. First of all, with a combination of Kenny’s team’s expertise using our platform, we’re actually able to do an immediate analysis on what change. Therefore we’re limiting the scope of the testing right out of the gate.

(16:59):
We can then take it a step further and do what we call a depth analysis to really minimize the number of levels we have to go into SAP to touch every single object that did change, further minimizing the testing scope. Now, we have the ability to look at our test repository and say, “What tests have we already written that are germane? Oh, good news! 70% of what we need to run is already written. I just ran it for you. Here are the results. But that other 30%, that’s where we need to get the expertise from.” We can’t do that necessarily with tools that will rely on coding because if I put that in front of a business user, they’re not going to understand a thing that they’re looking at. But with platforms like Tosca that are very visual and model-based, I can put a Tosca test in front of a business user who’s never seen Tosca before and they’ll understand exactly what they’re looking at and say, “Oh no, you should have searched for the vendor code in this way and you’re using the wrong data at this place,” really getting that knowledge.

(17:53):
But for those really esoteric places that Danny was talking about, what if we have no idea of even where to start? Well, we also had the capability to look at production through our platform and say, “Oh, well I see Joe is the top user for this particular transaction. Let me go schedule time with him to sit down.” And even if he doesn’t have that time, I can use a capability like our automated recording assistant to capture his activities and create Tosca tests in that manner. And… just a little teaser, maybe in the very near future, we’re going to have generative AI help out with that challenge as well, but being able to have the test in such a place where we can get the business input, they can look at it and really… we call it democratized automation… in some cases, maybe even participate in putting it together as well as intelligently limiting our scope of testing. That’s how you tackle that problem.

Michael Larsen (18:46):
Excellent. Now there are a couple of things that we can throw in here and I’m probably bunching a bunch of this stuff together, so forgive me if we need to split this out. You just hit on the fact that you were looking at the idea that AI is going to have some effect in working with the testing for this. So maybe we can touch on that a bit. And I also understand that there is a conference coming up in June. You can correct me if I’m wrong on that, and I believe the conference is called Sapphire, and you can correct me if I’m wrong on that too, but point is, maybe there’s a little bit about what’s going to be talked about at Sapphire, and I’m assuming AI is probably going to be a strong component of that conference. What’s some interesting stuff going on in the SAP world? And I’m guessing that Sapphire is part of that.

Kenny Small (19:36):
Yes, so Sapphire is where SAP, for the event, brings together all of their partners, and their customers, and it’s really about sharing ideas, seeing what’s new, and generating excitement and collaboration. We’ve been speaking to partners, we are going to be there. I know Tricentis are going to be there as well, and we’ve been engaged with our partners. No surprise, the key topic that always comes up is AI. What can we learn about AI? What’s new happening with AI? And even speaking to SAP about this themselves. So I’m personally excited to see what all of the partners are doing and investing in, particularly Tricentis and some of the other guys we work with. What I think is going to be exciting and new is, from a quality perspective, is shifting the whole fundamental strategic value of quality from quality assurance where we assure the product and try and break it, to quality engineering, where we start to really integrate technology, the technology that Bill and Danny have spoken about, into that quality life cycle. Engineer it from the beginning.

(20:35):
So we’re not waiting to find the answer of, “Does this work?” We are now predicting, so we’re going to be more predicting and preventing issues and defects than detecting them. That is a total mindset shift for me in the quality space of we are now going to become a strategic arm and partner. Testing’s always been the child in the room that time gets cut in a project, it’s the testing time, but now we can allow for that because of the advancements in AI, and the integrations to change it to more of an engineering approach. It’s about leveraging the platforms that Bill and Danny have spoken about. Traditionally, we would have to do all of these activities such as Bill touched on it when he said, “What changed?” Looking at what to test. That’s my key focus. It’s not just about, “Can we automate and accelerate?” Yes, of course, we do that, but the first step we should be looking at is, “What do we actually test?” And then we hear the same old feedback and statements of, “I don’t have my SMEs, they don’t have time, they don’t have the bandwidth, so how can we integrate technology and engineering to answer that question upfront and start baking it and being predictive about where we focus using AI?” That are some really cool things I’ve seen out there in the market.

Daniel do Valle (21:43):
I think really the excitement that we’re expecting this year is basically to talk about AI. If you read the latest Gardner report on AI, one of their predictions is that 70% of companies will be investing or have initiatives on AI by the next year. We’re not talking about three, or five years, we’re talking about 12 months. And what we are expecting and what we’re seeing on a day-to-day basis is this very high demand on the subject. What Bill does very eloquently is talk about this subject. How can you bridge the gap between this expectation and the reality? And what I hope that people do is go past this curve of high expectations, the hype, and go down into the ground and understand really what is available. This is what Tricentis tries to do to allow those tools to exist right away and help our customers.

(22:39):
We’ve had AI in our products for many, many years. Our products include those AI tools, machine learning concepts, and smart algorithms, but what is new is the new form of AI, what we call generative AI or large language models. And in those we have several offerings. Bill, we were just discussing this CoPilot concept, which is an assistant. So in other words, you just translate something that will be technical into something that’s more natural language. In other words, in your example, you have a restaurant with this highly complex flow on how to sell one particular dish. You can just ask the AI how many of those we sold last year, last quarter. The AI will give you the response without you trying to enter a technical key or entry to provide that report. So I think it will allow people to use the tools that they currently have in a smarter and more proficient way.

Bill Hayden (23:41):
I really think that the fundamental job of the quality engineer, to borrow Kenny’s theme, is going to become more exciting now that we’ve entered this phase around generative AI. In fact, if you look at how the market responded to generative AI compared to how the market responded to Netscape in the nineties, it almost exactly mirrors itself. And yet I think we might be on an even more accelerated curve, but we do have to be aware of the hype. These are not perfect, nor are they meant to replace. They’re meant to augment and add productivity, and that’s where our challenge is going to come in. We’re also going to adopt a bit of a skillset around quality control for what the generative AI produces. Just because it created a manual or automated test doesn’t necessarily mean that manual or automated tests have provided any value. We’re absolutely going to need people in the loop to do that.

(24:37):
In fact, I would argue that the role of a quality engineer has become even more important. Think about how much generative AI is being used on the development side. Because of that, our workload’s about to quintuple. We’re going to need efficiencies in order to tackle it. We’re going to need our own generative AI capabilities, but I see us entering a new golden age for the quality engineer. 10 years ago we said everyone needed to know how to code. Now it’s going to be everyone needs to learn how to prompt engineer. And I think you’ll see some offerings discussed at Sapphire that we’ll be unveiling to the public, maybe perhaps even for the first time, around these copilots. Everything from “How can I create better tests?” to, “What does this existing suite do?”, all the way to the generation of tests that will require us to be in the loop to say, “What value did that truly add?”

Matthew Heusser (25:27):
So that kind of brings me to the tool space and I’m sure you know far more about this than I do. SAP has a methodology, which I don’t know much about, called Activate, which is the way of thinking about and doing software development on SAP, and there are also at least three tools in the mix now: Live Compare, Tosca, and the generative AI co-pilots. Your point, Bill, that even if testing doesn’t change, it’s going to have to because development’s going to get supercharged by the copilots I think is really interesting. Can you help me out with some of these terms and ideas?

Bill Hayden (26:06):
Oh, certainly. So first of all, let’s go back and talk about some of the platforms you just mentioned. You mentioned Live Compare, which I believe is the starting point and building broth from a platform perspective of anyone, say, S/4 transformation journey. It is marketed primarily as intelligent change impact analysis, but it’s really a Swiss Army knife with about 15 different capabilities that everyone needs to help efficiently manage their SAP testing process. Everything from OB/OP unit code testing to data audits to test impact analysis to security and operational impact analysis. But it is hard as a change impact analysis. It’s Wolfgang’s first idea, “What if we could do less testing more successfully?” And it goes back to that theoretical scenario I laid out around that food service company where we talked about a change coming into the environment and optimizing the amount of testing and looking into the test repository to figure out what needs to be run and what needs to be created.

(27:06):
So that’s our Live Compare product. Tosca is our enterprise flagship test automation platform that works on a completely different principle than any code-based approach. It is a visual model-based approach to test automation. Now a model, quite simply in this case, is a screen in my SAP system, very visually laid out for me with all the objects that I could either drive or validate against. If I’m doing API-level testing, a model would be a request/response pair. Down to the data level, a model is a SQL statement and the returning dataset, but it’s represented in a tool almost like a Lego block because we can simply drag and drop them in the right order and build our test flows thusly. We’re not sacrificing any flexibility in terms of logic flow. We still support branching of logic, looping, et cetera, but all in a very visual format.

(27:54):
Going back to that point of getting the business to look at it and say, “Yes, that’s right.” or, “I need your input to help me make it better.” That is our Tosca platform. And oh, by the way, it works the exact same way on over 170 different interfaces. We’d like to say from mobile to mainframe. So you don’t need multiple tools and multiple teams. You can accomplish that 27-system end-to-end test with one platform. Where is AI going to supercharge this? Danny already mentioned a couple of places that have been in the tool for years. Our Vision AI driver is a revolutionary new type of UI test automation that works almost like facial recognition for the UI test automation. Basically, it doesn’t matter what an object’s written in, all it matters is how it appears on the screen. In fact, it’s so powerful, you can give me a wireframe or a PDF sketch and I can build the automation from it and it’ll work on the live system once it becomes live. We can automate through Citrix interfaces.

(28:49):
RDP hops. We even did it through a Zoom call. Another area where machine learning plays a huge part is the self-healing that’s built into Tosca. We talk a lot about that maintenance problem. That visual model-based approach already helps us get ahead of that maintenance problem and sustain really high levels of automation coverage and keep up with that pace of change. Even at Bose where they had 57 changes in one quarter, but self-healing takes it to the next level in that the test could literally update themselves. If a “Continue” button changes to an “Enter” button in SAP, it can catch that, handle it, and update itself. Now, large language models, where are those going to play? Initially, you’re going to see some offerings around our Tosca platform that center in really three areas. Test optimization, quality insights and expertise, and guidance. And that expertise and guidance is what you think about when you think about any copilot, being able to ask a situationally aware question and get the information you need, but being able to optimize a large suite of execution results to quickly get to that root cause, which dovetails as well with what Kenny was talking about in terms of predictive for root causes, being able to optimize the testing you’re doing up front and then as our… call it phase 1.5… what we do have on the board is the automatic generation of Fiori tests based on the information available to the system.

(30:13):
Now we’re not stopping there. I do want to preview the concept that our CTO calls autonomous testing. Just a little glimpse into the future. Being able to generate a test from requirements, epics, user stories, is one thing, but what if I had a mobile test and I simply told the system, “Turn on the power saver on this mobile device.” It’s all I did. The system can take a look at the screen, understand what navigation needs to occur, auto-build the TOSCA artifact on the fly to go to the next level screen, analyze that screen, understand what needs to occur, auto-build the TOSCA automation to go one level further down until it gets to that switch to turn on the power saver. All I did was tell it, “Turn on the power saver.”

(31:00):
Now imagine how that’s going to revolutionize automation once that becomes reliable. We’re defining tests at a higher level. It didn’t matter if it was iOS or Android, it doesn’t matter if today it’s five screens and tomorrow it’s 20 screens. It figures it out. That’s the future and promise to look forward to and also the hype to guard against because we are not there yet. But I just wanted to give everyone a preview of where things might be headed.

Michael Larsen (31:24):
That sounds pretty amazing and kind of mind-blowing. So I thank you for that breakdown. So we’ve been talking for a while here and, of course, we tend to have these shows at a kind of set level of time. So I want to say first off, thanks to everybody for joining us today and, key and important to this is where can people learn more, and more to the point also, where can people learn more about Sapphire, as the conference that you’re all talking about?

Kenny Small (31:50):
There’s going to be a lot of information going out from our marketing teams on LinkedIn. This podcast is going to be going out and people can hopefully learn a few things and really just come up and see us if you’re in and around the event, come and talk to us and not just learn about the event, but learn more about what we’re doing and we can take you through some real world application of this because a lot of it is in the future. But to tie two things that are real together with what Bill just said, with the visual and predictive AI, just a few days ago I was having a conversation with someone, a customer, and they said, “Hey, we get invoices from various different vendors always in a different format.” So we then start speaking about the visual AI tools and technologies that we have access to alongside Tricentis and saying, “Hey, well it doesn’t matter what format, it is, just send it to us and the visual AI can pick it up,” but not only that, it would pick up the last time this vendor sent this invoice, they had a particular element on the screen and now it’s missing.

(32:44):
So now we can predict, “Well, we know this is going to fail, so let’s not even send it through.” So there are life examples. We can take you through real demos. Yeah, look out on LinkedIn, look out for these podcasts and the various other materials, but also for more of those stories and real-world applications of how you might be going through something similar, just come up to us at Sapphire. If you’re at the event or reach out to us in response to those LinkedIn posts. We’ll be in contact and take you through a bit more detail.

Michael Larsen (33:07):
Alright, fantastic. We want to give everybody a chance to… we often say, “What is the last word?” or, “You have 30 seconds to sum up everything that you’ve gotten the chance to talk about today. Here’s your chance to do that.” If you want to give a 30-second sum-up of everything that’s going on and that’s hip about SAP, here’s your chance.

Bill Hayden (33:28):
I just want to say, please come visit us when we are at Sapphire. I know all three of us will be there. Danny and I will be busy in the Tricentis booth on the floor showing our latest and greatest software. And hopefully, we’ll have a chance to talk to you about the challenges that you are running into because it’s important as much as it’s to get this message out, to hear from you, the people listening to this podcast as what you need, what capabilities that you’re looking for, what challenges you need to overcome, and then we can work together through the expertise, Kenny’s organization, with our platform, in order to solve those. So looking forward to seeing everyone in Orlando.

Daniel do Valle (34:06):
I would like to second bill suggestion to visit us at SAP Sapphire in Orlando, on June three, four, and five. Meet us at a show floor on Booth 500 where we can stop and talk to our international team of SAP test experts. We can also meet at our hospitality lounge where we can have some drinks, and refreshments, and do networking. Attend one of our sessions with our clients and also participate in our party shuttles we can take you to the dinners and hop on to multiple dinners with a cold drink and have a lot of fun and ultimately understand how to go to zero to ready with Tricentis because speed matters and quality can wait.

Michael Larsen (34:45):
Alright, well if that’s the last word for everybody, I’m going to say thank you so much for joining us today and thank you to all of you out there who are listening to The Testing Show. We hope you enjoyed today’s episode and we look forward to coming back to you in a couple more weeks with new and fresh content, all about the world of software testing and other fun things that we can dig up. Thanks for joining us and take care, everybody.

Bill Hayden (35:10):
Thanks Michael, Thanks Matt.

Daniel do Valle (35:12):
Thank you everybody.

Kenny Small (35:12):
See you guys.

Michael Larsen (OUTRO):
That concludes this episode of the Testing Show. We also want to encourage you, our listeners to give us a rating and a review on Apple Podcasts. Those ratings and reviews help raise the visibility of the show and let more people find us. Also, we want to invite you to come join us on The Testing Show Slack channel as a way to communicate about the show. Talk to us about what you like and what you’d like to hear. Also to help us shape future shows, please email us at [email protected] and we will send you an invite to join the group. The Testing Show is produced and edited by Michael Larsen, and moderated by Matt Heusser with frequent contributions from our many featured guests who bring the topics and expertise to make the show happen. Additionally, if you have questions you’d like to see addressed on the testing show or if you would like to be a guest on the podcast, please email us at [email protected].

Recent posts

Get started with a free 30 minute consultation with an expert.