QE is a new practice with an evolving set of rules and guidelines for adoption. As such, there's a lack of rules and guidelines for adoption. Some teams start by focusing on the customer experience (CX), others start by extending quality beyond the R&D teams, and still others start aligning quality metrics to business outcomes. Join this panel to hear about the challenges real QA leaders faced in elevating quality, the strategies that made them successful, and the methods you can try with your team.

Transcript

 

Katie Staveley

Hello, everyone. Thank you and welcome to our next session. This is our panel session on leading connected quality teams. My name is Katie Staveley. I'm the moderator for today's session. Just a few reminders as we get started here. It is a panel session. So we'll have a lot of questions for our panelists. However, if you have any questions during today's session, feel free to submit those through the Q&A tab on the platform to your right. We also welcome any chat or discussions, feel free to put those in the chat section as well. We are recording today's session. So if you happen to have to step out or want to take a look at this again later, we will have it recorded and posted to the platform within the next 24 hours or so and then we'll make it available to the broader world in a few days. With that, I will go ahead and hand off the session to our moderator, Jeff Zupka. And thank you again for joining us. Over to you, Jeff.

 

Jeff Zupka

Great, thanks Katie. And welcome, everyone. Thank you for joining us today. So I'm Jeff Zupka. I'm an Engineering Manager here at mabl, and I'm really excited to be moderating this panel as it's a topic that really resonates with me, in my role. Quality engineering is a new practice that's evolving very quickly to keep up with the accelerating pace of DevOps transformation that many teams are now going through. Unsurprisingly, then there's a real lack of clear and consistent guidelines for adoption. Some teams start by focusing on the customer experience, or others start by extending quality beyond their research and development teams. Still, others start by aligning quality metrics to business outcomes. So we're going to explore these topics and more on today's panel. We'll be talking to three great QA leaders to learn about the challenges that they faced in elevating quality within their teams and organizations, the strategies that have made them successful, and the tactics that you can try with your own teams. But first, let's meet our panelists. Why don't we start with Madhura? If you could give a quick intro, that'd be great.

 

Madhura Kamat

Thank you, Jeff. So hey, everyone, I'm Madhura Kamat. I'm a Senior Principal Test Engineering Architect at iCIMS. I've been with iCIMS for about eight years. Now a little bit about myself. I grew up in India, I did my undergrad in computer science, and I came here to do my master's in computer science in the US. I joined iCiMS right out of school as an SDET, and my career has evolved with my role. And currently, I'm a Test Architect there. And my job or my role kind of consists of multiple things from setting up a test strategy, coaching and mentoring people as they implement different tools and technologies and processes, and best practices for quality across iCIMS. And then finally going through metrics, and taking care of overall quality health for iCIMS as an org. So yeah, that's pretty much about me. Should I hand it over to Stacey maybe? Sure. Great. Thanks.

 

Stacey Kirk

Hello, everyone. Glad to be here. My name is Stacey Kirk, I am the CEO of QualityWorks Consulting Group. We are a process and engineering consulting firm that works with organizations from startups to Fortune 500 to mature their organization so that they can have better testing and release efficiency. A little more about me: I studied computer science, and have had roles from SDET all the way up to senior executive director in the areas of testing and release engineering. 12 years ago, I started my company, which is in Los Angeles and also in Kingston, Jamaica.  We now have over 60 consultants throughout the world that are working with our clients.

 

Jeff Zupka

Great, thank you Stacey. And welcome. Janet?

 

Janet Bracewell

Hi, everyone, Janet Bracewell, I'm the Senior Engineering Manager for QA at SmugMug. I have been with the company since I started as a tester nine and a half years ago, I have a team of eight, we have a lot of fun, we work hard. Being their manager means it's a lot of herding cats, but I have a good time doing it. And I'm glad to be here.

 

Jeff Zupka

Great. Thanks, Janet, and welcome to all of you. So let's get started. What’s the current perception of QA and how has it evolved for the three of you? I'm curious - how are you defining quality today? And has that changed over the last few years? Madhura, maybe we could start with you again? And go from there.

 

Madhura Kamat

Sure. Of course, like the overall perception of quality itself, the role and definition of quality has evolved a lot over the years. A lot of it is a function of DevOps culture, growing CI/CD focus, and overall architecture evolution towards microservices and micro frontends. So any of us who have been a part of testing for a few years have seen those long regression cycles that come after long months of development become very rare. . 


Today, we are releasing way more frequently because of CICD. We have teams that probably release at the end of every sprint, and now that we can separate releases from deployments, we are actually deploying multiple times a day. So with that, of course, the way we define quality has to evolve. A lot of focus has gone to automation. And in general, focusing more on CI-based tests. We all used to have like these inverted test pyramids, where the focus is on the topmost layer, which is the end-to-end test layer. And a lot of it was manual at that point. But we had the time to do manual testing, today we don't. 

 

So we have to shift towards more robust, leaner, and less brittle unit testing, more integration testing, build more mock testing, so that you can isolate and test your components since standing up a whole infrastructure to test a full end-to-end flow is really expensive. So you know, you have to think about isolation. So things have evolved. 

 

But also when you talk about quality today, you have to think about more than functional test automation, right? Like, it's about performance, security, localization, accessibility, all those things combined. I think all of those kinds of builds should be built into a good test strategy. And you have to think about monitoring and alerts that you said post-deployment. So a good mix of continuous integration and continuous deployment test classes will make a good test strategy. So yeah, we have evolved on those fronts.

 

Jeff Zupka

Yeah, the increasing speed of releases definitely creates a lot of interesting and difficult challenges around quality. I know, we've experienced that at mabl in our own development. Janet, is that something you've experienced? And how have you dealt with that?

 

Janet Bracewell

Definitely, when I started, it was entirely Waterfall. We were at the tail end of whatever the very long code building process was, but that has changed. We are now releasing code pretty much anytime, all the time. It makes no difference. If the piece of code is tested and approved, it's shipped. So the deployment on that type of schedule of any time means that the testing, the thinking about the testing has to be done very, very early. So that is a huge single change for us. And sounds like others as well.

 

Jeff Zupka

Yeah. Stacey, how have you experienced some of these changes over the last couple years - curious to hear that as well?

 

Stacey Kirk

Yeah, absolutely. Well, I definitely have lots of battle stories. You know, there was a time when we had, these large defect counts that were over 1000, and every week people were like “yeah, we're getting it to 900, 800, 700”. And, you know, quality really was tied to the amount of defects that you had. But I have seen products and apps that have 1000s of bugs. And if you ask the customer, do you love it, they'll be like, I love it. I won't put any names out there, those apps we all know which ones are talking about.

 

At the same time, I've seen apps where there were literally 10 or less bugs. And you know, the end user, the business user, like, this is crap, I can't use it. And it really came down to some of the things that the other ladies have shared, like, performance, you know, if it's more than five seconds, especially, you know, I'll call it my, my teenage kid and his friends, anything over five seconds is garbage in their mind. So I think, you know, we are we, you know, we've gone from an age - and I've been in the age -  where we had these huge manuals that came with applications. Now, we don't do that anymore, you have to have a product that people can understand immediately. And so the way we look at software quality assurance really comes down to whether people love it, and whether can they access it and use it quickly? So we really have made a shift I've seen over the last years in terms of what is a great product?

 

Jeff Zupka

Yeah, that's really great. And I'm sure it resonates with a lot of people on the call today. Sticking with you, Stacey, what about the perception of software quality assurance within your organization? Have you noticed shifts there? And if not, you know, how are you addressing that to really make quality feel like it's, you know, everyone is accountable to it.

 

Stacey Kirk

The perception of QA varies depending on the size and the maturity of the company, a lot of it is, you know, top down, you know, from the executives that, give the word of how important quality is to the company. I've found that, many times the leaders in tech will have a development background. And so when you talk about more of the emotional side to quality, it doesn't really resonate as well. If you talk about we're going to automate everything. And if we get all this stuff automated, you'll need less people that that does resonate, oftentimes, especially with the CTO and the C suite level.

 

So it really comes down to, you know, the company and where they are on their kind of maturity pathway of how they really feel about quality. I do think, though the great thing about test automation is it's a language that our QA testers and our developers can bond on. And so that's kind of one of the exciting areas that I think everyone kind of gets, test automation saves time.

 

Jeff Zupka

Yeah. Janet, have you noticed some more things within your teams and your organization's around how QA and automation is perceived?

 

Janet Bracewell

Definitely, we have come from a fully manual background. There are large swaths of our platform that cannot be automated, for whatever reason, the complexity behind admin access, that sort of thing. And that's not going to change. So you look to automate testing: we have dubbed them happy paths, the most common user flows. And so there's that we are also building tests and getting them written or staged during the actual code development period. So that it's a, fill-in-the-blank, finalize whatever is necessary. And then that becomes a part of not just the post-deployment validation, but it's also a piece of our regression testing. We have regularly scheduled runs, the post-deployment runs I mentioned.

 

The other thing that, value-wise, is fairly new for us is that the mabl test runs will occasionally catch something that QA says, well this is kind of off. You know, that intermittent bug that you need 1000 tests to run to see if you can find a pattern. We had that and opened an incident and discovered that it was HA proxy misbehaving. But it was a signal like a server error. And you'd hit the server error occasionally. So mabl has been also used by us as a canary in terms of seeing signals for things that are not quite right.

 

Jeff Zupka

Madhura, how about you? Has it been difficult to get that buy-in within the organization? You know, we're trying to adopt some of these quality engineering best practices and be able to scale quality as the team gets bigger and moves faster.

 

Madhura Kamat

So a couple of things. Yeah, like, on the buy-in side, I think I agree like what Stacey said, right, like it has to come top down. And luckily enough for me, for it seems like right from our CTO who is heavily invested in quality. And they believe that software quality has to be built from design to deployment, it cannot just come as an afterthought. So even as an ARD, we have evolved, like we don't have QA as a role, we have evolved from that. And everybody is a cross-functional team. Everybody is a software engineer, in essence. 

 

So with that, with that org structure in mind, of course, we've done a bunch of changes in the way we scale quality. It's harder, we are about 1000 employees all over the globe, it's really hard to make sure, it's easy to set standards, but it's hard to implement and govern those standards. 

 

So I will just break it up into like four main parts, like the first part is defining a good stress test automation strategy, like I was talking about. And that has to be asked for your architecture. So in my case, our architecture, like the new modern architecture there from the monolith application that we have is SPA, which is a UI, react front end BFF, which is a back end for front end, which is basically just a Node.js service routing, the back end and the front end, just connecting them. And then the back end is typically your set of microservices. So our best strategy defines different test classes for each of these artifacts. And then we've come up with test examples and automation as a starter kit. So when someone creates a new service, they have an example test built out there. So that kind of solves the defining part. 

 

The next part is really the training. So whenever someone joins iCIMS, we take people through something called immersion, which is like a 10-week training course. And it's a hybrid mix of both training and practical examples and things. So what we do is we actually have like people divided into agile teams, and we make them build a small application, which is not domain specific, basically building a magic eight ball, but they build it with our tech stack. So they build the spa, BFF microservice. They write tests for each of these artifacts, they make sure the code coverage is about the quality gate that we said. So things like this kind of help us set the tone, right from the beginning. So when they actually join their teams, they know what is the bare minimum expected. 

 

And the third part is really governance, which makes it even harder for like, in my role, how do you make sure with 1,000 employees all over the globe, that they're really following these things. So we have defined the definition of ready and definition of done, the definition of done is more on the quality side, which kind of integrates into our JIRA. So when someone creates a user story or creates a bug ticket, it gets auto-populated with sub-tasks. And those tasks are unit testing, sanity, testing, performance testing, security testing, localization testing, documentation, monitoring, and alerting. So each smallest level unit of work that you're delivering needs to meet certain requirements within those parameters. So that kind of ensures that people are at least thinking about these things and not skipping, even if they want to say, I don't think this applies for the story, there is at least a process for them to mark it as okay, this is not something we're gonna do and why you're not going to do it. 

 

That has really helped us mature in our overall quality process. The part where I struggle most is when we've had a lot of acquisitions. In the past five years, we've acquired around five companies. And with that things change, right? Because every acquisition comes in with a different level of maturity. It's not like you're hiring a new employee, they have their own ways of working. So there we try to come and assess their maturity levels, and we try to give them like a timeframe as to reaching the iCIMS level of maturity. But that's an area we're still trying to focus more on.

 

Jeff Zupka

Great. I really appreciate that detail. I'd like to stay on testing processes for a moment, as I've experienced over and over in my career. You know, getting buy-in on strategy is important and step one, but then when you're actually implementing that strategy into process, you know, changing behavior is very hard. And it takes time. So I'm curious, you know, what adjustments, the three of you have made in your processes to help elevate the importance of quality in the team? And maybe we could start with you, Stacey on that one?

 

Stacey Kirk

Yeah well, I've simplified how I explain quality to senior leaders, to people outside of the quality assurance organization. I've found that what people don't understand they ignore, and the more people that can talk the talk, and they feel bought in with quality. And so we have simplified. And we've seen, you know, and what's so great about mabl, is there are just so many quality metrics you can view, and you'll find what are the ones that people resonate what resonates with people, and using that metric, as the first measure of quality within the organization helps at every level.

 

So for example, we had a startup that wanted releases every day. And some of those releases were not going to the complete App Store, it was just going to some investors. And so we decided to give a metric to a title: gold, silver, bronze, black. And, and that made sense to him and to the investors that say, this is a this is a bronze built. And we all know that it's probably going to have a few bugs, but it does allow you to look at the features, or black is you never know what you're gonna get. So I think it allowed the entire organization to understand software quality at a very simple level. And so they were brought in with you know, what, we can't keep having these bronze builds, it's time for a gold one, and to get their investment in their feedback and time into it.

 

Jeff Zupka

Janet, what about you, any particular process adjustments or quality metrics that you use with your team that you've seen your work really well?

 

Janet Bracewell

Quality metrics, in terms of one of the things that we are tracking is when code is shipped, the level or number of bugs that come through from production. We are very tightly linked to our support team and we hear the noise very quickly. The number of defects we are measuring as well, that's less a measure of QA in terms of the number of defects found, it is more a measure of quality of code, but we keep track of that as well. 

 

Underneath everything one of our top company values is to thrill our customers. And so everything that we do as a team, a QA team, a development team, a support team, really doesn't matter. We are building a platform to meet our customers' needs, but we're also building it to provide business impact in positive ways. The very bottom of that is the idea that everyone is empowered in their sphere to thrill our customers. And so, as we look at quality, it's not just a QA team thing. We don't own it. Everybody owns it.

 

Jeff Zupka

Yeah, so obviously the path to quality engineering and adoption. It doesn't happen all at once. It's a journey and check the very next presentation after this is the journey to quality engineering adoption. And I'm sure, you know, many on the call today, or maybe just starting that journey are about to start that journey. So I'm curious, any of you if any of you have any, what advice would you give to people, you know, quality leaders teams that are just starting on that journey? Maybe Madhura, you want to kick that one off?

 

Madhura Kamat

Sure. So I work at iCIMS, it's a platform. It's an APS that helps you hire people. So my first point is really hiring the right talent because you must have heard the term: “software is people”. And I did not understand that very much in the beginning of my career, but now it resonates so much that your product is only as good as the people who are building it. So make sure you invest a lot of time and energy in hiring and putting up a good hiring process. 

The second would be to treat quality as a first-class citizen, you know, we all have been through that wall between developers and QA: just throw it over the wall and let them take care of it. Try not to encourage that culture. If you're a leader in quality in your company, make sure you're giving importance to the people who are doing the QA work. And also like having shared ownership, right, like, I think similar to what Janet and Stacey were also touching upon. Don't put the onus on one person on the team, like, make sure you're spreading it throughout the team. Shared ownership is always better, because they will feel invested in the product, they'll feel invested in the quality. So make sure you're encouraging that culture of quality. And then lastly, if you're starting new, I would say like start small, you know, set your minimum quality gates set what you mean, is your lowest bar, like you will not let this pass, you know, have a zero tolerance policy on that, for example, like, if you want to achieve code coverage of 85% and more, you have a big monolithic code base with multiple lines of code. Maybe start with just applying it on like new lines of code, it's not gonna happen over years, but like at least, you know, follow the strategy of if you touch it to fix it. So that we like you over time, build that quality. And so I'm a big advocate. 

 

We have reached the maturity level we wanted to reach by applying that strategy of just starting small, start with one thing at a time, change, no one likes change, when you go back to the teams and tell them, oh, now we have this one more gate. And you would have to have at least you know, all major accessibility bugs fixed before you release, and no one is going to be welcoming of that change. But like if you start small and give them enough time and training, I think people would be more willing to accept the process. And hopefully, everybody gets to the level they want to be with just one step at a time. 

 

Jeff Zupka

Great Janet, Stacey, any advice that you would give, based on your experiences going on this journey?

 

Janet Bracewell

Yes, it's going to be bumpy. It just will be. And if you think it isn't going to be you will be hugely, distressed or disappointed or both. Like Madhura said, start small, pick one thing, a section of code an area of your platform, build your testing and quality strategy around that. Let the developers see the success of that they will come to you and say I need this, can you build this for us. And if you gain some of that buy-in, then you drag them by the hand and you say I can build it for you. But it's better if I show you how it works. So you understand if something goes wrong, or whatever your particular spin on that is, but take your folks along with you as much as you can.

 

Stacey Kirk

Yeah, and I'll agree with what both Madhura and Janet said as well. You know, you have to kind of understand your limitations. You know, speaking from my own mistakes, you know, I've seen teams where there's, you know, one tester that's junior with no development experience. And they've been charged with fully automating all functionality integrated in CI/CD with quality gates that trigger and send reports, you know, and it's like that's, that's a huge jump from what from having sometimes no test cases at all, even manual. So I do agree, you got to start simple.

 

I also encourage you to measure, one of the things that I think we get excited about is just getting started and having something and I've done that without ever saying well, where do we start from. And there really is a lot of value in showing progress on your journey. And a lot of people forget to baseline where they started from. So you can get a lot more buy-in more excitement when you can say, hey, we've started with 1% automated and now we're at 4%. It doesn't sound like a lot to you but it shows that the investment is paying off and you'll find that more people will be open to giving up resources, time, and attention to see that progress over time. 

 

Jeff Zupka

Great, Madhura, you mentioned Software is people, which, I love that. So let's talk a little bit about teams. And, you know, we're in this new world where remote or hybrid work is very common now. And I'm curious, are there skills that you look for in QA people that maybe were overlooked? Two or three years ago before this, this big shift to more distributed work, and maybe Janet, we could start with you on this one?

 

Janet Bracewell

Sure. I have not done any hiring over the past few years, we have a very long tenure, very stable team. Tenure ranges from 17 years down to our newbie, we still call him the baby. And he's had about three and a half years. Our team has always been fully remote, we are spread across the four time zones of the continental US, which on occasion produces some interesting hiccups. But that's time zones. 

 

I think the longevity of the team provides a really close-knit group. We have a weekly team meeting. Everything else is canceled around it, we don't miss that one. So it's part stand up, we have a standing agenda that people contribute to week by week. The agenda actually is really open, there's room for any sort of conversation, topics about how things are going, what kinds of problems there are within the product teams that they're assigned to, things we want to learn, things we want to experiment with. Because of the relationships that have been built over the years, there's also a fair amount of humor and sarcasm, and what have you that goes on. I have a sneaking suspicion that there are things in the chat that are going to be more heckling than helpful from my team, which is usual.

 

One of the things in terms of QA, like I said, I haven't done any hiring over the past several years. But one of the things that is super valuable on the team is an inherent curiosity. There's a good level of creativity, and then a little bit of paranoia in there. Because ultimately, testing is risk management. We want to make sure that whatever goes out to the public, customer-facing, even our infrastructure provides the most stability and best customer experience that we can provide.

 

Jeff Zupka

What about you, Stacey, any particular skills attributes that you look for when you're hiring?

 

Stacey Kirk

Yeah, well, I've done a lot of hiring, probably hundreds over my career. And I'm saying that I choose EQ over IQ, I've had an opportunity to have PhDs, people veteran Mensa. But I really feel like especially as we're moving into this realm of experience, over bugs, there's a lot of value. And people that can emotionally connect and have a level of empathy with and I call it empathy testing, can have empathy with the end customer. There was a time when I would only want to hire people with computer science degrees since we did so much automation. I've backed off from that and really see the value in all of these different backgrounds and how they play into this experience. 

 

And then, in terms of two other qualities, I think I agree with the ones that Janet mentioned, those that are definitely very important. Also, something called entrepreneurial spirit, as Janet says, bumpy, you're going to turn back if you don't have that spirit of entrepreneurship to keep moving forward, and to definitely have critical thinking skills, because chances are there are more than one way to do it. And you're gonna have to get to probably wave five to make it work sometimes.

 

Jeff Zupka

Yeah. Madhura going back to you for a minute, you were talking earlier about some of the organizational challenges that you faced around, you know, adding, you know, going through an acquisition and expanding the team, I'm assuming, you know, making a more distributed team overall. So I'm curious, what are you doing to keep all those people on the same page, especially as you're incorporating new people, you know, to make sure that everyone is focused on application quality?

 

Madhura Kamat

Yeah, so just Jeff like I mentioned, like some of the things that I already mentioned kind of helped the DoD, the software test strategy, these things, but other than that, like touching on what Stacey was saying, right, like building the right, you want people to care about quality at the end of the day. And it's even harder when you don't have quality assurance roles. Because we have everybody's a software engineer. So people are focused on multiple things in their sprint, right, it's, it's really hard to get focus. You might have people who come from a quality background and their area of interest is QA within the team, but everybody is essentially capable of doing all tasks. 

 

Some of the things that we try to do is we have something called a test counsel that I co-chair with one of my other test architects. What that is, basically, we have a community of quality leaders within our organization, we get together every few weeks. And that creates a platform for people to come in and present new tools and processes they think would help, or any changes or any areas where we think things are not going our way or there are gaps. We'll discuss improvement areas in tools like mabl, I recently did a POC for mabl. So we do proof of concepts and then areas where we can accelerate some of these maturities.

 

Through tools like mabl or some other technologies and tools, we try to invest in the council again, it is like an invite only for the leaders. But for everyone, we have something called COPs, which is a community of practice, which is basically anybody can open one up, just get three or more people who are interested in that topic, and they can start meeting monthly. So we have performance testing COPs, we have accessibility testing COPS, we have continuous integration testing COPs. People who are interested in learning a certain tool or technology, or just passionate about a topic, get together to talk about it.

 

When they feel that they have enough data to prove that, you know, something that we're doing in their team is going to help across, we give them a platform to come out and present to the whole organization and the council then help them implement that across different orgs. So I think those things have helped sort of bring us a little closer in the globalization bubble that we are in. It's even hard scheduling a simple meeting across all time zones, let alone spreading the standard. So yeah, some of those things really help us keep things on the same page.

 

Jeff Zupka

Yeah, sure. Janet, what about you? Have you had similar experiences with, you know, trying to get buy-in, you know, outside the quality team, you know, with leadership with other parts of the organization? What, what's worked for you?

 

Janet Bracewell

We make a lot of noise, actually, just honestly, we have our QA team, but individuals are actually embedded on product teams. And so they are like little evangelists on testing practice, or building test plans, or what are we building automation for either new features or kind of makeup coverage. But what is similar is that each QA goes to those teams with the same practice and process in terms of testing. And so the product teams, the developers in particular can't escape us.

 

Jeff Zupka

Great, I love that comment about making noise. That certainly is a good important part of that is getting that visibility and that buy-in. Stacey, anything that you would add from your experiences?

 

Stacey Kirk

Yeah, I mean, I think, again, it's a lot of simplicity, that simplicity and how people measure quality. We do a lot of the same things that Madhura mentioned in terms of having sessions where people can come and share their expertise. We do a virtual lunch and learn, where we have consultants who've had a certain success share what they did. We also have a weekly meeting where we have at the end something called ‘pop your collar’, which is where anyone can highlight how another person has been valuable. And so I think over time subconsciously, when you see, oh, you know, this person helped this other person fix this great bug, it kind of inspires you to want to pop collar yourself, and, you know, appreciate it and being appreciated for kind of opening your doors, and helping others as they're trying to progress on their automation journey. I think it just inspires people to want it to be a part of it even more.

 

Jeff Zupka

Yeah, I love that, we’ll have to start wearing collars in the office, so we have something to pop. So talking about customer experience and user experience for a minute, I think that's been something that's really been revolutionary for me, especially here at mabl and using mabl is, you know, thinking about quality from the perspective of the customer experience. 

 

You know, we're not a bunch of individual pieces and teams, you know, building separate parts of the application. But you know, building this cohesive user experience that's not only functional but also has good performance and is accessible. So I'm curious how the three of you think about that. And maybe Janet, maybe we could start again with you.

 

Janet Bracewell

The idea of extending it outside of the team, we, as I mentioned a little bit ago, have the QA embedded on the product teams. But there are a number of other things that we have in place. We have office hours, where anyone who wants to work on mabl test coverage is invited to join our test  engineers. There are opportunities for communicating and collaborating with the developers because the QA are embedded on the team. 

 

The expectation is that as there is code being worked on, or even before that, that folks are collaborating on what the testing needs to look like, what needs to be tested to validate whether the feature is working or that we didn't break anything else, all of those things are incorporated very early.

 

I really liked what Madhura said a little bit ago about cross-team collaboration. And that is also one of those bits and starts things. There'll be times when test collaboration was really really good with product and design and engineering and QA. And then there are times where either due to speed or necessity, I hate to say that, necessity, but it falls over. So then you have all kinds of other hiccups that occur. But there is such a need for close communication between individuals either on the product team, you know, collectively or cross-team. 

 

Jeff Zupka

And Janet, are there any particular metrics or customer experience metrics that you're using to have that alignment so that all of those, your team, the rest of the organization are aligned on what quality really means?

 

Janet Bracewell

I mentioned earlier that we do track some of the signal for code quality in terms of defects found during testing defects found on production. We do try to set what those levels and measurements should be and then build towards that, as code is under development. I'm not sure that entirely answers your question.

 

Jeff Zupka

Okay, that makes sense. Stacey or Madhura, any particular metrics that either of you are using to tie quality to customer experience?

 

Madhura Kamat

So we have a very deep dashboard program, where we have dashboards on every team level that roll up all the way to the CTO level. And we track a bunch of quality metrics. So like, like Janet was saying, we track support metrics, which is coming from the customers, the internal defects that you open by the team. But more than that, I personally like to slice and dice that data a little more - I like to dive deeper into that. So for example, if you have support metrics, we try to analyze like, how many of them were functional, you know, how many of them were fixed by a code change? How many of them really needed just a conflict? Our platform is extremely configurable, so we deal with a lot of configuration issues. Can we fix those with just you know, better documentation or better training on implementation teams? 

 

So we try to do a deeper dive on the higher level, but on the team level, we look at like, the depth of the backlog that teams have, you know, like, how many support cases do you have open? If you were given the next two sprints to just close them out? Would you try to keep them under single digits for at least the newer projects. You can at least close it out within a sprint if there was a need that we need to address some of the customer concerns. We also look at the age of the backlog, right? How long have these tickets been open? Are you closing them recently, we look at the ratio between created versus resolved is the team in a healthy state like they are resolving more than they're creating. 

 

If it's not, then you probably need to invest more in that team and get more help. So things like that. But overall, for customer experience, we try to get together with all the directors from all the arms every week and look at these data points that I mentioned and make informed decisions on where we need help. So mabl was one of those decisions, one of our products where we had a lot of customer escalations. And then we were trying to look for low-code test automation tools, where, how can we accelerate test automation to catch those, you know, early defects there, because that team lacked automation completely. So mabl really helped us accelerate some of our plans there. So yeah, we are a big believer that you need to use data to make decisions. So really having good metrics is key.

 

Jeff Zupka

Great. So wrapping up. One thing that I'm definitely hearing here is that the work is never ending, and the journey never ends. So do the three of you have any final advice that you'd like to give, to share with, with everyone on the call about maintaining a really high level of focus on quality over the long term? And why don't we start with you, Stacey?

 

Stacey Kirk

Yeah, absolutely right. Quality is never ending. It's often thankless. So whenever there's an opportunity to incentivize quality advocates, to cheer on their efforts to really highlight user feedback, it allows people that are normally heads down, working to get to the next level to see the results and the joy of the work that they're doing. And the end users and doing that, wherever possible, tie quality into the company goals.

 

Jeff Zupka

Great. Janet, any final advice that you'd love to share?

 

Janet Bracewell

A plus one, Stacey, or plus 100, as the case may be. We have actually a program to use Bonusly, it's a platform that gives bonus points that can be cashed in for dollars, and it's plus five because you helped me with but it's it's not reserved for oh, you just finished this wonderful thing, or you built a whole suite of tests to cover fill in the blank. It's every day recognizing the work that's done and saying thank you. But it's a tangible thing for the teams and it's across the company. So it might be somebody who changed some documentation or edited it or wrote code or fixed a bug that was driving everyone nuts, because we are internal customers as well. 

 

Jeff Zupka

Great and Madhura. I'll give you the final word any advice you'd like to share?

 

Madhura Kamat

I'm big plus one on what you both said. But one thing I would add is that technology evolves. So you know, keep evolving. Look at the problem at hand and each product's solution is different. So you know, try to analyze, what you're trying to solve for, and then come up with a solution. Don't just bring in what you know from the past, so just keep learning, keep evolving, and put your heart into it, and things will turn out well.

 

Jeff Zupka

Great. Well, I really want to thank Stacey, Madhura, and Janet, for joining us today and for the really great conversation and reminder that this, this session was recorded, and it'll be shared out later on. And next up, we have as I mentioned, during the presentation, Darrel Farris is going to be doing a presentation on the journey to Quality Engineering and adoption. So sure, you'll hear you know, a lot more tactics and strategies around this journey and how you can implement it yourselves. So thank you again and enjoy the rest of Experience. Thank you