The mabl blog: Testing in DevOps

Learning From Production Use

Written by Lisa Crispin | Jan 22, 2020 1:00:00 PM

The 2018 State of DevOps report concluded that among the key technical practices that drive team performance, the most important are monitoring, observability, and continuous testing. Much of these are focused on application health, but these practices also let us improve our understanding of system behavior and how customers use product capabilities.

Today’s technology has enabled teams to “shift right” in their testing, getting in-depth information on how customers are using new and updated features as they are released. Instrumenting code to log events in a structured way allows quick analysis of production use without burdensome overhead and expense. There’s a wealth of information to drive new feature ideas, as well as to better focus testing as new changes are developed, driving the endless DevOps loop.

It’s been exciting for me to see how our mabl team guides development with what we learn from production use. I’d like to share a few examples of how teams can use these new capabilities to quickly improve and grow their products.

 

Measuring success

On the left side of the DevOps loop, modern development teams take a lean approach by slicing a new feature idea down into small increments, starting with a “learning release” or “minimum viable product” (MVP), in order to get the fastest feedback from real customers on the right side of the loop. How will you know whether to continue with a new feature idea as planned, or make some adjustments, unless you measure whether it’s something customers will value? 

At mabl, our feature teams often measure adoption of a new feature by determining how many customer tests use it. For example, when mabl drag and drop steps were introduced, adoption by customers was measured to see if the functionality should be expanded to other use cases, such as click and drag. 

Another type of metric is whether a feature reduces the time a user needs to complete a task, or reduces the average run time of a test. These measurements are combined with usability studies to see whether a change was successful for customers. I’ve enjoyed watching how our product and user experience experts analyze user activities in production and redesign the user interface to make new capabilities easier to use.

 

Taking quick action

I’m a fan of small experiments and creating hypotheses to gauge the success of each experiment. Even if the hypothesis is wrong, valuable learning can emerge. If you hypothesize that Feature X is the most valuable to potential clients, but it turns out that they value Feature Y more, there is no failure - only learning! Now you can focus everyone’s efforts on the more valuable feature, expanding its capabilities, perhaps investing more in testing it to be confident about future changes, making sure new potential customers discover it. 

The ability to learn how many people use different pages and features in the application drives design changes. For example, our mabl engineers were able to optimize filtering capabilities for test results by determining the most popular filters being used, and listing those first in the dropdown. Production usage data can be surprising. When teams work in small increments and quick iterations, they can respond quickly as they learn important information. 

 

Tools

There are plenty of tools to help with understanding which features customers (and potential customers) are using to identify trends. Our mabl teams use BigQuery and DataStudio to store and analyze huge amounts of log data, but there are dozens of other similar tools. Analytics tools such as MixPanel help trace user journeys through the application as well as feature usage.

Learning directly from customers, face to face, is also key. At mabl, our UX experts rely on live usability testing sessions with customers. Customer support requests are another essential source of information. What parts of the app cause users to struggle or encounter issues? Our customer support team produces health reports which are used to guide future changes. A simple widget to get direct feedback from customers as they use the app helps identify challenges and areas for improvement. 

 

A whole-team effort

Setting up the infrastructure to provide all the production data needed for analysis is an investment. Once the infrastructure is in place, team members may be able to use pre-set reports or use simple SQL queries to get the information they need. Still, the tools often require specialized skills, so it becomes a whole-team effort to make sure everyone gets help as needed. 

At mabl, engineers may help create dashboards or complicated queries. They instrument code so that all the analysis tools have the right data available. Customer support team members reach out to designers as they learn about UX or UI issues. Everyone, including the engineering team, sits in frequently on calls with customers to learn first-hand what they are experiencing.

Teams that embrace DevOps work to build relationships between teams and between people with different specialized skills so that this type of collaboration becomes second nature. Everyone on the delivery team feels responsible for the changes they release to production, keep watch of production usage closely, and respond quickly as needed with fixes and new changes. 

 

It’s still important to shift left too!

I’ve been talking about activities that fall more on the right-hand side of the DevOps loop, but we need to remember that shifting our testing and learning both right and left is important. As I talked to my colleagues at mabl, someone noted that it’s still important to get feedback on mockups for new designs and new features before development even starts. We can and should test feature ideas. It’s much better to hear “Oh, we didn’t think of that!” as you’re planning a new feature, rather than after it’s in production and customers are feeling pain. 

Continuous testing means learning throughout the cycle from new concepts to customers’ outcomes. I agree with the fifth of the “Modern Testing Principles”, which says that the customer is the only one capable of judging and evaluating the quality of our product. We need to know exactly what customers are doing and experiencing, and get their feedback as quickly as possible. Then we need to apply that information to guide our next round of small changes and deliver them quickly. By solving customer problems, our teams and companies become successful.