The mabl blog: Testing in DevOps

7 steps to making mabl more accessible | mabl

Written by Juliette MacPhail | Aug 4, 2021 12:00:00 PM

One of the things that we’re most excited about at mabl is helping software teams transition from quality assurance to quality engineering. By dramatically reducing the effort required to ensure functional correctness, we’re enabling teams to expand their focus to higher-order aspects of quality, including UX, performance, security, and accessibility. We’re just as excited to increase our own focus on accessibility at mabl.  

Accessibility is a critical dimension of the web experience for people with disabilities, enabling everyone to have equal access on the web. With nearly 15% of the global population living with some form of disability, accessibility isn’t just a business decision, it’s an ethical and moral obligation. Removing barriers that make it harder for people with disabilities to navigate and interact with the web is imperative for ensuring social inclusion and an equitable user experience.

Like all aspects of software quality, accessibility is never “done.” It cannot simply be checked off a list and forgotten; accessibility must be continuously monitored and improved. To that end, mabl recently developed a working group to address key accessibility concerns with our website, documentation, and app. And while our accessibility journey is far from complete, we thought we would share our experience in the hope that it may help or inspire other teams to improve the accessibility of their applications. 

Set the Stage: Goals, Roles, and Plans

We began with research. What does it mean to be accessible? What is the distinction between ADA and 508 compliance? What guidelines should we follow, and what short-term actions can we take first? Along the way, we discovered a wealth of resources from the World Wide Web Consortium Web Accessibility Initiative

Based on that research, we identified actions that we could take in the near term, established a timeline, and distributed tasks among members of the working group.

Overall, our planning and early implementation followed these steps: 

  1. Identify the goals of the work, the role of each member in the team, and the plan for achieving those goals
  2. Establish a baseline for accessibility to identify gaps and places for improvements
  3. Design the changes needed to address these gaps
  4. Implement the changes
  5. Test that these changes work as expected and do not impact existing functionality 
  6. Evaluate the results
  7. Plan for next steps and future enhancements 

Establish A Baseline

We leveraged Lighthouse, an open-source tool from Google, to establish an initial baseline and identify key accessibility gaps across our web site, documentation, and web application. We primarily used Lighthouse within Chrome Developer Tools to scan individual pages for issues, but you can also trigger Lighthouse scans from a command line or using the Lighthouse Chrome Extension.  

Web application baseline results

Help guides baseline results

In the initial scans, Lighthouse helped us identify cases where UI elements were missing labels and had insufficient color contrast ratios for background and font colors. 

Contrast checks in Chrome DevTools

We also found the built-in contrast checks in Chrome DevTools to be an easy way to isolate issues and experiment with new color combinations. 

Design the Changes

Once we established the baseline, we determined what changes were needed to address our findings. In order to work efficiently, we discussed the UI architecture patterns that would set us up for success moving forward. Our team determined that having a centralized library of UX controls would allow us to manage components and changes in a more streamlined and centralized way, as opposed to the challenges of using complex CSS and styling applied to individual pages, forms, and controls. This increased the scope of our work considerably in the short-term, but it will pay dividends in terms of engineering velocity and overall quality moving forward.

The initial audit also revealed that many of our brand colors used throughout the web app did not have sufficient contrast ratios. Becoming more accessible meant fundamentally changing the color scheme used in the mabl app and required us to think differently about brand colors and system colors. Rather than overhaul our brand colors, we established complementary system colors that provide appropriate contrast ratios. We found Adobe’s Color Contrast Analyzer very helpful in evaluating different color combinations to identify candidates that meet accessibility standards and worked with our existing branding. 

Make the Changes

With the new designs in place, we started tackling our to-do list. Some of these changes were simple, such as modifying the contrast for the links in our test automation help guides. We also collaborated with our Marketing team to make key changes to labels, ARIA attributes, and the contrast ratios on our main site. 

Creating a centralized library for UI components and more extensive labeling has been a long-term effort at mabl. In order to get these changes into a shippable state, it required repeated rounds of testing, validation, code changes, and design updates. We replaced a lot of CSS with a small set of variants and replaced most specified color hex values with a uniform set of CSS variables so that we can more easily tweak designs in the future. 

We also learned valuable lessons along the way. Large, long-lived branches can quickly become outdated in an environment like mabl, where changes are merged to our code base continuously. Shipping these enhancements incrementally is a more sustainable approach that requires less risk and maintenance to the mabl test automation app overall. The challenges we encountered reinforced the importance of designing with accessibility in mind from the ground up to encompass all aspects of the user experience.

Test, Test, Test

We firmly believe in the value of shifting testing left for early feedback in the development life cycle. While each of the individual changes we were making were relatively low risk, the collective impact of many changes was significant, so we aimed to identify any discrepancies early on. Testing early and often in a culture of quality helped mitigate the risk that defects would make it through to production. 

Unit testing is, of course, a core part of this strategy. We already have solid unit test coverage for our UI, and initial unit test runs did catch several issues with our changes. Unfortunately,  changes to our button controls caused many of the unit tests that relied on existing selectors to fail, so it took quite a bit of time to identify the defects and update all of the broken tests. Thankfully, the addition of ARIA labels gave us a way to make our unit tests more resilient moving forward, as these are less likely to change than other attributes.

We also used our local build to update our own mabl tests. Using the CLI, we created a separate branch, ran key tests against our local builds, and updated the tests as needed. Given that the branch with the streamlined components had lived for so long, having a separate branch with fixed automated tests before deployment was critical, as this allowed us to get the tests ready without affecting tests in our shared staging and production environments. We were happy to discover that most of our mabl tests automatically adapted to these changes with minimal rework!

Once the core changes had been made to the web app, we also performed manual and exploratory testing - starting with our local builds and all the way through the pipeline. We created and validated a set of manual test cases based on the Page Coverage data that we have available in mabl and added additional exploratory testing for additional scenarios, focusing on high-risk paths. We tracked visual, functional, and accessibility issues that we discovered and addressed them before merging into our main branch. 

Results!

Once we completed the testing process, we took a look at our results. Thanks to Lighthouse feedback that was incorporated into recent website updates, mabl.com now receives an accessibility score of 100 (on a scale from 0 - 100) on nearly all pages. This included changes to contrast between background and foreground colors to improve the legibility of our content, as well as new names and labels for controls to enhance the experience for users of assistive technology. 

mabl.com Lighthouse results

We have also addressed nearly all of the core labeling and contrast issues with our documentation and web application, and our UI will be much easier to maintain moving forward, thanks to the centralized styling and shared components.

Button components prior to accessibility changes

Button components after accessibility changes

Continuous Improvement and Looking Ahead

We’re committed to expanding our centralized UI library, moving from the current work to other form elements, tables, and charting components. We intend to integrate core accessibility checks into our build process to ensure that we maintain accessibility standards with every new code change.We’re also planning to conduct a more comprehensive accessibility audit using an expert in the field to augment our team as we move past scanner tools to manual verification. 

Making mabl the easiest, most powerful low-code test automation solution is our driving mission, and ensuring that mabl is accessible for everyone is an essential part of delivering on that mission. These early steps by the mabl team are just the beginning of a consistent and continuous process to make mabl not just a simple and powerful test automation platform, but also an accessible one.