Want to Automate Testing the Right Way? Follow These Best Practices.

A sizable human touch is required to ensure that automated testing actually saves a team time.

Written by Michael Hines
Published on Aug. 13, 2021
Want to Automate Testing the Right Way? Follow These Best Practices.
Brand Studio Logo

“Automation” is a deceptive term, especially when it comes to automated testing. Instead of completely removing humans from the process, automated testing merely shifts the bulk of the work from running tests and logging errors to designing tests and selecting the best tools for the job. Like all things automation, a sizable human touch is required to ensure that automated testing actually saves a team time as opposed to creating new problems.

For Vlad Plasman, RedShelf’s manager of engineering for data and analytics, that means creating a process that incorporates automation from the beginning. 

“Testing is baked into our engineers’ mindset from day one with unit test coverage of over 90 percent,” Plasman said.

With that in mind, we reached out to three Chicago tech leaders to learn more about what their approachs entail, including templating, mocking and isolation.

 

Matt Wood
Manager, Test Automation • Vail Systems, Inc.

One of the best ways to ensure people pick up a practice is to make it easier to accomplish. That’s the idea behind test automation at Vail Systems. Manager of Test Automation Matt Wood said Vail’s test execution and reporting systems are purposefully designed to be both dev-friendly and efficient.

 

Briefly describe your top three test automation best practices.

Whenever possible, we attempt to create automated test solutions and when necessary follow-up with exploratory manual solutions. We have three main test classifications that include end-to-end feature testing, load and longevity testing, and high-availability testing. Our goal is to build these test classifications into our CI/CD pipeline process. Each of these test classifications may include varying degrees of automation and utilize various tools to perform the testing. These tests include the automation and verification of phone calls, texts messages, API requests, API responses and web UI cross-browser operations as required by different Vail products.

We focus on creating developer-friendly test execution and reporting systems.


What kind of tests does your team automate, and why?

We strive for 100 percent coverage with end-to-end testing. Our aim is to achieve the highest level of automation to every functionality of Vail’s products under test. We focus on creating developer-friendly test execution and reporting systems that allow developers to execute and verify test results on their own private branches when needed.

In our process, a developer can submit new code changes. The CI/CD pipeline begins by building their submitted code, then running static code analysis. If all build and static analysis pass, the code is deployed to one of the testing locations and a full functional test is executed. Results are available as soon as the automated tests are completed.

 

What are your team’s favorite test automation tools, and how do they match up with your existing tech stack?

We utilize a Git code repository, and our CI/CD process is driven by CloudBees using Jenkinsfile Groovy scripts and shell scripts with static analysis tools such as Coverity and cpplint. We are also utilizing Kubernetes clusters for deployment and managing platform resources and have many in-house test suites for feature and load test execution.

For our end-to-end browser and cross-browser testing, we also utilize the following tools: Selenium with Selenium Grid, Cypress, JMeter, and Readyui. Many of our test solutions are built in a way that allows for tests to run in parallel. This greatly speeds up test execution time and allows resources to be reallocated dynamically for other test execution.

 

Austin Kelsch
Senior Software Engineer • CardX

Despite their name, best practices can sometimes be hard for people to adopt, which is why CardX uses templates. Austin Kelsch, senior software developer, said these repository templates enable developers to seamlessly integrate best practices into their tests.

 

Briefly describe your top three test automation best practices.

Integrating automated tests into our CI process is key to producing stable and streamlined deployments. Staging and production builds get rejected if the automated tests fail. Templating is a simple but important best practice. By building and maintaining repository templates, team members have our automated test best practices in place out of the box. Rather than “enforce” best practices, templates help remind and guide developers to build out their tests and CI deployment pipeline the right way.

Preventing flaky tests with mocking is an especially interesting line to toe but a critical one. For automated tests, our team makes sure some tests use real API responses and others use mocked responses. This allows us to fully test all edge cases without having to configure real API responses based on a potentially varying database to match each case.

By building and maintaining repository templates, team members have our automated test best practices in place out of the box.


What kind of tests does your team automate, and why?

Browser compatibility tests are a superb time-saver. Our most in-depth testing suites are very thorough but do not cover browser compatibility. Additionally, manual regression testing on different browsers is resource-intensive. Our team has created browser test suites that cover the main regression pathways. Browsers like Internet Explorer that are no longer maintained, but still used, are especially difficult. If modern JS syntax or a non-transpiled npm package makes its way into a build, the browser may hit a parsing exception and not render the application at all. Automating tests on these browsers quickly alerts our team if there are any issues well before those changes can make it to production.

End-to-end tests that cover the full cycle of critical user experiences are extremely powerful. UI rendering, API requests and responses, and UI interactions can all be judged in a single test. These tests quickly reveal if any changes along the chain between frontend and backend caused issues.

 

What are your team’s favorite test automation tools, and how do they match up with your existing tech stack?

CardX uses Bitbucket Pipelines to handle our CI. Bitbucket already stores our source code, and their Pipelines tool allows us to quickly build triggers off of Git commands. Before a deployment can occur, we can slot in automations of the relevant test suites. Our front-end repositories kick off both Cypress and Jest tests. Jest integrates with our front-end framework, Vue.js, and allows us to create unit tests at the component level. Cypress is a robust end-to-end testing tool that actually renders the application with a browser and can cover full user paths and edge cases.

Additionally, our team uses Postman to automate tests against our various APIs. Postman tests serve as an additional layer beyond back-end unit testing to ensure the stability of our API dependencies.

 

Vlad Plasman
Manager of Engineering, Data & Analytics • RedShelf

The only way to consistently release error-free code is to never release anything. Of course, it is possible to get close to hitting that mark, and that’s Vlad Plasman’s goal. Plasman, manager of engineering, data and analytics at RedShelf, gave us a brief overview of his comprehensive automated testing strategy.

 

Briefly describe your top three test automation best practices.

Test often, test early: Our goal is to catch issues as early as possible in the software development life cycle as well as preventing what can be prevented. Testing is baked into our engineers’ mindset from day one with unit test coverage of over 90 percent. Our regressions are triggered automatically upon build and no feature is completed without proper unit, functional and end-to-end testing.

Use the right tools: Every environment requires a unique, tailored approach. We have combined open-source test frameworks with test management software to address manual testing scenarios so we can see a “full picture” across the entire enterprise.

Isolation: To prevent a “state” from spilling between a variety of users sharing the same environment, we use modern orchestration tools to ensure the effectiveness and stability of our tests.

Our regressions are triggered automatically upon build and no feature is completed without proper unit, functional and end-to-end testing.


What kind of tests does your team automate, and why?

We automatically run unit tests on check-in to make sure that the business logic is correct and also run integration testing to make sure that component interfaces are not broken. RedShelf also tests full automation on builds to make sure that a code change does not break existing functionality.

Functional and end-to-end testing is also employed. Fully automated UI and app testing as well as back-end API testing and data testing ensure that every facet of an application is working properly, both in isolation and integrated state. We also conduct load testing. Our testing ensures our website and its components stay performant to facilitate an optimal user experience.

 

What are your team’s favorite test automation tools, and how do they match up with your existing tech stack?

XRay for test management, which provides a line of sight into what we test, how we test and when we test. TestNG, along with Selenium and Appium, help us stay flexible and make it easy to test across multiple platforms. Behave — for behavior-driven development — is the core of our end-to-end and regression testing, which fits perfectly our existing Python infrastructure.

All responses have been edited for length and clarity.