The QA Best Practices These Teams Swear By

November 10, 2020
DRW office
DRW

Customers of the trading platform TopstepTrader would’ve received an alert about a new feature that wasn’t actually available yet had it not been for the company’s QA team.

When staging TopstepTrader’s new “free account reset,” a feature for customers who renew their monthly subscription to its Trading Combine program, QA engineers caught that the feature flag was not controlling the alert for the new feature. 

“If we hadn’t caught the bug during testing, our customers would have been alerted about a ‘free reset’ before the feature went live, causing a fair amount of confusion,” said Aaron Toran, a QA engineer.

Fortunately, Toran, his colleagues and their customers were spared that kerfuffle thanks to the QA team’s product release criteria, which includes staging dark releases wrapped in feature flags when an epic is too big for a single release. 

“Although this process increases the amount of work required for testing, it also dramatically improves the overall release confidence because we’re now testing both new and old flows,” Toran said. 

Toran and leaders from five other Chicago QA teams, shared what other best practices they follow to ensure solid workflows for their teams and quality products for their users.

 

morningstar
morningstar
Parina Madaan
Quality Assurance Manager

 

Continuously groom the test suite and make sure test cases add value

Project requirements can shift quickly, which can lead to confusion across teams. While Morningstar QA Manager Parina Madaan admits that this remains a challenge for her team, one strategy she’s implemented to help is making sure that both the QA and development teams are operating on the same tech stack. 

 

Whats the most important best practice your QA team follows, and why?

It is important to continuously groom the test suite and make sure test cases add value. This creates a focus on quality over quantity. We run smoke tests with every commit and deploy so we can quickly find high-risk issues. This reduces costs and maintains customer satisfaction.

 

How do you determine your release criteria? Give us an example of your typical process.

It starts with considering quality from a team perspective. Units, integrations and end-to-end tests are run before releases. When both product and engineering meet for a go/no go, we review all test results and confirm that the scope of the sprint is complete. A final regression is run on the most stable environment with the customer in mind, and we run a final smoke test in production after the deployment to confirm the release.

 

How do you ensure the QA team stays up to date on shifting requirements, and how do you plan ahead to ensure changes are handled smoothly?

That’s a big challenge, but a few things help. It starts with making sure the QA and development teams are using the same tech stack for coding and automation. This allows for more interaction and blurs the line between the disciplines. Product spends a lot of their time focusing on prioritization and communicates this daily during team standups. Technology is allowed to question product. This creates a steady balance where the entire team is focused on the business value for the customer.

 

Mike
Data Center Engineering Lead

 

Hold peer reviews often

At the trading firm DRW, the QA team’s most effective tool is its peer review. Mike, who oversees DRW’s data center engineering team, explained why.

 

Whats the most important best practice your QA team follows, and why?

Peer review is one of the most effective QA tools we have in our arsenal. We’ve built this practice into the project cycle and have our teams constantly rotate through various deployments to ensure iterative reviews throughout the deployment lifecycle. This approach allows for ongoing validation that all changes are accounted for and the install met our high standards. It also benefits our team by consistently exposing each of us to new projects and groups, which provides ongoing opportunities for learning and development. We believe the more deployments we handle, the better we become at anticipating users’ needs and improving our approach on subsequent projects.

Since we are immersed in layer 1, the physical layer, peer review doesn't involve checking code or running code through a simulation. Instead, we follow these simple steps: We utilize our deployment data tools to encapsulate the scope of work, collaborate with other teams to make sure all changes are accounted for and, lastly, we document our deployments in a repository so that we can all see what practices work best for what deployments.

 

How do you determine your release criteria? 

Our process starts and ends with the deployment data tools we have developed with DRW’s software engineers, serving as the communication backbone between our team and the rest of DRW. These tools allow for a constant flow of communication, which enables us to efficiently adapt to changes in internal requirements or evolving external factors in the markets in which we operate.

 

How do you ensure the QA team stays up to date on shifting requirements, and how do you plan ahead to ensure changes are handled smoothly?

As our team goes through deployment processes, we collaborate independently to make sure all changes are accounted for and installments have met our standards that we have laid out as a team. Evaluation, collaboration, communication and a shared desire to continuously push the boundaries of what we do allows us to effectively deploy equipment at a rapid pace. This brings a tremendous benefit to our firm as we seek to find a new edge in markets around the world.

 

Erika Hayden
Director of Quality Assurance

 

Understand customer use cases and conduct automated testing and performance testing

CCC’s Director of QA Erika Hayden said no QA practice is more important than the other. Working closely with product development, testing quickly and identifying performance issues early are a few strategies that Hayden said have contributed to their success. 

 

What’s the most important best practice your QA team follows, and why?

I will share a few that I believe are the source to our success. First, we work closely with the product development teams to understand our customer and how they use our applications. Second, with such a high demand for the features that we design and develop, the QA team needs to have the ability to test quickly while maintaining the highest level of quality. This is where automation comes into the play to assist with the regression testing. We develop automated tests as soon as the code is stable, which allows us to execute thousands of critical customer workflows in days versus weeks. It is crucial to execute regression tests for a release, as successful tests give us the confidence to move to production.

Lastly, it is critical to identify performance issues early in the development lifecycle. We have a dedicated performance testing team that works closely with the product development, architecture and infrastructure teams with the collective goal to identify performance issues as soon as possible by discussing the changes in upcoming releases.  

By building out the performance test harness early in the development lifecycle, we can identify any performance issues early and allow the teams more time to address them. Performance issues may take more time to address, so, starting early is key. It is also important to have a dedicated environment in which you can simulate production use cases and volume and match your production configurations as much as possible.

 

How do you determine your release criteria? 

Working on an enterprise QA team, we receive release candidates from several teams that will ultimately be deployed to production together. Our goal is to have a release at least once a month to meet the needs of our customers. Proper planning is essential for both an efficient and high quality successful release. With the help of our release management team, we start by clearly identifying the release candidates that teams want to go into a release. The release candidates are prioritized with the customer focus in mind.

The QA team then meets with the leads of the product development teams to understand the release candidates in detail and estimate the time to test all features. If there are features that cannot be tested in the release schedule timeframe, those items are planned for the next release. These release candidates are reviewed with the appropriate market and product managers to confirm alignment with current customer needs. If there is an issue, we work collectively with the release management team to identify options to swap priorities. 

 

How do you ensure the QA team stays up to date on shifting requirements, and how do you plan ahead to ensure changes are handled smoothly?

Communication and collaboration are the keys to our success. This has become more important with the entire team working remote. Prior to COVID-19, we had releases planned out and were well on our way to executing against those plans. After COVID-19, we quickly reprioritized the development of features that we knew would be most important to our customers.  

We include all impacted parties as soon as we can in the discussions when we are shifting the priorities, and share critical information via collaborative meetings and not just in email. This lets us provide the organization with a clear picture of when the new requirements can be satisfied, and we are confident in our ability to deliver against the new plan.  

 

Aaron Toran
Quality Assurance Engineer

 

Integrate agile processes early, prioritize bug tickets and hold exploratory testing

TopstepTrader’s QA team participates in all of the company’s agile ceremonies, which QA Engineer Aaron Toran said helps his team know ahead of time when transitions in product changes will occur and how they could impact other stories.

 

What’s the most important best practice your QA team follows, and why?

Having a deep understanding of our business goals and users’ behavior is key to ensuring that we provide the best possible experience for our customers. We continually put ourselves in our customers’ shoes by testing our products’ quality and finding new ways to improve. Getting involved in the agile process in the early stages of a project helps us identify missing criteria and point out edge cases that may not have been considered before. 

A few more best practices that we exercise daily include keeping up-to-date with bug tickets and requests, advocating for quality improvements, understanding priority versus severity when it comes to fixing bugs, and keeping time for exploratory testing, which is especially important when there is an extensive integration happening and we want to avoid all of the individual components from breaking down when combined.

 

How do you determine your release criteria? 

It begins in the planning phase, where we plan out the entire roadmap of a project, including development work, quality assurance, integration and staging testing, release and product testing. The goal is to have each story ready for release when development and QA work is completed. We discuss the acceptance criteria during the grooming process and find edge cases and potentially blocking stories. There’s also the possibility of more stories being needed for the expected functionality, so additional time is set aside for supplemental planning as required.

It’s not uncommon for an epic to be too big for a single release. When this happens, we stage a series of dark releases wrapped in feature flags. Although this process increases the amount of work required for testing, it also improves the overall release confidence because we are now testing both new and old flows.

 

How do you ensure the QA team stays up to date on shifting requirements, and how do you plan ahead to ensure changes are handled smoothly?

Communication is critical, and we are continually collaborating with multiple departments company-wide when new changes are being made. Because we participate in all of the agile ceremonies, we know ahead of time when transitions will occur and how they could impact other stories. This process also gives us the ability to update test plans and formulate questions regarding the end state of the feature and how it works as a whole. In addition to our daily department meetings, we also have weekly QA sync meetings to go over the stories we have planned for and groomed. The weekly meetings ensure that each team member is up to date on project priorities and avoids any single points of failure if someone falls a little behind. We also use this time to go over automation testing and address any process pain points.

 

Bedford West
Quality Assurance Manager

 

Assign a QA coordinator to foster communication across departments

At BenchPrep, the learning management system’s QA team created a QA coordinator role that gathers feedback from each member of the QA team during a given sprint cycle and relays information between QA, engineering, product, design and support teams. QA Manager Bedford West said this role has reinforced the importance of communication on his team. 

 

Whats the most important best practice your QA team follows, and why?

We see frequent and transparent communication as our most important best practice. We do our best work when every member of our team is fully informed and feels safe to share their ideas or to question the status quo. This practice of communication isn’t simply a guiding principle, it’s concretely baked into our process and execution. Our QA coordinator role, which rotates every sprint, supports our emphasis on communication. We also conduct a QA retrospective at the end of every sprint in addition to our scrum team retrospectives so we can align on best practices across our discipline.

 

How do you determine your release criteria? Give us an example of your typical process.

Releasing a product is an all-team decision. We don’t view QA as the gatekeepers of our release criteria. Reinforcing our communication best practice above, we strive to openly communicate the current customer and business risks of releasing our code, data and configuration to production. We do this on a story-by-story basis by creating an organic test plan via testing notes for each JIRA ticket. We treat these testing notes as a lightweight, malleable checklist by which we can communicate remaining risk in a sprint. Our QA coordinator takes point on communicating this out to the broader technology and product teams to ensure everyone is on the same page prior to hitting “go.”


How do you ensure the QA team stays up to date on shifting requirements, and how do you plan ahead to ensure changes are handled smoothly?

The only way to ensure QA is looped into changing requirements is to make them part of the conversation early and often. As equal members of our scrum teams, QA participate in ideation, refinement and planning. To ensure changes are handled smoothly, each team has QA run a test planning meeting at least once a sprint for upcoming work. In this way, everyone can brainstorm and align on how changing requirements can be adequately addressed through testing and development. When requirements change mid-sprint, we try to similarly discuss these items in our daily standups and Slack channels.

 

Priya Narayan Rao
Manager of Quality Assurance Testing

 

Get involved with product development ASAP

Testing from the beginning state of the development process helps the QA team at CSG detect bugs early and address the defects, said Priya Narayan Rao, a manager of QA testing. Rao shared how else this key practice impacts the customer engagement platform’s products and features.

 

What’s the most important best practice your QA team follows, and why?

Our most important best practice is to have our QA team involved with product development as early as possible. We ensure this happens by following this process: user automation test; release; retrospective and feedback; project starts with requirement gathering; grooming; estimation; planning; development and prepare test scenarios; and testing. By testing early, we’re catching bugs early and we’re improving the quality of the software, which reduces the cost of quality maintenance. 


How do you determine your release criteria? 

CSG’s software is ready to be released when business requirements and acceptance criteria are met, unit testing is completed — which involves creating tests in isolation specific to independent units — and integration testing is performed. Then we prepare document release notes and have the entire QA team sign off on it saying the software is ready to be released into production. A formal document is then attached to the release CRQ.

 

How do you ensure the QA team stays up to date on shifting requirements, and how do you plan ahead to ensure changes are handled smoothly?

We test our products continuously. Continuous testing is the only way to ensure that progress is being made on the product. Also, we collect and provide ongoing feedback to guarantee that the product meets the needs of the business. To ensure changes are being handled smoothly, we use less documentation and have a reusable checklist in addition to automated testing. This allows our team to focus on testing as opposed to incidental details. Finally, we have flexibility designed into test scenarios and focus less on detailed test plans. The team works on initial automated testing on application aspects that are most likely to remain unchanged. They then take time to analyze the requirement and have a backup plan to work on changes if needed.

 

Jobs from companies in this blog129 open jobs
All Jobs
Finance
Data + Analytics
Design + UX
Dev + Engineer
HR + Recruiting
Internships
Marketing
Operations
Product
Project Mgmt
Sales
Content
Sales
new
CCC Information Services
Chicago
Product
new
CCC Information Services
Chicago
Developer
new
DRW
Chicago
Finance
new
DRW
Chicago
Operations
new
BenchPrep
Chicago
Developer
new
Morningstar
Chicago
Developer
new
Morningstar
Chicago
Developer
new
Morningstar
Chicago
Finance
new
Morningstar
Chicago
Sales
new
Morningstar
Chicago
Developer
new
Morningstar
Chicago
Project Mgmt
new
Morningstar
Chicago
Project Mgmt
new
Morningstar
Chicago
HR + Recruiting
new
Morningstar
Chicago
Developer
new
Topstep
Chicago
Developer
new
DRW
Chicago
Developer
new
DRW
Chicago
Design + UX
new
DRW
Chicago
Data + Analytics
new
DRW
Chicago
Developer
new
Morningstar
Chicago
Developer
new
Morningstar
Chicago
Developer
new
DRW
Chicago
Finance
new
Morningstar
Chicago
Developer
new
DRW
Chicago
Design + UX
new
Morningstar
Chicago
Developer
new
DRW
Chicago
Data + Analytics
new
Morningstar
Chicago
Developer
new
DRW
Chicago
Data + Analytics
new
DRW
Chicago
Developer
new
DRW
Chicago
Design + UX
new
Morningstar
Chicago
Data + Analytics
new
DRW
Chicago
Developer
new
DRW
Chicago
Data + Analytics
new
DRW
Chicago
Developer
new
DRW
Chicago
Content
new
Morningstar
Chicago
Data + Analytics
new
Morningstar
Chicago
Internships
new
Morningstar
Chicago
Developer
new
Morningstar
Chicago
Developer
new
CCC Information Services
Chicago
Internships
new
DRW
Chicago
Sales
new
Topstep
Chicago
Marketing
new
Morningstar
Chicago
Developer
new
Morningstar
Chicago
Finance
new
Morningstar
Chicago
Design + UX
new
Morningstar
Chicago
Operations
new
Morningstar
Chicago
Design + UX
new
Morningstar
Chicago
Developer
new
DRW
Chicago
Developer
new
Morningstar
Chicago
Data + Analytics
new
Morningstar
Chicago
Sales
new
Morningstar
Chicago
Operations
new
Morningstar
Chicago
Product
new
Morningstar
Chicago
Finance
new
CCC Information Services
Chicago
Data + Analytics
new
DRW
Chicago
Finance
new
DRW
Chicago
Developer
new
CSG
Chicago
Data + Analytics
new
Morningstar
Chicago
Developer
new
DRW
Chicago
Operations
new
DRW
Chicago
HR + Recruiting
new
Topstep
Chicago
Operations
new
Morningstar
Chicago
Internships
new
CCC Information Services
Chicago
Developer
new
DRW
Chicago
Data + Analytics
new
Morningstar
Chicago
Design + UX
new
Morningstar
Chicago
Product
new
Morningstar
Chicago
Finance
new
Morningstar
Chicago
Data + Analytics
new
Morningstar
Chicago
Finance
new
Morningstar
Chicago
Developer
new
CCC Information Services
Chicago
Developer
new
DRW
Chicago
Design + UX
new
CCC Information Services
Chicago
Developer
new
CCC Information Services
Chicago
Internships
new
CCC Information Services
Chicago
Operations
new
CCC Information Services
Chicago
Developer
new
Morningstar
Chicago
Operations
new
CCC Information Services
Chicago
Marketing
new
Morningstar
Chicago
Product
new
Morningstar
Chicago
Developer
new
Morningstar
Chicago
Developer
new
Morningstar
Chicago
Operations
new
Morningstar
Chicago
Data + Analytics
new
CCC Information Services
Chicago
Operations
new
CCC Information Services
Chicago
Data + Analytics
new
Morningstar
Chicago
HR + Recruiting
new
DRW
Chicago
Product
new
CCC Information Services
Chicago
Developer
new
DRW
Chicago
Operations
new
Morningstar
Chicago
Finance
new
Morningstar
Chicago
Project Mgmt
new
Morningstar
Chicago
Internships
new
CCC Information Services
Chicago
Design + UX
new
CCC Information Services
Chicago
Data + Analytics
new
CCC Information Services
Chicago
Developer
new
DRW
Chicago
Developer
new
Morningstar
Chicago
Developer
new
Morningstar
Chicago
Data + Analytics
new
DRW
Chicago
Finance
new
Morningstar
Chicago
Operations
new
Morningstar
Chicago
Project Mgmt
new
Morningstar
Chicago
Developer
new
Morningstar
Chicago
Product
new
Morningstar
Chicago
Product
new
Morningstar
Chicago
Product
new
Morningstar
Chicago
Project Mgmt
new
Morningstar
Chicago
Finance
new
Morningstar
Chicago
Finance
new
Morningstar
Chicago
Internships
new
Morningstar
Chicago
Sales
new
Morningstar
Chicago
Developer
new
DRW
Chicago
Developer
new
DRW
Chicago
Operations
new
DRW
Chicago
Sales
new
Topstep
Chicago

Chicago startup guides

LOCAL GUIDE
Best Companies to Work for in Chicago
LOCAL GUIDE
Coolest Offices in Chicago Tech
LOCAL GUIDE
Best Perks at Chicago Tech Companies
LOCAL GUIDE
Women in Chicago Tech