Leverage The Power Of Testers

If you’re a developer, you probably don’t appreciate the power of testers. In fact, you probably think negatively about some aspect of your testing team. They think of bugs that you haven’t considered. They try doing things you never designed your product to handle. And, they file bugs when they encounter unexpected behavior. What a pain!

If you’re a tester, you may not get respect from your development teammates, but you don’t quite know why. After all, you’re good at your job. You put the product through its paces and do your best to expose issues in the product’s behavior. Your job, in fact, is to do this before someone outside the company tries the same thing.

At its core, the difference between developers and testers comes down to two factors: starting point and mindset.

Developer Power: Developers Make

A good developer thinks of all the ways a customer can use a product. In some cases, developers work to the product spec created by a product manager or product architect. In other cases, developers consider behaviors that a user might try and determine how to handle those behaviors.

Because developers venture into the unknown, they look to product experts to guide their development. Product managers often must specify the behavior of the product explicitly so developers can code effectively. And, when developers ask “What if…” questions of the product experts, (e.g. “What if the user does…”), they delegate code behavior to others who know what they intend to make.

Tester Power: Testers Break

Do you know those commercials where Mercedes demonstrates the safety of their cars by testing them in a collision? Testers love that commercial.

Good testers assume that users might be bad actors and attempt to do something that the developer might not expect. Much of the time, the product team had not considered that behavior, so they wrote no specification for it.

As a recovering product manager, I tried to draw the distinction between defining the behavior and requirements for the customer benefit versus the product design. I wasn’t the product designer, I thought. However, while I worked with a bunch of competent developers, none of them were designers, either. Alas, I got the questions about usability, security, and other non-functional specifications. In that situation, I did the only thing useful I could think of — rely on the power of testers on the QA team. And, fortunately, they had prior test experience so they could identify potential failure modes and help describe trade-offs in design.

School Of Hard Knocks

In one of my favorite movies from the 1980s, Body Heat (Rated “R” for a reason), the protagonist, a public defender, gets asked by his girlfriend to help kill her husband. The public defender had previously defended an arsonist, who is now out of jail. The attorney figures a fire would help cover up the murder, so he meets with the former arsonist to ask what would take to start a fire. To which, the arsonist says:

“Are you thinking about committing a crime, counselor? If so, ‘you have to realize there are probably 50 ways you could [make a mistake and get caught], and you’re a genius if you can think of 25. And you ain’t no [bleeping] genius.’

“Do you know who told me that?” the arsonist continues. “You did, counselor.”

It’s the same point for product designers. You often must experience design failures before you can spot potential future design failures. Often, however, the failure experience accumulates with test engineers.

If you want to hear two engineers discussing the power of testers in some detail, listen to Angie Jones of Applitools talk with Dr. Nicole Forsgren of Google in their webinar: Test Automation as a Key Enabler for High Performing Teams. Both of them explain how testers bring unique perspectives to their teams.

The Heartbleed OpenSSL Example

Do you remember the “Heartbleed” bug in OpenSSL? One of my favorite xkcd cartoons gives a great overview. To ensure that a connection was set up properly, the initiator sent a test key of arbitrary length and number to the receiving side. The number was supposed to be the length of the sent test key. The receiver would send the number of bytes back according to the value. To exploit the bug, the value greatly exceeded the length of the test key — exposing the SSL stack of the receiving side back to the exploiter.

How would you know to look for that kind of bug? It existed in the wild for years before the exploit finally got exposed.

The power of testers comes from imagining scenarios like this in a design and considering the risks to future designs.

Failure Modes

Why experience? Because tools alone cannot help you. For example, you can measure execution coverage in your code. But, coverage tools won’t always tell you when you have potential bugs in your code. For example, you can execute 100% of your code and exercise all the expected input cases, but an unexpected input causes an unanticipated result.

You can also run functional tests that become out of date but still pass. A number of Applitools customers have experienced what happens when they don’t validate the visual behavior of their functional application tests after an app upgrade. New code comes out that can change the on-screen app rendering, but the underlying HTML and identifiers don’t reflect any change in the code. Tests pass, but the screen gets filled with visual errors. And woe to you if you release the app with those errors in the wild.

So, as failure modes continue to evolve, you need the combination of experience and tools to help you stay ahead.

Web Security Exploits

  • A1:2017-Injection
  • A2:2017-Broken Authentication
  • A3:2017-Sensitive Data Exposure
  • A4:2017-XML External Entities (XXE)
  • A5:2017-Broken Access Control
  • A6:2017-Security Misconfiguration
  • A7:2017-Cross-Site Scripting (XSS)
  • A8:2017-Insecure Deserialization
  • A9:2017-Using Components with Known Vulnerabilities
  • A10:2017-Insufficient Logging&Monitoring

OWASP even offers a testing guide for exploits:

https://www.owasp.org/index.php/OWASP_Testing_Project

Again, these are the kinds of issues one learns from experience.

Continuous Deployment Teams

Elisabeth Hocke, a test guru on agile test and principle agile tester at Flixbus, teaches a great course about The Whole Team Approach to Continuous Testing on Test Automation University. In this course, she explains how continuous delivery demands the team mindset for testability. Take the course, or read my summary of it.

Priyanka Halder, head of quality at GoodRX, gives a great webinar about High-Performance Testing — Acing Automation in Hyper-Growth Environments. She talks about how her team embeds into the development team as a practical application of continuous delivery. You can watch her webinar and read my blog about it.

This may not be your organization. Just know that there are organizations where developers value the contributions of the test team, and quality engineers help coach the developers to build testable code.

Seek Wisdom In Experience

Realistically, quality engineers face a myriad of challenges that developers cannot always help solve. The OpenSSL Heartbleed exploit existed for years before someone understood the problem. Because developers can blind themselves to failure modes, quality engineers have to spend time imagining a horrible future.

New approaches help bridge the gap between development and test. New frameworks help developers build testability hooks into each of their applications. Standardized approaches help ensure that all changes can be anticipated.

At Applitools, we address a specific blindness. Web apps include built-in complexity that you might sometimes ignore because of standard. HTML, CSS, and JavaScript have standards, so developers expect that they only need to code once for all platforms. Realistically, though, today’s testers know that rendering engines behave differently across platforms, viewport sizes, browsers, and operating systems on both mobile devices and computers. In the end, pixel and DOM comparisons lead to so much extra work that testers limit their automation to validating expected behaviors — blinding themselves to unexpected behaviors.

Conclusion

But, as the continuous delivery examples show — successful teams can come from these opposite views collaborating. Wherever you find yourself on this continuum, know that collaboration is likely in your future.

Further Reading

Originally published at https://applitools.com.

Deliver visually perfect web and mobile apps with AI-powered end-to-end visual testing & monitoring. Get started, free: http://bit.ly/ApplitoolsFree