How To Ace High-Performance Test for CI/CD

About Priyanka

Start With The Right Foundation

Define High-Performance Testing

  • Chaotic — you need to be comfortable with change
  • Less time — all hands on deck all the time for all the issues
  • Less resources — you have to build a team where veterans are mentors and not enemies
  • Market pressure — teams need to understand and assess risk
  • Reward — do it right and get some clear benefits and perks

Why High-Performance Testing?

  • Scope — instead of running a dedicated app, or on a single browser, today’s apps run on multiple platforms (web app and mobile)
  • Frequency — we release apps on demand (not annually, quarterly, monthly or daily)
  • Process — we have gone from waterfall to continuous delivery
  • Framework — we used to use singe-stack on premise software — today we are using open source, best of breed, cloud based solutions for developing and delivering.

How To Achieve High-Performance Testing

  • Does the team appreciate that failures can happen?
  • Does the team have inconsistencies? Do they have unclear requirements? Set impossible deadlines? Use waterfall while claiming to be agile? Note those down.

Building a Team

Test Automation

  • Assertions on Action
  • Initialization and Cleanup
  • Data Modeling/Mocking
  • Configuration
  • Safe Modeling Abstractions
  • Wrappers and Helpers
  • API Usage
  • Future-ready Features
  • Local and Cloud Setups
  • Speed
  • Debugging Features
  • Cross Browser
  • Simulators/Emulators/Real Devices
  • Built-in reporting or easy to plug in

Industry Standards

  • Quality — pass at least 90% of all tests run
  • Run Time — average of all tests run two minutes or less
  • Platform Coverage — tests cover five critical platforms on average
  • Concurrency — at peak usage, tests utilize at least 75% of available capacity
  • Quality — 18% of the companies achieved 90% pass rate
  • Run time — 36% achieved the 2 minute or less average
  • Platform coverage — 63% reached the five platform overage
  • Concurrency — 71% achieved the 75% utilization mark
  • However, only 6.2% of the companies achieved the mark on all four.

Investigating Benchmarks

GoodRx Strategies

Test In Production

  • Feature Flag — Test in production. Ship fast, test with real data.
  • Traffic Allocation — gradually introduce new features and empower targeted users with data. Hugely important for finding corner cases without impacting the entire customer base.
  • Dog Fooding — use a CDN like Fastly to deploy, route internal users to new features.

AI/ML

ReportPortal.io

High-Performance Testing — The Burn Out

  • Being cynical or critical at work
  • Dragging yourself to work and having trouble getting started
  • Irritable or impassion, lack energy, hard to concentrate, headache
  • Lack of satisfaction from achievement
  • Use food, drugs or alcohol to feel better or simply not to feel
  • Avoid unachievable deadlines. Don’t take on too much work. Estimate, add buffer, add resource.
  • Do what gives you energy — avoid what drains you
  • Manage digital distraction — the grass will always be greener on the other side
  • Do something outside your work — Engage in activities that bring you joy
  • Say No too many projects — gauge your bandwidth and communicate
  • Make self-care a priority — meditation/yoga/massage
  • Have a strong support system — talk to you family, friends, seek help
  • Unplugging for short periods helps immensely

GoodRx Case Study

Set Goals

  • Distributed QA team 24/7 QA support
  • Dedicated SDET Team who specializes in test
  • A robust framework that will make any POC super simple (plug and play)
  • Test stabilization pipeline using Travis
  • 100% automation support to reduce regression time 90%

Build a Team

Build a Tech Stack

  • Python and Selenium WebDriver
  • Behave for BDD
  • Browserstack for a cloud runner
  • Applitools for visual regression
  • Jenkins/Travis and Google Drone for CI
  • Jira, TestRail for documentation
  • Speed and parallelization
  • BDD for easy debug and read
  • Cross-browser cross-device coverage in CICD
  • Visual validation

Set QA expectations for CI/CD testing

  • Hourly; QA runs 73 tests hourly against the latest build to sanity check the site.
  • On Build: Any new build runs 6 cross-browser and makes sure all critical business paths get covered.
  • Nightly 300 test regression tests on top of other tests.

Priyanka’s GoodRx Quality Timeline

  • In her first quarter, she added a QA Manager, QA Analyst, and a Senior SDET. They added offshore reprocessing to support releases.
  • By October 2018 they had fully automated P0/P1 tests. Her team had added Spinnaker pipeline integration. They were running cross-browser testing with real mobile device tests.
  • By December 2018 she added two more QA Analysts and 1 more SDET. Her team’s tests fully covered regression and edge cases.
  • And, she pressed on. In early 2019, they had built automation-driven releases. They had added Auth0 support — her team was hyper-productive.
  • Then, she discovered her team had started to burnout. Two of her engineers quit. This was an eye-opening time for Priyanka. Her lessons about burnout came from this period. She learned how to manage her team through this difficult period.

GoodRx Framework for High-Performance Testing

Conclusions

For More Information

--

--

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store