I’ve been a software developer for many years. I’ve seen two major transformations: one is the transition between frontend and backend code. The other is in release cycles and speed.
During the mainframe era, most code resided on the server, moving to the client when personal computers came into play. It shifted back to the servers with the web, and in the last five years, we’re seeing code shifting back to the client side as Single Page Applications (SPA) become more popular (and I assume most of you already heard of Angular or React or Ember or Vue and list goes bigger every day).
For back-end testing, most find API testing sufficient as it tests with high fidelity most of the server code. Unit Tests weren’t popular until recent years
As for front-end testing, little code resided on the client side, and API (backend) testing covered most of the app’s code. Many companies historically relied on manual testing. Automation was expensive due to 5 main reasons:
- Highly trained people – they are developers in any way: They write complicated code; They spring VMs and browsers on and off and deal with large scale when it comes to performance testing.
- Finding great developers is hard. Finding great developers who want to write test automation for a living has the same chances of Shaquille O’Neal not missing a single free throw in ten consecutive shots. 40% of a test automation person time is spent on authoring tests (1).
- High maintenance – In the agile world, UI changes are frequent, This implies that changing the UI requires changes in the UI-tests. 30% of a test automation person time is spent on maintaining tests (1).
- Long ramp-up – it usually takes weeks or months until you have a dozen of tests.
- Few environments. The OS and browsers were not as dispersed as today. Windows ruled the Desktop world, and IE had 95% market share.
In the last five years, we’re seeing code shifting back to the client side as Single Page Applications (SPA) become more popular (and I assume most of you have already heard of AngularJS or ReactJS or Ember.js or Vue.js, and the list goes bigger every day) as well as mobile.
Nowadays, a huge part of any project resides in the frontend. Not only did past challenges remain, the plot thickens as we have more environments (e.g. mobile and tablet), more Operating Systems (Window & Mac are both popular), and more browsers to support (IE11, Chrome, Safari, Firefox, Opera, Edge, not to mention old versions of IE.. may it rest in peace already). Also, the maintenance of front end tests has only improved marginally in recent years and is still as high as 30%.
To mimic a user interaction, you must first locate the element you wish to act upon. There are two major approaches to locate an element:
- Visual based – look at the pixels
Frameworks such as Sikuli use that technique, but it’s super fragile, as render per OS/browser/display-adapter will generate different pixels. Some display adapters are non-deterministic in nature. To have stable tests you must be a computer vision guru.
- Object based – look at the object hierarchy (DOM).
Frameworks such as Selenium spread like fire. As opposed to twenty years ago, better software design pattern emerged (such as the Page Object) which helped separate the tests’ business logic from the implementation, and urged reuse.
Nonetheless, both approaches have their own complexities.
In recent years we have seen a transition from waterfall development to agile. Waterfall cycles were long (e.g. Microsoft releasing a new version of Windows every 2 years), and QA teams would spend months manually testing the software. The short cycles in agile drive the need for automation and Test Driven Development (TDD). The road to automation is still bumpy so some organizations only test the backend via API, while most heavily rely on manual testing. Few claim to have sufficient coverage (both client and server) to release a version knowing it’s completely safe (AKA Continuous Deployment). Lack of proper coverage is risky given the renaissance of code shifting to the frontend and the focus on user experience.
Putting it all Together
The latter is the outcome of legacy tools for functional testing that haven’t taken advantage of technological evolution: improved computing power at a lower cost, cloud infrastructure & software as a service, real-time big data processing in microseconds, deep user analytics, and behavioral flows. Combining those technologies and thinking about quality in a different way can help us leapfrog to a future where tests are automatically created with every new feature. Reducing investment in testing yet improving quality and user experience is not a myth. If driving can be autonomous – so can tests. Over the next couple of blog posts I’ll describe the evolution of software development and quality assurance and why it leads to autonomous testing.