Software Quality Challenges in the Race of Release – Testim

Time is money. How many times have you heard that? We are truly a "Startup nation" constantly racing against the…

Software Quality Release
Testim
By Testim,

Time is money. How many times have you heard that?

We are truly a “Startup nation” constantly racing against the clock to deliver multiple features and execute sprint content, to meet our customer’s demands. This intense pace is an everyday reality. As a QA Engineer that has worked with and in different organizations, I have experienced it up-close and personal.

On one side there are owners and investors – they want to see growth. On the other side, there are customers – they want features and capabilities that work. And then there is us, Testers – we want to deliver quality.

Where does software quality lie in the release process?

Let’s start by defining software quality and how would you measure it?  In the context of software engineering,  software quality refers to how well it meets functional and non-functional requirements.  In terms of how well you can measure it; there are various metrics associated with it and is complex. This is like asking, “how would you define a high-quality watch, car, or a clothing item?”

Could it be that from using the product you can feel that its creator/maker used good materials (even if it means that the price would be higher)? If you use it for a long time, would it still be preserved from standard wear and tear? Is it designed to be comfortable? Fun to use? Does it break or overheat when you accelerate to a high speeds or drive long distances?

With that said, a Mercedes costs 10 times more than a Honda. Does that mean that a Honda is not a good quality car?

All of these examples teach us that quality is based on a perceived notion of price to value.

  • Does the product serve its purpose in a good way?
  • Can it stand up to our “use demands” in a reliable, long-lasting way?

Price can be interpreted in different ways as well – for example – implementation and maintenance time.  Don’t be fooled,  breaking this perception is a lot easier than building it in the eyes of our users. I can think of more than one car brand that has managed to break our perception of it in the last decade or so.

The farther you run, the more expensive it is to go back.

What I’m about to write will not be a shock to anyone, and still, you will be surprised to hear the number of organizations I see that just don’t seem to assimilate that idea. The earlier you will incorporate quality practices into the product’s life cycle – the less money you will spend on fixing it in the long runI have seen how a wide aspect feature is being invented, designed and developed, only to understand that it is different from the poet’s intention, or does not serve its purpose the best way.

What can possibly go wrong? Nothing much, just:

  • Features developed contrary to the customer’s tastes/needs
  • Modules that do not carry out the action for which they were designed
  • Time-consuming debates on whether it was a bug or maybe the characterization was not unequivocal and clear enough
  • Going back and forth by working on/fixing the same module for several times
    • As a result: lack of time to perform all of the evaluated “sprint content” = less features, less progress
  • Features that aren’t clear enough for the user and result in complaints or support issues
  • Unsatisfied customers
  • Bad user experience
  • Bugs in production
  • Wasting time = money

Make a decision, start the change!

Start by understanding the importance of having a wide range testing infrastructure in your organization. Sure, it will not happen overnight. Yet still, it is an investment that produces huge value.

Suit up, sneakers on, and start the process:

  • Start designing a work procedure that includes clear written requirements. People never completely understand one another. If there is no written documentation, it can become challenging to examine what to develop and what to test. I am realistic, there is no place or time for detailed requirements in a startup. However, we can design a format which is short enough to be written in our race and clear enough for the developer to understand what to perform.  
  • Implement the use of a knowledge base and management tools that will suit your needs. Lets define knowledge base.  As far as I’m concerned, it can be a Google-Drive but you still need one place that I can go to if I need to know how a feature works. Now, about management tools, lets just say that it is a topic of a whole another article and still what you need to understand is that there are free requirement management, bug management and process driven tools that could help you organize your everyday tasks.
  • Start building a work process which will suit your company’s needs and corporate culture. The same as there are rules to running a marathon, there has to be rules to running a startup. When do you start the run? When can the tester join? What is the criteria that indicates that the lap is over? What can and can’t be done in the middle of the sprint? Every game needs its set of rules.
  • Implement productive communication between product and QA. How? Well for one thing here is personal example. Start by implementing a productive and positive discussion. Every individual requires an approach so they will not “lock up” to a defensive position and observe your words. Take some time to learn your environment and how to approach each member.
  • Assign a quality lead that will provide you with the added value of a productive process.
  • Incorporate quality practices as early on as possible in the concept, design, and development. You will be surprised how effective it is to review requirements (even before one line of code has been written) and how much time it can save you in the long run.  
  • Make an organizational level decision to make quality an high priority issue, and let QA do their job with the appropriate amount of authority.

Two points I would like to add to all that has been written:

  1. It is a process and conception that takes time, but it will return its investment.
  2. It’s not magic, there will always be bugs! The question is in what amount and what severity?

Remember the question we asked at the beginning of the article? (How do we fit in this never-ending race?)

I would like to wrap up this article with a quote by the famous Tester Michael Bolton:

“I wish more testers understood that testing is not about “building confidence in the product”. That’s not our job at all. If you want confidence, go ask a developer or a marketer. It’s our job to find out what the product is and what it does, warts and all. To investigate. It’s our job to find out where confidence isn’t warranted where there are problems in the product that threaten its value.”