web.onassar.com Archive

I can be reached at onassar@gmail.com.

For my open source work, check out github.com/onassar

QA Teams: Why I don't like working with them

View more posts

I understand the purpose of QA teams. They ensure that the software (or what-have-you) you're putting out to be used is of "quality". As I do web development, every 10+ person company I've worked at/with has employed at least one or two QA engineers.

But I truly think that having a QA team to fallback on results in poorer code, more edge-case bugs, and long-term, bad programming practices.

Again, having a good QA team can mean the difference between your entire environment going down, being insecure, or result in data-faults. But on the flip side, why should they be the ones who catch this?

The best code I've ever written has been that which I needed to vet myself. The code that I conceptualized, architected, executed, and deployed. I had no choice but to think about every edge-case when writing a class or function. I would go in throwing errors left-right-and-center.

So what happens when I have a QA team to fallback on? I write so-so code. I know they'll catch the edge-cases (or at least I assume they will). I write code that might not scale, or might be insecure. I just care less. I don't think that's indicative of my work-ethic either. I have exceptional work ethic when it comes to development.

The issue seems to be that a QA team exists to catch bugs; and as such, an extra 30% of development time is allocated to the QA cycle. If I were to write proper, scalable, secure code, I'd need to use at least 20-25% (these are example figures) which would leave 5% for the QA cycle. And why isn't that acceptable? Because of overhead.

There is large overhead with QA teams. Management, deployments, bug logging, reproduction, resolution, regression testing, etc. etc. It's not as simple as fixing a bug and moving on.

I've thought a bit about a better QA cycle, and ironically I think it involves more time investment before development starts, rather than after.

Proper UX/Data Definitions

Most bugs that I encounter are around edge cases, assumptions around the data that's being collected, or design inconsistencies.

I think spending more time before development begins with the designers, project managers and product managers makes the most sense. Drill down into the entire programmatic process (not the nuts and bolds of the code, but the flow of development). Ask about every piece of data that's being collected, and try and catalog what it's limitations are.

I think we'll always need QA teams for software development. But I think their role right now is far too grand. They ought to be ensuring the quality of the code being pushed. Not catching bugs every-which-way. Spending more time up front with UX/data architects and the people defining the scope of the project (if that's not you), to being as specific as possible about limitations could ideally save you copious amounts of time during the QA cycle.

And as an added benefit, turn you into a more conscientious developer who turns out way less bugs.