It never ceases to amaze me the number of different
titles that show up on the business cards and in the email signatures
of people who test software for a living. As if the sheer number of
titles weren't bad enough, many of them are just plain erroneous. It's
so bad, in fact, that I put the following instructions on the "Who are
you?" slide when teaching classes about software testing:
Please introduce yourself. Be sure to include- Your name
- A little about your current project
- Your business card title
- What you actually do
These instructions invariably draw a few chuckles, but it takes only
a few people to introduce themselves before the class begins to
appreciate the number of titles software testers hold, the variance in
what software testers do across the industry, and the fact that there
doesn't seem to be much correlation between one's business card title
and their job descrīption.
The most common titles I hear, in rough order of frequency, include the following:
- Software Quality Engineer
- Test Automation Engineer
- Quality Assurance Engineer
- Software Development Engineer in Test (SDET)
- Test Engineer
- Quality Control Specialist
- Quality Analyst
- Systems Analyst
- Software Test Manager
- Business Analyst
- I don't have a title
| | | | | Without a clear understanding of what we do, it's only natural that people try to interpret what we do from our title.
|
| | | | | |
| |
|
"Specialist," "Engineer," "Architect," and
"Manager" are usually indicators of grade or seniority, though I can't
figure out what makes an "Engineer" subordinate to an "Architect" or a
"Manager" superior to either. That is certainly not the hierarchy I
learned while earning a B.S. in Civil Engineering. It also seems to be
common to more or less randomly insert "Software" just about anywhere
in the titles above.
The most common ways people describe what they do, also in rough order of frequency, include the following:
- I test software
- I automate tests
- I find bugs
- I validate requirements
- I break things
- I manage the test team
- I get ignored then blamed
- I'm not sure; I was hoping you could tell me
The only titles and descrīptions that consistently map to one
another are ones that include "Automation" or some form of "Manage."
The rest of the titles and descrīptions seem to match up with one
another almost haphazardly. By the time everyone has introduced
themselves to the class, there is often a prevailing sense that the
software testing industry is in the middle of an identity crisis. At
this point in the class I could cite the fact that very few of people
get to choose their business card titles to dismiss the divergence of
titles as irrelevant. I could even use it as an excuse to quote William
Shakespeare:
"What's in a name? That which we call a rose by any other name would smell as sweet."
In actuality I do neither because I believe this phenomenon is
relevant, confusing, and at least potentially detrimental to them, to
the software testing industry, and to the software that we test.
Besides that, I don't think either the fact that we don't choose our
own titles or the Shakespeare quote accurately characterize the
situation.
The Shakespeare quote doesn't apply because, while
it is reasonable to assume that most people are familiar with roses and
what they smell like, it is unreasonable to assume that even our
non-tester teammates are familiar with what we do. Without a clear
understanding of what we do, it's only natural that people try to
interpret what we do from our title. With that in mind, it's not hard
to imagine how this can lead to our teammates expecting things from us
that are inaccurate, unreasonable, or even impossible.
Consider the title "Software Quality Assurance Engineer" as an
example. If I wasn't already aware that this title is generally a
pseudonym for "Software Tester," I'd assume it was a title describing a
degreed (maybe even licensed) engineer whose role is to assure that
software achieves some governmentally enforced standard of quality.
Further, I'd assume someone holding that title could be held
individually liable, by law, if it were determined that the software he
assured did not meet the standard of quality. The image that pops into
my head when I think of a Software Quality Assurance Engineer is of
someone wearing expensive business attire and a hardhat, holding a
clip-board, and sitting at a computer watching code scroll by while
marking off list items on sheet of paper labeled "Code Quality Audit."
This, of course, is ridiculous. In reality, very few of us are
degreed (let alone licensed) engineers, and I'm not aware of a single
instance of a tester being held legally liable for poor quality
software. Additionally, with rare exceptions, we don't assure much of
anything. (How could we when we neither control the software nor make
the release decisions?) Finally, I think it is pretty clear based on
the quality of some of the software that you and I pay for, that there
are no governmentally enforced standards of quality for most software.
Somehow I don't think weather forecasters have "Weather Quality
Assurance Engineer" printed on their business cards. Clearly, weather
forecasters aren't expected to either assure or engineer the quality of
the weather, yet I regularly encounter people who think that software
testerscanandshouldassure or engineer the quality of
software. I don't think many people who test software, independent of
their title, believe they can predict the quality of the software that
will ultimately make it into production any more accurately than a
weather forecaster can predict the quality of next Wednesday's weather.
Since there seems to be a prevalent desire for software testers to
have fancy sounding titles, maybe we should consider "Software Quality
Forecaster" instead. At least that would help our teammates better
understand what we really do.
----------------------------------------
About the author:Scott Barber is the chief technologist ofPerfTestPlus, vice president of operations and executive director of theAssociation for Software Testingand co-founder of theWorkshop on Performance and Reliability.