Sunday 24 May 2009

Competency and the Art of War

Recently working in Environment, Health, and Safety has brought to the forefront something that has always confounded me: competency.

What is this exactly? A dictionary definition might be that a person is qualified to do a certain job. That person either has the knowledge, or can prove practical skill via a supervisor's assessment by some standard. In IT, there isn’t a lot of competency. I know that will probably irk a vast majority of people in the industry; but it’s true. Look at the history. People from all fields have flocked to IT, or have been seconded and promoted into IT positions. Think DOT COM. If you can’t find workers, you probably overlook the lack of education or certification in the attempt to have a warm body in a position that gets you some sign of progress. But would you let a plumber perform heart surgery?

Sure, you can argue that the heart is a pump, with valves and pipes- but what about all that medical mumbo-jumbo?

There has been a consistent pattern in IT as technology rushes forward without skilled labor to apply it. The history was, there weren’t enough computer science (a science discipline), computer engineering (an engineering discipline), or management information systems (a management discipline) graduates available, and computer stuff was shinny and new. So yes, you did see a lot of domain types jump the fence from accounting and whatnot into programming. This might have been okay thirty years ago if the employee remained in the same domain as their expertise.

So with the labor shortage, you saw a lot of colleges and non-accredited educational companies offer six month courses for a technology diploma. Your average university degree is four years. And I remember hearing the argument in the early 90’s, "We won’t hire a university grad because the tech schools push out learning on the bleeding edge, and that is what we need." Strangely, I heard that in a software company that was staffed majoritively by university engineering graduates. But the point is, technology moved fast enough then to question the credentials of people who were willing to dedicate a few years to training.

And it got worse. In the latter 90’s, all sorts of specializations of IT appeared. In some sense this was a quest for competency by business. All sorts of organizations started to appear (or get revamped) that covered enterprise systems architecture, business analysis, project management, information systems auditing, and many, many more. (This was shortly after you had technology certificates appear for short courses on tech-du-jour. I always found Microsoft accreditation funny, because being a user from day one, I giggled at Microsoft certified people that learned how to power on a server, start and stop services, and things that seemed trivial from either reading a manual or a few intuitive mouse clicks. But I digress...)

A good software developer wore many hats. And many people gathering requirements were also involved in translating them into technical specifications. Software processes evolved and were refined. And it was okay to be involved at various stages of
software development. But then something changed; people needed to fit specific roles, be certified for these roles, and not cross the line. Driven by failure, the software industry and Information Technology compounded their problems by disconnecting the people and communication from business to developer. Worse: people that were competent, measured by their success, started to have their credentials questioned by human resource departments that expected certification-du-jour. HR: What are you? CP: A senior systems analyst, been doing it for 20 years. HR: We don’t recognize that job anymore, so do you manage people? CP: Yes, I manage five guys. HR: So you’re a project manager, do you have your PMP from PMI? CP: Huh? HR: We don’t think you’re qualified.