Software programmers are not blue-collar workers

Like many skilled workers, software developers are smart creative people. Leading or managing such people requires finding a way to guide, support and empower them, while letting them do their job. Writing software is inherently a process in which trade-offs and compromises are made continually and iteratively till the end. It is therefore essential that all parties stay as close as possible together so that they can weigh decisions when it matters and actively take part in the game.

Like many skilled workers, software developers are smart creative people. Leading or managing such people requires finding a way to guide, support and empower them, while letting them do their job. This sounds easy, but I have seen many organisations and individuals failing at it, leading to the very issues they were trying to prevent. The recurring mistake people commit is that they treat skilled workers in the same manner that they would do blue-collar workers.

I have built and managed technical teams throughout my career, I’ve always been aware of the need to balance communication with smart and skilled folks like software developers. I continuously practice software development myself. By staying hands-on on both sides of the equation, I’ve continuously kept tap of the challenges involved. It’s one thing to be programming, but leading and managing developers is an entirely different game.

Instructions can be crippling

Some believe in providing extremely detailed instructions to developers, as a way to manage work. That is misguided for a number of reasons. Every software programming endeavour is as unique as can be. Micro decisions need to be made at every step of way, the conditions that might have been prevalent in a prior project context would have been largely superseded by new ones. However detailed, instructions will leave gaps that developers will run into. As such speed bumps are encountered, people want them to be removed swiftly and they would feel better if they could that themselves. If developers are not properly empowered, such gaps will lead to frictions, could cause tensions and result in defensive behaviours. While well intended, detailed instructions is not a very effective instrument for managing software development work.

If everything goes, nothing does it

The other extreme to too much instructions is having none. Without any guidance, anything goes. Such software development will lead nowhere. For software development to succeed, all those involved should be able to describe in a consistent and coherent manner what the aim is and what the final product should be. Software developers don’t work in isolation of other units in an organisation, they need to know what is intended, what is expected of them. An optimal set of guidance accepted across teams is a necessity, not a luxury. Working out exactly what such guidance is and how much of it is needed, is part of the job that leadership must facilitate.

Bean counting may be useless

Some are painstakingly tracking developer hours, lines of codes, and other misleading metrics, in the firm belief that that is the way to manage development resources. For the kind of work that doesn’t require thinking or coming up with solutions, typically repetitive tasks, that may yield some results. In software development projects this is a fruitless endeavour, developer productivity does not lie in any of those things. Bean counting is only useful if used anonymously as a learning tool, and perhaps as supporting documentation in the event of a dispute over resource usage. This last one is typically what software consulting bureaus are worried about, they then unfortunately put so much emphasis on it that it becomes a damocles sword over people’s head. Finding the metrics that matter is a skill.

Perfectly codified work is demotivating

If software development work could be codified to such extent that it could be flawlessly executed without any thinking, exploring, personal problem solving skills, then nobody would ever want to do it. That would be the most boring job ever for anyone. It makes people feel insecure, a feeling that could be summed up in an imaginary thought quote:

If I am not bringing anything special to this work, I can be easily replaced by someone else, or a robot! I don’t like that. I want to be perceived as being special to this project, I reject any shoddy instruction that would make my job look plain and simple.

This is also something that team managers and leaders should be careful about. When providing guidance turns into hand-holding, people stop thinking for themselves and the group may walk blindly straight into trouble. And when trouble strikes, finger pointing is often what follows. Besides, people learn when they can get themselves stuck and work their way through hurdles. When the conditions are right, the outcome will almost always exceed expectations.

Architecture: just enough guidance

Writing software is inherently a process in which trade-offs and compromises are made continually and iteratively till the end. It is therefore essential that all parties stay as close as possible together so that they can weigh decisions when it matters and actively take part in the game. In my experience, what works well is to provide just enough guidance to developers so as to establish appropriate boundaries of play. When that is well understood, leadership must then work to establish metrics that matter, and closely manage such metrics for the benefit of all those involved.

Treating software developers as blue-collar workers to be told “here, just go and write software that does this and come back when it’s done” is fatally flawed. People who typically make such mistakes are:

  • project managers, who are clearly making assumptions that only they espouse and neglecting the basics of creative work process
  • architects, who believe that developers are lesser capable workers that just need to do as told, follow the blueprints
  • technical managers, who have the impression that managing people is about shouting orders, being perceived as having authority
  • people enamoured with processes, tools and frameworks, oblivious of the purely human and social aspects of software development

In my experience, the root cause of such mistakes are relics of obsolete management practices that favoured hierarchy over social network. Note, my use of the term social network has absolutely nothing do with the multitude web sites that hijacked this vocabulary. By social network, I am referring to situations when a group of humans are engaged in making something happen together, it is social because of the human condition, and it’s a network because of the interdependencies.

This subject is vast and can be treated over a much larger text than I am doing here. In this post I’ve only touched upon a few of the most visible and recurring aspects, I’ve stopped short of discussing issues such as applying learning,  methodologies or tools.

If you recognise some of the shortcoming listed above in the context where you are in, and if you are in a position to influence change, I suggest that you at least initiate a dialog around the way of working with your fellow workers.

Cyber security white elephant. You are doing it wrong.

The top cause of information security weakness is terrible user experience design. The second cause of information security vulnerability is the fallacy of information security. A system designed by people will eventually be defeated by another system designed by people. This short essay is just a reminder, perhaps to myself as well, that if certain undesired issues don’t appear to be going away, we may not be approaching them the right way.

The top cause of information security weakness is terrible user experience design. The second cause of information security vulnerability is the fallacy of information security. Everything else flows from there.

It happened again, a high profile IT system was hacked and the wrong information was made available to the wrong people in the wrong places at the wrong time. This is the opposite of what information security infrastructure is defined as. Just search the Internet for “Sony hack”, one of the top hits is an article from 2011 that reads as Sony Hacked Again; 25 Million Entertainment Users’ Info at Risk

This type of headline news always beg some questions:

Don’t we know that any information system can eventually be breached given the right amount of resources (resource includes any amount and combination of: motivation, hardware, people, skills, money, time, etc) ?

Is it really so that deep pocketed companies cannot afford to protect their valuable resources a little harder than they appear to be doing?

Do we really believe that hackers wait to be told about potential vulnerabilities before they would attempt to break in? Can we possibly be that naive?

This has happened many times before to many large companies. Realistically, this will continue to happen. A system designed by people will eventually be defeated by another system designed by people. Whether the same would hold true for machine designed systems versus people designed systems is an undetermined problem. Once machines can design themselves totally independent of any human action, perhaps then information security will take new shape and life. But that might just be what professor Steven Hawkins was on about recently?

I suggest that we stop pretending that we can make totally secure systems. We have proven that we can’t. If however, we accept the fallibility of our designs and would take proactive actions to prepare, practice drills and fine tune failure remediation strategies, then we can start getting places. This is not news either, people have been doing that for millennia in various capacities and organisations, war games and military drills are obvious examples. We couldn’t be thinking that IT would be exempt of such proven practices, could we?

We may be too often busy keeping up appearances, rather than doing truly useful things. There is way too much distraction around, too many buzzwords and trends to catch up with, too much money to be thrown at infrastructure security gears and too little time getting such gears to actually fit the people and contexts they are supposed to protect. Every freshman (freshwoman, if that term exists) knows about the weakest link in information security. If we know it, and we’ve known it for long enough, then, by design, it should no longer be an issue.

It’s very easy to sit in a comfortable chair and criticise one’s peers or colleagues. That’s not my remit here, I have profound respect for anyone who’s doing something. It’s equally vain to just shrug off and pretend that it can only happen to others. One day, you will eventually be the “other”. You feel for all those impacted by such a breach, although there are evidently far more important and painful issues in the world at the moment to be worrying about something like this particular breach of confidentiality.

This short essay is just a reminder, perhaps to myself as well, that if certain undesired technology issues don’t appear to be going away, we may not be approaching them the right way. Granted the news reporting isn’t always up to scratch, we do regularly learn that some very simple practices could have prevented the issues that get reported.