Friday, January 20, 2023

Let's Talk About "Time Theft"

So, this morning, I woke up to find the lurid headline "What is Time Theft, and Why Are Some Employers So Worked Up About It?" on CBC.  After reading it, I'm still spitting nails angry with the article and its suppositions. 

First, let's start with the general idea of "time theft".  Fundamentally, this notion has been around ever since the early days of the Industrial Revolution. Whether it's called "slacking off", "goldbricking", or now "time theft", it's the fundamental idea that if you aren't going full out 100% of the time at work, you are somehow "stealing time" from your employer.

The roots of the idea lie in a very old, feudal concept that you "owe" your employer somehow for the privilege of having a job, and the deep suspicion of the "nobility" that the peasants will do everything they can to short-change their "lord". In the feudal era, that was a very direct, tangible exchange. The "lord" was entitled to a percentage of your crops, or whatever it was you produced.

It works well when things are tangible and concrete. Work hasn't been that concrete since the end of the 19th century. The emergence of administrative workers, supervisory jobs, and knowledge workers very quickly started to erode the direct and tangible exchange going on. In today's world, where an increasing number of people are doing work that is knowledge work, the idea of of "time theft" is not only archaic, but downright wrong. 

Let me explain. Consider a software developer for a moment - it's a relatively new profession - only really in existence since the 1950s, and the struggles around measuring developer "productivity" are much better documented than in other domains I can think of.  At first, it was mostly seen by management as some kind of voodoo, and programmers pretty much got to do as they pleased without oversight because there was no understanding of what exactly they were doing. Then various attempts at "managing" things crept in - setting deadlines around projects, trying to "estimate" how long it might take to do particular tasks, and so on. 

None of this worked particularly well for a variety of reasons, but perhaps the most notorious attempt at measuring productivity came in the form of "Lines of Code".  Basically, the idea there was that the programmer was now being measured on how many lines of code they wrote in a day.  Except that turns out to be a terrible metric because a lousy programmer can churn out hundreds of lines of absolutely useless code, and a talented programmer might only turn out a handful of lines, but they would be robust and reliable. 

The observation that comes out of this is that a lot of "work" happens in people's heads, and doesn't necessarily mean that it's "stolen time".  I've had many times where I have spent hours thinking a problem through only to find that the solution was a handful of lines of code - or even a mere few characters in one line of code. In the time I was thinking about the problem, I might get up and go grab a coffee, go downstairs and grab a snack, vacuum the rugs if I was working at home, etc.  None of that is "stolen time" per se, even if the monitor boss thinks I wasn't "working".  

The argument being made is essentially that "doing anything that benefits you" while "at work" is somehow "theft" because you're being paid for it.  In the context of working from home, that means that doing things like starting a load of laundry, or running a vacuum over the floors is somehow "stealing" from your employer.  The reality is that those "thefts" of time always happen, whether one is working at home or in a formal office. What do they think those conversations in the water cooler, or in a colleague's office amount to?  It doesn't mean that productive work isn't happening, it means that the person is taking a break in order to be able to finish thinking something through. (and yes, thinking something through often means giving the brain a rest)

This is basic human factors stuff. Whether you are working at physical tasks, or doing something that is more abstract, it's necessary to take breaks, do other things for periods of time in order to maintain productivity. You could argue "well do something else, just as long as it's work related".  Except that's not how people function. 

If the tasks of the job aren't getting accomplished, you might have reason as an employer to address with the employee whether or not they are doing their job.  However, especially in knowledge work, or other kinds of labour that is now possible to do remotely, we have to recognize that current approaches to tracking productivity are mostly garbage. Intrusive technologies like activity logging are arguably abusive, and make the same error as the old "lines of code" metric, and "a bum in a chair" is similarly a less than useful way to track someone's working habits. 

We are also a long ways from the days of it being practical to directly oversee the work going on - far too much of the work in today's world is abstracted by technology, and more and more of the physical work is being automated to the point that those jobs are gradually disappearing. (Yes, this is a problem, but it's another issue entirely).

Employers are essentially demanding absolute "loyalty" from workers, but offer little in return except a paycheque at the end of the day (and if the employer goes bankrupt, workers are at the end of the line for creditors - another part of our system that is broken). Arguably, employers who refuse to pay their workers a living salary, or engage in active wage suppression games, are themselves engaging in a form of theft from their workers. 

It is high time for a new compact between employers and workers, and this one has to be legitimately reciprocal. It has to be built with a less cynical view on both sides. 

No comments:

About “Forced Treatment” and Homelessness

I need to comment on the political pressure to force people experiencing addiction into treatment. Superficially, it seems to address a prob...