Tech Industry Workers Have Become Cogs In a Broken System

What’s wrong with the tech industry? That’s the question that has kept me up at night, and also wished me good morning, and also haunted my dreams over the last few years. It’s the one I get asked again and again, usually early in the conversation when I’m expecting to be asked about my kids and stuff.


The tech industry drama has been escalating, especially over the last year, during which I’ve been uncovering various big tech industry problems—from lousy products nobody wants, to the over-reliance on the misguided promises of AI, to the quagmire that hiring is stuck in, to scared executive leadership, to the rapid decline of productivity.


In the video, one of Optifye’s co-founders plays a garment factory worker and another plays the factory line manager, with the latter berating the former for not hitting his quotas, and pointing to Optifye’s camera-assisted dashboard in real time to make his case.


Let’s take Optifye out of the equation, because they are indeed aimed at blue-collar line workers. You can be mad about that, that’s fair, but this theory isn’t about the tech, it’s about the money going into the tech.


My friend has a point. Maybe. To me, new money going into AI-and-computer-vision monitoring software to measure productivity seems like new money going into those time-card clocks like Fred Flintsone punched out of at the end of his work day.


Here’s the reckless speculation. If you want to measure the productivity of white-collar workers—someone sitting behind a desk, either onsite or remotely, and doing things like “thinking, talking, and typing” instead of “assembling, checking, and approving”—and you want to be able to call them out like you’d call out line worker #17, well, you’d need much better monitoring software, with an AI kick, to be able to do that.


The use of monitoring software in the white-collar environment is not a new concept. It’s not even a pandemic-era concept. But like most technology since 2020, it’s taken a decentralization turn to rein in productivity that got scattered across the globe. 


You don’t have to go any further than return-to-office mandates to see that the arguments for productivity are being wielded to preserve an idea of measurability, consistency, predictability, and profitability. 


And yeah, it’s hard to argue that you can trust that someone who is paid mostly to “think, talk, and type” is actually using those skills for the benefit of the company and not for, say, their pickleball tournament seeding. 


The most obvious and ugly side-effect that AI has wrought on the knowledge-based worker is the notion that those white-collar productivity metrics can now be accurately measured. Fine. However, those metrics only mean something when the goals are to “assemble, check, and approve” as much of an output as possible. 


So let me relate this to my other tech industry beef with Scrum and Jira and Agile becoming a methodology and organized religion. If the goal is simply X number of features every quarter, regardless of the value of those features to the customer, then a good way to measure progress and productivity towards that goal is to make sure we’re all “thinking, talking, and typing” the right way.


I’m a capitalist at heart, probably dyed in the wool, but this is not my first time pushing back against the quantification of the output of knowledge workers. And every time, the first argument I get is that I’m just some sort of hippie-dippy, work-should-be-fam kind of leader.


Who can do that? The problem-solvers, the fixers, the innovators, the outside-the-boxers – and we can’t measure their output with charts and graphs, let alone monitor progress with AI and cameras or keystroke-monitoring software.