Vivek Haldar

The perils of being a knowledge worker

When I was a graduate student, it struck me as particularly strange that we were represented by a union, the Union of Auto Workers (UAW). But we were graduate students! We thought for a living! We were more than just labor, weren’t we? We didn’t need collective bargaining! Turns out, in at least one sense, graduate students were just labor.

That was when I began to sour on the term “knowledge worker.”

There is an implicit connotation that knowledge work is higher-skill and higher-status. Matt Crawford stuck a nail into that notion in his book “Shop Class as Soul Craft”, in which he delved into both why “knowledge work” can be dull and soul-crushing, and working with your hands on physical objects can be extremely cognitively challenging.

As a programmer, most of what I do is challenging and interesting, but once in a while I get the creepy feeling that all I’m doing at my computer is pulling levers. Think about how much code you can churn out in a modern IDE by just auto-completing, then think about the frightfully small distance to go before even than can be automated. Dan Benjamin often mentions on his podcast how easily replaceable Java developers are. “When you’re gone, the floor will part and another Java dev will appear, shrink-wrapped with his chair. They’ll wheel him to your cubicle, take off the shrink-wrap, and business will continue as normal.”

The culture of Silicon Valley can be traced back to Robert Noyce, and he was dead-set against unions:

As the work force grew at Intel, and the profits soared, labor unions …made several attempts to organize Intel. Noyce made it known, albeit quietly, that he regarded unionization as a death threat to Intel, and to the semiconductor industry generally. Labor-management battles were part of the ancient terrain of the East. If Intel were divided into workers and bosses, with the implication that each side had to squeeze its money out of the hides of the other, the enterprise would be finished. Motivation would no longer be internal; it would be objectified in the deadly form of work rules and grievance procedures. The one time it came down to a vote, the union lost out by the considerable margin of four to one. Intel’s employees agreed with Noyce. Unions were part of the dead hand of the past.

This ethic carries on today. The idea of unionizing tech workers is still considered oxymoronic. And that is a good sign because the hankering for unionization is a leading indicator that your field is going to get eaten by automation. It means that the only bargaining chip you have left is control over the aggregate supply of labor, not skills or creativity.

For another example, take law. What was once thought of as purely human and beyond the touch of machines is crumbling to automation. Robot lawyers are creeping into the bottom end (routine contract generation, search and discovery) and gradually climbing up the value chain.

Claims made about the impact of automation on the law are not entirely speculative, nor are they new. E-discovery, which applies modern search technologies to help manage the massive amounts of data in litigation, has already seen significant coverage both within and beyond the legal industry in the past decade… However, what may be more novel is that automation is moving increasingly beyond incremental improvements to tools used by lawyers in the “back office.” Automation more and more touches the actual work product received by clients, as well as “front office” interfaces that the public uses to access legal services and the legal system at large.

This isn’t manufacturing and factory workers the author is talking about.

Clearly, the moat of security around being a “knowledge worker” is drying up fast.

At this point, if you are a knowledge worker, you should be wondering whether your job can be saved, and how you’ll know.

Venkat Rao has a deep meditation on this very topic. Go read the whole thing. The tl;dr is that while we worry about saving the sexy a.k.a creative work from automation, the real opportunity for human gainful employment lies in dull, unsexy, schlep work around the fringes of what machines do and that are not worth automating. In other words, exploiting the arbitrage when your brain is cheaper than CPU time. The best example of this is humans doing image processing on Mechanical Turk. The essay is an excellent deconstruction of the platitude that “computers can’t take over creative work.” If you dig deep down, a lot of creative work isn’t creative at all.

A lot of what people think of as creativity is variation within a theme. Computers can actually be pretty good at it. Consider the following three computer-generated pieces of art:

All that intricacy and variation was generated by the following code and a sprinkling of randomness:

startshape S
background { b -.5 sat 1}
rule MYSQUARE {
    SQUARE { b -1 }
    SQUARE { s 0.5 }
}
rule MYCIRCLE {
    CIRCLE { b -1 }
    CIRCLE { s 0.5 }
}
rule S {
    X1 { x -.5 y -.5 }
    X1 { x -.5 y .5  }
    X1 { x .5  y -.5 }
    X1 { x .5 y  .5  }
}
rule X1 .13  { MYCIRCLE { b 1} }
rule X1 .13  {
    MYSQUARE{ b 1}
}
rule X1 .74  { S { s .5 b .3}  }

A good fraction of human creativity follows the same pattern of generative expansion from a small kernel. Is it that hard to imagine a program that generates pleasing web designs and puts web designers out of business? You would need a generative component–one that would generate a large number of designs given a small spec–and a filtering component that would whittle them down to a small number of top designs that a human could then select from based on whimsy. The first component is easy to write today. The second component becomes possible if one feeds a large number of “good” web designs to a machine-learning program.

So creativity isn’t going to save knowledge workers. That moat is looking thinner and dryer.

A useful direction to look at is ambiguity. Dealing with ambiguity is the step before creativity. The creative process comes after beating down the ambiguity of what to create and the constraints within which it must exist. In a sense it is meta-creativity. I’m still stumbling through this idea, but my gut tells me that chasing ambiguity is likely to be an effective hedge against machines eating your job. Stay tuned.