3 minute read

This is in reply to reading this article from the BBC, “AI to hit 40% of jobs and worsen inequality, IMF says” (January 15, 2024, Annabelle Liang, BBC)

Some quotes and my responses:

More generally, higher-income and younger workers may see a disproportionate increase in their wages after adopting AI.

This isn’t obvious to me. I guess the “may” hedges here, but it’s basically an empty statement. Some higher-income workers (legal clerks?) may find their jobs vanishing entirely. While some older workers who can figure out prompt tuning may find their value shoots up dramatically because they can more easily handle their workload.

Lower-income and older workers could fall behind, the IMF believes.

Don’t they always? Seriously, this has been happening for decades and will continue until we take a broader view at the meaning of work itself. Because the bigger question we need to answer as a society, and I don’t expect the IMF, or WEF or World Bank to answer it, is are we “workers” or are we “citizens”? Are we perhaps even “people”?

There’s more to life than work, at least there could be more to life than work. Is it possible that the “AI Transition” is to one where less people need to work at all, yet don’t need to suffer total destruction to do so?

“It is crucial for countries to establish comprehensive social safety nets and offer retraining programmes for vulnerable workers,” Ms Georgieva said. “In doing so, we can make the AI transition more inclusive, protecting livelihoods and curbing inequality.”

This is a good first step, but still demonstrates limited foresight.

Even talking about an “AI Transition” assumes we know where we are transitioning to! Or at the very least, it assumes there will be a transition from one stable equilibrium of society, or the economy, to another. For example, after the widespread scaling of manufacturing in the lates 1800s and early 1900s, think Ford Model-T, there were huge societal convulsions around the idea of labour rights, what it meant to be an employee, an employer, and a caring society. The labour movement, the Depression, the New Deal social welfare state all arose during and out of that transition. War and climate issues were important parts of that, but one very important part was a sudden technological shift that changed the nature of work and livelihood for a huge portion of the people in Western societies.

So a transition like this isn’t simple or guaranteed to happen within a few years, and it isn’t guaranteed to happen without widespread societal disruption. If Generative AI really is going to lead to a transition as big as these previous ones, then we don’t know that the new equilibrium will look like at all, how long it will take to get there, or how painful/smooth it will be.

An Opinion on the Future

If you’ll notice, I’m even hedging on if it really will lead to such a limitation. That’s because the current wave of AI technologies that everyone is excited about has limitations and risks no one has internalized yet, even as applications of it will also continue to develop at breakneck speed. These risks are not just to the things the technology can do and that people will use it for. The risks are even I suppose investment style risks. As Gary Marcus and only a few others are adamantly pointing out, the current wave of tools may be built on a house of cards that will eventually be deemed as a huge illegal theft of intellectual property on a scale we haven’t seen since the MP3 Wars at the turn of the century. Despite all the users, and money involved that battle ended up being “won” by the IP holders who sued Napster etc. out of business and implemented their own protocols for online purchasing, copy protection etc. The fact that seems long ago and irrelevant now in our era of music/video streaming is beside the point. ChatGPT and most of the others are trained on vast amounts of data that was collected under false pretences, to demonstrate research ideas, which was then used to provide a pay-for-use trained model that doesn’t cite it’s sources or compensate creators. We don’t know how those lawsuits will play out, but the particular players and tools we are using now might be entirely different in 5 years depending how that shakes out.