2 minute read

(I seriously never imagined I’d have to write that title.)

Today I came across an odd analysis of several legitimate problems by Barry Devlin.
(via Marshall Kirkpatrick’s google+ feed)

I’m sure the analysis is well intentioned and perhaps I have misread some of his claims but he appears to be blaming three major, societal problems on one thing they all seem to have in common … the use of algorithms.

The three problems he lists are:

  • Insurance Companies overanalysing patient data to deny them coverage
  • Automated Stock Trading software playing time delays to beat any poor human traders
  • Movie Studios analysing data to determine what movies people like to get the biggest bang for their buck

These are legitimate and worrying problems, but placing blame on overuse of algorithms per se is kinda strange.

Algorithms are simply systems which solve problems. They can be as simple as a recipe for baking a cake or as subtle and complex as Google’s search algorithm. Trying to encourage people to use less algorithms in the modern world is like encouraging people “to use less hammers” and beat the nails in with their hands. “We’re just throwing up buildings at an unnatural rate because of all these fancy hammers.” The problems he points out stem from other choices that are implicitly being made, the intensive use of algorithms does not cause them.

The problem of insurance companies over analysing their clients and denying coverage is inevitable when you have an unregulated, profit driven insurance industry. Regulate the industry so that they can’t use certain information, or so they cannot deny coverage in certain circumstances. Or if this is the US you are worried about, switch to a single payer health care system and take most of that power away from insurance companies altogether. Obviously they are going to do everything they can to make money. Either take away their incentive or restrict what they can do, complaining that they should not try so hard by analysing any data they are allowed to use doesn’t make sense.

High speed, automated trading is a very important issue which needs to be addressed. But again, this isn’t about not allowing people to do as much analysis as they want, it is about levelling the playing field. Why should large trading companies get an advantage because they can afford larger servers or can rent the rooms beside the NY Stock exchange computers to reduce their lag time? Implement a regulation saying that there must be a fixed minimum delay between all trades. Or alter the trading software in the markets to only accept trades every x microseconds. Again, saying algorithms are the problem is saying that are playing the game too well when you are only giving them incentives to play that game.

As for movies and how collaborative filtering will help studios understand exactly what kinds of movies people are willing to pay the most for, he answers the question himself. Any studio that only tries to make movies that are like last year’s hits is going to lose out to a more creative studio that actually makes popular movies no one was expecting. That’s not the algorithm’s fault, that’s just bad marketing strategy.

So I just don’t see where he’s coming from. Algorithms completely permeate our lives, they always have.
Computers just make it more obvious.

Updated: