David Stairs

“I think that it’s fairly likely that it will not take too long of a time for the entire surface of the Earth to become covered with data centers and power stations. Once you have one data center, which runs lots of AIs on it, which are much smarter than humans, it’s a very useful object. It can generate a lot of value. The first thing you ask it is: Can you please go and build another one?” — Ilya Sutskever as quoted by Cade Metz in Genius Makers on pp.299-300. Published by Dutton, © 2021.

This sounds like a description of the planet Coruscant, the Star Wars planet whose surface is completely developed.

“[F]or several decades the computing power found in advanced Artificial Intelligence and Robotics systems has been stuck at insect brain power of 1 MIPS. While computer power per dollar fell [should be: rose] rapidly during this period, the money available fell just as fast. The earliest days of AI, in the mid 1960s, were fuelled by lavish post-Sputnik defence funding, which gave access to $10,000,000 supercomputers of the time. In the post Vietnam war days of the 1970s, funding declined and only $1,000,000 machines were available. By the early 1980s, AI research had to settle for $100,000 minicomputers. In the late 1980s, the available machines were $10,000 workstations. By the 1990s, much work was done on personal computers costing only a few thousand dollars. Since then AI and robot brain power has risen with improvements in computer efficiency. By 1993 personal computers provided 10 MIPS, by 1995 it was 30 MIPS, and in 1997 it is over 100 MIPS. Suddenly machines are reading text, recognizing speech, and robots are driving themselves cross country.”

This quote by Hans Moravec was made twenty-five years ago. For decades AI proponents have assumed that Moore’s Law about the expanding number and speed of transistors on computer chips would result in these large reductions of cost. On a human/machine comparison the supercomputer Watson has defeated the world’s top Jeopardy champions, and Deep Mind’s Alpha Go defeated human world Go champion Lee Sedol.

Data centers are estimated to utilize about 1% of the world’s electricity at 2020 levels of consumption. A wind turbine will produce about 1.5 megawatts of electricity per day in a 12mph wind.

A recent query on CNET inquiring about cyber-currency generation asked: “How much energy does mining take?”

The Digiconomist’s Bitcoin Energy Consumption Index estimated that one Bitcoin transaction takes 1,544 kWh to complete, or the equivalent of approximately 53 days of power for the average US household.

To put that into monetary terms, the average cost per kWh in the US is 13 cents. That means a Bitcoin transaction would generate more than $200 in energy bills.

Bitcoin mining requires enormous amounts of electricity to run the server farms necessary. It is interesting that tech investors, supposedly some of the same people involved with the pursuit of machine intelligence, could be so blind. (Although avarice does tend toward long-term myopia)

Bitcoin mining used more energy than Argentina, according to an analysis from Cambridge University in February 2022. At 121.36 terawatt-hours, crypto mining would be in the top 30 of countries based on energy consumption.”

So running server farms, whether to mine crypto or service AI and superintelligence is not so dumb as just plain inefficient. And inefficiency had better be prolific if it hopes to survive, prolific in the way a maple tree goes to seed. Prolific like goldenrod pollen in September. Prolific like the Covid-19 Omicron variant.

Nature, which is prolific by default, wastes with a purpose. And, while you might argue that the purpose of Bitcoin mining or AI computing is the accumulation of wealth and the generation of convenience, unlike survival these ends are neither sustainable nor justifiable on a planet of rapidly diminishing resources.

The current commercial applicability of AI in voice recognition personal assistants, or security-based facial recognition algorithms loved by police departments creates more problems than solutions when taking the issues of privacy, data security, and civil liberty into account.

I am still looking for an answer to my titular question. Superintelligence, AI, powerful parallel computing, and the Singularity are all concepts supposedly nearing realization, our salvation writ large in the cipher of machine code. But for me the vision is obstructed and don’t see it. Maybe, locked as I am in my meat prison, I am just too slow, too dumb, and too blind to appreciate the approaching Tech Rapture.

Then again, maybe not. There is wisdom in meat, as well as protein.

David Stairs is the founding editor of the Design-Altruism-Project.