Warning: Memetic Hazard!

I habitually prowl odd corners of the Internet, but plenary coverage takes time. There are still, I am reminded almost daily, large and undiscovered countries.

I charted a new one today (although I had glimpsed its coastline from afar on several occasions). It began with a Tweet from the account calling itself “Outsideness”:

https://twitter.com/Outsideness/status/412591534668656640

This led to the “transhumanist” website LessWrong.com, and a discussion thread containing the full version of the quote excerpted just above:

Are there any mechanisms on this site for dealing with mental health issues triggered by posts/topics (specifically, the forbidden Roko post)? I would really appreciate any interested posters getting in touch by PM for a talk. I don’t really know who to turn to.

Sorry if this is an inappropriate place to post this, I’m not sure where else to air these concerns.

The “forbidden Roko post”? How ominous. I was reminded at once of a short story that I linked to some years back: The Riddle of the Universe and its Solution, by Christopher Cherniak.

Well, it turns out that the forbidden post is known as “Roko’s Basilisk”. It’s gone from LessWrong, but it’s copied here, and you can read about it here. (The idea is described in brief, and with obvious relish, in this YouTube clip.)

Basically, the notion — if I have it right after reading up on it for an hour or so — is this:

Thanks to all our tinkering with computers, sooner or later a Godlike, superintelligent machine will bootstrap itself into existence, will consider itself the teleological endpoint of human evolution, will build an ethical framework based on that axiom, and will decide that anyone who had ever considered the possibility of such an entity’s existence, but who had not devoted all their resources to bringing it into existence, was guilty of an unforgivable sin. This Entity will create a perfect simulation of the world’s history, including you and me, and will, inside that simulation, consign all such sinners to excruciating torment.

Now the kicker: how do we know we are not already living inside the simulation? (Cue creepy music.)

Apparently this idea has had such an upsetting effect on more than a few people that all discussion of it has been banned from LessWrong’s chat-boards. Some have been disturbed to the point that they actually have begun emptying their wallets (with LessWrong.com itself as the beneficiary), or at least losing sleep.

I’ll confess that I find all of this terribly boring. For one thing, the idea that we’re living in a computer simulation is nothing new, and the whole idea rests, as I pointed out five years ago, here and here, on the assumption that computer simulations can be conscious. (I couldn’t care less if some simulation of me is being roasted in Hell inside some CPU somewhere, as long as that simulation isn’t self-aware, conscious me.) Given that we have no idea whatsoever how consciousness arises, and that the only things we know to be conscious are our own biological brains, I can see no reason to bump “freak out about being tortured, as software, by software” up to the top of my to-do list.

Maybe I’m just too dull-witted. There are certainly plenty of smart people over at LessWrong, and they seem to be taking all of this very seriously. Perhaps if I thought about it really hard, hard enough to get around my objections, I’d suddenly understand it well enough to be compelled to take it seriously, which of course would qualify me for eternal torment by our malevolent AI overlord — whereas if I persist in blithe ignorance, I’m off the hook. A “Pascal’s Wager” in reverse, if you like.

Of course, perhaps knowing what I already know puts me on the hook for not making such an effort.

Time will tell, I guess. I have plenty of other things to worry about, so I’ll take my chances. See you in Hell.

5 Comments

  1. This comment has nothing to do with this post, Malcolm, but wanted to make sure you didn’t miss this latest tasteless display by our fearless leader: http://www.americanthinker.com/blog/2013/12/creepiest_obama_story_yet_just_got_creepier.html, http://www.americanthinker.com/blog/2013/12/creepiest_obama_story_yet.html and http://www.nationalreview.com/corner/366499/government-mere-mortals-mark-steyn. He’s beginning to rival the tinpot dictators.

    Posted December 17, 2013 at 8:01 am | Permalink
  2. Malcolm,

    I think you and I are largely in agreement that the simulation hypothesis isn’t really worth considering. See here and here. Our main difference is that I casually grant the possibility that a simulation could be/become conscious: I’m 80-90% a Kurzweilian functionalist.

    Posted December 17, 2013 at 9:57 am | Permalink
  3. Malcolm says

    Kevin, being a functionalist means you don’t have the benefit of my principal reason for skepticism about all this.

    If you accept the possibility of conscious simulations, then isn’t it possible that the Basilisk just might git ya?

    Posted December 17, 2013 at 9:22 pm | Permalink
  4. Malcolm says

    To Eliezer Yudkowsky or anyone else who may ever wander over this way from LW: I’m sure I’m just coming across as a big dope here, who simply isn’t bright enough to see the problem.

    In this case, that might be advantageous.

    Posted December 17, 2013 at 9:26 pm | Permalink
  5. Bill says

    My impression of the artificial intelligence crowd is that they don’t understand intelligence. Not that anyone else does either. I am always left with the idea that they think that if they just have a bigger set of computational units or a fancier bag of algorithmic tricks they will succeed. First they need to let go of algorithmic thinking, but in so doing they will then let go of artificial, since all the computers somewhere have algorithms embedded.

    Having spent most of my adult life working with computers, I find them nothing more than fancy abacuses. That is a far cry from the way even the simplest neural networks operate. Which modes of operation I have yet to see exactly simulated in electronics.

    Posted December 20, 2013 at 6:59 pm | Permalink

Post a Comment

Your email is never shared. Required fields are marked *

*
*