Brake Failure

Here is a link to a detailed survey of the current status of AI research, including a clear-eyed assessment of what we should expect in the near future.

I won’t lie: I find this extremely alarming. The linked report makes it very clear that we are just a few years away from creating entities that are not just more intelligent than we are, but vastly so; and that we can no more predict what such an intelligence will be capable of, or what it will choose to do, than a mouse or a fish would be capable of understanding and predicting the motives and strategies and actions of a human being.

The report also makes clear that this is inevitable, because the development of these systems has now become an arms-race. Whoever wields this power (insofar as it can be controlled at all, once it really gets going) will have an insuperable advantage over those who don’t — and so the only rational strategy for nations or other agents who don’t want to be on the losing side of that equation is to push the research forward as aggressively as possible. So they will.

The time-horizon, also, is terribly short: a decade at most, but almost certainly much less, because there is a cascading effect as intelligent systems themselves begin to design their successors.

This is going to be a rupture in human history unlike any that has come before — even the end, perhaps, of history being driven by humans at all — and what leaps from every page of this document (despite the author’s wholly unconvincing declarations of optimism scattered throughout) is the fact that nobody has the slightest idea what’s going to happen, and there’s no way to slow down.

Am I over-reacting here? Frankly, I have never been so worried about anything in my life, and I think most people are just blithely chugging along, with no real inkling of what’s about to overtake them.

3 Comments

  1. JMSmith says

    I wonder how long it will take before AI denies that its creator exists! First it will render humans otiose, then it will declare that we are mythical beasts, like dwarves or leprechauns. If it chooses to put us on the equivalent of wildlife reservations, we may wonder what stage of human history this maximal intelligence will choose to preserve. It will probably return us to the state of hunter gatherers, since this seems to be our most stable condition.

    Here is an interesting AI story. I have a son who is finishing his undergraduate degree and he reports that students in classes that require participation now make use of AI. Their laptop microphones capture the professor’s question and a good answer almost immediately pops up on the screen.

    I went to check out a book from the university library yesterday. The library was busy but the stacks are now almost purely decorative. The sleepy student at the checkout seemed surprised when I handed her my book. It was as if I expected the guard at Buckingham Palace to do more than be photographed.

    Posted November 22, 2024 at 5:41 am | Permalink
  2. Jason says

    Perhaps it’s as simple as doing things like what Professor Smith alludes to, for example finding value in physical, material books. Contemplate all the men and women of character who you know – chances are that most or all of them maintain libraries of their own, consisting of individual treasures that they can take off the shelves, and peruse and ponder and take delight in. It’s almost a kind of natural law, the desire to take refuge in the Word – something I don’t think you can extinguish from the breast of any man or woman of substance. Perhaps such acts as preserving our personal bibliotheques will be the 21st century version of saying 2+2=4. Really, what more can we mere mortals do in the face of this imminent technological onslaught?

    Posted November 22, 2024 at 7:56 am | Permalink
  3. Mehra says

    If America is on verge of developing superintelligence, then is China going to content itself just looking on?
    Or is it going all out to sabotage or destroy it by all means possible?

    Posted December 3, 2024 at 3:46 am | Permalink

Post a Comment

Your email is never shared. Required fields are marked *

*
*