Accelerating Change 2005 Blogging

09/17/05 00:00:00    

By Michael Mealling

I'm at Accelerating Change 2005 and will be live blogging as much of it as I can. Much of the conference is made up of ad hoc BOF groups so I'm not sure if I can capture much beyond what I'm directly involved in. The gist of the conference is how the rate of technological change is accelerating and how that rate itself will be accelerating due to specific force multipliers such as AI, nanotech, and life extension. More to come when the sessions startup:

John Smart, Founder and President of Acceleration Studies Foundation, is giving the introduction to the conference and a status of where the Foundation is at and where its going.

(grrr… my camera's flash has stopped working so I may not have pictures)

First up is Vernor Vinge:

“Exponential growth” is one of the simplest and most pervasive patterns in nature. Some of these patterns are “exponential growth with catastrophic collapse” and “Exponential growth with saturation”. Particular technologies have “exponential growth with saturation” but when considered in aggregate the base technology curve stays exponential.

If this continues for a few more decades we get to the “killer app” of exponential improvement in computational ability: The development of creativity and intellect that surpass present-day humans.

The question of the future isn't can we re-create human intellect, we can already do that in the form of new humans. The real question is when we create something that goes beyond it. This is a good point to declare a Singularity.

Vinge defines singularity as “a place where some regularity property is lost”. Or analogous to the physics terms connotation of “a place where the rules profoundly change”. A working definition is that while you could bring Mark Twain to today's time and pretty much explain the state of the world in an afternoon, you could not do the same thing with a goldfish. Its also been described as technical progress that is incomprehensibly complex and rapid.

What would it be like if the Singularity didn't happen?

Maybe Murphy's Law trumps Moore's Law: “The maximum possible effectiveness of a software system increases in direct proportion to the log of hte effectiveness (i.e. speed, bandwidth, memory .

Or maybe catastrophe intervenes. The more we learn about the cosmos, the more we learn how dangerous it is.

Vernor's conclusion: while the Technological Singularity is not at all a sure thing, it is the most likely non-catostrophic scenario on the horizon. Of course the Singularity itself could be catastrophic. What can we do to make the bad versions less likely?

What if: AI succeeds

What if: The internet itself attains unquestioned life and intelligence [ed: IMHO, not likely]

What if: Fine grained distributed systems are aggressively successful?

(or what happens when all of the embedded systems network and 'wake up')

What if: Intelligence Amplification occurs: as the radical endpoint of human/computer interface or as the outcome of bioscience research?

Soft takeoffs vs hard takeoffs:

Soft takeoff - the transition takes years, perhaps even with the exact begininning and end states a matter of debate

Hard takeoff: the transition takes place in a very short period of time, perhaps less than 100 hours and without obvious precursors. Hard takeoff is generally considered a bad thing due to its catastrophic nature. Vinge thinks that the best way to plan for a hard takeoff is by using Intelligence Amplification in order to ensure that humans can adapt with the phase shift instead of being subject to it. This even brings up the point that if you are amplifying your human intelligence at the same time as the rest of the technological rate change is happening you actually don't see a Singuarity happen. You ride it.

Questions: with our limited intelligence compared to things post Singularity, can we even tell if things start being much smarter than we are. Things such as corporations made up of AIs, etc….

Question: how can I make money on the takeoff: from the back of the room: "I'll sell you hard takeoff insurance”. Vinge: if its a soft takeoff then everyone will get rich off of it. You may be able to get rich by simply sitting on your couch.

Question: will the biology end up trumping the silicon for what that super intelligence post singularity state may be based on.


comments powered by Disqus