cheshirenoir: (Default)
[personal profile] cheshirenoir
Warren Ellis has posted a nice little rant about Post Singularity nutters.

(Stop looking at me that way!)

Yes I find the concepts of Singularity events fascinating.
Yes I know too much of the finer details. (Mind you anyone with an interest in Nano and has read Kurzweil probably knows some of the finer details)
Yes I suspect an "Information Event Horizon", past which we can make no predictions, is possible.
Yes I did make career decisions based on the possibility.
No I don't think it has to happen.
No I don't think a post singularity world will nessecarily be a nice place for us meatheads.

Date: 2008-06-04 03:51 am (UTC)
From: [identity profile] krjalk.livejournal.com
I have more time for the Singularity as a concept than I do for Singularitarians as a group. Particularly Extropians, who I found to largely be a group of arrogant wankers.

Date: 2008-06-04 04:53 am (UTC)
From: [identity profile] evilpaul.livejournal.com
Well, I've read books like, The Spike, Nano! and The Age of Spiritual Machines and The Spike, so I have more than a passing interest in the subject.

Let's say you take a relatively conservative viewpoint and argue for a singularity that simply involves a future point of non-predictability. That concept, all by itself points to "the end of the world, as we know it" because it is by definition, unknowable.

Will we be happy, in this unknowable future? Will we even be human? We can't tell from here.

Obviously this kind of philosophical nonsense pisses a lot of scientifically minded people off. (Don't even mention quantum physics.)

The Singularity doesn't mean the End Of The World, it just means that we are rapidly approaching a point where people have no fucking idea what is going to happen next.

The meme of referring to the singualrity as the Geek Rapture is successful for good reason, and I'm certainly not blind to the parallels with various apocalyptic religions. You won't find me hanging out with extropians trying to "buy themselves some scrubland outside Bastrop in Texas for a compound."

Yet.

Actually, I prefer the Terence McKenna/Grant Morrison version where all you need to do is take a massive dose of powerful psychoactives in 2012 and surf the novelty wave, baby.

Date: 2008-06-04 11:04 am (UTC)
From: [identity profile] http://users.livejournal.com/_fustian/
Ellis (and other Unbelievers ;) fails to allow for the Star Trek/Neuromancer phenomenon; the "if we can imagine it, we can build it" idea which has worked for many, many apparently impossible technologies. It's not quite the wikialistic equivalent of "clap if you believe in fairies", but in some ways it's close. Basically, the more engineers that have been exposed to the idea of something technological, the more likely it is that they'll eventually work out a way to build it. Since there's nothing inherently impossible about general AI (even if you're religiously opposed to the idea of machine consciousness), given its commercial value alone it doesn't take much to suggest that it's in fact inevitable (as long as someone else's eschatology doesn't intervene. From there to the Singularity is, well, not exactly a leap of faith.
Page generated May. 29th, 2025 04:51 pm
Powered by Dreamwidth Studios