First off, I am in no way speaking on behalf of either position. I have no connections with either Mr. Horgan, or the Singularity Institute. I am simply an observer of The Singularity, and it’s current events.
The following is an interview with Singularity Institute’s director of media Mr. Michael Anissimov. The interview was conducted by Mr. John Horgan in response to an earlier discussion between the two regarding Mr, Anissimov’s blog and some statements made.
I had a few questions regarding this particular interview, mainly concerning Mr. Anissimov’s responses. I asked him if he would mind speaking to me about the interview. He declined this request stating that he was too busy. I suppose that is understandable, but then again, as a director of an entity as potentially significant as S.I. one would ask, “You don’t have time to address questions from those interested in your cause?”
I have also asked to speak with Mr. Horgan and I am awaiting his response.
My concerns about the interview were this.
1. Is there is common ground to be found regarding the Singularity in general, …”(and not because of the Singularity, that doesn’t count),” Does Mr. Horgan believe in the Singularity concept itself? Or does he only have an issue with S.I. and their interpretations of it?
2. Although I have found disagreement in some of the predictions relating to the Singularity, I have not really found any outright claims that dispute the events validity itself. …” those of you who are getting tired of defending the Singularity” Would Mr. Horgan like to respond with some specific examples?
3. Is the comment regarding S.I. being a cult, a fair interpretation of S.I. in general, or merely the result of some specific series of statements or actions performed by individuals within the movement?
4. Does S.I. really even meet the definitions of a cult. If not, then Mr. Horgan’s statement ...”is a cult. A harmful one,” would in fact be unfair and potentially inflammatory. In fact, I believe that Mr. Horgan actually answers this in a following sentence. …“You may not BE cultists,”
5. .For Mr. Horgan, do you believe that .“and others who talk about the end of everything as we know it real soon,” is a doomsday scenario, or could this have a different meaning? If not, could he please give a specific?
6. Are either party aware of the religious connotations of the “cult” definition? If so, does Mr. Horgan choose to stand by the statement? Is this simply a matter of semantics?
7. And my most concerning question. Why did Mr. Anissimov seem so ill prepared to answer this series of questions by Mr. Horgan?
… “I’ve been promoting Singularity-related ideas for a decade. If there were a cult, I would be in its leadership.” (http://singinst.org/aboutus/team)
…”If anything, our “cult” is more analogous”
And where did this come from? …” I hear Ray’s The Singularity is Near movie is pretty silly,” how does it defend his or S.I.’s position?
I’m still not quite sure of what to make from this posted series of dialog other than to conclude that two experienced writers who are both in positions to give honest journalistic information, have forgotten the fields golden rule of truthfulness, accuracy, objectivity, impartiality, and fairness. Which makes it clear why both of these individuals have follen into the realm of “inconsequential bloggers!”
…“Personally, I think this level of peer acceptance in an idea so preposterous, (The Singularity,) is one hell of an indicator that Mr. Kurzweil is correct. When a society is able to grasp the concepts of something they don’t even truly understand, I believe it becomes clear that the society is aware these things as well, Even if only in their sub-conscience..
“I think it shows a or level of concern, or a worry that stems from this knowledge, that drives the society to clamor for answers to the question of “What do we do?” I believe that a better, more concise definition of the phenomenon itself is required so that we may be better suited in our attempts to identify the centers of the human psyche involved.
“Once these areas are better understood, maybe then can they be exploited in a way that gets society into the “works” stage of enacting change, instead of the “pacifist” mentality of, ‘Yes, I agree, someone should look into that,” or, “yes, but what are you going to do about it?”
“We as a species need to bring the awareness level of our potential future closer to the forefront of discussion, and do so in a way that jump-starts action, but not paranoid based panic. Only then will we be able to address areas such as religion, where dramatic changes in thinking are going to have to be made, with awareness, care, forethought and complete honesty.” Richard Brown – “SHAPING; How to address change.” 2011
Mon 25 May 2009
Last week I critiqued the Singularity and its chief guru Ray Kurzweil in Newsweek. The piece ended this way:
“Part of me—the part that thrilled at prospects for artificial intelligence almost 30 years ago—finds Kurzweil’s prophesies highly entertaining. He raises lots of provocative questions: What would be like to be immortal? To have an IQ of 1,000? To exist not as a doomed, flesh-and-blood creature but as a piece of software that can keep redesigning itself and merging with other programs? But another part of me—the grown-up, responsible part—worries that so many people, smart people, are taking Kurzweil’s sci-fi fantasies seriously. The last thing humanity needs right now is an apocalyptic cult masquerading as science.”
Michael Anissimov, one of the more thoughtful Singularitarians I’ve encountered, critiqued my critique on his blog Accelerating Future. He wrote, in part: “It’s quite rude to call groups of obviously intelligent and freethinking people a cult, when we have 1) a commitment to rationalism and changing our minds, 2) a naturalistic worldview, 3) substantial uncertainty about many of the nodes in our probabilistic model, 4) are willing to put forth the work to actually develop AGI, no matter how long it takes, 5) operate on the expectation of human-derived causes rather than “the great thrust of history”-type nebulous causes, 6) see the positive/negative outcome as contingent on human action, 7) have no expectation of or desire for in-group perks if we do successfully build superintelligence, 8, absence of religious trappings, 9) no revenge fantasies, a la true Rapturists, and 10) have no overblown anthropomorphism for the technological agents we are creating.”
I posted this response: “Michael and you other Singularitarians, thanks for noticing my my little piece in Newsweek, which I didn’t think anyone read anymore. You may not BE cultists, but everyone–not just me–PERCEIVES you as cultists, which is all that matters. But hey, we all need a crazy faith. I have one too, that war will end (and not because of the Singularity, that doesn’t count). In fact, I urge those of you who are getting tired of defending the Singularity to join me in my end-of-war cult. There’s lots of room, and even a few reasons to believe: http://discovermagazine.com/2008/apr/13-science-says-war-is-over-now.”
Michael and I then had the following exchange by email:
MICHAEL TO JOHN: I agree with your end of war ideas… but I think it’s dishonest to pump up the “cult” meme as a rhetorical device. You could be putting us in personal danger down the road. Some deep greens already want to kill us:
If our ideas are so wrong, you should be able to criticize them without the “cult” label… what about real cults like Scientology, Raelians, etc? If anything, our “cult” is more analogous to the enthusiasm that caused the dot com bubble — futurist enthusiasm about technology.
JOHN TO MICHAEL: Michael, it’s not a thetorical device. I do indeed think the Singularity movement, as represented by Kurzweil and Vinge and others who talk about the end of everything as we know it real soon, is a cult. A harmful one, as I’ve said, in an age when science’s reputation is under attack. Cults often coalesce around these sorts of apocalyptic fantasies, and true believers display us-vs-them insularity, hostility and arrogance toward non-believers etc. You seem quite reasonable yourself. You seem to think merely that AI is gonna happen some day, but then I’d say you’re not a Singularitarian. And that’s not the sort of prediction that sells books, generates cover stories, gets the attention of nasty a-holes like me, etc. John
MICHAEL TO JOHN: A “cult” is a centrally organized, physically in-contact group that does things like alienate you from your family. A “movement” is a general ideological thrust. It’s absurd to call a loosely connected, online movement a “cult” when its members disagree so thoroughly on the details and aren’t even attempting anything dangerous.
Superintelligence is not a fantasy, it’s a real possibility — the question is when. It makes sense to worry that superintelligence could wipe us out — did we not wipe out the Neanderthals? Where is this us-vs-them insularity, hostility, and arrogance you speak of? Where did you see it? Even if it did exist (I can’t find it), would it be any different than liberals vs. conservatives, or materialists vs. dualists, etc?
I do think we could be facing the end of the world as we know it, due to the threat we face from recursively self-improving AI. If AI is very difficult and takes centuries — great — that gives us more time to prepare to ensure that it’s programmed to be human-friendly.
The extreme fuzziness within the “Singularity movement” belies your claim that it is a cult. I am right in the thick of it — I’ve been promoting Singularity-related ideas for a decade. If there were a cult, I would be in its leadership. But there is no such thing.
In any case, I hear Ray’s The Singularity is Near movie is pretty silly, so that will make it all the easier to make fun of Singularitarian ideas.
JOHN TO MICHAEL: Michael, you protest too much. My guess is that you’re going thru a crisis of faith, as well you should. You’re obviously a very smart knowledgable guy. Is this really what you want to spend your life on? I asked Elizer the same thing on Bloggingheads, and I meant it, it’s not a rhetorical ploy. And I mean it when I say that the Singularity is giving science a bad name, because it’s not based on a rational appraisal of current science. If it’s any consolation, I also consider Chrstianity, Buddhism, Islam, psychoanalysis and lots of other belief systems to be irrational and hence cultish. They just have more adherents than you do. Go ahead and keep the faith, but don’t blame others who find your faith absurd and wasteful. John
MICHAEL TO JOHN: Yes, among other things, this is what I want to spend my life on — ensuring that when we get the technology to genuinely enhance human intelligence or creating artificial intelligence, it’s done in a way that is safe for the rest of humanity. Of course I protest, this is one of the big causes I’m into. Other causes I’m into include promoting decentralized customized manufacturing and understanding the potential dangers of synthetic biology.
The Singularity doesn’t have to be a niche quirky interest like psychoanalysis. It doesn’t have to be based on an irrational appraisal of current science. My concern is based on four simple points:
1. Superintelligence seems plausible. Superintelligence means intelligence qualitatively smarter than a human, like we’re qualitatively smarter than a chimp.
2. Superintelligence could just be a matter of finding a certain way to introduce new brain tissue into the human brain, or connecting together a large network of specialized agents and intelligent humans. We don’t know. It could be a lot harder.
3. When superintelligence does come about, it could be a big deal, as in having the potential to threaten all humanity.
4. We ought to care about technologies that might give rise to superintelligence, and, if possible, imbue a concern for safety into the researchers developing them and the visionaries evangelizing them.
Where is the irrational appraisal of current science? Note that even if you believe SI isn’t possible for 500 years, I still think it’s worth worrying about, as the impact could be so huge.
I’m not sure where the irrational appraisal of science is. All the possible objections that come immediately to mind aren’t really convincing to me:
1) Superintelligence isn’t possible at all.
2) Superintelligence is so far off it’s pointless to do anything about it now.
3) The points all make sense, but the whole thing gives me a queasy feeling that makes it emotionally necessary to reject outright.
Discard Kurzweil’s package about immortality, trend projections, nanobots, etc., and you still have a very real concern — that humans aren’t set up to be the most intelligent and powerful species forever, technology could be used to create greater intelligence, and we’re going to eventually have to deal with it.
In your article you offer some actual arguments, i.e., “we aren’t close to curing cancer therefore indefinite life extension is unlikely soon”. But a lot of your arguments seem to be social ones — “this sounds quirky and weird, therefore it’s bad”. I’m very open to the former but very uninterested in the latter.
JOHN TO MICHAEL: Come on Michael, listen to yourself! We’ve got problems that threaten us right now! Fundamentalist suicidal religious cults, collapsing states, proliferating nukes and other deadly weapons in unstable regions, surging populations in some of those same regions, global warming and other more tangible forms of pollution. And you’re fretting about supersmart cyborgs or bots or cyberentities or whatever, stuff that MAY–and may not–happen within 500 years? Why not waste your life agonizing over the dangers of time travel or evil aliens?
Also it pisses me off when you and your ilk–including Kurzweil–accuse me of “fearing” the Singularity or of merely dismissing it as “weird.” That’s bullshit. Sure, I make fun of you guys, because I’m trying to entertain people. But in my Spectrum article and even that crappy little Newsweek piece I also present specific counterarguments to the wild extrapolation upon which the Singularity is based. My first two books also have a detailed critique of the fields you think will produce the Singularity, including AI, neuroscience, genetics and so on. You Singularitarians, for all your vaunted cleverness,display an extraordinary and I can only assume willful ignorance of the complexities of biology, including how the genetic code produces bodies and how the neural code produces minds. When someone draws your attention to these issues, you respond with what you accuse critics of, ad hominem attacks. There’s the cult-like insularity and arrogance I talked about before. And that’s why you don’t deserve to be taken seriously.
MICHAEL TO JOHN: I’m very interested in all the issues you’re talking about, but from a utilitarian perspective, I consider superintelligence to be #1. We are increasingly being taken seriously by the mainstream: http://www.nytimes.com/2009/05/24/weekinreview/24markoff.html?_r=1?_r=1
…and for damn good reasons. The “Singularity”, in my eyes, is not a belief system, it’s a technological milestone, like economically-efficient solar cells or nuclear fusion. It’s just an important tech milestone.
Us smart people (and you’re one of them) have the capability to engage in abstract thinking, which means that “tangibility” might be important, and it might not, depending on the precise calculations and probability estimates. Yes, nukes matter, yes, other WMDs matter a lot (and I’ve fundraised for research into emerging WMDs). Thankfully, hundreds of millions of dollars and tens of thousands of smart people are focusing on those, but practically none are focusing on the entirely mundane (and quite serious) challenge of superintelligence, which you off-handedly dismiss without an apparent grain of seriousness.
Time travel and evil aliens are bullshit, the possibility of superintelligence is not, and I know hundreds of highly intelligent people that justifiably agree with me. I’m not a fanatic in the least, and you seem to think I’m reasonable, so we must have some deep disagreement. I ordered your End of Science book, hopefully that will give me more of a view into your worldview. I’m a tech-sci-oriented liberal as well, maybe that gives us some commonality but maybe it doesn’t.
You don’t consider the Singularity as weird or fear it — I understand that. You think it’s total bullshit. I understand a lot about how the genetic code produces bodies and the neural code produces minds — what makes you think that we’d need to copy all that to create a higher form of intelligence? The Wright Brothers didn’t need to copy a pigeon to create flight. Intelligence is an abstract information process with underlying principles — just like everything else, once we unlock those principles, we’ll be able to exploit them to our own ends — just like we exploited physics for the industrial revolution, biology for the agricultural revolution, electronics for the electronic revolution, etc. We’ll exploit cognitive science for the intelligence explosion. I consider it quite plausible that it will be possible to understand and exploit the underlying principles of intelligence (to create automated intelligence) within the next 50 years, which is well within my projected lifetime. If not, we’ll at least contribute a lot to that technological goal.
Collapsing states, weapons of mass destruction, and nukes threaten millions of lives. Human-indifferent superintelligence threatens billions, and it could create a near-paradise if utilized properly. If you want to ignore it, be my guest, but I’m a very skeptical human being, whose “merely” human life is very exciting and happy as it is, and I have no need of utopian fantasies. I am operating on a purely pragmatic basis, even if you dismiss it as idiocy.