Thinking about how the arsing fuck to explain the rationalists to normal people - especially as they are now a loud public problem along multiple dimensions.
The problem is that it’s all deep in the weeds. Every part of it is “it can’t be that stupid, you must be explaining it wrong.”
With bitcoin, I have, over the years, simplified it to being a story of crooks and con men. The correct answer to “what is a blockchain and how does it work” is “it’s a way to move money around out of the sight of regulators” and maybe “so it’s for crooks and con men, and a small number of sincere libertarians” and don’t even talk about cryptography or technology.
I dunno what the one sentence explanation is of this shit.
“The purpose of LessWrong rationality is for Yudkowsky to live forever as an emulation running on the mind of the AI God” is completely true, is the purpose of the whole thing, and is also WTF.
Maybe that and “so he started what turned into a cult and a series of cults”? At this point I’m piling up the absurdities again.
The Behind The Bastards approach to all these guys has been “wow these guys are all so wacky haha and also they’re evil.”
How would you first approach explaining this shit past “it can’t be that stupid, you must be explaining it wrong”?
[also posted in sneer classic]
I just describe it as “computer scientology, nowhere near as successful as the original”.
The other thing is that he’s a Thiel project, different but not any more sane than Curtis Yarvin aka Moldbug. So if they heard of moldbug’s political theories (which increasingly many people heard about because of, well, them being enacted) it’s easy to give a general picture of total fucking insanity funded by thiel money. It doesn’t really matter what the particular insanity is, and it matters even less now as the AGI shit hit mainstream entirely bypassing anything Yudkowsky had to say on the subject.
How would you first approach explaining this shit past “it can’t be that stupid, you must be explaining it wrong”?
This is the question of the moment, isn’t it?
I have no answers, but i can say thanks for being a light in the dumbness.
@dgerard TESCREAL is structurally an evangelical a-theist religion descended from the 19th century Russian Orthodox christianity of Nikolai Fyodorovitch Fyodorov and in my new book I will—
Nah, forget the book, I’ve got the attention of the POTUS’s ketamine-addled fractional-trillionaire shitposter, get in the car, losers! (Thows brake pedal at puzzled pedestrian)
The car’s a Tesla, isn’t it?
@JohnBierce CyberTruck, ofc.
Of course!
TESCREAL is structurally an evangelical a-theist religion descended from the 19th century Russian Orthodox christianity of Nikolai Fyodorovitch Fyodorov and in my new book I will—
Hannu Rajaniemi beat us all to the punch all the way back in 2010, didn’t he? The Sobornost are exactly who the Rationalists want to be when they grow up.
@o7___o7 @sneerclub Yep, but hopefully my next space opera will take it to the next level (once I finish it—I’ve been working on it since 2015) …
It’s eugenics but as a religious cult for reactionaries
Yud is that creepy nerd from your middle school who wrote disturbing fan fiction, but it wasn’t just a phase and now he has the aforementioned cult
I think starting with Sam Bankman Fried is a solid idea. Relatively informed members of the general public a) know who that guy is, and b) know that he made some really poor decisions. He does not have the silicon valley mystique that attaches itself to some other adherents, I think fewer people will think “well that guy is really smart, why would he be in a cult”. Then you can go back and explain EA and LessWrong and Yudkowsky’s role in all of this.
I’ve been contemplating this, and I agree with most everyone else about leaning heavily into the cult angle and explaining it as a mutant hybrid between Scientology-style UFO religions and Christian dispensationalist Book of Revelation eschatology. The latter may be especially useful in explaining it to USians. My mom (who works in an SV-adjacent job) sent me this Vanity Fair article the other day about Garry Tan grifting his way into non-denominational prosperity gospel Christianity: https://www.vanityfair.com/news/story/christianity-was-borderline-illegal-in-silicon-valley-now-its-the-new-religion She was wondering if it was “just another fad for these people,” and I had to explain no, not really, it is because their AI bullshit is so outlandish that some of them feel the need to pivot back towards something more mainstream to keep growing their following.
I also prefer to highlight Kurzweil’s obsession with perpetual exponential growth curves as a central point. That’s often what I start with when I’m explaining it all to somebody. It provides the foundation for the bullshit towers that Yudkowsky and friends have erected. And I also think that long-term, the historiography of this stuff will lean more heavily on Kurzweil as a source than Yudkowsky, because Kurzweil is better-organized and professionally published. It’ll most likely be the main source in the lower-division undergraduate/AP high school history texts that highlight this stuff as a background trend in the 2010s/2020s. Right now, we live in the peak days of the LessWrong bullshit volcano plume, but ultimately, it will probably be interpreted by the specialized upper-division texts that grow out of peoples’ PhD theses.
awful.systems
And I also think that long-term, the historiography of this stuff will lean more heavily on Kurzweil as a source than Yudkowsky, because Kurzweil is better-organized and professionally published.
That is interesting to think about. (Something feels almost defiant about imagining a future that has history books and PhD theses.) My own feeling is that Yudkowsky brought something much more overtly and directly culty. Kurzweil’s vibe in The Age of Spiritual Machines and such was, as I recall, “This is what the scientists say, and this is why that implies the Singularity.” By contrast, Yudkowsky was saying, “The scientists are insufficiently Rational to accept the truth, so listen to me instead. Academia bad, blog posts good.” He brought a more toxic variation, something that emotionally resonated with burnout-trending Gifted Kids in a way that Kurzweil’s silly little graphs did not. There was no Rationality as self-help angle in Kurzweil, no mass of text whose sheer bulk helped to establish an elect group of the saved.
Yes, Kurzweil desperately trying to create some kind of a scientific argument, as well as people with university affiliations like Singer and MacAskill pushing EA, are what give this stuff institutional strength. Yudkowsky and LW are by no means less influential, but they’re at best a student club that only aspires to be a proper curriculum. It’s surely no coincidence that they’re anchored in Berkeley, adjacent to the university’s famous student-led DeCal program.
FWIW, my capsule summary of TPOT/“post-rationalists” is that they’re people who thought that advanced degrees and/or adjacency to VC money would yield more remuneration and influence than they actually did. Equally burned out, just further along the same path.
There’s something about how these weird ideologies grow and how people get trapped in them that reminds me of oncogenesis.
When a cancer forms, cell lines experience a typical progression where a lineage accumulates mutations which aren’t that bad individually, but combine in the aggregate to cause havoc. The cells close themselves off from outside communication and lose all capacity for repairing genomic damage or heeding signals to slow out-of-control growth. Environmental factors like pollution and ionizing radiation can predictably push otherwise benign cells in that direction; the same cancer can develop in thousands or millions of people independently via convergent shit-evolution.
It rhymes, doesn’t it?
So… on strategies for explaining to normies, a personal story often grabs people more than dry facts, so you could focus on the narrative of Eliezer trying big idea, failing or giving up, and moving on to bigger ideas before repeating (stock bot to seed AI to AI programming language to AI safety to shut down all AI)? You’ll need the wayback machine, but it is a simple narrative with a clear pattern?
Or you could focus on the narrative arc of someone that previously bought into less wrong? I don’t volunteer, but maybe someone else would be willing to take that kind of attention?
I took a stab at both approaches here: https://awful.systems/comment/6885617
I usually say the following. I’m paraphrasing a spiel I have delivered in person several times and which seems to get things across.
'there’s a kind of decentralized cult called rationalism. they worship rational thinking, have lots of little rituals that are supposed to invoke more rational thinking, and spend a lot of time discussing their versions of angels and demons, which they conceive of as all powerful ai beings.
rationalists aren’t really interested in experiments or evidence, because they want to figure everything out with pure reasoning. they consider themselves experts on anything they’ve thought really hard about. they come up with a lot of apocalypse predictions and theories about race mingling.
silicon valley is saturated with rationalists. most of the people with a lot of money are not rationalists. but VCs and such find rationalists very useful, because they’re malleable and will claim with sincerity to be experts on any topic. for example, when AI companies claim to be inventing really intelligent beings, the people they put forward as supporting these claims are rationalists.’
I think for the Yud variety specifically a good summary is : “Taking all the wrong lessons from science fiction, and building a cult around it’s villains, and celebrating “Rational” villains + a belief in the inevitability of world changing technological progress”
they’re racists
@dgerard If “the purpose of a system is what it does” then the main purpose of “LessWrong rationality” seems to be “getting Yudkowsky laid”
@dgerard (it’s a cult)
Here’s my first shot at it:
“Imagine if the stereotypical high-school nerd became a supervillain.”
Imagine insecure smart people yes-anding each other into believing siskind and yud are profound thinkers.
The latest in a chain of cults, after Mormonism, the Victorian-era spiritualist fad, Scientology and new-age “quantum” woo, each using the trappings of the exciting scientific/technological ideas of their time to sell the usual proposition (a totalising belief system that answers* all questions).
Honestly, a couple hours over some beers might be the only way.
Didn’t come up with that simile, but it might fit:
It’s like a fleshed out version of a 12 year old thinking “everything would be great if I was in charge, because I’m smart and people are dumb”
Something about people who are too impressed with their own smarts and swap pet theories that make them feel smart.