a place for it
Almost nostalgic to see a TREACLES sect still deferring to Eliezer’s Testament. For the past couple of years the Ratheology of old with the XK-class end of the world events and
alignof AI
has been sadly1 sidelined by the even worse phrenology and nrx crap. If not for the murder cults and sex crimes, I’d prefer the nerds reinventing Pascal’s Wager over the JAQoff lanyard nazis2.1: And it being sad is in and of itself sad.
2: A subspecies of the tie nazi, adapted to the environmental niche of technology industry work
A subspecies of the tie nazi
OBJECTION! Lanyard nazis include many a shove-in-a-locker nazi
Counter-objection: so do all species of the nazi genus.
FWIW here’s LW with discussion
Thanks for the link.
I forgot how frustrating these people are. I’d love to read these comments but they’re filled with sentences like:
I take seriously radical animal-suffering-is-bad-ism[1], but we would only save a small portion of animals by trading ourselves off 1-for-1 against animal eaters, and just convincing one of them to go vegan would prevent at least as many torturous animal lives in expectation, while being legal.
Just say “I think persuading people to become vegan is better than killing them”?
Why do you need to put a little footnote[1] to some literal fiction someone wrote about human suffering to make a point?
Screw it, here’s that footnote:
For a valid analogy between how bad this is in my morality and something that would be equally bad in a human-focused morality, you can imagine being born into a world with widespread human factory farms. Or the slaughter and slavery of human-like orcs, in case of this EY fiction [link omitted].
Exqueeze me? You have to resort to some shit somebody made up to talk about human exploitation?
Lots of discussion on the orange site post about this today.
(I mentioned this in the other sneerclub thread on the topic but reposted it here since this seems to be the more active discussion zone for the topic.)
came here to post this!
I loved this comment:
=====
[Former member of that world, roommates with one of Ziz’s friends for a while, so I feel reasonably qualified to speak on this.]
The problem with rationalists/EA as a group has never been the rationality, but the people practicing it and the cultural norms they endorse as a community.
As relevant here:
-
While following logical threads to their conclusions is a useful exercise, each logical step often involves some degree of rounding or unknown-unknowns. A -> B and B -> C means A -> C in a formal sense, but A -almostcertainly-> B and B -almostcertainly-> C does not mean A -almostcertainly-> C. Rationalists, by tending to overly formalist approaches, tend to lose the thread of the messiness of the real world and follow these lossy implications as though they are lossless. That leads to…
-
Precision errors in utility calculations that are numerically-unstable. Any small chance of harm times infinity equals infinity. This framing shows up a lot in the context of AI risk, but it works in other settings too: infinity times a speck of dust in your eye >>> 1 times murder, so murder is “justified” to prevent a speck of dust in the eye of eternity. When the thing you’re trying to create is infinitely good or the thing you’re trying to prevent is infinitely bad, anything is justified to bring it about/prevent it respectively.
-
Its leadership - or some of it, anyway - is extremely egotistical and borderline cult-like to begin with. I think even people who like e.g. Eliezer would agree that he is not a humble man by any stretch of the imagination (the guy makes Neil deGrasse Tyson look like a monk). They have, in the past, responded to criticism with statements to the effect of “anyone who would criticize us for any reason is a bad person who is lying to cause us harm”. That kind of framing can’t help but get culty.
-
The nature of being a “freethinker” is that you’re at the mercy of your own neural circuitry. If there is a feedback loop in your brain, you’ll get stuck in it, because there’s no external “drag” or forcing functions to pull you back to reality. That can lead you to be a genius who sees what others cannot. It can also lead you into schizophrenia really easily. So you’ve got a culty environment that is particularly susceptible to internally-consistent madness, and finally:
-
It’s a bunch of very weird people who have nowhere else they feel at home. I totally get this. I’d never felt like I was in a room with people so like me, and ripping myself away from that world was not easy. (There’s some folks down the thread wondering why trans people are overrepresented in this particular group: well, take your standard weird nerd, and then make two-thirds of the world hate your guts more than anything else, you might be pretty vulnerable to whoever will give you the time of day, too.)
TLDR: isolation, very strong in-group defenses, logical “doctrine” that is formally valid and leaks in hard-to-notice ways, apocalyptic utility-scale, and being a very appealing environment for the kind of person who goes super nuts -> pretty much perfect conditions for a cult. Or multiple cults, really. Ziz’s group is only one of several.
-
Dozens of debunkings! We don’t need citations when we have Bayes’ theorem!!
To be fair, I also dont believe any stories about a blackmail pedo ring ran by Yudkowsky. And with all the bad stuff going round, why make up a blackmail ring, like he is the Epstein of Rationality. (Note this is a diff thing than: people did statutory rape and it was covered up, which considering the Dill thing is a bit more believable).
Yes, I agree.
Stop stop I can only update my priors so fast!
Dozens as you can see.
As for privacy well, Brent Dill has been running around UFO discords telling people he’s in hiding because the rats want to kill him. Make of that what you will.
(assuming it’s true)
it was how I found his twitter actually!
Ziz was originally radicalised by the sex abuse and abusers around CFAR
deleted comments from the thread https://www.lesswrong.com/posts/96N8BT9tJvybLbn5z/we-run-the-center-for-applied-rationality-ama
College Hill is very good and on top of this shit
At least one fellow on Twitter painting us as Zizians.
Adding zizians to the list of things that sneerclub gets blamed for. After covid.
Guess they have not noticed we also have sneered at ziz, and that a lot of us are not living in america, nor the east coast of it.
lol. link?
I should also mention that Fnord fellow who has a thing with the sub is Brent Dill.
Perhaps for dull disclosure (typo but left it in), we should explain who Brent Dill is? (Wasnt he the guy who was no longer welcome at events due to abuse? Sorry correction, one of the guys we know about, we dont know about the secret list of people who are no longer welcome Scott once talked about)
I really should start remembering names.
yes thats him i think
Here’s one that an xcancel search turned up (warning: is in Andy Ngo’s replies)
thats the one! doesn’t look like a rat either.
Given the … calibre of thinking on display in his other tweets (xharts?), I wouldn’t be surprised if he just typed “Zizians” into Reddit’s search box; /r/sneerclub currently comes up a few times near the top of results there.
xharts
xcretions
second warning: an xcancel search for “zizians” turns up so much transphobia that you’ll feel like it’s a late night on the Nebuchadnezzar and Tank accidentally loaded your brain with JK Rowling instead of kung fu.
cw: body horror
e: and now my brain is writing the matrix horror fan fiction where Jk Rowling gets Agent Smithed into someone’s brain and overwrites everything
Demonstrating once again that Twitter is the damp locker-room floor of ideas.
I think its deliberate badjacketing.
my take too - someone hoping to make proxy use of the bigot wave that ngo is currently attempting whipping up with his fake panic
that seems about right, and “this anti-cult information source isn’t actually anti-cult, it’s a competing cult you should avoid” is a pretty common form it can take. it’s very convenient for the cultists, because it defuses criticism without engaging with it, by keeping all thought within the framework of the cult. the idea that we all organically stumbled upon Rationalist ideas (or were exposed to them through our friends or industry) and wholesale rejected them, must be eliminated as a possibility. we must have an ulterior motive that can’t be summed up as “hahaha holy shit look at these assholes” — or else the cult has to accept that a lot of people legitimately hold the idea that Rationalists and Zizians and all of Yud’s other ideological children are in fact fucking assholes.
It’s true: The sneerclub mod interface is a replica of the mummification machine from Young Sherlock Holmes, and when we ban people, we actually drown them in wax.
It only works with the chanting, of course.
this is a nice writeup of the story of Ziz
I’m so used to bad rat science being expressed in obscurantist math and quantum physics jargon that the kindergarten neuro woo like “each half-a-brain has a 1 in 20 chance of being ontologically Good” and “nonbinary people have one half of their brain be transgender” throws me off.
Where are the Planck units and the h-bars, category theory, maybe something about Turing machines or Gödel? Can’t you at least throw in a square root or something? Is this all it takes to stroke the a modern STEM dweeb’s ego? I guess all the talk about “debugging” and “jailbreaking” compensates for the infantile aesthetics of the crankery.
@dgerard I can’t help but imagine what a Zizian would do if one endarkened them with the knowledge that lettuce in salad was not allowed to grow to maturity (flowering) before being harvested and consumed
extremely normal thing and definitely not cult shit (ziz):
Something I don’t think I ended up writing, was a random conversation on what it meant to be good and a Sith. In which I said, something like, well, I’m doing whatever I want, no matter what, which in my case is good things.
also seems weird that vassar went from failed let-me-google-scholar-that-for-you fake medical startup to a niche rat cult leader. can’t any single one of these people get a normal job?
How are you going to get them back to
the farma retail job oncethey’ve seen Paristasted cult power?pol pot was in paris and tasted cult power and all it took to get him back to farm and out of power was Vietnamese invasion. personally i think that Vietnamese invasion on all ea compounds would solve a lots of problems and disband a couple of startups
personally i think that Vietnamese invasion on all ea compounds would solve a lots of problems and disband a couple of startups
But aren’t they used to dealing with VC?
imagine marc andreesseen on a beat up bike with equally beat up akm strolling through californian forests (currently on fire) trying to deliver rice and beans to starving venture-capitslist guerillas
Footnote 1: As a consequence of this approach to thinking and life, rationalists are, as a rule, unbelievably prolix, wall-eyed, and tedious writers, and also polyamorous.
“Wall-eyed” hit me as odd, so I go to look it up: https://www.wordnik.com/words/walleyed
The first entry:
adjective Often Offensive Affected with exotropia.
Well-played, Max!
I’m glad the non sanitized details are making it out.
fwiw, this /r/slatestarcodex poster says that Ziz of the Zizians did indeed take her name from Worm the web serial
“indirect personal communication”
Worm finally enters the mainstream. Yaaaaaaaaay.
You know I was wondering about where the name came from and it’s sufficiently plausible that I believe it. Notably in the story her threat - the reason just being around her is so dangerous - is because she has some kind of perfect predictive ability on top of all the giant psychic kaiju nonsense. So she attacks a city and finds the one woman who needs to die in order for her super-thinker husband to go mad and build an army of evil robots or whatever.
It very much rhymes with the Rationalist image of a malevolent superintelligence and I can definitely understand it being popular in those circles, especially the “I’m too edgy to recognize that Taylor is wrong, actually” parts of the readership.
and imagine the ego to name yourself after the Simurgh from Worm
Explanatory spoiler for those who don’t want to read Worm, a million words of deconstruction/reconstruction of superhero comics:
spoiler
The source of superpowers is a pair of gigantic multidimensional alien creatures. Their eternal mission involves finding a planet with intelligent life, infesting them with powers via brain tumor neural links, and encouraging them to fight so as to generate new ideas – the aliens are profoundly uncreative, even stupid by human standards, but immensely powerful – and when the planet is no longer producing, they reincorporate all the changed and combined powers, then eat the local sun to power their next FTL hop.
When their experimental subjects aren’t moving fast enough, they open up their archive and resurrect the Endbringers – superpowered kaiju which provide a stimulus for cooperation by destroying cities on a schedule. Each Endbringer is unique. The Simurgh, or Ziz, takes the form of a giant statue of a woman covered in angel wings, and wings-on-wings, and so forth. Ziz has primarily psychic powers: telekinesis; a ‘scream’ or ‘song’ that drives people to violent insanity over the course of an hour or so; and mind-control that seems limited to slowly changing supers into creative serial killers and mad scientists and so forth. Ziz floats in orbit, preventing space travel, and periodically descends to the surface to terrorize a city.
Everything escalates. Every trope is explained.