It's time for a bunch of self-proclaimed "rationalists" — a group of mostly technologists in their 20s — to start acting like it and stop paying the Berkeley-based Center For Applied Rationality. I say this because CFAR's $3,900 4-day seminars and their attendees are the subject of a New York Times magazine article this week that will leave you slapping your forehead in front of an imagined group of conference-goers.
Lest you think this to be a kooky-Berkeley one-off spiritual center, allow me to note that these charlatan-sounding types have been hired by Facebook and the Thiel Fellowship and that $3,900 is not cheap, especially given the living conditions offered.
[The] workshops... are run like a college-dorm cram session. Participants stay on-site for the entire time (typically four days and nights), often in bargain-basement conditions. In San Leandro, the organizers packed 48 people (36 participants, plus six staff members and six volunteers) into a single house, using twin mattresses scattered on the floor as extra beds. In the kitchen, I asked Matt O’Brien, a 30-year-old product manager who develops brain-training software for Lumosity, whether he minded the close quarters. He looked briefly puzzled, then explained that he already lives with 20 housemates in a shared house in San Francisco. Looking around the chaotic kitchen, he shrugged and said, ‘‘It’s not really all that different.’’
Yes, in a total coincidence, that would be Lumosity the games app that must now fork over $2 million to the FTC for "unfounded" claims of cognitive health benefits. Others at the seminar included Asher, a self-described "singing, freestyle rapping, former international Quidditch All-American turned software engineer.’’ A third was a gentleman who ended conversations with a bit of charm, saying ‘‘I will allow you to disengage,’’
The fun starts with a CoZE, or comfort-zone expansion, exercise. An organizer and CFAR founder who says ‘We’re trying to invent parkour for the mind," as if to shout "THIS IS A FAD," first encourages attendees to, indeed, step outside their comfort zones. Naturally, one puts his hand in a pan of curry and another takes off his shirt and affixes a sign to himself that reads "touch me." We're off!
‘‘A lot of people think that rationality means acting like Spock and ignoring things like intuition and emotion,’’ [co-founder Julia Galef] said ‘‘But we’ve found that that approach doesn’t actually work.’’ Instead, she said, the aim was to bring the emotional, instinctive parts of the brain (dubbed ‘‘System One’’ by Kahneman) into harmony with the more intellectual, goal-setting parts of the brain (‘‘System Two’’).
An elaboration on that:
‘‘The prefrontal cortex is like a monkey riding an elephant,’’ she told the group. ‘‘System One is the elephant. And you’re not going to steer an elephant by telling it where it should go... When you realize that people are complex systems — that we operate in complicated ways, but also sort of follow rules — you start to think about how you might tweak some of those variables.’’
CFAR appears to be a mashup of bits of cognitive science and contemporary self-help with a dash of religion, all served up like a heavily opiated cocktail to anyone who can afford it, and who might otherwise consider themselves too smart for self-help and who maybe spent too many of their college days writing code.
The program seems based on several typically religious assumptions as old as time itself: that humans are bad and need to be fixed, that we'll soon be destroyed en masse (wait until you hear how!), and that immortality is attainable (also weird!).
How to live forever and save the planet in the process? That's not clear, but it starts with CFAR. Says one attendee, as if quoting an episode of Silicon Valley, "Self-help is just the gateway. The real goal is: Save the world."
The Master in all of this is artificial-intelligence researcher Eliezer Yudkowsky. Yudkowsky founded the Machine Intelligence Research Institute (MIRI) that provided the original funding for CFAR. The groups share their Berkeley office.
"Yudkowsky is a controversial figure," the magazine puts it gently, "Mostly self-taught — he left school after eighth grade — he has written openly about polyamory and blogged at length about the threat of a civilization-ending A.I." And even if you think poly is legit, Yudkowsky also subscribes fully to cryonics, i.e. preserving bodies so as to resurrect them at a later time.
Oh and, ‘‘I wouldn’t be surprised if tomorrow was the Final Dawn, the last sunrise before the earth and sun are reshaped into computing elements’’ is an actual quote. "I think my efforts could spell the difference between life and death for most of humanity."
I say, while it's nice these "rationalists" are trying to improve themselves, there are far more rational ways to do it and this all sounds about as science-driven as Scientology. Go to group therapy. Go to church, even. The story here isn't people being maybe scammed. It's that a group of otherwise very smart, sort of sad people who say they're thinking for themselves are being maybe scammed. But, again: tale as old time.
So save your money for the Esalen Institute in Big Sur — at least that place is cool.
Related: All About The Real Big Sur Retreat From The Mad Men FinaleBrain Drain: Game App Lumosity Will Pay $2 Million For 'Unfounded' Cognitive Benefit Claims