

What with all the rapid innovation surrounding small modular reactors, I still firmly believe that I will be able to bolt a nuclear reactor to a DeLorean in my garage
What with all the rapid innovation surrounding small modular reactors, I still firmly believe that I will be able to bolt a nuclear reactor to a DeLorean in my garage
Much the same mistake I (and many others) made trying to get into the series! Although I have to say that I’m still one of the philistines that gerikson brings up who’s read Phlebas and Player of Games and not much else.
Nitpicking, but at what point do we start calling it race pseudoscience? Letting the creeps have even a tiny bit of legitimacy is too much, especially as mainstream outfits are working overtime to legitimize them.
Looking forward to stumbling across this one in a used bookstore 20 years from now, comically misfiled next to a copy of John Dies at the End
Essay proclaiming broad stagnation is now well over a decade old, Thiel stands by that thesis, but hey, Thiel himself definitely isn’t part of the problem! Invest in blockchain-powered AI gene editing today!
I keep telling people that Thiel isn’t some kind of boogeyman end-boss hiding behind Musk, because he’s clearly just as loaded and incompetent as Musk, he only takes more care to keep it out of the public eye… but every time he pops his head up for some garbage like this, I am forced to reconsider that latter conclusion.
“Look how AI abuse by overconfident fools wrecked the government” should easily be a golden campaign platform, but given how credulous influential Democrats are being about cryptocurrency at this late date, I dunno
My perspective is that EA and the upper-class philanthropy it inherits from are consumerist, a system that rests on top of colonialism. It’s basically selling spiritual consumer goods, much like the medieval Catholic Church selling indulgences (and look what that provoked!). Once we get beyond the public health interventions, into longtermist EA’s “trillions of simulated minds in our future lightcone” bullshit, it’s clearly selling an unhealthily narcissistic spirituality, though its adherents would never call it that. The product, in this case, is the warm fuzzy self-aggrandizing feeling that one can extend one’s (over)privileged position in our relatively fragile 21st century society into influence over sci-fi-scale expanses of time and space.
For Yarvin, it always is and always will be someone else’s fault
ChatGPT’s got what intelligence craves… it’s got neurons
I actually think it’s part-and-parcel of Yarvin’s personality. As much as he rails against “the Cathedral,” PMCs, whatever, he himself is a perfect example of a pathological middle manager. Somebody who wants power without having to shoulder ultimate responsibility. He craves the childishly simplified social environment of a medieval-fantasy king’s court, but he doesn’t want to be the king himself. He wants to be (and has been, up until now) the scheming vizier who can run his manipulation games in the background, deciding who gets in front of the king but not having to take the heat if the king makes a bad decision. (And the “kings” he works for have made plenty of bad decisions, but consequences have only just begun to catch up.)
I suspect this newfound mainstream attention is far more uncomfortable than it is validating for him. Perhaps the NYT profile was a burst of exhilaration, but the shine has worn off quickly. This correlates with the story last year about him coming back to Urbit as a “wartime CEO.” If Urbit is so damn important for building his ridiculous vision, why wasn’t he running it the whole time? He doesn’t actually want to be CEO of anything. Power without responsibility.
He will never stop to reflect that his “philosophy,” such as it is, is explicitly tailored for avaricious power-hungry narcissists, soooooo
Obvious joke is obvious, but
The essay brims with false dichotomies, logical inconsistencies, half-baked metaphors, and allusions to genocide. It careens from Romanian tractor factories to Harvard being turned “into dust. Into quarks” with the coherence of a meth-addled squirrel.
Harvard isn’t already full of Quarks?
For my money, 2015/16 Adams trying to sell Trump as a “master persuader” while also desperately pretending not to be an explicit Trump supporter himself was probably the most entertaining he’s ever been. Once he switched from skimmable text blogging to livestreaming, though, he wanted to waste too much of my time to be interesting anymore.
Yes, Kurzweil desperately trying to create some kind of a scientific argument, as well as people with university affiliations like Singer and MacAskill pushing EA, are what give this stuff institutional strength. Yudkowsky and LW are by no means less influential, but they’re at best a student club that only aspires to be a proper curriculum. It’s surely no coincidence that they’re anchored in Berkeley, adjacent to the university’s famous student-led DeCal program.
FWIW, my capsule summary of TPOT/“post-rationalists” is that they’re people who thought that advanced degrees and/or adjacency to VC money would yield more remuneration and influence than they actually did. Equally burned out, just further along the same path.
I’ve been contemplating this, and I agree with most everyone else about leaning heavily into the cult angle and explaining it as a mutant hybrid between Scientology-style UFO religions and Christian dispensationalist Book of Revelation eschatology. The latter may be especially useful in explaining it to USians. My mom (who works in an SV-adjacent job) sent me this Vanity Fair article the other day about Garry Tan grifting his way into non-denominational prosperity gospel Christianity: https://www.vanityfair.com/news/story/christianity-was-borderline-illegal-in-silicon-valley-now-its-the-new-religion She was wondering if it was “just another fad for these people,” and I had to explain no, not really, it is because their AI bullshit is so outlandish that some of them feel the need to pivot back towards something more mainstream to keep growing their following.
I also prefer to highlight Kurzweil’s obsession with perpetual exponential growth curves as a central point. That’s often what I start with when I’m explaining it all to somebody. It provides the foundation for the bullshit towers that Yudkowsky and friends have erected. And I also think that long-term, the historiography of this stuff will lean more heavily on Kurzweil as a source than Yudkowsky, because Kurzweil is better-organized and professionally published. It’ll most likely be the main source in the lower-division undergraduate/AP high school history texts that highlight this stuff as a background trend in the 2010s/2020s. Right now, we live in the peak days of the LessWrong bullshit volcano plume, but ultimately, it will probably be interpreted by the specialized upper-division texts that grow out of peoples’ PhD theses.
awful.systems
Huh, 2 paradigm shifts is about what it takes to get my old Beetle up to freeway speed, maybe big Yud is onto something
It is what happened to look good in the valley between the Adderall comedown and yesterday evening’s edible really starting to hit
And the photos from a previous event are an ocean of whiteness. Hard to argue that they’re not, uh, cultivating a certain demographic…
I propose we pool funds, buy an old motel on the other side of the city limits in Oakland, and rename it Farthaven
Over the last few years, I have fully gotten on board with the idea that the haunting vestige of the idea of people as property is one of the core weaknesses of American society, and the “western civilization” enthusiasts that promote its supremacy.
Of course, there are a lot of other people who have been on board with that point of view for centuries.