

Its also one of the authors of “Attention Is All You Need”, one of the founding texts of the AI ideology.
he/they


Its also one of the authors of “Attention Is All You Need”, one of the founding texts of the AI ideology.


I’m gonna say it: The entire “artificial intelligence”/“machine learning” research field is corrupt. They have institutionally accepted the bullshit fountain as a tool. It doesn’t matter if they’re only using chatbots as a “pilot program”; they’ve bought into the ideology. They’ve granted fashtech a seat at the bar and forced all the other customers to shake its hand.
Abso-fucking-lutely. Oxford’s latest “research paper” isn’t marketing - its propaganda. Propaganda for bullshit fountains, and for the ideology which endorses them.


Another deep-dive into DHH’s decline has popped up online: DHH and Omarchy: Midlife crisis:


What’s a government backstop, and does it happen often? It sounds like they’re asking for a preemptive bail-out.
Zitron’s stated multiple times a bailout isn’t coming, but I’m not ruling it out myself - AI has proven highly useful as a propaganda tool and an accountability sink, the oligarchs in office have good reason to keep it alive.


Considering we’ve already got a burgeoning Luddite movement that’s been kicked into high gear by the AI bubble, I’d personally like to see an outgrowth of that movement be what ultimately kicks it off.
There were already some signs of this back in August, when anti-AI protesters vandalised cars and left “Butlerian Jihad” leaflets outside a pro-AI business meetup in Portland.
Alternatively, I can see the Jihad kicking off as part of an environmentalist movement - to directly quote Baldur Bjarnason:
[AI has] turned the tech industry from a potential political ally to environmentalism to an outright adversary. Water consumption of individual queries is irrelevant because now companies like Google and Microsoft are explicitly lined up against the fight against climate disaster. For that alone the tech should be burned to the ground.
I wouldn’t rule out an artist-led movement being how the Jihad starts, either - between the AI industry “directly promising to destroy their industry, their work, and their communities” (to quote Baldur again), and the open and unrelenting contempt AI boosters have shown for art and artists, artists in general have plenty of reason to see AI as an existential threat to their craft and/or a show of hatred for who they are.


Part of me wants to see Google actually try this and get publicly humiliated by their nonexistent understanding of physics, part of me dreads the fact it’ll dump even more fucking junk into space.


Found a high quality sneer of OpenAI from Los Angeles Review of Books: Literature Is Not a Vibe: On ChatGPT and the Humanities


Plus, the authors currently suing OpenAI have gotten their hands on emails and internal Slack messages discussing their deletion of the LibGen dataset - a development which opens the company up to much higher damages and sanctions from the court for destroying evidence.


That’s quite a remarkable claim. Especially when the actual number of attacks by AI-generated ransomware is zero. [Socket]
If even a single case pops up, I’d be surprised - AFAIK, cybercriminals are exclusively using AI as a social engineering tool (e.g. voice cloning scams, AI-extruded phishing emails, etcetera). Humans are the weakest part of any cybersec system, after all.
The paper finishes by recommending “embracing AI in cyber risk management”.
Given AI’s track record on security, that sounds like an easy way to become an enticing target.


The question of how to cool shit in space is something that BioWare asked themselves when writing the Mass Effect series, and they came up with some pretty detailed answers that they put in the game’s Codex (“Starships: Heat Management” in the Secondary section, if you’re looking for it).
That was for a series of sci-fi RPGs which haven’t had a new installment since 2017, and yet nobody’s bothering to even ask these questions when discussing technological proposals which could very well cost billions of dollars.


Performing the SPARTAN Program’s original aim, sir.
I also learned Bitwarden bought into AI reading this. They don’t appear to have let vulnerability extruders ruin their code as of this writing, but any willingness to entertain the fascism machines is enough for me to consider jumping ship.