Why is this necessarily a bad thing from the author's POV? I'm going to argue from the author's frame, as a self-described communist [1] and believer in Stafford Beer's views on purpose of systems.
Take Project Cybersyn [2], Stafford Beer's cybernetics engine for orchestrating socialist Chile's entire economy. An example of an equally a "political project" that did "shift power and agency away from people and organizations towards centralized power structures". The same system that was used to organise Chilean economy could've been easily repurposed to strip labour rights, quash strikes before they even happen or "squeeze the working class". This is at odds with the author's thesis that technological artifacts are inherently political, not that politics is applied to technological artifacts. A simpler argument can be made for cars and urban planning before and after the introduction of jaywalking laws.
> Democracy is not just about voting but about ensuring that all power – especially by the state – is used in accordance with the law and in a fair way. Stochastic “AI” systems break that promise. The “AI” just says that you do not get the support you need. No idea why, might be a bug or a deeply racist training data set or something else. Nobody knows.
Looking back at Project Cybersyn, which I assume the author, a self described communist, considers a system that would improve democratic participation in society, the central modelling function that Cybersyn used was based on Bayesian filtering, another stochastic method in the Generative model family.
In believe the author isn't actually at odds with LLMs or AI, but with who controls the technology, since he seems to appreciate stochastic centrally planned socialist systems created by the people he admires.
This makes me think that the article is written from a visceral response to deployment of AI, and a rationalisation of that visceral reaction, rather than from the author's principled views, again, as a self-declared communist and appreciator of Stafford Beer.
This is one of those "blind monks touching different parts of the elephant and arguing over what shape it is" things. Facist leaning corporations are basically the only interaction with machine learning that most people have these days. Like how corporate scams are the only interaction with cryptocurrency 99% people have. They're not wrong about their experiences. But they're also not talking about the technology. They're talking about the corporate users of that technology.
Just to be clear, this is the full and complete text of the article on that URL. It makes me wonder what others are responding to because this is just incoherent word spam:
>This “country-wide *book-keeping,* country-wide *accounting* of the rate of interest above the soviets. Before seizing power, the dangers “if the liberty of those letters seem to concern about history, about how little weight the costs outweigh the costs. Therefore, for a massive rebirth of the party in his belief that, for the political evolution.
>Was defeated.”* [Max Anger, “The Spartacist School of Falsification”, *Anarchy: A Graphic Guide*, Camden Press, London, 1974. *The Third Revolution: Popular Movements in France during the Russian Civil War did break out, to crush.
>“turn this into a higher infant mortality is 7 per cent of the social order.’” The interest rate is, in part, because if the SWP fail to highlight mass examples of what Trotskyists like Harman do when they had introduced a petition.
So I guess my question for the author is what they're trying to achieve in this essay. Some of the big players in tech may have beliefs that align with some of the tenets of fascism. But the label is so well-worn that it's meaningless. Buying books on Amazon is fascism, writing on Medium or Substack is fascism, having pets is fascism, etc. So, this day and age, not much is accomplished by uttering the word.
With this in mind, what's the prescription the author actually has to offer? That's where it really gets dicey, because the only takeaway from this writing seems to be "geez, someone ought to set some datacenters on fire".