Here again Cory is misrepresenting the LLM-critic’s argument: Sam Altman is a scam artist and habitual liar, but that’s not one of the first 10 to 20 reasons people criticise OpenAI’s products.Sure, basically every leading figure in the “AI” space seems to be unpleasant at best but that’s true for most of tech TBH. People criticise LLMs for their structural properties, their material impacts, for the way they make it harder to learn and grow, for the way they make products worse while creating massive negative externalities in the form of emissions, water use and e-waste. For the way these systems can only be build by taking every piece of data – regardless of whether the authors consent or even explicitly refuse and how the training needs ungodly amounts of harmful, exploitative labor done mostly by people in countries from the global majority. How it materially harms the commons.
…
Artifacts and technologies have certain logics built into their structure that do require certain arrangements around them or that bring forward certain arrangements. The second aspect is often illustrated by how ships are organized: Because ships are sometimes in dangerous situations and sometimes critical decisions need to be made, the existence of ships implies the existence of a hierarchy of power relationships with a captain having the final say. Because democracy would be too slow at times. These politics are built into the artifact.
Understanding this you cannot take any technology and “make it good”. Is a torturing device “good” if the plans on how to build it are creative commons? Do we need to answer the existence of the digital torment nexus by building an open source torment nexus? I’d argue we need to destroy it – regardless of what license it is released under.
Cory’s having a moment because he came up with the phrase “enshittification,” but I think he and Boing Boing have always sucked.