Does asking for consent and attribution in the training of LLMs impede access to knowledge? This question came up in a recent debate, where I was told that my criticism of large language models (LLMs) being trained on copyrighted content without consent or attribution implied that I was “against learning” – or “against the free exchange of knowledge.”
This line of criticism misunderstands what learning actually is. And it is particularly mistaken when directed at someone who has been engaging in knowledge exchange throughout her life. Often without compensation. But more generally:
Access to knowledge is not achieved by feeding books into machines. It’s achieved through public libraries, access to education, and dialogue; and yes, through fair mechanisms of intellectual contribution and recognition.
Saying that the “binge-eating” of human-made content by LLMs is equivalent to human learning and that restricting the former equals opposing the latter is reductionism at its finest
It’s the idea that if we just feed a machine enough tokens, it somehow “knows.” But compressing isn’t knowing. Neither is predicting understanding. And token probability isn’t dialogue.
And that’s exactly the point: Saying no to unfettered machine ingestion has nothing to do with restricting access to human knowledge. It has to do with upholding responsibility.
What’s more disturbing: I’m not even sure whether this alleged equivalence – i.e. that human and machine learning are the same – is truly believed by those who defend it. Or whether they only maintain it for strategic reasons, because without it they couldn’t promote their business model that thrives on appropriation.
As Viktor Frankl warned decades ago: when we reduce the human being to a machine, we don’t just simplify; we risk creating an existential vacuum. And when we reduce human learning to pure “ingestion”, we lose the very qualities that make knowledge valuable: context, connection, and meaning. All three are absent in LLMs.
Originally shared on LinkedIn on April 2, 2025.
Header image Alexis Brown auf Unsplash