From Kant’s epistemology, knowledge is a ‘construction’ of the subject's mind, not a passive ‘reception'.[7]
Kant profoundly pointed out that knowledge is not passively imprinted onto the mind from the external world but is actively ‘synthesized’ by the subject using a priori categories (such as causality, substance) to organize the sensory manifold (raw data from the senses). A book or the output of Transformer merely provides ‘sensory manifold’—a combination of symbols. These symbols, in themselves, are meaningless and silent. Transformer generates a complex arrangement of symbols through its massive parameters and data, but this is still just ‘material.’ It lacks the subject’s ability to ‘synthesize’ these symbols into coherent knowledge. Therefore, Transformer’s output is like a book that has not been read: it is just raw ‘informational material’ awaiting processing, not a knowledge structure formed by the mind's synthesis.
Furthermore, Gadamer's hermeneutics further clarifies that meaning and knowledge emerge during the ‘understanding event’ rather than being pre-existing in texts.[8]
Gadamer suggests that texts (including books and Transformer’s outputs) do not ‘contain’ fixed meanings or knowledge. They are merely ‘triggers,’ and the true meaning is dynamically generated in the ‘understanding event’ by the reader. The process involves merging the reader’s ‘pre-understanding’ with the text, a result of ‘fusion of horizons.’ Transformer’s generation process, based on statistical regularities, is merely a symbolic concatenation. It has no ‘pre-understanding’ and cannot engage in true ‘fusion of horizons.’ Its fluent text simulates human knowledge expression, but does not carry any actual understanding. Therefore, whether Transformer’s output contains ‘knowledge’ entirely depends on whether a human reader can trigger a genuine ‘understanding event.’
Finally, externalist semantics (e.g., Putnam, Berg) argues that the justification of knowledge depends on the external world and the community, not solely on the internal symbol system.[9]
The classic definition of knowledge, ‘justified true belief,’ requires justification, i.e., a reason why a proposition is true. Externalist semantics emphasizes that the meaning and truth conditions of symbols (e.g., ‘water’) depend on the external world (pointing to real H₂O) and the linguistic community (shared usage rules). Transformer is confined within the textual world constructed by its training data. Its symbolic associations come entirely from statistical co-occurrences within the data and cannot anchor to the external world. It outputs ‘The Earth is round’ not because it has confirmed this belief through observation or logical reasoning, but because ‘Earth’ and ‘round’ co-occur frequently in the training data. It cannot provide genuine, world-based ‘reasons’ for its output. Thus, the output of Transformer is, at most, a claim about knowledge or an expression of information, with its truth and validity ultimately determined by human cognitive agents embedded in the world and engaged in social practices.