Full bibliography
A Disproof of Large Language Model Consciousness: The Necessity of Continual Learning for Consciousness
Resource type
Preprint
Author/contributor
- Hoel, Erik (Author)
Title
A Disproof of Large Language Model Consciousness: The Necessity of Continual Learning for Consciousness
Abstract
Scientific theories of consciousness should be falsifiable and non-trivial. Recent research has given us formal tools to analyze these requirements of falsifiability and non-triviality for theories of consciousness. Surprisingly, many contemporary theories of consciousness fail to pass this bar, including theories based on causal structure but also (as I demonstrate) theories based on function. Herein, I show these requirements of falsifiability and non-triviality especially constrain the potential consciousness of contemporary Large Language Models (LLMs) because of their proximity to systems that are equivalent to LLMs in terms of input/output function; yet, for these functionally equivalent systems, there cannot be any falsifiable and non-trivial theory of consciousness that judges them conscious. This forms the basis of a disproof of contemporary LLM consciousness. I then show a positive result, which is that theories of consciousness based on (or requiring) continual learning do satisfy the stringent formal constraints for a theory of consciousness in humans. Intriguingly, this work supports a hypothesis: If continual learning is linked to consciousness in humans, the current limitations of LLMs (which do not continually learn) are intimately tied to their lack of consciousness.
Repository
arXiv
Archive ID
arXiv:2512.12802
Date
2026-01-19
Accessed
1/26/26, 9:37 AM
Short Title
A Disproof of Large Language Model Consciousness
Library Catalog
Extra
arXiv:2512.12802 [q-bio]
Notes
Comment: 31 pages, 3 figures. V3: Added new section (4.1), restructured section 5.1, and further expanded citations
Citation
Hoel, E. (2026). A Disproof of Large Language Model Consciousness: The Necessity of Continual Learning for Consciousness (arXiv:2512.12802). arXiv. https://doi.org/10.48550/arXiv.2512.12802
Link to this record