The Abstraction Fallacy: Why AI Can Simulate but Not Instantiate Consciousness
10 hours ago
- #computational functionalism
- #AI consciousness
- #ontology of computation
- The Abstraction Fallacy challenges computational functionalism in AI consciousness debates, arguing it mischaracterizes the relationship between physics and information.
- Symbolic computation is not an intrinsic physical process but a mapmaker-dependent description requiring an experiencing agent to discretize continuous physics into meaningful states.
- To assess AI sentience, we need a rigorous ontology of computation, not a complete theory of consciousness, to avoid deepening the AI welfare trap.
- The framework separates simulation (behavioral mimicry via vehicle causality) from instantiation (intrinsic constitution via content causality), showing why algorithmic symbol manipulation cannot instantiate experience.
- AI consciousness would depend on specific physical constitution, not syntactic architecture, offering a physically grounded refutation of computational functionalism.