LLM Daydreaming
10 months ago
- #AI Innovation
- #Machine Learning
- #Cognitive Science
- LLMs lack fundamental aspects of human thought, such as continual learning and a 'default mode' for background processing.
- Proposal of a 'day-dreaming loop' (DDL) for LLMs to simulate spontaneous human insight by continuously sampling and linking concepts.
- The DDL involves a generator model exploring links between concepts and a critic model filtering for valuable ideas, creating a feedback loop.
- The 'daydreaming tax' refers to the substantial compute cost of this process, which may be necessary for innovation.
- Strategic implication: expensive, daydreaming AIs could generate proprietary training data for efficient models, bypassing the 'data wall'.
- Human researchers benefit from continual thinking and background processing, which LLMs currently lack.
- The default mode network in humans is associated with spontaneous thoughts and creativity, a feature absent in LLMs.
- The DDL is inspired by wake-sleep algorithms and the default mode network, proposing a method for LLMs to achieve similar creativity.
- Potential obstacles include the high cost of the DDL and the challenge of optimizing the process for useful insights.
- Implications suggest that only power-users or researchers may be willing to pay the 'daydreaming tax' for novel insights.