MemCast
MemCast / episode / insight
Million-token contexts enable 'days of human learning'
  • Current context lengths (~1M tokens) equivalent to days/weeks of human reading
  • Enables substantial in-context learning
  • Longer contexts (10M+) could enable months of learning
  • Engineering challenge is inference optimization, not fundamental limits
Dario AmodeiDwarkesh Patel00:41:42

Supporting quotes

A million tokens is a lot. That can be days of human learning. If you think about the model reading a million words, how long would it take me to read a million? Days or weeks at least. Dario Amodei
There's nothing preventing longer contexts from working. You just have to train at longer contexts and then learn to serve them at inference. Dario Amodei

From this concept

Continual Learning Debate

Discussion of whether AI systems need human-like continual learning to be economically transformative, or if scaling current approaches will suffice.

View full episode →

Similar insights