MemCast
MemCast / episode / insight
Continual learning may not be necessary for transformative AI
  • Current approaches (pre-training + RL) may suffice
  • Parallels historical ML barriers that dissolved with scaling
  • Coding shows end-to-end capability emerging without explicit continual learning
  • Long context windows can substitute for some learning
Dario AmodeiDwarkesh Patel00:20:33

Supporting quotes

I think continual learning, as I've said before, might not be a barrier at all. I think we may just get there by pre-training generalization and RL generalization. Dario Amodei
People talked about, 'How do your models keep track of nouns and verbs?' 'They can understand syntactically, but they can't understand semantically? It's only statistical correlations.' But then suddenly it turns out you can do code and math very well. Dario Amodei

From this concept

Continual Learning Debate

Discussion of whether AI systems need human-like continual learning to be economically transformative, or if scaling current approaches will suffice.

View full episode →

Similar insights