“I think continual learning, as I've said before, might not be a barrier at all. I think we may just get there by pre-training generalization and RL generalization.” — Dario Amodei
“People talked about, 'How do your models keep track of nouns and verbs?' 'They can understand syntactically, but they can't understand semantically? It's only statistical correlations.' But then suddenly it turns out you can do code and math very well.” — Dario Amodei