“One is how much raw compute you have. The second is the quantity of data. The third is the quality and distribution of data. It needs to be a broad distribution. The fourth is how long you train for.” — Dario Amodei
“Then the sixth and seventh were things around normalization or conditioning, just getting the numerical stability so that the big blob of compute flows in this laminar way instead of running into problems.” — Dario Amodei
Amodei explains his 'Big Blob of Compute' theory - that AI progress depends primarily on seven scalable factors rather than algorithmic breakthroughs. He discusses how this applies to RL and generalization.