MemCast
MemCast / episode / insight
AI progress continues roughly as predicted since 2017
  • The 'Big Blob of Compute Hypothesis' from 2017 still holds true
  • Seven key factors determine progress: compute, data quantity/quality, training duration, scalable objective functions, and numerical stability
  • RL scaling now shows similar log-linear improvements as pre-training did earlier
Dario AmodeiDwarkesh Patel00:01:59

Supporting quotes

I actually have the same hypothesis I had even all the way back in 2017. I think I talked about it last time, but I wrote a doc called 'The Big Blob of Compute Hypothesis'. Dario Amodei
What it says is that all the cleverness, all the techniques, all the 'we need a new method to do something', that doesn't matter very much. There are only a few things that matter. Dario Amodei

From this concept

The End of the Exponential

Amodei argues we're nearing the end of exponential AI progress, with capabilities reaching human-level across many domains much sooner than most people expect. He discusses why public recognition lags behind technical reality.

View full episode →

Similar insights