MemCast
MemCast / episode / insight
Scaling data, compute, and model size alone will hit diminishing returns without new algorithmic ideas
  • Fei‑Fei Li acknowledges that larger models and more data have produced impressive gains, yet she warns that simply adding GPUs will not solve core limitations.
  • She cites tasks that current models cannot perform, such as counting chairs in a 3‑D scene or deriving Newtonian physics from raw observations.
  • The speaker argues that breakthroughs in architecture (e.g., transformers) and training paradigms are still needed to achieve higher-level reasoning.
  • This perspective aligns with research communities calling for “efficiency‑first” and “reasoning‑first” approaches.
  • The insight guides investors and labs to fund exploratory work beyond brute‑force scaling.
Fei‑Fei LiLenny's Podcast00:23:55

Supporting quotes

I definitely think we need more innovations. Scaling more data, more GPUs and bigger current model architecture is still a lot to be done, but we absolutely need to innovate more. Fei‑Fei Li
Limits of scaling
We still can't count the number of chairs in a video of a room, something a toddler can do. Fei‑Fei Li
Capability gap

From this concept

Beyond Scaling -- The Need for New Innovations in AI

While data, compute, and model size have driven recent advances, Fei-Fei Li stresses that continued breakthroughs require fresh ideas--especially in world modeling, embodied intelligence, and multimodal reasoning.

View full episode →

Similar insights