Dwarkesh Patel & Jeffrey Dean , Dwarkesh Podcast

Jeff Dean & Noam Shazeer – 25 years at Google: from PageRank to AGI

12 Feb 2025 • 134 min • EN
134 min
00:00
02:14:43
No file found

This week I welcome on the show two of the most important technologists ever, in any field. Jeff Dean is Google's Chief Scientist, and through 25 years at the company, has worked on basically the most transformative systems in modern computing: from MapReduce, BigTable, Tensorflow, AlphaChip, to Gemini. Noam Shazeer invented or co-invented all the main architectures and techniques that are used for modern LLMs: from the Transformer itself, to Mixture of Experts, to Mesh Tensorflow, to Gemini and many other things. We talk about their 25 years at Google, going from PageRank to MapReduce to the Transformer to MoEs to AlphaChip – and maybe soon to ASI. My favorite part was Jeff's vision for Pathways, Google’s grand plan for a mutually-reinforcing loop of hardware and algorithmic design and for going past autoregression. That culminates in us imagining *all* of Google-the-company, going through one huge MoE model. And Noam just bites every bullet: 100x world GDP soon; let’s get a million automated researchers running in the Google datacenter; living to see the year 3000.Watch on Youtube; listen on Apple Podcasts or Spotify. Sponsors Scale partners with major AI labs like Meta, Google Deepmind, and OpenAI. Through Scale’s Data Foundry, labs get access to high-quality data to fuel post-training, including advanced reasoning capabilities. If you’re an AI researcher or engineer, learn about how Scale’s Data Foundry and research lab, SEAL, can help you go beyond the current frontier at scale.com/dwarkesh Curious how Jane Street teaches their new traders? They use Figgie, a rapid-fire card game that simulates the most exciting parts of markets and trading. It’s become so popular that Jane Street hosts an inter-office Figgie championship every year. Download from the app store or play on your desktop at figgie.com Meter wants to radically improve the digital world we take for granted. They’re developing a foundation model that automates network management end-to-end. To do this, they just announced a long-term partnership with Microsoft for tens of thousands of GPUs, and they’re recruiting a world class AI research team. To learn more, go to meter.com/dwarkesh To sponsor a future episode, visit dwarkeshpatel.com/p/advertise Timestamps 00:00:00 - Intro 00:02:44 - Joining Google in 1999 00:05:36 - Future of Moore's Law 00:10:21 - Future TPUs 00:13:13 - Jeff’s undergrad thesis: parallel backprop 00:15:10 - LLMs in 2007 00:23:07 - “Holy s**t” moments 00:29:46 - AI fulfills Google’s original mission 00:34:19 - Doing Search in-context 00:38:32 - The internal coding model 00:39:49 - What will 2027 models do? 00:46:00 - A new architecture every day? 00:49:21 - Automated chip design and intelligence explosion 00:57:31 - Future of inference scaling 01:03:56 - Already doing multi-datacenter runs 01:22:33 - Debugging at scale 01:26:05 - Fast takeoff and superalignment 01:34:40 - A million evil Jeff Deans 01:38:16 - Fun times at Google 01:41:50 - World compute demand in 2030 01:48:21 - Getting back to modularity 01:59:13 - Keeping a giga-MoE in-memory 02:04:09 - All of Google in one model 02:12:43 - What’s missing from distillation 02:18:03 - Open research, pros and cons 02:24:54 - Going the distance Get full access to Dwarkesh Podcast at www.dwarkeshpatel.com/subscribe

From "Dwarkesh Podcast"

Listen on your iPhone

Download our iOS app and listen to interviews anywhere. Enjoy all of the listener functions in one slick package. Why not give it a try?

App Store Logo
application screenshot

Popular categories