In a recent interview with Dwarkesh Patel, Mark Zuckerberg addressed the mother of all bottlenecks when on how AI could sustain its explosive market progression: energy.

"Energy, not compute will be the #1 bottleneck to AI progress."

Here's the transcript of the video, rewritten just a tad to make it more legible (all italics and bolds are me emphasizing some points, obviously, and links are in reference to issues and ideas we already wrote about):
Over the last few years, I think there has been this issue of GPU production. Even companies that had the money to pay for the GPUs couldn't necessarily get as many as they wanted because of all these supply constraints. Now, I think that's sort of getting less so. Now, I think you're seeing a bunch of companies think about, "Wow, we should just really invest a lot of money in building out these things." And I think that will go on for some period of time.
There is a capital question of, "Okay, at what point does it stop being worth it to put the capital in?" But I actually think before we hit that, you're going to run into energy constraints. Because I don't think anyone's built a gigawatt single training cluster yet. To put this in perspective, I think a gigawatt is around the size of a meaningful nuclear power plant, only going towards training a model.
Then you run into these things that end up being slower in the world: getting energy permitted, is a very heavily regulated government function. And if you're talking about building large new power plants or large buildouts and then building transmission lines that cross other private or public land, that is just [another] heavily regulated thing. You're talking about many years of lead time. If we wanted to stand up just some massive facility to power that, that's a very long-term project.
We would probably build out bigger clusters than we currently can if we could get the energy to do it. That's fundamentally money bottlenecked [in the order of a trillion dollars]. But it [also] depends on how far the exponential curves go.
A number of companies are working on data centers on the order of 50 megawatts or 100 megawatts; or a big one might be 150 megawatts, okay? So you take a whole data center, and you fill it up with just all the stuff that you need to do for training, and you build the biggest cluster you can. (...) But then when you start getting into building a data center that's like 300 megawatts or 500 megawatts or a gigawatt, I mean, just no one has built a single gigawatt data center yet. It will happen, this is only a matter of time, but it's not going to be next year.
It's one of the trickiest things in the world to plan around is when you have an exponential curve, how long does it keep going for? I think it's likely enough that it will keep going that it is worth investing the tens of 100 billion-plus in building the infrastructure to assume that if that kind of keeps going, you're going to get some really amazing things. But I don't think anyone in the industry can really tell you that it will continue scaling at that rate for sure. In general, in history, you hit bottlenecks at certain points and now there's so much energy on this that maybe those bottlenecks get knocked over pretty quickly. But I don't think that this is something that can be quite as magical as just like, okay, you get a level of AI, and you get a bunch of capital, and you put it in, and then all of a sudden, the models are just going to kind of like [Mark stutters, seemingly envisioning a breakthrough revolution]... it's just like... I think you do hit different bottlenecks along the way.

Why do I write about this?

Obviously, understanding the key bottlenecks that will constrain the speed of progress and the directionality of a major innovation trend is a subject I often address, if only because most key executives are oblivious to it. And here we have a crystal discussion about this core principle of technological innovation in the context of AI, the most hyped–and deservedly so–technology on the block in 2024.

The consequence of this discussion?

If you want to know who will push as much as possible to sustain the exponential progression of AI, monitor who will invest massively not just in dedicated high-end chip manufacturing but also quite rapidly in energy production (the slowest-moving, slowest part of this puzzle).

  • And as a side prediction: whoever tries to invest billions (a trillion?) in this endeavor will be mauled by the stock market as investors are massively in it for the short term and don't want to participate in a five to ten-year plan of tech supremacy.
  • Also: "The advanced nuclear fission startup, Oklo, announced on Tuesday it will go public via merger with AltC Acquisition Corp., a special purpose acquisition company that was co-founded by OpenAI CEO Sam Altman."
The link has been copied!