3 Min Read

OpenAI's Next Model: Has AI Hit a Plateau?

Featured Image

Wondering how to get started with AI? Take our on-demand Piloting AI for Marketers Series.

Learn More

A bombshell report from The Information suggests OpenAI might be hitting an unexpected wall with its next-generation AI model, code-named "Orion."

The report claims that while Orion shows improvements over previous models, the gains are notably smaller than the dramatic leap we saw between GPT-3 and GPT-4. 

Even more concerning? The model isn't consistently better at certain tasks, including coding—despite higher operational costs.

But OpenAI insiders are pushing back hard against suggestions of an AI slowdown. And the reality might be more complex than it first appears.

Marketing AI Institute founder and CEO Paul Roetzer broke it all down for me on Episode 123 of The Artificial Intelligence Show.

The Training Data Challenge

At the heart of the issue is a fundamental AI challenge: high-quality training data is becoming scarce. 

According to The Information, OpenAI has largely exhausted publicly available text sources and has begun experimenting with AI-generated training data—a move that comes with its own complications.

The publication wrote:

“One reason for the GPT slowdown is a dwindling supply of high-quality text and other data that LLMs can process during pretraining to make sense of the world and the relationships between different concepts so they can solve problems such as drafting blog posts or solving coding bugs, OpenAI employees and researchers said.

In the past few years, LLMs used publicly available text and other data from websites, books and other sources for the pretraining process, but developers of the models have largely squeezed as much out of that type of data as they can, these people said.”

Inside Perspectives Paint a Different Picture

“Everyone was reacting to this,” says Roetzer. 

While some saw proof in The Information’s report that scaling laws are slowing down, others fundamentally dispute that idea, including some people who work at OpenAI.

CEO Sam Altman, in a recent interview with Y Combinator's Garry Tan, expressed confidence: "These things are going to compound...The models are going to get so much better, so quickly." When asked what he's excited about in 2025, Altman's response was simple: "AGI."

Dan Shipper, co-founder and CEO of Every and a vocal AI industry watcher, also disagreed, pointing out that The Information’s report is very starkly at odds with what many AI researchers working on the technology appear to believe.

Despite the debate, there are major unanswered questions about whether anyone can create another breakthrough model that stays ahead of the competition for years, like GPT-4 did, says Roetzer.

"When they introduced GPT-4 in March of '23, everybody chased that model since then, and it seems like everybody just sort of caught up," says Roetzer. "No one is clearly ahead of a GPT-4o model, but they're all sort of comparable.”

He says it seems up in the air at the moment whether we’ve hit a real plateau—or if we’re about to get the next big thing in frontier models.

Rather than a slowdown, we might be at a crossroads where companies need to bet on different paths forward beyond simply giving models more data and compute.  

Roetzer identifies four potential breakthrough areas:

  1. Reasoning, which is already being pioneered by OpenAI's o1 model.
  2. Multimodal training, or training models simultaneously on text, video, audio, and images.
  3. Symphony of models, where we use  a frontier model as a conductor for specialized smaller models
  4. Self-play and recursive self-improvement, areas where Google has significant advantages thanks to its work on game-playing AI like AlphaGo.

But, even this speculation misses the even bigger picture for business leaders, says Roetzer. 

What This Means for Businesses

While the debate over AI's next breakthrough is fascinating, Roetzer emphasizes that it shouldn't distract from the massive untapped potential in the AI we already have today.

The reality for business leaders, says Roetzer, is: 

“It’s irrelevant to you if they make a leap forward next year. The absorption of the current capabilities is so low, that the value you can create in your company using today’s models is so significant and so untapped.”

It’s fun to speculate when or if we’ll get GPT-5 or Gemini 2 or Claude 4, he says. But your real focus should be on using what we have today to succeed next year by building a more efficient team, driving productivity, and increasing creativity and innovation.

“I think diffusion of the current capabilities would be enough disruption to last us five years,” says Roetzer. “And they're going to get smarter, they're going to get more generally capable, they're going to be able to take actions—things like that may not create value for your company for another year or two, but what we have today can transform your company right now.

Related Posts

Meet Sora, OpenAI's Stunning New AI Video Generator

Mike Kaput | February 20, 2024

OpenAI is teasing a stunning new text-to-video model called Sora. And early examples of what it can do are blowing up the internet.

OpenAI o1: What You Need to Know

Mike Kaput | September 17, 2024

OpenAI has released an initial version of its code-named “Strawberry” project—a new AI model that displays advanced reasoning.

OpenAI's Secretive "Strawberry" Could Be Here This Fall

Mike Kaput | September 4, 2024

Rumors are swirling about OpenAI’s new AI system codenamed “Strawberry,” and we might be seeing it as soon as this fall.