OpenAI tempers expectations with a less spectacular, GPT-5-free DevDay this autumn.

August 6, 2024
Brian

Last year, OpenAI staged a splashy press event in San Francisco, announcing a slew of new products and capabilities, including the ill-fated App Store-style GPT store.

This year, however, will be more low-key. On Monday, OpenAI announced that its DevDay conference will transition from a tentpole event to a series of on-the-road developer interaction events. The company also stated that it will not debut its next major flagship model during DevDay, instead focusing on upgrades to its APIs and developer tools.

"We're not planning to announce our next model at DevDay," an OpenAI spokesperson told TechCrunch. "We'll be focused more on educating developers about what's available and showcasing dev community stories."

OpenAI’s DevDay events this year will take place in San Francisco on October 1, London on October 30, and Singapore on November 21. All will feature workshops, breakout sessions, demos with the OpenAI product and engineering staff and developer spotlights. Registration will cost $450 (or $0 through scholarships available for eligible attendees), with applications to close on August 15.

OpenAI appears to have lost its technical lead in the generative AI race in recent months, opting to hone and fine-tune its tools as it trains the successor to its current leading models, GPT-4o and GPT-4o mini. The company has refined approaches to improving the overall performance of its models and preventing those models from going off the rails as frequently as they previously did.

OpenAI's models, like most generative AI models, are trained on massive collections of web data, which many creators choose to keep private for fear of being plagiarized or not receiving credit or payment. Originality.AI data shows that more than 35% of the world's top 1,000 websites now block OpenAI's web crawler. And around 25% of data from “high-quality” sources has been restricted from the major datasets used to train AI models, a study by MIT’s Data Provenance Initiative found.

Should the current access-blocking trend continue, the research group Epoch AI predicts that developers will run out of data to train generative AI models between 2026 and 2032. 

OpenAI is said to have developed a reasoning technique that could improve its models’ responses on certain questions, particularly math questions, and the company’s CTO Mira Murati has promised a future model with “Ph.D.-level” intelligence. (OpenAI revealed in a blog post in May that it had begun training its next “frontier” model.) That’s pledging a lot — and there’s high pressure to deliver. OpenAI’s reportedly hemorrhaging billions of dollars training its models and hiring top-paid research staff.