ANALYSIS: AI insiders predict a shift away from massive, expensive models like ChatGPT toward smaller, specialized AI agents that handle specific tasks. These compact models cost less to develop and can run on laptops instead of data centers. Companies are investing in focused AI tools that excel at particular jobs rather than general-purpose platforms, making AI more accessible and efficient.
HSBC’s recent analysis of the financial challenge facing OpenAI shows how massive the scale of the company’s thinking is. It already claims revenues of $20 billion. It has committed to $1.4 trillion to build out the new data centers that will feed its ChatGPT interface. And even if it can generate $200 billion–plus in revenues by 2030, it will still need a further $207 billion in funding to survive.
Those are massive sums.
But a dozen or so AI insiders who talked to Fortune recently at Web Summit in Lisbon described a different future for AI. That future, they say, is characterized by much smaller AI operations, often revolving around AI “agents” that perform specialized, niche tasks, and thus do not need the gargantuan large-language models that underpin OpenAI, or Google’s Gemini, or Anthropic’s Claude.







