Connect with us

Hi, what are you looking for?

Stock News

The AI race heats up: Which model will dominate?

Advances in AI arrive faster than users can track them.

The release of Google’s Gemini 3, the billion-dollar partnerships around Anthropic’s Claude, and fresh updates from OpenAI have condensed years of progress into a single season.

Every major tech company is betting on larger models, new chips, and aggressive cloud expansion.

But the real issue is whether better models are enough to capture billions of users, rewire entire industries, and support the economic expectations placed on the AI boom.

A market expanding faster than expectations

AI is now expanding from a specialist technology and spreading into healthcare, retail, finance, advertising, and enterprise operations.

Bloomberg Intelligence projects that generative AI alone could generate around $1.8 trillion in annual revenue by 2032, equal to up to 16% of global technology spending.

This is accompanied by a shift in infrastructure.

AI workloads already represent more than 20% of global server revenue, with forecasts pointing toward 40% in the coming years.

Source: Bloomberg Intelligence

The sector is also affecting hardware demand. Training large models created the first spike in spending, but inference, the everyday use of AI in real applications, is becoming the more persistent driver.

This change matters because inference workloads do not arrive in waves.

They run continuously in customer service systems, productivity tools, advertising engines, and coding assistants.

Businesses see this as the start of a long infrastructure cycle rather than a brief surge tied to a handful of research labs.

Can Gemini catch ChatGPT?

Google just announced the launch of Gemini 3, and it shows how intense the competition has become.

Gemini 3 posted record scores on major reasoning benchmarks and introduced new coding and agentic capabilities.

Source: Google

The model is integrated directly into Search, the Gemini app, Workspace, and Google’s AI Mode.

Google reported 650 million monthly active users for Gemini and over 2 billion for AI Overviews inside Search.

On paper, this gives the company a distribution footprint that matches or even exceeds ChatGPT.

However, improving a model is not the same as changing user behaviour.

ChatGPT recently reached 800 million weekly users and remains the most familiar AI interface for many.

The product’s strength is not simply the model behind it but the ecosystem around it.

Millions of people use ChatGPT inside GitHub Copilot, Windows, Microsoft 365, and dozens of third-party tools built specifically around OpenAI models.

Changing habits is hard, even when a competitor is technically better in some areas.

This creates a subtle dynamic. Google can match or surpass ChatGPT at the model level, but that alone does not guarantee a migration of users.

Model improvements matter, but they do not work in isolation.

Switching depends on deeper economic and workflow factors such as lower cost for long-context tasks, smoother enterprise integration, or unique features that reduce a company’s operational burden.

People will move if the gain is clearly visible in their daily work, not because a benchmark score has increased.

Consolidation, capital and the shadow of a bubble

The financial scale around AI has grown large enough to attract questions from investors. Some deals look circular. A model company raises capital from a cloud provider, then spends that capital on the cloud provider’s compute services.

This is one of the reasons analysts warn that parts of the sector may be inflating faster than enterprise budgets can absorb.

At the same time, several companies are still posting strong adoption numbers.

Anthropic told investors it has more than 300,000 business and enterprise customers and is projecting to double or even triple revenue to around $26 billion next year.

Google counts 13 million developers using Gemini as part of their workflow.

Although these figures show genuine traction, they also highlight how expensive it is to expand at this level. This tension between real usage and oversized expectations creates an environment that looks similar to the early internet cycle.

Capex is heavy, valuations are high, and some companies will overextend.

The underlying trend, however, is unlikely to reverse.

AI is already becoming part of everyday software and infrastructure, regardless of which companies dominate the final lineup.

Is NVIDIA vulnerable in a world of custom silicon

NVIDIA’s position at the center of training workloads has sparked concerns about long-term risk as Google, Amazon, and others promote their own chips.

Google’s TPU line and Amazon’s Trainium and Inferentia chips are signs of a bigger trend toward custom silicon designed for specific workloads.

The recent Anthropic partnership complicates the picture further.

Despite Google’s heavy investment in its own chips, Anthropic committed up to one gigawatt of compute capacity using NVIDIA’s Grace Blackwell and Vera Rubin systems, supported by up to $10 billion of investment from NVIDIA and $5 billion from Microsoft.

In the near term, NVIDIA is not losing ground.

The demand for compute is expanding so quickly that both GPUs and custom accelerators are needed at once.

The risk for NVIDIA appears further out.

If inference becomes the dominant workload and hyperscalers shift most of that traffic to their own internal chips, NVIDIA could face pressure on margins rather than volumes.

Training might remain profitable, but the mix of workloads would move away from NVIDIA’s most lucrative segment.

This does not mean NVIDIA is in trouble today. The next three to five years look stable because the appetite for training runs and general computing remains enormous.

The competitive threat is more about long-term pricing power.

Hyperscalers want to avoid dependence on a single supplier.

They are investing in their own silicon, not to eliminate NVIDIA but to negotiate from a stronger position.

Source: Bloomberg

What will decide the next winners

The AI race is entering a phase where leadership depends on more than model breakthroughs.

Distribution is extremely important, especially the ability to deliver AI features to hundreds of millions of users through platforms people already use every day.

Integration matters as well, because companies want tools that slip into their existing workflows without disruption.

There is no single path to dominance.

Google can leverage Search and Android. Microsoft has Windows, Office, and GitHub.

Anthropic is carving a position among enterprises that want reliable and transparent models.

OpenAI continues to drive consumer mindshare through ChatGPT.

The field is shaped by overlapping alliances and rivalries where partners are often competitors as well.

The next phase of the AI economy will not reward companies for the size of their models alone.

It will reward those who can connect intelligence to real tasks, real decisions and real economic value.

Model power is becoming abundant. What remains scarce is the ability to turn that power into durable behaviour, sticky workflows, and long-term demand.

The post The AI race heats up: Which model will dominate? appeared first on Invezz

You May Also Like

Investing News

Amplia Therapeutics Limited (ASX: ATX), (“Amplia” or the “Company”), is pleased to announce important new data from our ongoing ACCENT clinical trial in pancreatic...

Stock News

The Opendoor stock price has crashed in the past few months and is now hovering near its all-time low. It has become a penny...

Investing News

Tolu Minerals Limited (“Tolu”) is pleased to announce the granting of its Ipi River tenement EL 2780 (Figure 1) covering 395.56 km2 of highly...

Politics News

It’s hard enough to know what you want. It’s even harder to know what others want. But what if what you want hinges on...