Google Cloud, AWS Lead Cloud Providers’ Gen AI Arms Race

Research firm Omdia says Google Cloud and AWS stand at the front of the pack.

Kelly Teal, Contributing Editor

May 9, 2024

3 Min Read
Generative AI arms race
Deemerwha studio/Shutterstock

The gen AI arms race has turned into a no-holds-barred battle for technological and customer supremacy among the hyperscalers.

New research from Omdia, part of Informa Tech, Channel Futures' parent company, shows that Google Cloud leads in technology, while Amazon Web Services excels in price. Microsoft Azure, meanwhile, seems to be focused elsewhere, analysts said this week. 

“GCP benefits from Google’s status as a powerhouse of fundamental AI research, while Amazon Web Services benefits both from the enormous scale of its existing business and its excellence in day-to-day operations,” analysts wrote on May 8, pointing to newly published insights.

“Customers looking to adopt the latest technology will be best served by GCP, while those focused on price will be best served by AWS. However, Microsoft Azure seems to be concentrating on satisfying OpenAI’s appetite for capacity," they said.

At first, in 2023, Microsoft appeared to hold the lead — the company effectively kicked off the gen AI arms race with the launch of ChatGPT via OpenAI, in which it had been investing for some time. Then Google Cloud came along with Bard. Then AWS picked up the slack with the introduction of Bedrock.

But the gen AI arms race really ramped up in December when Google Cloud debuted Gemini, which replaced Bard. Since then, Google Cloud has showcased even more advancements around Gemini — and seen its revenue rise 28% (the most of all three hyperscalers) in its latest earnings round as a result. Now, according to some reports this week, the division of Alphabet is hiring hundreds of salespeople and engineers all dedicated to powering AI adoption.

Related:Microsoft AI Cloud Partners Get New Designations, Benefits, More

Google Cloud isn’t the only hyperscaler to benefit from its gen AI rollouts, of course. AWS revenue hit $25 billion in the most recent quarter; that marked a 17% jump, though that put the world’s largest public cloud computing provider behind its peers in terms of January-March sales. While Microsoft doesn't disclose specific numbers around Azure or ChatGPT, its “Microsoft Cloud” group reported a 23% year-over-year revenue increase for the first quarter, reaching $35.1 billion.

Inference: A Key Part of the Gen AI Arms Race

There’s more to the gen AI arms race, though, than just numbers, as Omdia analysts pointed out. The strategy behind each hyperscalers’ success largely comes down to how they serve inference, per Omdia. That’s the process of generating content or answers from an AI model once its training is complete. 

By definition, analysts explained, inference is required when an AI application goes into production, with demand driven by end users. Therefore, inference represents the intersection of AI projects and practical reality, as Omdia put it. Analysts then made the following conclusion: As more AI applications hit the production stage, inference will account for a growing share of overall AI computing demand.

Related:AWS Keeps ‘Strong Lead’ in Cloud Infrastructure

“The competition in this sector is intense,” said Alexander Harrowell, principal analyst for advanced computing at Omdia. “Google has an edge related to its strength in fundamental AI research, while AWS excels in operational efficiency, but both players have impressive custom silicon.”

Omdia's Alexander Harrowell

Microsoft, on the other hand, “took a very different path,” Harrowell said.

At first, the company focused on field-programmable gate arrays. In essence, these can be tweaked to meet specific functions (go here for a deeper tutorial). Now, however, Microsoft is “pivoting urgently to custom chips,” Harrowell said.

Even so, Microsoft, along with Cloud Cloud, remains “considerably behind” AWS when it comes to CPU inference, Harrowell added. 

“AWS’ Graviton 3 and 4 chips were clearly designed to offer a strong AI inference option and staying on a CPU-focused approach is advantageous for simplifying projects,” he explained.

Without a doubt, the gen AI arms race will continue to heat up throughout 2024, and not just among the big three cloud computing providers. Smaller companies throughout the channel are hot and heavy on gen AI, too. Read more here.

About the Author

Kelly Teal

Contributing Editor, Channel Futures

Kelly Teal has more than 20 years’ experience as a journalist, editor and analyst, with longtime expertise in the indirect channel. She worked on the Channel Partners magazine staff for 11 years. Kelly now is principal of Kreativ Energy LLC.

Free Newsletters for the Channel
Register for Your Free Newsletter Now

You May Also Like