AI workloads require new network build-outs

According to the new AI Networks for AI Workloads report by Dell’Oro Group, spending on switches deployed in AI back-end networks is forecast to expand the Data Center Switch Market by 50 percent.

  • Tuesday, 16th January 2024 Posted 11 months ago in by Phil Alsop

Current data center switch market spending is on front-end networks used primarily to connect general-purpose servers. AI workloads will require a new back-end infrastructure buildout. The competition intensifies between InfiniBand and Ethernet as manufacturers vie for market dominance in AI back-end networks. While InfiniBand is expected to maintain its lead, Ethernet is forecast to make substantial gains, such as 20 revenue-share points by 2027.

“Generative AI applications usher in a new era in the age of AI, standing out for the sheer number of parameters that they have to deal with,” said Sameh Boujelbene, Vice President at Dell’Oro Group. “Several large AI applications currently handle trillions of parameters, with this count increasing tenfold annually. This rapid growth necessitates the deployment of thousands or even hundreds of thousands of accelerated nodes. Connecting these accelerated nodes in large clusters requires a data center-scale fabric, known as the AI back-end network, which differs from the traditional front-end network used mostly to connect general-purpose servers.

“This predicament poses the pivotal question: what is the most suitable fabric that can scale to hundreds of thousands and potentially millions of accelerated nodes while ensuring the lowest Job Completion Time (JCT)? One could argue that Ethernet is one speed generation ahead of InfiniBand. Network speed, however, is not the only factor. Congestion control and adaptive routing mechanisms are also important. We analyzed AI back-end network build-outs by the major Cloud Service Providers (such as Google, Amazon, Microsoft, Meta, Alibaba, Tencent, ByteDance, Baidu, and others) as well as various considerations driving their choices of the back-end fabric to develop our forecast,” continued Boujelbene.

Additional highlights from the AI Networks for AI Workloads Report:

AI networks will accelerate the transition to higher speeds. For example, 800 Gbps is expected to comprise the majority of the ports in AI back-end networks by 2025, within just two years of the latest 800 Gbps product introduction.

While most of the market demand will come from Tier 1 Cloud Service Providers, Tier 2/3 and large enterprises are forecast to be significant, approaching $10 B over the next five years. The latter group will favor Ethernet.

Beacon, NY, Dec 20, 2024– DocuWare unveils its AI-powered Intelligent Document Processing (DocuWare IDP), bringing about unprecedented improvements...
85% of IT decision makers surveyed reported progress in their companies’ 2024 AI strategy, with 47% saying they have already achieved positive ROI.

MSPs will invest in more AI security forecasting

Posted 4 days ago by Phil Alsop
Predictive maintenance and forecasting for security and failures will be a growing area for MSPs with an interest in security, says Nicole Reineke,...

Machine identities next big target for cyberattacks

Posted 5 days ago by Phil Alsop
Venafi has published the findings of its latest research report: The Impact of Machine Identities on the State of Cloud Native Security in 2024....
Nearly 50% of organisations have experienced a security breach in the last two years.

IT professionals recognise lack of gender diversity

Posted 6 days ago by Phil Alsop
The majority (87 percent) of IT professionals agree that there is a lack of gender diversity in the sector, yet less than half (41 percent) of...

A moving landscape for MSPs

Posted 1 week ago by Phil Alsop
2025 predictions from Ranjan Singh, chief product officer at Kaseya.

Data breach epidemic takes its toll

Posted 1 week ago by Phil Alsop
New study by Splunk shows that a significant number of UK CISOs are stressed, tired, and aren’t getting adequate time to relax.