Gathering speed: Chip consolidation drives AI data growth

Alan Weckel
AI robots running

The data center market is about to go through a significant upgrade cycle in technology that has implications for how data center buildout designs evolve. Several new technologies are all going to hit the market at the same time. The industry is moving towards 7nm technology across most of the semiconductor space. At the same time, chiplets are moving front and center as an important building block. 112Gbit/s SerDes, custom-built artificial intelligence (AI) chips, and the adoption of PCI Gen4 are about to be adopted.

All these technologies are a catalyst towards more traffic in the data center and higher speeds networks within the data center and across regions. We’ve seen robust merger activity in the semiconductor space as vendors position themselves for the new market opportunities these technologies enable. NVIDIA is purchasing Mellanox to gain access to a portfolio of high-speed networking and server accelerators to push their AI strategy forward. Xilinx is purchasing Solarflare to accelerate smart and programmable NICs. Intel just announced its intent to purchase Barefoot, increasing its presence in the networking silicon.  

Booming ecosystem

AI will continue to see massive investment dollars over the next several years and move to mainstream technology for enterprises and away from mostly a consumer technology where it’s common today.  Vendors are preparing themselves for this shift by making significant investments internally or through acquisitions as enterprises, both themselves and through cloud providers, look to better use the data they collect. Most AI training is likely to occur in cloud providers, but many customers too are expected to experiment with edge-based AI for both training and inference. Colocation providers will be a prime location for this type of edge computing as it allows the developer to develop on a cloud provider and push that toward the edge. Having colocation providers involved in the edge is another tool and opportunity as customers and providers define and evolve their edge strategies.

During the next 12 to 18 months, cloud providers will begin deploying these next-generation technologies in significant volumes, which will require networks to grow significantly to support the increased traffic that AI generates. I’d expect to see a significant increase in what each server will be able to generate and, as 400Gbit/s becomes more widely available, an increase in the amount of equipment getting deployed for data center interconect (DCI). DCI adoption will increase even more as 112Gbit/s SerDes enters the market; this will allow the market to begin adopting 800Gbit/s.

Alan Weckel

Related articles