Nokia AI Networking Innovation Lab
Networking Field Day 40
•
15m
The rise of AI has driven the emergence of multiple new network domains, each with distinct roles, architectures, and performance requirements. This presentation explores these new networks and their roles. Patrick McCabe, representing Nokia, builds on the premise that AI is a permanent fixture in the technological landscape, requiring a non-linear evolution of network architecture. He identifies two primary functions, training and inferencing, as the drivers of this change. Training involves massive GPU clusters that scale geometrically and are highly sensitive to packet loss, while inferencing, often pushed to the network edge, prioritizes low latency to serve end users effectively. Together, these functions demand a move away from traditional statistical averages toward a more deterministic approach to network performance.
Architecturally, the shift from north-south to massive east-west traffic patterns within GPU clusters has rendered traditional leaf-spine designs inadequate for AI data movement. McCabe details the emergence of specialized backend networks categorized as scale-up, scale-out, and scale-across. Scale-up handles communication within a single system or server, while scale-out facilitates high-speed interaction between different systems within a data center, a primary focus for the Ultra Ethernet Consortium (UEC). Scale-across is a particularly challenging new frontier, necessitated by the fragmentation of AI clusters across different physical locations, often due to power constraints, requiring advanced routing and data center interconnects to maintain the illusion of a single compute entity over distances of 10 kilometers or more.
The presentation emphasizes that the center of this new universe is the GPU, supported by essential storage networks that feed vast amounts of data to processing units. While the back end deals with the rigors of scale and reliability, the front end remains more traditional, connecting these specialized environments to the outside world and end users. McCabe concludes with an analogy comparing AI to the printing press, suggesting that while AI lowers the cost and scarcity of production it does not replace the human creator. Instead, it shifts the premium value toward innovation, ideas, and judgment, allowing for a radical expansion of who can create within this high-performance infrastructure.
Presented by Patrick McCabe, Solutions Marketing Data Center Networks. Recorded live at Networking Field Day 40 in San Jose on April 8, 2026. Watch the entire presentation at https://techfieldday.com/appearance/nokia-presents-at-networking-field-day-40/ or visit https://TechFieldDay.com/event/nfd40 or https://Nokia.com/ for more information.
Up Next in Networking Field Day 40
-
AI Data Center Nokia Validated Design...
Get an inside look at how Nokia Validated Designs (NVDs) streamline AI-ready data center and networking deployments through proven architectures, rigorous validation, and real-world performance insights. We'll highlight several of our latest AI-focused NVDs, show how partners are extending them, ...
-
Ethernet and its Evolution to Support...
Ethernet continues to evolve to meet the performance and scaling demands of modern AI networking architectures, progressing from RoCEv2 toward innovations driven by the Ultra Ethernet Consortium (UEC). This presentation discusses these requirements and introduces UEC Specification 1.0, with a foc...
-
Nokia Management for AI Data Center N...
Explore the essential management considerations for building and operating multi-tenant AI data center networks. Attendees will learn why abstraction is critical to achieving the scale, speed, and consistency required for AI infrastructure. The presentation will demonstrate how event-driven autom...