South Minneapolis News

collapse
Home / Daily News Analysis / Aria Networks raises $125M and debuts its approach for AI-optimized networks

Aria Networks raises $125M and debuts its approach for AI-optimized networks

Apr 19, 2026  Twila Rosenbaum  5 views

Aria Networks has successfully raised $125 million in funding to introduce its groundbreaking Deep Networking platform aimed at revolutionizing AI-optimized networks. Founded in January 2025 by Mansour Karam, a former founder of Apstra, a vendor in intent-based networking acquired by Juniper Networks in 2020, Aria Networks is setting a new standard in the networking industry.

The Deep Networking platform departs from traditional switch-centric networking models, instead adopting a path-centric approach that emphasizes microsecond telemetry. This innovative direction allows the platform to treat the network as an active participant in enhancing AI cluster performance. It integrates purpose-built switching hardware, a hardened version of SONiC, fine-grain telemetry sourced from switches, transceivers, and host NICs, along with intelligent agents that function at each layer of the networking stack.

“For AI to be truly effective, it must be tailored to this specific domain, necessitating a ground-up architecture optimized for AI,” Karam explained.

Functionality of Deep Networking

The Deep Networking platform's core functionality is to actively engage the network in improving AI performance through real-time telemetry collected at the ASIC level. Unlike traditional network monitoring tools like NetFlow, which gather data post-event and at a coarse resolution, Aria's system captures telemetry data in real-time with microsecond granularity.

“We have embedded code directly within the ASIC, on the ARM processors, to extract telemetry,” Karam noted, highlighting the platform's unique capabilities.

This embedded telemetry enables adaptive tuning for Dynamic Load Balancing parameters, Data Center Quantized Congestion Notification (DCQCN), and failover logic, all without waiting for a threshold breach or manual intervention. The architecture of the platform is layered, with agents reacting to link-level events in microseconds and making strategic decisions regarding traffic flow at higher levels. At the cloud layer, a large language model-based agent provides operators with insights in natural language, allowing them to inquire about specific jobs or alert conditions and receive context-aware responses.

Karam emphasized that simply integrating an LLM into existing architectures does not yield optimal results. “If you ask it to do anything without context, it could hallucinate and disrupt network operations,” he cautioned.

Additionally, Aria offers an MCP server, enabling external systems, such as job schedulers and LLM routers, to directly query the network state and incorporate it into their decision-making processes.

New Metrics for Networking

Contrary to traditional networking metrics that focus on bandwidth and latency, Aria's platform emphasizes Model FLOPS Utilization (MFU) and token efficiency. MFU is defined as the ratio of achieved FLOPS per accelerator to the theoretical peak. Karam pointed out that MFU for training workloads typically falls between 33% and 45%, with inference often below 30%.

“The network significantly impacts MFU and token efficiency, as it interacts with every component in the cluster,” Karam stated, establishing the critical connection between networking performance and overall operational efficiency.

Token efficiency is quantified through metrics such as tokens consumed per dollar or tokens produced per unit time, and Aria posits that both of these metrics are heavily influenced by network performance. For instance, a malfunctioning NIC in a large cluster can reduce MFU significantly during operations, while inadequate congestion settings can lead to sustained underperformance.

Aria's modeling indicates that a mere 3% improvement in MFU across a 10,000-XPU cluster could equate to approximately $49.8 million in additional annual revenue, or a 7.9% revenue increase at current token pricing.

Switch Portfolio Overview

Aria's hardware offerings include a range of switches built on Broadcom ASICs, running a standards-based, hardened SONiC implementation. The product lineup features:

  • Aria Switch 800G: Utilizing the 51.2T Broadcom Tomahawk 5 ASIC, it includes 64 x 800G OSFP ports with support for DSP, LRO, and LPO optics.
  • Aria Switch 1.6T High Radix: A 4RU air-cooled model based on the 102.4T TH6 ASIC, featuring 128 x 800G OSFP ports.
  • Aria Switch 1.6T: A 2RU unit available in EIA 19 and ORV3 form factors, supporting both air and full liquid cooling, with 64 x 1.6T OSFP ports.

Future Directions and Customer Engagement

Aria Networks is also embedding forward deployed engineers (FDEs) with customers from the initial deployment phase. Karam explained that this model is distinct from traditional professional services.

“Everything the forward deployed engineers do ultimately gets engineered back into the products,” he stated, emphasizing the alignment of FDEs with product development. This approach allows real customer environment data to be continuously fed back into the platform, driving improvements in both agent capabilities and software update frequency, which Aria aims to achieve weekly, contrasting with the semi-annual or annual cycles of traditional vendors.

Karam concluded, “By integrating this intelligence, we aim to enhance the breadth and capabilities of our solution while ensuring it remains secure and reliable. Our primary goal is to keep networks operational at all times.”


Source: Network World News


Share:

Your experience on this site will be improved by allowing cookies Cookie Policy