HPE Digs Into AIOps with Aruba Central LLMs

In addition, Verizon Business will offer Aruba Central as a managed service.

James Anderson, Senior News Editor

March 26, 2024

3 Min Read

HPE is adding large language models (LLMs) to enhance the search features of its Aruba Central network management platform and to improve its AIOps capabilities.

The vendor on Tuesday announced a generative AI-based search engine designed to help network engineers and channel partners more efficiently monitor and troubleshoot problems on their networks. The "LLM-based" search engine will go live in April with different licensing tiers. Customers can also buy it as a separate SaaS product.

HPE executives say the generative AI capabilities join existing machine learning that already was helping Aruba users leverage network insights.


"Modern networking customers demand security-first, AI-powered insights into their critical infrastructure, and that's what we're delivering," said David Hughes, chief product officer at HPE Aruba Networking. "HPE continues its strong history of AI innovation with this bold move and HPE Aruba Networking Central's new approach for deploying multiple LLM models to embrace the capabilities of GenAI."

Developing the Large Language Models

Executives also boast that the large language models have trained on an extensive data lake in the HPE GreenLake cloud platform, as well as 3 million phrases and questions related to Aruba network jargon.

"The generative AI is specifically built around multiple LLMs that we have purposefully decided to train and tune and host on our own data lake," said Alan Ni, Aruba's senior director of edge marketing.


HPE said training sets for modeling outnumber rival cloud platforms by up to tenfold.

However, leveraging data from a customer-facing platform such as GreenLake came with a caveat. HPE Aruba says the "sandboxed" LLMs" scrub out personal and customer identifiable information (PII/CII).

"This is not based off an external API – a ChatGPT implementation. We're hosting this internally on models that we've trained," NI said. "With this sort of architecture, no PII or CII is shared beyond the specific customer tenants on the Central platform."

HPE claims the LLMs increase increase document classification accuracy by 55%.

HPE's AI Play(s)

Networking providers like HPE have been identifying the swim lanes where they will seek to capitalize on the multitrillion-dollar opportunity around generative AI.

Many of these vendors are expanding their computing portfolios to add the processing power generative AI requires. HPE, for example, paired its PE ProLiant Compute DL380a server with Nvidia GPUs and DPUs.

Ni said adding the LLMs to Aruba's popular Central platform helps HPE play "both sides of the coin," on generative AI.

"There's this case of AI networking; how are we applying AIOps and AI technology to run critical infrastructure more efficiently," Ni told Channel Futures. "And there's the broader HPE portfolio – which is networking, compute, storage, networking – to run AI and business workloads. We're committed to serve both."

Juniper Networks, which HPE is set to acquire, has also bolstered its AIOps features.

Verizon-HPE Partnership

Meantime, Verizon Business and HPE Aruba expanded their collaboration. Now Verizon customers can buy HPE Aruba Networking Central as a managed service.

HPE Aruba and Verizon already were partnering around software-defined wide area networking, going back to the Silver Peak business that HPE bought.

Read more about:


About the Author(s)

James Anderson

Senior News Editor, Channel Futures

James Anderson is a news editor for Channel Futures. He interned with Informa while working toward his degree in journalism from Arizona State University, then joined the company after graduating. He writes about SD-WAN, telecom and cablecos, technology services distributors and carriers. He has served as a moderator for multiple panels at Channel Partners events.

Free Newsletters for the Channel
Register for Your Free Newsletter Now

You May Also Like