Data Center

Lenovo, AMD broaden AI options for customers | TechTarget

Lenovo is expanding AI infrastructure options with a line of servers and an HCI stack that push beyond support of Nvidia and Intel AI accelerators to now include the AMD GPUs.

The Lenovo Think System SR685a V3 server uses the latest generation of AMD EPYC CPUs and is the first to use AMD Instinct MI300X GPUs for compute-intensive workloads, including large language model (LLM) training. The SR685a is aimed at both private data center and public cloud deployments.

For the edge, Lenovo introduced the ThinkAgile MX455 V3 Edge Premier Solution, an HCI device built to be used with Azure Stack HCI for AI compute and storage on premises and in the cloud. The vendor also introduced its compact, half-width edge server, the ThinkSystem SD535 V3, which combines AMD and Intel architecture in a single chassis for the first time.

While Lenovo is looking to compete with Dell and HPE on hybrid AI offerings, it might have an advantage, according to Steven Dickens, an analyst at Futurum Group.

From an AI perspective, Lenovo has a robust portfolio. They have one of — if not the — most comprehensive end-to-end offerings from ruggedized far end devices right to the core.
Steven DickensAnalyst, Futurum Group

“From an AI perspective, Lenovo has a robust portfolio,” he said. “They have one of — if not the — most comprehensive end-to-end offerings, from ruggedized far-end devices right to the core.”

AI for all, with AMD

Lenovo is releasing multiple products to meet different requirements, according to Kamran Amini, vice president and general manager of server, storage and software-defined infrastructure at Lenovo.

“AI is not one thing — it’s not a one-size-fits-all T-shirt,” he said. “Many types of activities drive AI.”

Part of the vendor’s “AI for all” strategy, the ThinkSystem SR685a V3 is aimed at AI use cases and support the latest AMD EPYC CPUs, up to 8 AMD Instinct MI300X GPUs and drop-in support for future AMD EPYC CPUs. It can also support Nvidia GPUs, including the H100, H200 and B100. Aside from AI, Lenovo is also targeting financial services use cases such as fraud detection and prevention and algorithmic trading strategies.

While cloud providers have massive compute capabilities, processing AI workloads in the cloud can come at a potentially higher cost than on premises, according to Sid Nag, an analyst at Gartner. Cost is one factor, but there are also smaller language models, which have significantly fewer parameters compared with LLMs that can be run on-premises.

“There is an opportunity for vendors like Lenovo and AMD to collaborate and build systems to run models for specific industry niches,” he said.

Lenovo is looking to compete in the AI market against players such as Dell as it races to expand its partnerships with chip makers and grow its AI infrastructure portfolio. Last month, Lenovo worked with Nvidia to add the latter’s high-end GPUs to its Lenovo ThinkSystem SR680a V3 server. Now, it’s turning to AMD for its newest GPUs. Doing so will expand its options on which GPUs it can offer as well as customer options for which GPU they need for a given use case, Nag said.

AI for the edge

Lenovo is bringing a new HCI device and smaller server to the edge for AI inferencing and real-time analytics. The Lenovo ThinkAgile MX455 V3 Edge Premier Solution operates on-premises and in an Azure Stack HCI cluster running on ThinkAgile, Lenovo’s HCI platform line. The two companies offer end-to-end support and lifecycle management as well as Opex and Capex offerings to simplify deployment for customers. The MX455 V3 can house up to six GPUs.

There is a lot of focus from the industry on model training and doing so usually means a lot of compute or using many GPUs, Nag said. But at the edge, training is not the top use case; instead it’s AI inferencing, a trained model performing a function, making a prediction or generating a conclusion.

“Lenovo is focused on inferencing that can be done at the edge with Azure Stack HCI,” he said.

Lenovo is also releasing its ThinkSystem SD535 V3 edge server, the first 2U four-node server to allow both Intel and AMD architecture within the same chassis. This design might help customers move away from legacy blade or modular servers to a more common rack and to optimize applications built on specific CPU technologies, Amini said.

By having both Intel and AMD in the same chassis, Lenovo is combining technologies to offer more benefits, Dickens said.

“This seems to be the best of blade technology and the best of rack servers put together,” he said.

Lenovo is looking past GPUs at other forms of compute as well, Dickens said.

AI services and lots of options

The vendor also launched Lenovo AI Advisory and Professional Services. The services include goal development through Lenovo’s AI Innovators Program, a software partner ecosystem for AI that launched in 2022. As part of its advisory services, the vendor can help customers grow technology options and recommend hardware, as well as provide deployment tools, according to the vendor.

Giving customers options isn’t a bad thing, Dickens said, but more options means more complexity. For example, Dell offers one server, the XE9680, that supports Nvidia and AMD GPUs as well as the new Intel Gaudi 3 AI accelerator.

“Is it too much choice? Probably not,” he said.

But from a go-to market perspective, there should be a way to distill everything down for customers, Dickens said.

Adam Armstrong is a TechTarget Editorial news writer covering file and block storage hardware, and private clouds. He previously worked at