AI server 4 GPU

23 786 132 339 

Performance-optimised single-socket AMD EPYC 4th generation based servers with up to four double-slot NVIDIA cards and a wide range of configuration options in space saving 2U form factor.

Description

// solution overview

AI Server 4 GPU

Our AI Servers are the perfect solution for any of your HPC or AI applicatons. Based on ASUS ESC and RS series, we can provide you solution built and tuned exactly for your needs, so you won’t have to worry about underpowering your project or getting excessive hardware.

With the latest AMD CPUs and the worlds most powerful GPU accelerators from NVIDIA, you can start your HPC & AI journey right away.

performance
configurability
TCO
premium support
// Latest Technology

Powered by the latest
AMD EPYC™ Generation CPUs

AMD EPYC
On AMD EPYC Bergamo CPUs​
Up to 1 cores
// Enterprise Design

Professional Solution
For AI Workloads

GPU-accelerated

GPU-accelerated AI Servers are suitable for a wide range of computing tasks due to their parallel capabilities and high performance.

QA & Testing

Turn to our experts to perform comprehensive, multi-stage testing and auditing of your software.

Reliable Hardware

We use only professional enterprise server components from renowned global manufacturers that are designed for 24/7 operation.

Best Performance

With the latest AMD processors and the world's most powerful graphics accelerators from NVIDIA, AI Servers deliver the highest performance.

AI Software Stack

Our software portfolio covers dozens of AI use cases – just download application container to our AI Server and start to develop you own AI application.

Premium Support

All our AI servers come with 3 years warranty and 9x5 remote support. DGX systems offers premium support on both the HW and SW stack.

// GPU selector

Quick guide

Not sure how to choose the right GPU card? Try our Quick guide or full GPU Selector.

// Extreme Performance

Computing on the most
powerful NVIDIA GPUs

NVIDIA L40
GPU memory per card
Up to 1 GB
WorkloadsGPUAI TrainingAI InferenceHPCRenderingVirtual workstationVirtual desktop (VDI)
ComputeH100
A100
A30
Compute / GraphicsL40S
L40
A40
A10
A16
Compute / Graphics (SFF*)L4
A2
*SFF = Small Form Factor
AI server SW stack
// AI software

We Provide Full
AI software stack

01.
GPU server

You can configure your GPU based server in the AI server configurator based on your needs or we can advise you the best configuration upon request and needs.

02.
Operating system

AI server comes with pre-installed Ubuntu Linux operating system and Docker engine to run application containers on top of your GPU server.

03.
NVIDIA GPU Operator

We provide you several components – Nvidia driver, Nvidia container runtime and Nvidia GPU device plug-in for Kubernetes in the AI software stack layer.

04.
Kubernetes

Kubernates is open-source platform for orchestration of application containers. You can easily deploy, scale, and easily manage your entire AI infrastructure.

05.
NGC applicatioin containers

NVIDIA NGC is public catalog of the most used AI frameworks, GPU accelerated applications, AI models or workflows. You can start to run your AI workload right after delivery of your AI server. NGC catalog is regurarly updated to always get the latest and compatible software components for highest performance of your AI applications.

// Drop us a line! We are here to answer your questions 24/7

NEED A CONSULTATION?

Specification

Chassis

ASUS ESC4000A-E12

CPU family

4th Generation AMD EPYC 9004 Series Processors

CPU

AMD EPYC 9124 16C 200W 3.0GHz, AMD EPYC 9354 32C 280W 3.25GHz, AMD EPYC 9454 48C 290W 2.75GHz, AMD EPYC 9654 96C 360W 2.4GHz

RAM configuration

24x DIMM slots, DDR5 4800/4400/4000/3600 RDIMM/ 3DS RDIMM, Maximum 6144GB

RAM

384GB DDR5 4800MHz (12x 32GB), 768GB DDR5 4800MHz (12x 64GB)

Drive bays

2 x 2.5? & 4 x 3.5? Hot-swap Storage Bays (NVMe / SAS / SATA)

SSD for boot

2x 960GB SSD

NVMe capacity

3,84 TB NVMe SSD, 7,68 TB NVMe SSD, 15,34 TB NVMe SSD

Expansion slots

Rear slots: 4 x PCIe x16 slots for dual-slot cards or 8 x PCIe x16 slots for single-slot cards Front slots: 1 x PCIe x8 slot replacing drive bay

Max. number of GPUs

4

GPU

2x NVIDIA L40s 48GB, 2x NVIDIA H100 80GB, 4x NVIDIA L40s 48GB, 4x NVIDIA H100 80GB, 8x NVIDIA L4 24GB

Networking

2 x GbE LAN ports (Intel I350 Controller) 1 x Management Port

Power supply

1+1 Redundant 2600W 80 PLUS Titanium Power Supply

Form factor

Rack, 2U

AI software stack

Operating system ? Ubuntu Linux NVIDIA GPU drivers Docker engine Connection to NVIDIA NGC Catalog

Reviews

There are no reviews yet.

Be the first to review “AI server 4 GPU”

Your email address will not be published. Required fields are marked *

You may also like…