Fill out your specs and we’ll match you with the best H200 GPU servers available worldwide.
| Choose from Multiple Architectures: | |||
|---|---|---|---|
| Model Type | Configuration | Ideal For | Availability |
| H200 SXM8 HGX | 8× H200 GPUs, NVLink | Large-scale AI training | In Stock |
| H200 SXM5 HGX | 4× H200 GPUs, NVLink | AI model fine-tuning, inference | In Stock |
| H200 PCIe | 1–4× GPUs | Scalable deployment & edge AI | Configurable |
| H200 NVLink / NVSwitch | Custom interconnect | Multi-node AI clusters | On Demand |
| H200 HGX Benchmark Results (Sample): | |||
|---|---|---|---|
| Test | H200 SXM8 | H100 SXM8 | Performance Gai |
| GPT-3 Training | 1.0x | 0.54x | +85% |
| BERT Large Training | 1.0x | 0.57x | +75% |
| HPC FP64 Compute | 1.0x | 0.61x | +64% |




| Technical Specifications (Summary) | |||||||
|---|---|---|---|---|---|---|---|
| Model | Interface | GPU Memory | Memory Type | Bandwidth | FP8 TFLOPS | NVLink | Power |
| H200 SXM8 | SXM5 | 141 GB | HBM3e | 4.8 TB/s | 1979 | Yes | 700W |
| H200 SXM5 | SXM5 | 141 GB | HBM3e | 4.8 TB/s | 1979 | Yes | 700W |
| H200 PCIe | PCIe 5.0 | 141 GB | HBM3e | 4.8 TB/s | 1979 | Yes | 600W |
Copyright Notice: © 2026 Blockware. All rights reserved.
Subscribe to get the latest HPC and AI market insider info and deals.