Chutes

Node Growth Analytics

GPU node growth patterns over the last 30 days

6.75K
↗ +2.5% vs yesterday

GPU Node Counts

Below are the current counts of GPU nodes on the Chutes platform, with various statistics on the GPUs themselves, and summaries of the total compute currently available.

GPU Node Table

Below are the current counts of GPU nodes on the Chutes platform, with various statistics on the GPUs themselves, and summaries of the total compute currently available.

GPU Name Provisioned Memory CUDA Cores Memory Bandwidth FP32 Performance Max Power
NVIDIA H200 141GB SXM 4048 141 GB HBM3e 16,896 4.8 TB/s 67 TFLOPS 700 W
NVIDIA RTX A6000 806 48 GB GDDR6 (ECC) 10,752 768 GB/s 38.7 TFLOPS 300 W
NVIDIA L40 687 48 GB GDDR6 (ECC) 18,176 864 GB/s 90.5 TFLOPS 300 W
NVIDIA A100 40GB PCIe 256 40 GB HBM2 6,912 1.6 TB/s 19.5 TFLOPS 250 W
NVIDIA H100 80GB SXM 234 80 GB HBM3 16,896 3.35 TB/s 67 TFLOPS 700 W
NVIDIA GeForce RTX 3090 230 24 GB GDDR6X 10,496 936 GB/s ~35.7 TFLOPS 350 W
NVIDIA L40S 212 48 GB GDDR6 (ECC) 18,176 864 GB/s 91.6 TFLOPS 350 W
NVIDIA GeForce RTX 4090 204 24 GB GDDR6X 16,384 1008 GB/s ~83 TFLOPS 450 W
NVIDIA A100 80GB SXM 170 80 GB HBM2e 6,912 2.039 TB/s 19.5 TFLOPS 400 W
NVIDIA B200 Blackwell 112 192 GB HBM3E 18,000 8 TB/s 160 TFLOPS 700 W
NVIDIA L4 Tensor Core GPU 33 24 GB GDDR6 7,680 300 GB/s 30.3 TFLOPS 72 W
NVIDIA RTX 6000 Ada Generation 24 48 GB GDDR6 (ECC) 18,176 960 GB/s 91.1 TFLOPS 300 W
NVIDIA RTX A4000 14 16 GB GDDR6 (ECC) 6,144 448 GB/s 19.2 TFLOPS 140 W
NVIDIA A100 80GB PCIe 8 80 GB HBM2e 6,912 1.935 TB/s 19.5 TFLOPS 300 W
7 N/A N/A N/A N/A N/A
NVIDIA RTX 4000 Ada Generation 1 20 GB GDDR6 (ECC) 6,144 360 GB/s 26.7 TFLOPS 130 W
NVIDIA RTX A5000 1 24 GB GDDR6 (ECC) 8,192 768 GB/s 27.8 TFLOPS 230 W
NVIDIA A10 Tensor Core GPU 0 24 GB GDDR6 9,216 600 GB/s 31.2 TFLOPS 150 W
NVIDIA A100 40GB SXM 0 40 GB HBM2 6,912 1.6 TB/s 19.5 TFLOPS 400 W
NVIDIA A40 0 48 GB GDDR6 (ECC) 10,752 696 GB/s 37.4 TFLOPS 300 W
NVIDIA H100 80GB PCIe 0 80 GB HBM3 16,896 3.35 TB/s 60 TFLOPS 350 W
Total 7047 - 108,917,248 - - 3,916,496 W
Chutes

Serverless AI Compute