Chutes

Node Growth Analytics

GPU node growth patterns over the last 30 days

3.34K
↘ -19.4% vs yesterday

GPU Node Counts

Below are the current counts of GPU nodes on the Chutes platform, with various statistics on the GPUs themselves, and summaries of the total compute currently available.

GPU Node Table

Below are the current counts of GPU nodes on the Chutes platform, with various statistics on the GPUs themselves, and summaries of the total compute currently available.

GPU Name Provisioned Memory CUDA Cores Memory Bandwidth FP32 Performance Max Power
NVIDIA H200 141GB SXM 2046 141 GB HBM3e 16,896 4.8 TB/s 67 TFLOPS 700 W
NVIDIA L40 478 48 GB GDDR6 (ECC) 18,176 864 GB/s 90.5 TFLOPS 300 W
NVIDIA A100 40GB PCIe 322 40 GB HBM2 6,912 1.6 TB/s 19.5 TFLOPS 250 W
NVIDIA L40S 265 48 GB GDDR6 (ECC) 18,176 864 GB/s 91.6 TFLOPS 350 W
NVIDIA GeForce RTX 3090 245 24 GB GDDR6X 10,496 936 GB/s ~35.7 TFLOPS 350 W
NVIDIA RTX A6000 56 48 GB GDDR6 (ECC) 10,752 768 GB/s 38.7 TFLOPS 300 W
NVIDIA H100 80GB SXM 56 80 GB HBM3 16,896 3.35 TB/s 67 TFLOPS 700 W
NVIDIA GeForce RTX 4090 32 24 GB GDDR6X 16,384 1008 GB/s ~83 TFLOPS 450 W
32 N/A N/A N/A N/A N/A
NVIDIA B200 Blackwell 24 192 GB HBM3E 18,000 8 TB/s 160 TFLOPS 700 W
NVIDIA A40 18 48 GB GDDR6 (ECC) 10,752 696 GB/s 37.4 TFLOPS 300 W
NVIDIA A100 80GB SXM 16 80 GB HBM2e 6,912 2.039 TB/s 19.5 TFLOPS 400 W
NVIDIA A100 80GB PCIe 8 80 GB HBM2e 6,912 1.935 TB/s 19.5 TFLOPS 300 W
NVIDIA H100 80GB PCIe 8 80 GB HBM3 16,896 3.35 TB/s 60 TFLOPS 350 W
NVIDIA RTX 6000 Ada Generation 4 48 GB GDDR6 (ECC) 18,176 960 GB/s 91.1 TFLOPS 300 W
NVIDIA RTX A4000 1 16 GB GDDR6 (ECC) 6,144 448 GB/s 19.2 TFLOPS 140 W
0 N/A N/A N/A N/A N/A
Total 3611 - 55,949,184 - - 1,940,140 W
Chutes

Serverless AI Compute