FitMyLLM

GPU memory calculator·byKarthic

EasyAdvanced

Find out how much GPU memory your AI model aka LLM (Large Language Model) needs before you buy hardware or rent cloud GPUs.

LLM Model Parameter

1. Choose a model family
?
2. Choose a size
?
Precision?
Training mode?
Running the model only (no learning).

Memory Estimate

?
23.3 GBinference
✅ Fits on NVIDIA RTX 3090 24 GB0.7 GB spare
Weights
Weights18.4 GB77%
KV cache2.5 GB10%
Activations1.2 GB5%
Overhead1.1 GB5%
Spare0.7 GB3%

Hardware Recommendations?

NVIDIA A100 40 GB
1x40.0 GB
58%
NVIDIA A6000 48 GB
1x48.0 GB
49%
NVIDIA L40S 48 GB
1x48.0 GB
49%
NVIDIA A100 80 GB
1x80.0 GB
29%
NVIDIA H100 80 GB
1x80.0 GB
29%
NVIDIA H200 141 GB
1x141.0 GB
17%
NVIDIA B200 192 GB
1x192.0 GB
12%
NVIDIA B300 288 GB
1x288.0 GB
8%