FitMyLLM
GPU memory calculator·byKarthic
EasyAdvanced
Find out how much GPU memory your AI model aka LLM (Large Language Model) needs before you buy hardware or rent cloud GPUs.
LLM Model Parameter
1. Choose a model family
?2. Choose a size
?Precision?
Training mode?
Running the model only (no learning).
Memory Estimate
?23.3 GBinference
✅ Fits on NVIDIA RTX 3090 24 GB0.7 GB spare
Weights
Weights18.4 GB77%
KV cache2.5 GB10%
Activations1.2 GB5%
Overhead1.1 GB5%
Spare0.7 GB3%
Hardware Recommendations?
1x40.0 GB
58%
1x48.0 GB
49%
1x48.0 GB
49%
1x80.0 GB
29%
1x80.0 GB
29%
1x141.0 GB
17%
1x192.0 GB
12%
1x288.0 GB
8%