Will It Run AI
CalculatorModelsHardwareCompare
Product
  • Calculator
  • Compare
  • Tier List
Browse
  • Models
  • Hardware
  • Docs
About
  • Why It Works
  • What's New
  • Legal Notice
  • Privacy Policy

All estimates are approximations based on mathematical models and public specifications. Actual performance may vary. Do not make purchasing decisions based solely on these estimates.

Data sourced from Hugging Face, Ollama, and official model documentation. Model names and logos are trademarks of their respective owners.

© 2026 Will It Run AI — Fase Consulting Ibiza, S.L. (NIF: B57969656)

Home/CogVLM2 19B/on RTX 3090 Ti 24GB

Can it run?

Can RTX 3090 Ti 24GB run CogVLM2 19B?

BGood

Runs well

Using Q4_K_M in llama.cpp

Capabilities:

Fit status

Runs well

Decode

61.7 tok/s

TTFT

3135 ms

Safe context

8K

Memory

17.9 GB / 24.0 GB

Memory breakdown

Weights11.6 GB
KV Cache3.0 GB
Runtime0.9 GB
Headroom2.4 GB

Performance by workload

WorkloadGradeFitDecodeTTFTContext
Agentic CodingCTight fit61.7 tok/s4560 ms8K
ChatBRuns well61.7 tok/s1710 ms8K
CodingBRuns well61.7 tok/s3135 ms8K
RAGCTight fit61.7 tok/s5700 ms8K
ReasoningBRuns well61.7 tok/s3705 ms8K

Quantization options

How CogVLM2 19B (19B params) fits at each quantization level on RTX 3090 Ti 24GB (24.0 GB usable).

QuantBitsVRAMQualityFit
Q2_K
2
7.4 GB
LowD36
Q3_K_S
3
9.3 GB
LowD37
NVFP4
4
10.6 GB
MediumD39
Q4_K_M
4
11.6 GB
MediumD39
Q5_K_M
5
13.7 GB
HighC41
Q6_KBest for your GPU
6
15.6 GB
HighC43
Q8_0
8
20.3 GB
Very HighC44
F16
16
38.9 GB
MaximumF0

Get started

HuggingFace
huggingface-cli download cogvlm2-19b
See all results for RTX 3090 Ti 24GBSee all hardware for CogVLM2 19B