Will It Run AI
CalculatorModelsHardwareCompare
Product
  • Calculator
  • Compare
  • Tier List
Browse
  • Models
  • Hardware
  • Docs
About
  • Why It Works
  • What's New
  • Legal Notice
  • Privacy Policy

All estimates are approximations based on mathematical models and public specifications. Actual performance may vary. Do not make purchasing decisions based solely on these estimates.

Data sourced from Hugging Face, Ollama, and official model documentation. Model names and logos are trademarks of their respective owners.

© 2026 Will It Run AI — Fase Consulting Ibiza, S.L. (NIF: B57969656)

Home/Kimi K2.5/on NVIDIA H100 80GB

Can it run?

Can NVIDIA H100 80GB run Kimi K2.5?

FWon't run

Too heavy

Using Q4_K_M in vLLM

Capabilities:

Fit status

Too heavy

Decode

14.2 tok/s

TTFT

13665 ms

Safe context

4K

Memory

625.4 GB / 80.0 GB

Memory breakdown

Weights610.0 GB
KV Cache5.0 GB
Runtime2.4 GB
Headroom8.0 GB

Performance by workload

WorkloadGradeFitDecodeTTFTContext
Agentic CodingFToo heavy14.2 tok/s19876 ms4K
ChatFToo heavy14.2 tok/s7453 ms4K
CodingFToo heavy14.2 tok/s13665 ms4K
RAGFToo heavy14.2 tok/s24845 ms4K
ReasoningFToo heavy14.2 tok/s16149 ms4K

Quantization options

How Kimi K2.5 (1000B params) fits at each quantization level on NVIDIA H100 80GB (80.0 GB usable).

QuantBitsVRAMQualityFit
Q2_K
2
390.0 GB
LowF0
Q3_K_S
3
490.0 GB
LowF0
NVFP4
4
560.0 GB
MediumF0
Q4_K_M
4
610.0 GB
MediumF0
Q5_K_M
5
720.0 GB
HighF0
Q6_K
6
820.0 GB
HighF0
Q8_0
8
1070.0 GB
Very HighF0
F16
16
2050.0 GB
MaximumF0
See all results for NVIDIA H100 80GBSee all hardware for Kimi K2.5