Will It Run AI
CalculatorModelsHardwareCompare
Product
  • Calculator
  • Compare
  • Tier List
Browse
  • Models
  • Hardware
  • Docs
About
  • Why It Works
  • What's New
  • Legal Notice
  • Privacy Policy

All estimates are approximations based on mathematical models and public specifications. Actual performance may vary. Do not make purchasing decisions based solely on these estimates.

Data sourced from Hugging Face, Ollama, and official model documentation. Model names and logos are trademarks of their respective owners.

© 2026 Will It Run AI — Fase Consulting Ibiza, S.L. (NIF: B57969656)

Browse AI Models

283 models available

/
Status:
Sort:
Filtered by:
MradermacherMMradermacherBaichuan M3 235B i1
235B0K ctx131.6 GB
denseLegacy

 

MradermacherMMradermacherlogos16v2 stablelm2 1.6b i1
1.6B0K ctx0.9 GB
denseLegacy

 

RichardErkhovRRichardErkhovstabilityai japanese stablelm instruct beta 70b
70B0K ctx39.2 GB
denseLegacy

 

MradermacherMMradermacherSOLAR 10.7B v1.0
10.7B0K ctx6 GB
denseLegacy

 

Srs6901SSrs6901GGUF SOLARized GraniStral 14B 1902 YeAM HCT
14B0K ctx7.8 GB
denseLegacy

 

BartowskiBBartowskibaichuan inc Baichuan M2 32B
32B0K ctx17.9 GB
denseLegacy

 

BartowskiBBartowskiDiscoPOP zephyr 7b gemma
7B0K ctx3.9 GB
denseLegacy

 

BartowskiBBartowskiHelpingAI2 9B
9B0K ctx5 GB
denseLegacy

 

BartowskiBBartowskiai21labs AI21 Jamba2 3B
3B0K ctx1.7 GB
denseLegacy

 

BaichuanBaichuanBaichuan 13B
13B8K ctx7.3 GBlegacy
denseLegacy

Baichuan-13B-Chat为Baichuan-13B系列模型中对齐后的版本,预训练模型可见Baichuan-13B-Base。

InternLMInternLMInternLM Chat 7B
7B8K ctx3.9 GBlegacy
denseLegacy

InternLM has open-sourced a 7 billion parameter base model and a chat model tailored for practical scenarios. The model has the following characteristics: - It leverages trillions of high-quality tokens for training to establish a powerful knowledge base. - It supports an 8k context window length, enabling longer input sequences and stronger reasoning capabilities. - It provides a versatile toolset for users to flexibly build their own workflows.

MistralMistralMistral 7B Instruct v0.3
7B8K ctx3.9 GBlegacy
denseLegacy

The Mistral-7B-Instruct-v0.3 Large Language Model (LLM) is an instruct fine-tuned version of the Mistral-7B-v0.3.

IntelIntelNeural Chat 7B
7B8K ctx3.9 GBlegacy
denseLegacy

Context length for this model: 8192 tokens (same as https://huggingface.co/mistralai/Mistral-7B-v0.1)

Nous ResearchNous ResearchNous Hermes 1.0
9B16K ctx5 GBlegacy
denseLegacy

Nous Hermes is a fine-tuned model optimized for instruction following and helpful dialogue. Trained on curated datasets emphasizing quality responses, reasoning, and user alignment.

AllenAIAllenAIOLMo 2 13B
13B33K ctx7.3 GBcurrent
denseLegacy

OLMo 2 13B is AI2's fully open research model with transparent training data and methodology. Designed for reproducible research with competitive performance on reasoning and general knowledge tasks.

HuggingFaceHuggingFaceSmolLM3 3B
3B128K ctx1.7 GBactive
denseLegacy

SmolLM3 is a fully open 3B-parameter language model with dual-mode reasoning, 128K context via YARN extrapolation, and native support for 6 languages. Pretrained on 11.2T tokens with a staged curriculum of web, code, math, and reasoning data. Post-trained with 140B reasoning tokens and Anchored Preference Optimization.

01.AI01.AIYi Coder 9B
9B131K ctx5 GBcurrent
denseLegacy

🐙 GitHub • 👾 Discord • 🐤 Twitter • 💬 WeChat

Intervitens-archiveIIntervitens-archiveinternlm2 limarp chat 20b
20B0K ctx11.2 GB
denseLegacy

 

DuyntnetDDuyntnetTinyLlama 1.1B Chat v1.0 imatrix
1.1B0K ctx0.6 GB
denseLegacy

 

QuantFactoryQQuantFactorystarcoder2 7b
7B0K ctx3.9 GB
denseLegacy

 

MaziyarPanahiMMaziyarPanahizephyr 7b beta Mistral 7B Instruct v0.2
7B0K ctx3.9 GB
denseLegacy

 

MradermacherMMradermacherOpenChat 3.5 7B Qwen v2.0 i1
7B0K ctx3.9 GB
denseLegacy

 

MradermacherMMradermacherOpenChat 3.5 7B Starling v2.0 i1
7B0K ctx3.9 GB
denseLegacy

 

HelpingAIHHelpingAIHelpingAI2 6B
6B0K ctx3.4 GB
denseLegacy

 

PreviousPage 9 of 12Next