Will It Run AI
CalculatorModelsHardwareCompare
Product
  • Calculator
  • Compare
  • Tier List
Browse
  • Models
  • Hardware
  • Docs
About
  • Why It Works
  • What's New
  • Legal Notice
  • Privacy Policy

All estimates are approximations based on mathematical models and public specifications. Actual performance may vary. Do not make purchasing decisions based solely on these estimates.

Data sourced from Hugging Face, Ollama, and official model documentation. Model names and logos are trademarks of their respective owners.

© 2026 Will It Run AI — Fase Consulting Ibiza, S.L. (NIF: B57969656)

Browse AI Models

50 models available

/
Status:
Sort:
Filtered by:
AlibabaAlibabaQwen 2.5 Coder 14B
14B131K ctx7.8 GBcurrent
denseLegacy

Qwen2.5-Coder is the latest series of Code-Specific Qwen large language models (formerly known as CodeQwen). As of now, Qwen2.5-Coder has covered six mainstream model sizes, 0.5, 1.5, 3, 7, 14, 32 billion parameters, to meet the needs of different developers. Qwen2.5-Coder brings the following improvements upon CodeQwen1.5:

UukuguyUUukuguyspeechless zephyr code functionary 7b
7B0K ctx3.9 GB
denseLegacy

 

Second-stateSSecond-stateStarCoder2 3B
3B0K ctx1.7 GB
denseLegacy

 

Second-stateSSecond-stateStarCoder2 7B
7B0K ctx3.9 GB
denseLegacy

 

Lmstudio-communityLLmstudio-communityYi Coder 1.5B
1.5B0K ctx0.8 GB
denseLegacy

 

SanctumAISSanctumAICodestral 22B v0.1
22B0K ctx12.3 GB
denseLegacy

 

BigCodeBigCodeStarCoder2 15B
15B16K ctx8.4 GBcurrent
denseLegacy

- Project Website: bigcode-project.org - Paper: Link - Point of Contact: contact@bigcode-project.org - Languages: 600+ Programming languages

GoogleGoogleGemma 3 12B
12B131K ctx6.7 GBcurrent
denseLegacy

Gemma 3 12B is Google's mid-range Gemma 3 model with vision capabilities. Offers strong reasoning, code generation, and image understanding balanced with practical resource requirements.

BartowskiBBartowskistarcoder2 15b instruct v0.1
15B0K ctx8.4 GB
denseLegacy

 

MradermacherMMradermacherstarcoder2 15b i1
15B0K ctx8.4 GB
denseLegacy

 

MradermacherMMradermacherYi 9B Coder i1
9B0K ctx5 GB
denseLegacy

 

GabriellarsonGGabriellarsonMamba Codestral 7B v0.1
7B0K ctx3.9 GB
denseLegacy

 

Lmstudio-communityLLmstudio-communitystarcoder2 15b instruct v0.1
15B0K ctx8.4 GB
denseLegacy

 

BartowskiBBartowskiinternlm JanusCoder 14B
14B0K ctx7.8 GB
denseLegacy

 

LegraphistaLLegraphistaCodestral 22B v0.1 IMat
22B0K ctx12.3 GB
denseLegacy

 

MetaMetaCodeLlama 13B Instruct
13B16K ctx7.3 GBlegacy
denseLegacy

Code Llama is a collection of pretrained and fine-tuned generative text models ranging in scale from 7 billion to 34 billion parameters. This is the repository for the 13 instruct-tuned version in the Hugging Face Transformers format. This model is designed for general code synthesis and understanding. Links to other models can be found in the index at the bottom.

QuantFactoryQQuantFactorystarcoder2 7b
7B0K ctx3.9 GB
denseLegacy

 

MradermacherMMradermacherCodeNinja 1.0 OpenChat 7B i1
7B0K ctx3.9 GB
denseLegacy

 

MradermacherMMradermacherCodestral 21B Pruned i1
21B0K ctx11.8 GB
denseLegacy

 

DevStral AIDevStral AIDevStral 7B
7B8K ctx3.9 GBlegacy
denseLegacy

Devstral 7B is Mistral AI's specialized coding model optimized for software development tasks. Features strong code generation, completion, and understanding across multiple programming languages.

AlibabaAlibabaQwen 2.5 Coder 7B
7B131K ctx3.9 GBcurrent
denseLegacy

Qwen2.5-Coder is the latest series of Code-Specific Qwen large language models (formerly known as CodeQwen). As of now, Qwen2.5-Coder has covered six mainstream model sizes, 0.5, 1.5, 3, 7, 14, 32 billion parameters, to meet the needs of different developers. Qwen2.5-Coder brings the following improvements upon CodeQwen1.5:

MradermacherMMradermacherCodestral 22B v0.1 i1
22B0K ctx12.3 GB
denseLegacy

 

MradermacherMMradermacherCodestral RAG 19B Pruned i1
19B0K ctx10.6 GB
denseLegacy

 

BigCodeBigCodeStarCoder2 7B
7B16K ctx3.9 GBcurrent
denseLegacy

- Project Website: bigcode-project.org - Paper: Link - Point of Contact: contact@bigcode-project.org - Languages: 17 Programming languages

PreviousPage 2 of 3Next