Run mode
What should I run on this machine?
Start from your hardware and workload, then rank realistic model, quant, and runtime combinations instead of raw model names.
Try the calculatorWill It Run AI
A runtime-aware local AI planner for GPUs, Macs, and current open-weight model coverage. Compare fit, speed, and tradeoffs before you waste time testing random quants.
Workload-aware
Recommendations start from coding, chat, RAG, or reasoning instead of generic “can it run?” output.Artifact-aware
Runtime and quant choices are resolved from supported artifacts, not guessed from a single VRAM number.Current coverage
The catalog surfaces featured local model families, GPU and Mac hardware, and compare pages that are all indexable.Latest local model coverage
Current model coverage includes the latest local and frontier-leaning open-weight families for coding, reasoning, RAG, and chat. Start from a real hardware target, not a benchmark chart.
Run mode
Start from your hardware and workload, then rank realistic model, quant, and runtime combinations instead of raw model names.
Try the calculatorCompare mode
Compare GPUs and Macs for local AI in the language buyers actually care about: coding, reasoning, chat, and long-context work.
Open compareMethod
The app separates catalog data, artifact support, fit estimation, and recommendation scoring so assumptions stay visible.
Read the method