moonshotai/Kimi-Linear-48B-A3B-Instruct · Hugging Face
We’re on a journey to advance and democratize artificial intelligence through open source and open science.
The LLM Model VRAM Calculator estimates the VRAM required to run large language models by allowing users to input model size, context length, and GPU details.
The tool considers parameters such as model parameter count, quantization, and context size, which all significantly impact VRAM needs. It instantly outputs the calculated VRAM requirement for loading and potentially running the selected model configuration.