Getting Started with SmolLM3‑3B‑GGUF for Long‑Context Multilingual Reasoning
SmolLM3 is a compact 3 billion‑parameter transformer that delivers state‑of‑the‑art performance at the 3B–4B scale, supporting six major languages and extended contexts up to 128 000 tokens.
This powerful yet compact model offers capabilities comparable to 4B models, making it lightweight and suitable for edge devices. It excels in long-context reasoning, able to handle up to 128,000 tokens from documents, transcripts, or logs. Furthermore, its multilingual instruction-tuning for English, French, Spanish, German, Italian, and Portuguese makes it ideal for global applications.…