Newest release of OpenVINO 2026.1 is here.
OpenVINO™ Toolkit 2026.1 is available now Download latest release Our records indicate you downloaded OpenVINO™ AI inferencing software from Intel in the past. We wanted to make you aware that a new release of OpenVINO™ toolkit, is now available for you to upgrade. This release expands model coverage while providing performance optimizations and reducing memory consumption. Key Highlights: More Gen AI coverage and frameworks integrations to minimize code changes New models supported: On CPUs & GPUs: Qwen3 VL On CPUs: GPT-OSS 120B Preview: Introducing the OpenVINO backend for llama.cpp, which enables optimized inference on Intel CPUs, GPUs, and NPUs. Validated on GGUF models such as Llama-3.2-1B-Instruct-GGUF, Phi-3-mini-4k-instruct-gguf, Qwen2.5-1.5B-Instruct-GGUF, and Mistral-7B-Instruct-v0.3. New notebook: Unified VLM chatbot with vide...