Ollama Rolls Out Experimental Vulkan Support for AMD and Intel
18 hours ago
- #GPU
- #Ollama
- #Vulkan
- Ollama 0.12.6-rc0 introduces experimental Vulkan API support.
- Vulkan support expands GPU coverage for AMD and Intel where ROCM or SYCL/OpenCL isn't available.
- Currently, Vulkan support is available for source builds, with future binary releases planned.
- Ollama is popular for running LLMs like GPT-OSS, DeepSeek-R1, Gemma 3, and Llama 3/4.
- The Vulkan support feature has been in development for the past year and a half.