Releases: foldl/chatllm.cpp
Releases · foldl/chatllm.cpp
v0.16
- As always, more models are supported, notably Janus-Pro.
- Windows: prebuilt binary with Vulkan (1.4.321.1). Use
-ngl all
to run whole model on default GPU.
v0.15
- As always, more models are supported.
- Windows: prebuilt binary with Vulkan (1.4.321.1). Use
-ngl all
to run whole model on default GPU.
v0.14
- Fix
main_nim.exe
: could not download models that are > 2GB due to this.
v0.13
- As always, more models are supported.
- Windows: prebuilt binary with Vulkan (1.4.321.1). Use
-ngl all
to run whole model on default GPU.
Update 2025-08-16: chatllm_win_x64.7z updated due to outdated main.exe
.
v0.12
- As always, more models are supported.
- Multimodal: vision & TTS.
- Windows: prebuilt binary with Vulkan (1.4.304.1). Use
-ngl all
to run whole model on default GPU.
v0.11
- As always, more models are supported;
- Windows: prebuilt binary with Vulkan (1.4.304.1). Use
-ngl all
to run whole model on default GPU.
v0.10
- As always, more models are supported;
- Windows: prebuilt binary with Vulkan (1.4.304.1). Use
-ngl all
to run whole model on default GPU.
v0.9
- as always, more models are supported;
- Windows: prebuilt binary with Vulkan. Use
-ngl all
to run whole model on default GPU.
Note: Vulkan Runtime (included in SDK 1.4.304.1) is needed. Download it form here: https://vulkan.lunarg.com/sdk/home#windows
v0.8
- as always, more models are supported;
- Windows: prebuilt binary with Vulkan. Use
-ngl all
to run whole model on default GPU.
Note: Vulkan Runtime (included in SDK 1.4.304.1) is needed. Download it form here: https://vulkan.lunarg.com/sdk/home#windows
v0.7
- As always, more models are supported.
- Fix some issues (Llama3.2, etc).