yzma lets you use Go for local inference+embedding with Vision Language Models (VLMs) and Large Language Models (LLMs) using llama.cpp without CGo.
yzma lets you use Go for local inference+embedding with Vision Language Models (VLMs) and Large Language Models (LLMs) using llama.cpp without CGo.