sci-misc/llama-cpp (FireBurn)

Search

Package Information

Homepage:
https://github.com/ggml-org/llama.cpp
License:
MIT

Versions

Version EAPI Keywords Slot
9999 8 ~amd64 0

Metadata

Maintainers

Upstream

Raw Metadata XML
<pkgmetadata>
	<maintainer type="person">
		<email>zl29ah@gmail.com</email>
		<name>Sergey Alirzaev</name>
	</maintainer>
	<use>
		<flag name="blis">Build a BLIS backend</flag>
		<flag name="hip">Build a HIP (ROCm) backend</flag>
		<flag name="openblas">Build an OpenBLAS backend</flag>
		<flag name="opencl">Build an OpenCL backend, so far only works on Adreno and Intel GPUs</flag>
	</use>
	<upstream>
		<remote-id type="github">ggml-org/llama.cpp</remote-id>
	</upstream>
</pkgmetadata>

Lint Warnings

USE Flags

Flag Description 9999
blis Build a BLIS backend
cuda Enable NVIDIA CUDA support ⚠️
curl ⚠️
hip Build a HIP (ROCm) backend
openblas Build an OpenBLAS backend
opencl Build an OpenCL backend, so far only works on Adreno and Intel GPUs
openmp ⚠️
vulkan Enable Vulkan graphics support ⚠️

Files