sci-misc/llama-cpp (FireBurn)

Search

Install

Install this package:

emerge -a sci-misc/llama-cpp

If the package is masked, you can unmask it using the autounmask tool or standard emerge options:

autounmask sci-misc/llama-cpp

Or alternatively:

emerge --autounmask-write -a sci-misc/llama-cpp

Package Information

Description:
Port of Facebook's LLaMA model in C/C++
Homepage:
https://github.com/ggml-org/llama.cpp
License:
MIT

Versions

Version EAPI Keywords Slot
9999 8 ~amd64 0

Metadata

Maintainers

Upstream

Raw Metadata XML
<pkgmetadata>
	<maintainer type="person">
		<email>zl29ah@gmail.com</email>
		<name>Sergey Alirzaev</name>
	</maintainer>
	<use>
		<flag name="blis">Build a BLIS backend</flag>
		<flag name="hip">Build a HIP (ROCm) backend</flag>
		<flag name="openblas">Build an OpenBLAS backend</flag>
		<flag name="opencl">Build an OpenCL backend, so far only works on Adreno and Intel GPUs</flag>
	</use>
	<upstream>
		<remote-id type="github">ggml-org/llama.cpp</remote-id>
	</upstream>
</pkgmetadata>

Lint Warnings

USE Flags

Manage flags for this package: euse -i <flag> -p sci-misc/llama-cpp | euse -E <flag> -p sci-misc/llama-cpp | euse -D <flag> -p sci-misc/llama-cpp

Flag Description 9999
blis Build a BLIS backend
cuda Enable NVIDIA CUDA support ⚠️
curl ⚠️
hip Build a HIP (ROCm) backend
openblas Build an OpenBLAS backend
opencl Build an OpenCL backend, so far only works on Adreno and Intel GPUs
openmp ⚠️
vulkan Enable Vulkan graphics support ⚠️

Files