sci-ml/ollama (tamiko)

Search

Package Information

Description:
Get up and running with Llama 3, Mistral, Gemma, and other language models.
Homepage:
https://ollama.com

Versions

Version EAPI Keywords Slot
9999 8 ~amd64 0

Metadata

Upstream

Raw Metadata XML
<pkgmetadata>
	<use>
		<flag name="cuda">Enable NVIDIA CUDA support</flag>
		<flag name="mkl">Use <pkg>sci-libs/mkl</pkg> for blas, lapack and sparse blas routines</flag>
		<flag name="rocm">Enable ROCm gpu computing support</flag>
	</use>
	<upstream>
		<remote-id type="github">ollama/ollama</remote-id>
	</upstream>
</pkgmetadata>

Lint Warnings

USE Flags

Flag Description 9999
${CPU_FLAGS[*]} ⚠️
blas ⚠️
cuda Enable NVIDIA CUDA support
mkl Use <pkg>sci-libs/mkl</pkg> for blas, lapack and sparse blas routines
rocm Enable ROCm gpu computing support

Files