sci-ml/ollama (inode64-overlay)

Search

Install

Install this package:

emerge -a sci-ml/ollama

If the package is masked, you can unmask it using the autounmask tool or standard emerge options:

autounmask sci-ml/ollama

Or alternatively:

emerge --autounmask-write -a sci-ml/ollama

Package Information

Description:
Get up and running with Llama 3, Mistral, Gemma, and other language models.
Homepage:
https://ollama.com
License:
MIT

Versions

Version EAPI Keywords Slot
9999 8 ~amd64 0
0.23.0 8 ~amd64 0

Metadata

Maintainers

Upstream

Raw Metadata XML
<pkgmetadata>
	<maintainer type="person">
		<email>negril.nx+gentoo@gmail.com</email>
		<name>Paul Zander</name>
	</maintainer>
	<use>
		<flag name="blis">Use <pkg>sci-libs/blis</pkg> as blas provider</flag>
		<flag name="cuda">Enable NVIDIA CUDA support</flag>
		<flag name="flexiblas">Use <pkg>sci-libs/flexiblas</pkg> as blas provider</flag>
		<flag name="mkl">Use <pkg>sci-libs/mkl</pkg> for blas, lapack and sparse blas routines</flag>
		<flag name="openblas">Use <pkg>sci-libs/openblas</pkg> as blas provider</flag>
		<flag name="rocm">Enable ROCm gpu computing support</flag>
	</use>
	<upstream>
		<remote-id type="github">ollama/ollama</remote-id>
	</upstream>
</pkgmetadata>

Lint Warnings

USE Flags

Manage flags for this package: euse -i <flag> -p sci-ml/ollama | euse -E <flag> -p sci-ml/ollama | euse -D <flag> -p sci-ml/ollama

Flag Description 9999 0.23.0
blas ⚠️
blis Use <pkg>sci-libs/blis</pkg> as blas provider
cuda Enable NVIDIA CUDA support
flexiblas Use <pkg>sci-libs/flexiblas</pkg> as blas provider
mkl Use <pkg>sci-libs/mkl</pkg> for blas, lapack and sparse blas routines
openblas Use <pkg>sci-libs/openblas</pkg> as blas provider
rocm Enable ROCm gpu computing support
vulkan ⚠️

Files

Manifest

Type File Size Versions
DIST ollama-0.23.0-deps.tar.xz 84650388 bytes 0.23.0
DIST ollama-0.23.0.gh.tar.gz 28625720 bytes 0.23.0
Unmatched Entries
Type File Size