<?xml version="1.0" encoding="utf-8" standalone="yes"?><rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom"><channel><title>Embeddable on Arrans github.io</title><link>https://arran4.github.io/star-tags/embeddable/</link><description>Recent content in Embeddable on Arrans github.io</description><generator>Hugo -- gohugo.io</generator><language>en</language><lastBuildDate>Thu, 16 Apr 2026 22:15:00 +0000</lastBuildDate><atom:link href="https://arran4.github.io/star-tags/embeddable/index.xml" rel="self" type="application/rss+xml"/><item><title>quantumaikr/quant.cpp</title><link>https://arran4.github.io/stars/r_kgdorzfyew/</link><pubDate>Thu, 16 Apr 2026 22:15:00 +0000</pubDate><guid>https://arran4.github.io/stars/r_kgdorzfyew/</guid><description>&lt;p>LLM inference with 7x longer context. Pure C, zero dependencies. Lossless KV cache compression + single-header library.&lt;/p></description></item></channel></rss>