08:16 CETWednesday · May 13, 2026

shipfeed

K SEARCHJK NAVO OPEN
on the wire
home/§ tools/cluster
ad slot opena single understated line lives here — sponsor wordmark + a short line.advertise on shipfeed →
§ tools · cluster

llama.cpp b9127

today · · primary fetch1 sourcecluster b16c107eupdated today ·

opencl: add opt-in Adreno xmem F16xF32 GEMM for prefill (#22755) ggml-opencl: add Adreno xmem F16xF32 GEMM for prefill ggml-opencl: address Adreno xmem review comments ggml-opencl: align xmem gemm kernel naming --------- Co-authored-by: Your Name macOS/iOS: macOS Apple Silicon (arm64) macOS Apple Silicon (arm64, KleidiAI enabled) macOS Intel (x64) iOS XCFramework Linux: Ubuntu x64 (CPU) Ubuntu arm64 (CPU) Ubuntu s390x (CPU) Ubuntu x64 (Vulkan) Ubuntu arm64 (Vulkan) Ubuntu x64 (ROCm 7.2) Ubuntu x64 (OpenVINO) Ubuntu x64 (SYCL FP32) Ubuntu x64 (SYCL FP16) Android: Android arm64 (CPU) Windows: Windows x64 (CPU) Windows arm64 (CPU) Windows x64 (CUDA 12) - CUDA 12.4 DLLs Windows x64 (CUDA 13) - CUDA 13.1 DLLs Windows x64 (Vulkan) Windows x64 (SYCL) Windows x64 (HIP) openEuler: openEuler x86 (310p) openEuler x86 (910b, ACL Graph) openEuler aarch64 (310p) openEuler aarch64 (910b, ACL Graph)

read full article on github.com
§ sources3 publications · timeline below
  1. github.comllama.cpp b9127primary
  2. github.comllama.cpp b9128
  3. github.comllama.cpp b9123

§ how this story moved

  1. primaryllama.cpp — Releases publishes the launch post.
  2. llama.cpp — Releases picks up coverage.
  3. llama.cpp — Releases picks up coverage.
llama.cpp b9127 · shipfeed