CVE-2026-34159

Publication date 1 April 2026

Last updated 14 May 2026


Ubuntu priority

Cvss 3 Severity Score

9.8 · Critical

Score breakdown

Description

llama.cpp is an inference of several LLM models in C/C++. Prior to version b8492, the RPC backend's deserialize_tensor() skips all bounds validation when a tensor's buffer field is 0. An unauthenticated attacker can read and write arbitrary process memory via crafted GRAPH_COMPUTE messages. Combined with pointer leaks from ALLOC_BUFFER/BUFFER_GET_BASE, this gives full ASLR bypass and remote code execution. No authentication required, just TCP access to the RPC server port. This issue has been patched in version b8492.

Status

Package Ubuntu Release Status
llama.cpp 26.04 LTS resolute
Not affected
25.10 questing
Needs evaluation
24.04 LTS noble Not in release
22.04 LTS jammy Not in release

Severity score breakdown

Parameter Value
Base score 9.8 · Critical
Attack vector Network
Attack complexity Low
Privileges required None
User interaction None
Scope Unchanged
Confidentiality High
Integrity impact High
Availability impact High
Vector CVSS:3.1/AV:N/AC:L/PR:N/UI:N/S:U/C:H/I:H/A:H

Access our resources on patching vulnerabilities