arXiv Open Access 2026

LLM-42: Enabling Determinism in LLM Inference with Verified Speculation

Raja Gond Aditya K Kamath Ramachandran Ramjee Ashish Panwar
Lihat Sumber

Abstrak

In LLM inference, the same prompt may yield different outputs across different runs. At the system level, this non-determinism arises from floating-point non-associativity combined with dynamic batching and GPU kernels whose reduction orders vary with batch size. A straightforward way to eliminate non-determinism is to disable dynamic batching during inference, but doing so severely degrades throughput. Another approach is to make kernels batch-invariant; however, this tightly couples determinism to kernel design, requiring new implementations. This coupling also imposes fixed runtime overheads, regardless of how much of the workload actually requires determinism. Inspired by ideas from speculative decoding, we present LLM-42, a scheduling-based approach to enable determinism in LLM inference. Our key observation is that if a sequence is in a consistent state, the next emitted token is likely to be consistent even with dynamic batching. Moreover, most GPU kernels use shape-consistent reductions. Leveraging these insights, LLM-42 decodes tokens using a non-deterministic fast path and enforces determinism via a lightweight verify-rollback loop. The verifier replays candidate tokens under a fixed-shape reduction schedule, commits those that are guaranteed to be consistent across runs, and rolls back those violating determinism. LLM-42 mostly re-uses existing kernels unchanged and incurs overhead only in proportion to the traffic that requires determinism.

Topik & Kata Kunci

Penulis (4)

R

Raja Gond

A

Aditya K Kamath

R

Ramachandran Ramjee

A

Ashish Panwar

Format Sitasi

Gond, R., Kamath, A.K., Ramjee, R., Panwar, A. (2026). LLM-42: Enabling Determinism in LLM Inference with Verified Speculation. https://arxiv.org/abs/2601.17768

Akses Cepat

Lihat di Sumber
Informasi Jurnal
Tahun Terbit
2026
Bahasa
en
Sumber Database
arXiv
Akses
Open Access ✓