arXiv Open Access 2025

Fast LLM Post-training via Decoupled and Fastest-of-N Speculation

Rongxin Cheng Kai Zhou Xingda Wei Siyuan Liu Mingcong Han +6 lainnya
Lihat Sumber

Abstrak

Rollout dominates the training time in large language model (LLM) post-training, where the trained model is used to generate tokens given a batch of prompts. This work, SpecActor, achieves fast rollout with speculative decoding that deploys a fast draft path to accelerate the unparallelizable generation, while the correctness is guaranteed by fast parallel verification of the outputs with the original model. SpecActor addresses two foundational challenges that hinder speculation efficiency: (1) a Decoupled speculation method that overcomes the computation inefficiency issue when executing speculative decoding with relative large per-worker batch size -- a common configuration in training but unfriendly to speculation, and (2) a Fastest-of-N speculation method that selects and combines different draft methods according to the rollout progress to approximate the optimal draft method even when the best one is unknown a priori. Extensive evaluations on production traces show that SpecActor accelerates mean rollout speed by 2.0--2.4x, with up to 2.7x speedup, over common post-training baselines. The results are consistent across both dense and MoE models and across different RL algorithms. Notably, SpecActor is 1.1--2.6x faster compared to vanilla speculative rollout in different traces. The accelerated rollout achieves 1.4--2.3x faster end-to-end training time.

Topik & Kata Kunci

Penulis (11)

R

Rongxin Cheng

K

Kai Zhou

X

Xingda Wei

S

Siyuan Liu

M

Mingcong Han

M

Mingjing Ai

Y

Yeju Zhou

B

Baoquan Zhong

W

Wencong Xiao

R

Rong Chen

H

Haibo Chen

Format Sitasi

Cheng, R., Zhou, K., Wei, X., Liu, S., Han, M., Ai, M. et al. (2025). Fast LLM Post-training via Decoupled and Fastest-of-N Speculation. https://arxiv.org/abs/2511.16193

Akses Cepat

Lihat di Sumber
Informasi Jurnal
Tahun Terbit
2025
Bahasa
en
Sumber Database
arXiv
Akses
Open Access ✓