arXiv Open Access 2025

FlowSpec: Continuous Pipelined Speculative Decoding for Efficient Distributed LLM Inference

Xing Liu Lizhuo Luo Ming Tang Chao Huang Xu Chen
Lihat Sumber

Abstrak

Distributed inference serves as a promising approach to enabling the inference of large language models (LLMs) at the network edge. It distributes the inference process to multiple devices to ensure that the LLMs can fit into the device memory. Recent pipeline-based approaches have the potential to parallelize communication and computation, which helps reduce inference latency. However, the benefit diminishes when the inference request at the network edge is sparse, where pipeline is typically at low utilization. To enable efficient distributed LLM inference at the edge, we propose \textbf{FlowSpec}, a pipeline-parallel tree-based speculative decoding framework. FlowSpec incorporates three key mechanisms to improve decoding efficiency: 1) score-based step-wise verification prioritizes more important draft tokens to bring earlier accepted tokens; 2) efficient draft management to prune invalid tokens while maintaining correct causal relationship during verification; 3) dynamic draft expansion strategies to supply high-quality speculative inputs. These techniques work in concert to enhance both pipeline utilization and speculative efficiency. We evaluate FlowSpec on a real-world testbed with other baselines. Experimental results demonstrate that our proposed framework significantly improves inference speed across diverse models and configurations, achieving speedup ratios 1.37$\times$-1.73$\times$ compared to baselines. Our code is publicly available at \href{https://github.com/Leosang-lx/FlowSpec#}{https://github.com/Leosang-lx/FlowSpec\#}.

Topik & Kata Kunci

Penulis (5)

X

Xing Liu

L

Lizhuo Luo

M

Ming Tang

C

Chao Huang

X

Xu Chen

Format Sitasi

Liu, X., Luo, L., Tang, M., Huang, C., Chen, X. (2025). FlowSpec: Continuous Pipelined Speculative Decoding for Efficient Distributed LLM Inference. https://arxiv.org/abs/2507.02620

Akses Cepat

Lihat di Sumber
Informasi Jurnal
Tahun Terbit
2025
Bahasa
en
Sumber Database
arXiv
Akses
Open Access ✓