arXiv Open Access 2026

Simulation Distillation: Pretraining World Models in Simulation for Rapid Real-World Adaptation

Jacob Levy Tyler Westenbroek Kevin Huang Fernando Palafox Patrick Yin +4 lainnya
Lihat Sumber

Abstrak

Simulation-to-real transfer remains a central challenge in robotics, as mismatches between simulated and real-world dynamics often lead to failures. While reinforcement learning offers a principled mechanism for adaptation, existing sim-to-real finetuning methods struggle with exploration and long-horizon credit assignment in the low-data regimes typical of real-world robotics. We introduce Simulation Distillation (SimDist), a sim-to-real framework that distills structural priors from a simulator into a latent world model and enables rapid real-world adaptation via online planning and supervised dynamics finetuning. By transferring reward and value models directly from simulation, SimDist provides dense planning signals from raw perception without requiring value learning during deployment. As a result, real-world adaptation reduces to short-horizon system identification, avoiding long-horizon credit assignment and enabling fast, stable improvement. Across precise manipulation and quadruped locomotion tasks, SimDist substantially outperforms prior methods in data efficiency, stability, and final performance. Project website and code: https://sim-dist.github.io/

Topik & Kata Kunci

Penulis (9)

J

Jacob Levy

T

Tyler Westenbroek

K

Kevin Huang

F

Fernando Palafox

P

Patrick Yin

S

Shayegan Omidshafiei

D

Dong-Ki Kim

A

Abhishek Gupta

D

David Fridovich-Keil

Format Sitasi

Levy, J., Westenbroek, T., Huang, K., Palafox, F., Yin, P., Omidshafiei, S. et al. (2026). Simulation Distillation: Pretraining World Models in Simulation for Rapid Real-World Adaptation. https://arxiv.org/abs/2603.15759

Akses Cepat

Lihat di Sumber
Informasi Jurnal
Tahun Terbit
2026
Bahasa
en
Sumber Database
arXiv
Akses
Open Access ✓