arXiv Open Access 2025

No Other Representation Component Is Needed: Diffusion Transformers Can Provide Representation Guidance by Themselves

Dengyang Jiang Mengmeng Wang Liuzhuozheng Li Lei Zhang Haoyu Wang +4 lainnya
Lihat Sumber

Abstrak

Recent studies have demonstrated that learning a meaningful internal representation can accelerate generative training. However, existing approaches necessitate to either introduce an off-the-shelf external representation task or rely on a large-scale, pre-trained external representation encoder to provide representation guidance during the training process. In this study, we posit that the unique discriminative process inherent to diffusion transformers enables them to offer such guidance without requiring external representation components. We propose SelfRepresentation Alignment (SRA), a simple yet effective method that obtains representation guidance using the internal representations of learned diffusion transformer. SRA aligns the latent representation of the diffusion transformer in the earlier layer conditioned on higher noise to that in the later layer conditioned on lower noise to progressively enhance the overall representation learning during only the training process. Experimental results indicate that applying SRA to DiTs and SiTs yields consistent performance improvements, and largely outperforms approaches relying on auxiliary representation task. Our approach achieves performance comparable to methods that are dependent on an external pre-trained representation encoder, which demonstrates the feasibility of acceleration with representation alignment in diffusion transformers themselves.

Topik & Kata Kunci

Penulis (9)

D

Dengyang Jiang

M

Mengmeng Wang

L

Liuzhuozheng Li

L

Lei Zhang

H

Haoyu Wang

W

Wei Wei

G

Guang Dai

Y

Yanning Zhang

J

Jingdong Wang

Format Sitasi

Jiang, D., Wang, M., Li, L., Zhang, L., Wang, H., Wei, W. et al. (2025). No Other Representation Component Is Needed: Diffusion Transformers Can Provide Representation Guidance by Themselves. https://arxiv.org/abs/2505.02831

Akses Cepat

Lihat di Sumber
Informasi Jurnal
Tahun Terbit
2025
Bahasa
en
Sumber Database
arXiv
Akses
Open Access ✓