"Directed Acyclic Transformer Pre-training for High-quality Non-autoregressive Text Generation"

Cat: cs CL

Link: arxiv.org/pdf/2304.11791v1 (https://arxiv.org/pdf/2304.11791v1)

https://nitter.moomoo.me/ArXivGPT/status/1650981642138447872#m

Reply to this note

Please Login to reply.

Discussion

No replies yet.