Investigating Pretrained Language Models for Graph-to-Text Generation
We investigate two recent pretrained language models (PLMs) and analyze the impact of different task-adaptive pretraining strategies for PLMs in graph-to-text generation.
EMNLP | NLP4ConvAI | WS