1. PEGASUS: Pre-training with extracted gap-sentences for abstractive summarization;zhang;Proc 37th Int Conf Mach Learn,0
2. Using the triangle inequality to accelerate k-means;elkan;Proc 20th Int Conf Int Conf Mach Learn,0
3. BART: Denoising Sequence-to-Sequence Pre-training for Natural Language Generation, Translation, and Comprehension
4. Teaching machines to read and comprehend;hermann;Proc Adv Neural Inf Process Syst,0
5. Colorful image colorization;zhang;Proc Comput Vis,0