1. Whole-brain tissue mapping toolkit using large-scale highly multiplexed immunofluorescence imaging and deep neural networks;Maric;Nat Commun,2021
2. Improving language understanding by generative pre-training;Radford,2018
3. Language models are unsupervised multitask learners;Radford;Openai Blog,2019
4. Bert: Pre-training of deep bidirectional transformers for language understanding;Devlin,2018
5. Representation learning with contrastive predictive coding;Van den Oord,2018