1. Artificial Hallucinations in ChatGPT: Implications in Scientific Writing
2. Kenn Amdahl. 1991. There Are No Electrons: Electronics for Earthlings. Clearwater Publishing Company, Incorporated.
3. Iz Beltagy Kyle Lo and Arman Cohan. 2019. SciBERT: A Pretrained Language Model for Scientific Text. http://arxiv.org/abs/1903.10676 arXiv:1903.10676 [cs].
4. On the Dangers of Stochastic Parrots
5. Tom Brown, Benjamin Mann, Nick Ryder, Melanie Subbiah, Jared D Kaplan, Prafulla Dhariwal, Arvind Neelakantan, Pranav Shyam, Girish Sastry, Amanda Askell, Sandhini Agarwal, Ariel Herbert-Voss, Gretchen Krueger, Tom Henighan, Rewon Child, Aditya Ramesh, Daniel Ziegler, Jeffrey Wu, Clemens Winter, Chris Hesse, Mark Chen, Eric Sigler, Mateusz Litwin, Scott Gray, Benjamin Chess, Jack Clark, Christopher Berner, Sam McCandlish, Alec Radford, Ilya Sutskever, and Dario Amodei. 2020. Language Models are Few-Shot Learners. In Advances in Neural Information Processing Systems, H. Larochelle, M. Ranzato, R. Hadsell, M. F. Balcan, and H. Lin (Eds.). Vol. 33. Curran Associates, Inc., 1877–1901. https://proceedings.neurips.cc/paper_files/paper/2020/file/1457c0d6bfcb4967418bfb8ac142f64a-Paper.pdf