1. Great power, great responsibility: Recommendations for reducing energy for training language models;McDonald,2022
2. Google cloud doubles down on nvidia gpus for inference;Freund,2019
3. Ai and Efficiency;Hernandez,2020
4. Amazon ec2 update-infl instances with aws inferentia chips for high performance cost-effective inferencing;Barr,2019
5. Aws to offer nvidia’s t4 gpus for ai inferencing;Leopold,2019