Author:
Zhou Xingyu,Canady Robert,Li Yi,Koutsoukos Xenofon,Gokhale Aniruddha
Publisher
Springer International Publishing
Reference13 articles.
1. https://github.com/gridlab-d/Taxonomy_Feeders (2015). Accessed October 2019
2. Athalye, A., Carlini, N., Wagner, D.: Obfuscated gradients give a false sense of security: circumventing defenses to adversarial examples. arXiv preprint arXiv:1802.00420 (2018)
3. Blasch, E., Bernstein, D., Rangaswamy, M.: Introduction to dynamic data driven applications systems. In: Blasch, E., Ravela, S., Aved, A. (eds.) Handbook of Dynamic Data Driven Applications Systems, pp. 1–25. Springer, Cham (2018). https://doi.org/10.1007/978-3-319-95504-9_1
4. Broll, B., Whitaker, J.: DeepForge: an open source, collaborative environment for reproducible deep learning (2017)
5. Carlini, N., Wagner, D.: Adversarial examples are not easily detected: bypassing ten detection methods. In: Proceedings of the 10th ACM Workshop on Artificial Intelligence and Security, pp. 3–14. ACM (2017)
Cited by
4 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献