1. Courbariaux, M., Bengio, Y., David, J.P.: BinaryConnect: training deep neural networks with binary weights during propagations. In: Advances in Neural Information Processing Systems, vol. 28. Curran Associates, Inc. (2015). https://dl.acm.org/doi/10.5555/2969442.2969588
2. Cramer, B., et al.: Training spiking multi-layer networks with surrogate gradients on an analog neuromorphic substrate (2020). https://arxiv.org/abs/2006.07239
3. Feinberg, B., Wang, S., Ipek, E.: Making memristive neural network accelerators reliable. In: 2018 IEEE International Symposium on High Performance Computer Architecture (HPCA), pp. 52–65 (2018). https://doi.org/10.1109/HPCA.2018.00015
4. Jain, S., Sengupta, A., Roy, K., Raghunathan, A.: Rx-caffe: Framework for evaluating and training deep neural networks on resistive crossbars (2018). http://arxiv.org/abs/1809.00072
5. Joshi, V., et al.: Accurate deep neural network inference using computational phase-change memory. Nature Commun. 11(1), 2473 (2020). https://doi.org/10.1038/s41467-020-16108-9