Affiliation:
1. Faculty of Information Technology and Bionics Peter Pazmany Catholic University Budapest Hungary
2. Department of Electronics and Telecommunications Politecnico di Torino Turin Italy
3. Institute of Circuits and Systems TUD | Dresden University of Technology Dresden Germany
Abstract
AbstractThis study introduces a simple memristor cellular neural network structure, a minimalist configuration with only two cells, designed to concurrently address two logic problems. The unique attribute of this system lies in its adaptability, where the nature of the implemented logic gate, be it AND, OR, and XOR, is determined exclusively by the initial states of the memristors. The memristors' state, alterable through current flow, allows for dynamic manipulation, enabling the setting of initial conditions and consequently, a change in the circuit's functionality. To optimize the parameters of this dynamic system, contemporary machine learning techniques are employed, specifically gradient descent optimization. Through a case study, the potential of leveraging intricate circuit dynamics is exemplified to expand the spectrum of problems solvable with a defined number of neurons. This work not only underscores the significance of adaptability in logical circuits but also demonstrates the efficacy of memristive elements in enhancing problem‐solving capabilities.
Publisher
Institution of Engineering and Technology (IET)
Reference30 articles.
1. Krizhevsky A. Nair V. Hinton G.:The cifar‐10 dataset.https://www.cs.toronto.edu/~kriz/cifar.html(2014). Accessed 8 Apr 2024
2. Simonyan K. Zisserman A.:Very deep convolutional networks for large‐scale image recognition. arXiv:1409.1556(2014)
3. He K. et al.:Deep residual learning for image recognition. In:Proceedings of the IEEE Conference on Computer Vision and Pattern recognition pp. 770–778(2016)
4. Language models are unsupervised multitask learners;Radford A.;OpenAI blog,2019
5. Language models are few‐shot learners;Brown T.;Adv. Neural Inf. Process. Syst.,2020