Compact Modeling of Nanocluster Functionality as a Higher-Order Neuron

Celestine P. Lawrence*

*Corresponding author for this work

Research output: Contribution to journalArticleAcademicpeer-review

27 Downloads (Pure)

Abstract

Disordered nanoclusters with multielectrode input–output functionality had recently been experimentally realized with energy-efficient and emergent computational capacity, and thus an interconnected network of several such nanoclusters had been proposed to realize artificial neural networks. To aid that end, here we show that nanocluster functionality can be fit to the simplest dendritic neuron model (DNM), where the only form of nonlinearity is due to multiplicative interactions. This work brings into the spotlight higher-order neural networks (known for their efficient encoding of geometric invariances) to serve as an explainable baseline model of nano-networks against which experimentalists can compare more sophisticated models (deep neural networks or physics-based models such as the lin-min network introduced here) and provides ground for designing novel approximate hardware and a statistical mechanics analysis of the learning performance of interconnected nanoclusters versus perceptrons (where neurons output a nonlinear function of the weighted sum of their inputs). A network with just ten higher-order neurons is shown to achieve a classification accuracy of more than 96% on the MNIST benchmark for handwritten digit recognition (which required 100 times more neurons in three-layer perceptrons).
Original languageEnglish
Pages (from-to)5373-5376
Number of pages4
JournalIEEE Transactions on Electron Devices
Volume69
Issue number9
Early online date3-Aug-2022
DOIs
Publication statusPublished - Sept-2022

Cite this