Publication
ES Week 2023
Workshop

A Neuro-Vector-Symbolic Architecture for Data- and Compute-Efficient Continual Learning, Abstract Reasoning, and Combinatorial Inference

Abstract

Neuro-symbolic AI approaches display both perception and reasoning capabilities, but inherit the limitations of their individual deep learning and symbolic AI components. By combining neural networks and vector-symbolic architecture machinery, we propose the concept of neuro-vector-symbolic architecture (NVSA). NVSA solves few-shot continual learning, visual abstract reasoning, and computationally hard problems such as factorization faster and more accurately than other state-of-the-art methods. We also show how the efficient realization of NVSA can be informed and benefitted by the physical properties of in-memory computing hardware, e.g., O(1) MVM, in-situ progressive crystallization, and intrinsic stochasticity of phase-change memory devices.

Date

Publication

ES Week 2023

Authors

Topics

Share