Tommaso Stecconi, Roberto Guido, et al.
Advanced Electronic Materials
Transformer-based Large Language Models (LLMs) demand large weight capacity, efficient computing, and high throughput access to large amount of dynamic memory. These challenges present great opportunities for algorithmic and hardware innovations, including Analog AI accelerators. In this paper, we describe recent progress on Phase Change Memory-based hardware and architectural designs to address the challenges for LLM inference.
Tommaso Stecconi, Roberto Guido, et al.
Advanced Electronic Materials
Max Bloomfield, Amogh Wasti, et al.
ITherm 2025
Victor Chan, A. Gasasira, et al.
IEEE Trans Semicond Manuf
Xiaofan Zhang, Haoming Lu, et al.
MLSys 2020