Irem Boybat-Kara
IEDM 2023
Transformer-based Large Language Models (LLMs) demand large weight capacity, efficient computing, and high throughput access to large amount of dynamic memory. These challenges present great opportunities for algorithmic and hardware innovations, including Analog AI accelerators. In this paper, we describe recent progress on Phase Change Memory-based hardware and architectural designs to address the challenges for LLM inference.
Irem Boybat-Kara
IEDM 2023
Laura Bégon-Lours, Mattia Halter, et al.
MRS Spring Meeting 2023
Ying Zhou, Gi-Joon Nam, et al.
DAC 2023
Corey Liam Lammie, Julian Büchel, et al.
ISCAS 2025