Papernews
← back

Pruning random resistive memory for optimizing analog AI

Jan 10, 2026 by Yi Li, Song-jian Wang, Yaping Zhao, Shaocong Wang, Bo Wang, Woyu Zhang, Yangu He, Ning Lin, Binbin Cui, Xi Chen, Shiming Zhang, Hao Jiang, Peng Lin, Xumeng Zhang, Feng Zhang, Xiaojuan Qi, Zhongrui Wang, Xiaoxin Xu, Dashan Shang, Qi Liu, Han Wang, Kwang-Ting Cheng, Ming Liu (Nature Communications)

DOI 10.1038/s41467-025-67960-6



We trained randomly weighted resistive‑memory networks by pruning edges to find high‑performing subnets that need no precise analog tuning, then leveraged device stochasticity for cheap large‑scale weight generation—result: big accuracy gains and massive energy cuts on a 40 nm RRAM test chip and scalability to ResNet‑50/ImageNet‑100. This co‑design turns analog device quirks from a liability into a feature, making in‑memory AI far more practical and power‑efficient.

source S2, crossref



dgfl, 2026