Sponsor: National Science Foundation (NSF)
The broader scope of this research includes: a) Energy-efficient architecture and algorithm co-design for DNN training to yield compressed models, b) Efficient model compression to retain its robustness, c) Model compression of brain-inspired deep SNNs.Related work:
- Souvik Kundu, Mahdi Nazemi, Massoud Pedram, Keith M. Chugg, and Peter A. Beerel, “Pre-defined Sparsity for Low-Complexity Convolutional Neural Networks,” in IEEE Transactions on Computers, 2020.
- Souvik Kundu, Mahdi Nazemi, Peter Beerel, Massoud Pedrami, “DNR: A Tunable Robust Pruning Framework Through Dynamic Network Rewiring of DNN,” in Proc. of ASP-DAC, 2021.
- Souvik Kundu, Gourav Datta, Massoud Pedram , Peter Beerel, “Spike-Thrift: Towards Energy-Efficient Deep SNNs by Limiting Spiking Activity via Attention Guided Compression,” in WACV, 2021.