Everything Leaves Footprints: Hardware Accelerated Intermittent Deep Inference

Chih Kai Kang, Hashan Roshantha Mendis, Chun Han Lin, Ming Syan Chen, Pi Cheng Hsiu*

*此作品的通信作者

研究成果: 雜誌貢獻期刊論文同行評審

19 引文 斯高帕斯(Scopus)

摘要

Current peripheral execution approaches for intermittently powered systems require full access to the internal hardware state for checkpointing or rely on application-level energy estimation for task partitioning to make correct forward progress. Both requirements present significant practical challenges for energy-harvesting, intelligent edge Internet-of-Things devices, which perform hardware-accelerated deep neural network (DNN) inference. Sophisticated compute peripherals may have an inaccessible internal state, and the complexity of DNN models makes it difficult for programmers to partition the application into suitably sized tasks that fit within an estimated energy budget. This article presents the concept of inference footprinting for intermittent DNN inference, where accelerator progress is accumulatively preserved across power cycles. Our middleware stack, HAWAII, tracks and restores inference footprints efficiently and transparently to make inference forward progress, without requiring access to the accelerator internal state and application-level energy estimation. Evaluations were carried out on a Texas Instruments device, under varied energy budgets and network workloads. Compared to a variety of task-based intermittent approaches, HAWAII improves the inference throughput by 5.7%-95.7%, particularly achieving higher performance on heavily accelerated DNNs.

原文英語
文章編號9211553
頁(從 - 到)3479-3491
頁數13
期刊IEEE Transactions on Computer-Aided Design of Integrated Circuits and Systems
39
發行號11
DOIs
出版狀態已發佈 - 2020 11月

ASJC Scopus subject areas

  • 軟體
  • 電腦繪圖與電腦輔助設計
  • 電氣與電子工程

指紋

深入研究「Everything Leaves Footprints: Hardware Accelerated Intermittent Deep Inference」主題。共同形成了獨特的指紋。

引用此