More Is Less: Model Augmentation for Intermittent Deep Inference

Chih Kai Kang, Hashan Roshantha Mendis, Chun Han Lin, Ming Syan Chen, Pi Cheng Hsiu*

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

5 Citations (Scopus)


Energy harvesting creates an emerging intermittent computing paradigm but poses new challenges for sophisticated applications such as intermittent deep neural network (DNN) inference. Although model compression has adapted DNNs to resource-constrained devices, under intermittent power, compressed models will still experience multiple power failures during a single inference. Footprint-based approaches enable hardware-accelerated intermittent DNN inference by tracking footprints, independent of model computations, to indicate accelerator progress across power cycles. However, we observe that the extra overhead required to preserve progress indicators can severely offset the computation progress accumulated by intermittent DNN inference. This work proposes the concept of model augmentation to adapt DNNs to intermittent devices. Our middleware stack, JAPARI, appends extra neural network components into a given DNN, to enable the accelerator to intrinsically integrate progress indicators into the inference process, without affecting model accuracy. Their specific positions allow progress indicator preservation to be piggybacked onto output feature preservation to amortize the extra overhead, and their assigned values ensure uniquely distinguishable progress indicators for correct inference recovery upon power resumption. Evaluations on a Texas Instruments device under various DNN models, capacitor sizes, and progress preservation granularities show that JAPARI can speed up intermittent DNN inference by 3× over the state of the art, for common convolutional neural architectures that require heavy acceleration.

Original languageEnglish
Article number3506732
JournalACM Transactions on Embedded Computing Systems
Issue number5
Publication statusPublished - 2022 Oct 8


  • Deep neural networks
  • edge computing
  • energy harvesting
  • intermittent systems
  • model adaptation

ASJC Scopus subject areas

  • Software
  • Hardware and Architecture


Dive into the research topics of 'More Is Less: Model Augmentation for Intermittent Deep Inference'. Together they form a unique fingerprint.

Cite this