Intermittent Deep Neural Network InferenceSysML 2018, February 15-16, 2018, Stanford, CA.
Graham Gobieski, Nathan Beckmann, Brandon Lucia
Carnegie Mellon University
The maturation of energy-harvesting technology has enabled new classes of sophisticated, batteryless systems that will drive the next wave of Internet of Things (IoT) applications. These applications require intelligence at the edge and even in the sensor node, e.g., allowing systems to immediately interpret sensed data and make judicious use of scarce bandwidth. However, inference on energy-harvesting devices presents challenges currently unexplored, namely that energy-harvesting devices are severely resourceconstrained and operate intermittently only when energy is available. Typical systems run at a few MHz, have a few hundred KBs of memory and consume power under 1mW when active [8, 22]. In comparison, the most energy efficient DNN inference accelerators consume hundreds of mW.
FULL PAPER: pdf