Week 7 Meeting 20-03-05

MicroBotNet

  • Can we do classification under 1 uJ?
  • Goal is a neural network with under 1 million MAC to do classification on Cifar-10 (10 image classes)
  • MicroBotNet achieves 79.35% accuracy with 930,000 MACs


Trained Ternary Quantization

  • Trained/quantized to 3 weights per convolution
  • Less precision for weights – significantly less power-hungry memory access!
  • My testing shows from 91% accuracy on SqueezeNet to 88% accuracy


Trained Ternary Quantization on MicroBotNet

  • We quantize any convolution or fully connected layer
    • With over 1000 parameters
    • Is not the first convolution layer or final fully connected layers.

Trained Ternary Quantization

  • 31 layers quantized. 86 layers not quantized

  • 65% quantized to one positive weight, one negative weight is 72.8% accuracy.

One Negative One Quantization

  • 65% quantized to zero, one, negative one, 72.3% accuracy.

Results

  • 153,792 param quantized
    • 12,222 quantized to one.
    • 134,124 quantized to zero.
    • 7,446 quantized to neg one.
  • 82,866 param not quantized