We are pleased to announce that Deep Learning Inference Engine backend in the OpenCV DNN module can run inference on ARM CPUs nowadays.
Previously Inference Engine supported Intel hardware only: CPUs, iGPUs, FPGAs and VPUs. Recently a new ARM CPU plugin has been published on the GitHub. This plugin allows Inference Engine to run DL networks on ARM CPUs using ARM Compute Library as a backend. The plugin is a part of open source version of OpenVINO and is not included in the Intel Distribution of OpenVINO toolkit.
Details on how to use the plugin via the OpenCV DNN module can be found in the OpenCV wiki.
Below you can see a short preview of the Object Detection Demo from the Open Model Zoo being run using ARM CPU plugin on Raspberry Pi: