Comment 0 for bug 1958866

Revision history for this message
apoorv (sangal) wrote :

We propose to cover as much models in the reasonable time is to download models via Open Model Zoo and then run benchmarking application against them. As OMZ showcases models of different use cases (images, language processing etc.) we can select the representative set of models, starting from e.g. resnet, bert that are currently widely used in different HW programs. Later models set can be adjusted to fit our needs and time we have.

Steps to run :
• install OpenVINO from pip, You can use wheels from the last http://nncv-nas-01.ccr.corp.intel.com/ovino-pkg/packages/nightly/2022WW04.2/master/519/wheels/linux/
• download and convert model
• run benchmarking application

Python benchmarking application looks like:
cd /tmp
wget http://nncv-nas-01.ccr.corp.intel.com/ovino-pkg/packages/nightly/2022WW04.2/master/519/wheels/linux/openvino-2022.1.0.dev20220118-6177-cp36-cp36m-manylinux_2_27_x86_64.whl
wget http://nncv-nas-01.ccr.corp.intel.com/ovino-pkg/packages/nightly/2022WW04.2/master/519/wheels/linux/openvino_dev-2022.1.0.dev20220118-6177-py3-none-any.whl
pip3 install openvino --find-links=/tmp
pip3 install openvino-dev[caffe,kaldi,mxnet,onnx,pytorch,tensorflow2] --find-links=/tmp
omz_downloader --name alexnet # or resnet-50-tf or bert-base-ner or whatever from https://github.com/openvinotoolkit/open_model_zoo/tree/master/models
omz_converter --name alexnet --precision FP32 # models from intel scope does not require conversion
benchmark_app -m ./public/alexnet/FP32/alexnet.xml

C++ benchmarking application can be installed from installer http://nncv-nas-01.ccr.corp.intel.com/ovino-pkg/packages/nightly/2022WW04.2/master/519/irc/linux/l_openvino_toolkit_p_2022.1.0.519_offline.sh or via apt as described at https://docs.openvino.ai/latest/openvino_docs_install_guides_installing_openvino_apt.html. Then still do pip installation from above and get models
cd /opt/intel/openvino_<VERSION>/samples/cpp # this is for 2022.1, use openvino_<VERSION>/inference_engine/samples/cpp for 2021.4
./build_samples.sh
cd ~/inference_engine_samples_build
./benchmark_app -m /path/to/converted/model # should be /tmp/public/alexnet/FP32/alexnet.xml here

Success Criteria: Successful execution of benchmark-app, nothing more (rely on zero exit code or run with -report_folder parameter that leads to generating statistics file in CSV format and check if “throughput” line contains some value).