Your Name b18dd2e801 3.0.2 2 éve
..
include b18dd2e801 3.0.2 2 éve
src b18dd2e801 3.0.2 2 éve
README.md b18dd2e801 3.0.2 2 éve

README.md

Inferencer2 is a module for running offline model inference, preprocessing and postprocessing based on cnis(Cambricon Neuware Infer Server, hereinafter referred to as Infer Server). Infer Server provides offline model loading and management, inference tasks scheduling and etc. functions. Infer Server greatly simplifies the development and deployment of deep learning applications for the MLU(Machine Learning Unit) platform, as well as is very efficient, therefore we recommend users to use Inferencer2 which is based on Infer Server. Also users could create customized module based on Infer server , and Inferencer2 is a sample for you.