Your Name b18dd2e801 3.0.2 2 anos atrás
..
include b18dd2e801 3.0.2 2 anos atrás
src b18dd2e801 3.0.2 2 anos atrás
README.md b18dd2e801 3.0.2 2 anos atrás

README.md

Inferencer2 is a module for running offline model inference, preprocessing and postprocessing based on cnis(Cambricon Neuware Infer Server, hereinafter referred to as Infer Server). Infer Server provides offline model loading and management, inference tasks scheduling and etc. functions. Infer Server greatly simplifies the development and deployment of deep learning applications for the MLU(Machine Learning Unit) platform, as well as is very efficient, therefore we recommend users to use Inferencer2 which is based on Infer Server. Also users could create customized module based on Infer server , and Inferencer2 is a sample for you.