-
Can the exported models e.g. the .onnx file be used for inference outside of MMDeploy without using the MMDeploy SDK? I wish to use the .onnx file in my own inference code such as this |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment
-
If you mean you want to export the model we supported and use it in your own project. YES, of cause. Just use it with the apis provided by the backend engine and it will have no difference. |
Beta Was this translation helpful? Give feedback.
If you mean you want to export the model we supported and use it in your own project. YES, of cause. Just use it with the apis provided by the backend engine and it will have no difference.
If you want to export your custom model (which does not come from OpenMMLab), Errrr, I am not sure. You might have to do some modifications to the code so it can be traced. We have provided a tool called function rewriter, you can use it to ease the process, there should be a doc about how to write the rewrite function.