Deep Learning API and Server in C++14 support for PyTorch,TensorRT, Dlib, NCNN, Tensorflow, XGBoost and TSNE
-
Updated
Nov 22, 2024 - C++
Deep Learning API and Server in C++14 support for PyTorch,TensorRT, Dlib, NCNN, Tensorflow, XGBoost and TSNE
InsightFace REST API for easy deployment of face recognition services with TensorRT in Docker.
针对pytorch模型的自动化模型结构分析和修改工具集,包含自动分析模型结构的模型压缩算法库
Based on tensorrt v8.0+, deploy detection, pose, segment, tracking of YOLO11 with C++ and python api.
End2EndPerception deployment solution based on vision sparse transformer paradigm is open sourced.
Based on tensorrt v8.0+, deploy detect, pose, segment, tracking of YOLOv8 with C++ and python api.
Yolov5 TensorRT Implementations
Using TensorRT for Inference Model Deployment.
this is a tensorrt version unet, inspired by tensorrtx
VitPose without MMCV dependencies
Advanced inference pipeline using NVIDIA Triton Inference Server for CRAFT Text detection (Pytorch), included converter from Pytorch -> ONNX -> TensorRT, Inference pipelines (TensorRT, Triton server - multi-format). Supported model format for Triton inference: TensorRT engine, Torchscript, ONNX
Base on tensorrt version 8.2.4, compare inference speed for different tensorrt api.
The real-time Instance Segmentation Algorithm SparseInst running on TensoRT and ONNX
Convert yolo models to ONNX, TensorRT add NMSBatched.
Based on TensorRT v8.2, build network for YOLOv5-v5.0 by myself, speed up YOLOv5-v5.0 inferencing
Advance inference performance using TensorRT for CRAFT Text detection. Implemented modules to convert Pytorch -> ONNX -> TensorRT, with dynamic shapes (multi-size input) inference.
tensorrt-toy code
Simple tool for PyTorch >> ONNX >> TensorRT conversion
Dockerized TensorRT inference engine with ONNX model conversion tool and ResNet50 and Ultraface preprocess and postprocess C++ implementation
Export (from Onnx) and Inference TensorRT engine with Python
Add a description, image, and links to the tensorrt-conversion topic page so that developers can more easily learn about it.
To associate your repository with the tensorrt-conversion topic, visit your repo's landing page and select "manage topics."