- Version 20.12
- Sold by NVIDIA
Triton Inference Server is an open source inference serving software that lets teams deploy trained AI models from any framework on GPU or CPU infrastructure. It is designed to simplify and scale inference serving. Triton Inference Server supports all major frameworks like TensorFlow, TensorRT,...