AI Accelerator Interface (AI²)
With the AI Accelerator Interface, MVTec provides a generic interface that allows customers to use supported AI accelerator hardware for the inference part of their deep learning applications – quickly and conveniently.
Such special devices are widely used especially for applications in the embedded environment, but also exist more and more in the PC environment. The AI Accelerator Interface is particularly future-proof. It does not only abstract the deep learning models from specific hardware, but it also allows users to optimize it for the use on their hardware.
In addition to plug-ins provided by MVTec, the integration of customer-specific AI accelerator hardware is also possible. Moreover, it is not only typical deep learning applications that can be accelerated via AI². All "classic" machine vision methods with integrated deep learning functions, such as HALCON's Deep OCR, benefit from this as well.
Available plug-ins
AI² Plug-In | Supported ArchitectureS | Supported Products | Available via |
---|---|---|---|
NVIDIA® TensorRT™ SDK |
|
| MVTec Software Manager |
Intel® Distribution of OpenVINO™ toolkit |
|
| MVTec Software Manager |
HAILO |
|
| HAILO developer zone |
1 starting with DLT 23.04
2 only inference
3 starting with DLT 24.05
For more technical information, please refer to the HALCON Reference Manual, section "AI Acceleration Interfaces (AI²)", or the MERLIC Tool Reference, section "AI² Interfaces for Tools with Deep Learning".