Process This: Demystifying Embedded Deep Learning Deployment
00:51:54
|
16 JUL 2021
In this webinar, learn how to automatically increase Edge AI inference performance using hardware accelerators for deep learning without having to program the cores.
Webinar agenda:
- How deep learning models operate on an embedded processor
- How to optimize models for maximum performance
- How to compile and deploy a model and accelerate inference using industry-standard APIs—e.g. TensorFlow Lite, ONNX Runtime, TVM
About TI Edge AI Cloud
TI Edge AI Cloud is a free online service that lets you evaluate accelerated deep learning inference on TDA4x processors. You do not need to purchase an evaluation board. The service is python-based; and it only takes a few minutes to login, deploy a model, and get a variety of performance benchmarks.
Resources
This video is part of a series
-
Process this: Edge AI technology topics
video-playlist (10 videos)