Process This: Edge AI technology topics


Process This: Demystifying Embedded Deep Learning Deployment


2021年 7月 16日

In this webinar, learn how to automatically increase Edge AI inference performance using hardware accelerators for deep learning without having to program the cores.

Webinar agenda:

  • How deep learning models operate on an embedded processor
  • How to optimize models for maximum performance
  • How to compile and deploy a model and accelerate inference using industry-standard APIs—e.g. TensorFlow Lite, ONNX Runtime, TVM

About TI Edge AI Cloud

TI Edge AI Cloud is a free online service that lets you evaluate accelerated deep learning inference on TDA4x processors. You do not need to purchase an evaluation board. The service is python-based; and it only takes a few minutes to login, deploy a model, and get a variety of performance benchmarks.


This course is also a part of the following series

Date: 六月4日2021年
arrow-topclosedeletedownloadmenusearchsortingArrowszoom-inzoom-out arrow-downarrow-uparrowCircle-leftarrowCircle-rightblockDiagramcalculatorcalendarchatBubble-doublechatBubble-personchatBubble-singlecheckmark-circlechevron-downchevron-leftchevron-rightchevron-upchipclipboardclose-circlecrossReferencedashdocument-genericdocument-pdfAcrobatdocument-webevaluationModuleglobehistoryClockinfo-circlelistlockmailmyTIonlineDataSheetpersonphonequestion-circlereferenceDesignshoppingCartstartoolsvideoswarningwiki