Hello, everyone, and welcome to the training on mmWave SDK. The training is intended for the software system and test engineers who are working with TI Single Chip mmWave RADAR devices. The goal of this training is to help you understand the software offerings for these mmWave sensors, such as AWR1443, AWR1642, IWR1443, and IWR1642. The common software package for all these devices is called mmWave SDK.
This training provides overview of the mmWave SDK building blocks and the core components. It explains the interaction between different SDK components and also describes the typical control and data flow in these mmWave devices.
Before going into the details of the SDK, let's first understand the different platforms for these mmWave sensors. Both the xWR14xx and the xWR16xx are platforms for the single chip autonomous mmWave sensors. The first platform, xWR14xx, which includes AWR1443 and IWR1443, has four receive channels and three transmit channels and comes along with a user programmable R4F microcontroller and a hardware accelerator for the signal processing. Whereas the second platform, which includes AWR1642 and IWR1642, has four receive channels, two transmit channels, and comes along with the same microcontroller but has an additional C674 DSP instead of the hardware accelerator. The data from these devices can be sent to an external processor or controller using the various external interface available.
Please note that both these devices need an external serial flash where the application code for the microcontroller and the DSP can be used stored. So in order to enable the application and the software development on these devices, TI provides platform software, which includes TI real-time operating system for the R4F and the DSP; device drivers for different peripherals; the APIs to program the mmWave device and the libraries for the signal processing; different demo applications to demonstrate the final use case; and certain tools.
The platform software for these devices are called mmWave software development kit or mmWave SDK. This is the high level block diagram of TI mmWave SDK. On the color coding, the blue blocks indicates the production quality components provided by TI. Components in the yellow blocks are provided as a reference or example and is expected by the users to develop them according to the system use cases.
Now let's look at the different components in the TI mmWave SDK. At the bottom of this block diagram is the bootloader, which resides in the Master SS of the R4F MCU and is responsible for the booting of the entire device. It also loads the application code from the serial flash into the different cores and executes them.
The Radar SS Firmware or the mmWave Front End controls the RF and the analog hardware blocks and is responsible for the entire mmWave Radar operations. The mmWaveLink is the driver for the Radar SS Firmware and provides low level APIs to control each of the hardware blocks in the front end.
The mmWave API is an abstraction over the mmWaveLink and provides simple APIs to the application to configure the front end. The mmWave API also handles the synchronization and the interprocess communication between the R4F MCU and the DSP subsystem. The SDK also includes the TI real-time operating system, TI-RTOS, and the RTOS-based drivers for both the master subsystem and the DSP subsystem. All the drivers comes along with the OSAL, which means that the drivers can be ported into a different RTOS, as here.
The mmWave library contains standard routines and the algorithms for signal processing, such as the FFT and the CFAR algorithm. There are some simple applications which are packaged in the SDK, which demonstrates the usage of these components to create a simple proximity [INAUDIBLE] applications. The mmWave Demo Visualizer is a GUI, which runs on a PC and p the object data, such as the range, velocity, and the relative position of the optics.
This is the directory structure for TI mmWave SDK. They Once you install the SDK, this is how you will see the directory structure in your PC. The other route, there are four folders-- the packages, docks, firmware, and tools. The packages contain the scripts files to building mmWave SDK. The TI folder contains all the source files for the components that were discussed in the last slide. So it includes the source files for the demo, which is the out-of-box demo and the capture demo. The drivers folder contains the source file for the drivers, including the associated PinMux and the other peripheral drivers. The driver comes along with the osal so it can be put into a different RTOS.
All the drivers comes along with the corresponding doxygen documents, which explains how to use the driver at the unit level, and also comes along with the unit test for these driver APIs. The control folder contrains the APIs to control the mmWave Front End, so it has the mmWaveLink, which is the low level APIs, and then mmWave will provide the high level APIs.
The algorithm folder contains mmWaveLib, which is the signal processing library. The platform contains the platform specific files, such as the linker command file for these platforms. The utils folder contains all the utility applications, such as the ccsdebug application, the command line interface to send data over the [INAUDIBLE] interface, the system profiler, which is the cycleprofiler, and the testlogger.
Remember the source files. The mmWave SDK contains the docs folder, which contains the release notes and the user guides. These are the starting point for any new user. There release notes contains the information about the release-- what are the different components, what are the dependencies on the different tools? Along with that, it also captures a lot of the new changes into this particular release.
The user guide contains the information on how to run the mmWave SDK demo, so the out-of-box and the capture demo, how to set up the mmWave SDK, and how to build the mmWave SDK. And finally, there is a firmware folder, which contains the firmware for the Radar subsystem and data tools folder. For more information on the SDK contents, you will refer to the release notes and the user guides.
Now let's have a brief look into each of these building blocks. mmWave SDK drivers offer portable and feature-rich access to peripherals through easy-to-use APIs. Here is an example of one of these drivers, and all the drivers follow the same structure and conventions. For example, the device and the CPU configuration allows each driver to be portable on a different CPU. In this case, drivers are common across the MSS and the DSP subsystem.
Each level uses the OSAL to provide a threat-safe and control access to the hardware IP. The TI-RTOS provides all the OS-related functionalities to these drivers. All the drivers in the SDK comes along with their doxygen document and the test application, which demonstrates the uses of these APIs.
Now let's look into the mmWave component. mmWaveLink is the driver for the Radar subsystem or the mmWave Front End. As you know, the Radar subsystem controls the RF and the [INAUDIBLE], and is responsible for the entire mmWave Radar operations. On its internal block, any operation can be controlled using the messages coming over the mailbox. mmWaveLink framework provides the infrastructure which generates these messages and also handles the communication over the mailbox.
mmWaveLink provides the low level APIs to control the Radar subsystem and the FMCW chirp configuration. Here is a snapshot of some of the APIs. The Device Manager APIs can be used to initialize a driver and do the handshake with the mmWave Front End. The Sensor Control APIs allows you to configure different blocks in the mmWave Front End, such as you can configure how many receive and how many transmit channels needs to be enabled. Similarly, you can configure the ADC format, the chirp configuration, and you can use the APIs to start and stop the transmission of the frames.
mmWave component provides high level APIs, which abstracts the low level mmWaveLink APIs and provides simple interface to the application. The model runs on both the MSS and the DSP and allows the flexibility to configure the Radar SS Firmware or the mmWave Front End from each of these cores. It also handles the communication between the MSS and the DSP to provide the synchronization.
In this example, the configuration of the front end is done from the MSS application, but the mmWave API passes the configuration the DSP so that the DSS can configure the EDMAs and the signal processing chains accordingly.
mmWaveLib, which runs on the DSP subsystem, provides key routines for the FMCW signal processing, which includes various FFT routines they've been doing on the input data. The basic detection algorithms, such as the CFAR-CA, the different angle estimation FFT, and the other helper routines, such as the scaling, shifting, and accumulation. All these routines are optimized for the C674x architecture to give a better system performance.
This is a typical flow in the mmWave SDK. As the bootloader brings out the MSS and the DSS out of reset, the application on the master subsystem uses the mmWave API and the mmWave Web Link driver to communicate with the mmWave Front End. The application does all the configuration of the mmWave Front End and triggers the frame. Once the frame is triggered, the ADC data from the front end is moved to the ADC Buffer.
ADC Buffer is a Ping-Pong Buffer, which means that while the data is getting moved from the mmWave Front End to the Pong Buffer, the data from the Ping Buffer is moved to the DSP memory. And the DSP uses the mmWave library to do The range FFT on the Ping Buffer. Once the Range FFT is complete, it uses the EDMA driver to move the data to the L3 Memory. The sequence is repeated for all the chirps in a fame, and the Range FFT is calculated for all the chirps.
During inter-frame time, DSP moves the data from the L3 Memory back to the DSP memory and does the Doppler FFT. The result of the Doppler FFT is, again, moved to the L3 Memory. After Doppler FFT, DSP runs the basic CFAR algorithm for the object detection and uses the angle estimation routine to find the relative position of the objects.
Now, data is then sent back to the master subsystem. MSS can send this information to an external controller using the different peripheral drivers, or it can also pass this data to an external processor to do the object tracking and the classification.
Now, let's look into the detailed control flow. The MSS application uses the SOC_init to initialize the device and powers up the mmWave Front End. It then uses the MMWave_init APIs to do the basic initialization of the mailbox and the other key drivers. After that, it waits for the front end bootup completion.
DSP application, on the other hand, uses the MMWave_execute API, which sets up the IPC to receive the data from the MSS. After this, both the MSS and the DSP does synchronization to check the health of each other. DSS also initializes the EDMA and the ADC buffer for the data processing.
Once the initialization is complete and the MSS and the DSS is synchronized, MSS application use the MMWave_config API to parse the configuration from the application to the mmWave Front End. mmWave API uses the mmWaveLink API, which constructs the mailbox message and sends it to the mmWave Front End. mmWave Front End, once it receives a message, checks the integrity of the message and sends an acknowledgement back to the mmWaveLink. In this way, all the messages are sent to the front end.
When the mmWave API has sent all the configuration to the front end, it also passes the configuration to that DSP. The DSP, once it receives this configuration, sets up the EDMA and the ADC Buffer accordingly. Once all the configuration is done, the MSS application uses the MMWave_start API, which sends a message to the DSP. Once the DSP receives the start message, it enables the data processing and waits for the ADC data.
After that, the mmWave API uses the mmWaveLink API to send a start message to the mmWave Front End. Once the mmWave Front End receives this start message, it starts transmission of the frame.
Now let's look into the data flow. Once a frame is triggered, the mmWave Front End moves the ADC samples into the Ping-Pong ADC buffer. Once one of these buffers is full, it generates an interrupt to the DSP, and the EDMA moves the data from one of these buffers into the DSP memory. DSP then uses the mmWaveLib to do the Range FFT on this buffer and moves the result to the L3 Memory. The sequence is then repeated for all the chirps in a frame, and the Range FFT is stored into the L3 Memory.
During the inter-frame time, the DSP configures the EDMA to move the data from the L3 Memory back to the DSP Memory and does the Doppler FFT. When the Doppler FFT is done, DSP, again, configures the EDMA to move the data back to the L3 Memory.
After that, the DSP does the object detection and the angle estimation and moves the object list to one of the object buffer in the ping-pong manner. MSS then picks up this Object List from one of these buffer and sends it to the external controller.
With this, we have come to the end of this training session. For more information, do refer to the mmWave SDK user guide and the demo applications. I hope that the training was helpful, and I sincerely thank you for your valuable time. Thank you very much. This website is under heavy load (queue full) We're sorry, too many people are accessing this website at the same time. We're working on this problem. Please try again later.
This website is under heavy load (queue full) We're sorry, too many people are accessing this website at the same time. We're working on this problem. Please try again later.
This website is under heavy load (queue full) We're sorry, too many people are accessing this website at the same time. We're working on this problem. Please try again later.