Getting Started with TinyML – Exploring Evaluation Kits

By Mark Patrick, Mouser Electronics

0
407

While there are many potential applications for low power microcontrollers running TinyML, they are particularly suited for use in edge-based industry internet of things (IIoT) devices. A wireless battery-powered motor condition monitor or a long range crop supervision are good examples of where they can be deployed.

There are many TinyML development kits available that can help you to get your first microcontroller-based machine learning application up and running.

In this article we explore some of these development kits and evaluation boards. These are well supported by popular TinyML libraries and workflow resources from companies like Google TensorFlow Lite for Microcontrollers and Edge Impulse.

TinyML Platform Showcase

Google AIY Vision Kit

The  Google AIY vision kit machine learning platform is highly integrated and easy to get started with – see Figure 1.

Google AIY vision kit
Figure 1 – The Google AIY vision kit (source Google)

Including a popular Raspberry Pi Zero and housed in a robust die-cut cardboard case,  the kit contains all the hardware, cables and firmware required to create a neural network based device that can identify faces and emotions and also recognise a set of 1,000 common objects. It also includes a Raspberry Pi V2 Camera and Google Vision Bonnet (Hat). The Google Vision Bonnet uses a low power Intel Movidius Myriad 2 MA2450 vision processing unit (VPU) to perform image inferencing tasks and it provides visual and audible outputs using a multicolour LED and piezo buzzer (with no requirement to be connected to the internet). A step-by-step tutorial on how to set up the kit is available on a companion website which also provides examples on how to implement all the Google TensorFlow Lite neural network.

A joy detector model is the first demo in the tutorial. An LED on the kit lights up after the camera detected the face of a person. The LED colour changes yellow if it detects a smile, and blue if it detects a frown. If it detects a large smile, the piezo buzzer plays a sound. If several faces are detected, the behaviour of the LED and piezo outputs matches that of all the facial expressions combined.

Connection to a remote processor (provided by a screen-connected Raspberry Pi or via SSH to another computer) is required for the object image recognition demo to receive output messages.

illustrates the terminal output
Figure 2 illustrates the terminal output from the image classification algorithm showing the probabilities.
classification terminal outputs
Figure 3 – Image classification terminal outputs (source Google)

SparkFun 15170 Development Board

The SparkFun 15170 Development board is a noteworthy evaluation platform because it has been designed to use considerably less power than the Google AIY kit.

Figure 3 – SparkFun 15170 Development Board – source (SparkFun)

The kit features the Ambiq Apollo3 Blue ultra-low-power 32-bit ARM Cortex microcontroller clocked a 48 MHz (but up to 96 MHz in burst mode) and which consumes 6uA/MHz of power.  It also includes a BLUETOOTH® Low Energy (BLE5) wireless transceiver,  an STMicroelectronics ST LIS2DH 3-axis accelerometer and two Knowles MEMS microphones. The board can run off a single coin cell for ten days (assuming 1.6 mA current consumption at 3 V).

An example word trigger application is available in Google’s CodeLab repository. This uses the TensorFlow Lite for microcontrollers and a convolutional neural network to show that it recognizes the words ‘yes’ and ‘no’ by flashing different LEDs for each result. Other example applications are available in the TinyML book written by Pete Warden and Daniel Situnayake. Other valuable resources and lectures are included on the website of the TinyML Foundation.

STMicroelectronics 32F746G Discovery Kit

The STM 32F746G Discovery Kit is also highlighted in the TinyML book – see Figure 4. It has more resources than a typical edge sensor device might require, making it an excellent TinyML prototyping platform. The board features the STM32F746NGH6 microcontroller (which has many useful low power features and sleep modes) which is ideally suited for deployment in low-power edge device.  Other features of the kit include a 4.3 inch 480 x 272 colour LCD-TFT capacitive touch screen, two STM MEMS microphones, 128 Mbit of Quad-SPI Flash and 64-Mbits of accessible SDRAM, an onboard ST-Link debugger programming interface, and stereo speaker outputs. Peripheral interfaces and connectivity include USB, virtual COM port, SPDIF, Arduino Uno V3 headers, and an Ethernet socket. The board support ST’s STM32Cube IDE and the full set of CubeMX libraries evaluation board Mouser’s YouTube channel has an introductory video featuring this kit.

STMicroelectronics 32F746G Discovery Kit
Figure 4 – The STMicroelectronics 32F746G Discovery Kit (source STMicroelectronics)

Silicon Labs Thunderboard Sense 2 IoT Development Starter Kit

The Thunderboard Sense 2 from Silicon Labs – see Figure 5 –  features the Mighty Gecko wireless system-on-chip (SoC) and a 2.4 GHz radio which supports multiple protocols including BLE, Thread, and Zigbee. EFR32 microcontrollers have very low power consumption, making them ideal for use in IIoT/IoT applications. The board also includes multiple sensors, including relative humidity and temperature, air pressure, indoor air quality and gas, ambient light and UV, digital microphones, a hall-effect sensor, and a 6-axis MEMS combined gyroscope and accelerometer. Other features include an integrated Segger J-Link debugger, USB virtual COM port, and high brightness LEDs.

Silicon Labs Thunderboard Sense 2 IoT Development Starter Kit
Figure 5 – The Silicon Labs Thunderboard Sense 2 IoT Development Starter Kit (source Silicon Labs)

The Thunderboard Sense 2 is also supported by Edge Impulse, an integrated workflow platform which has been designed to assist with the training, testing, and deployment of embedded microcontroller machine learning applications. Edge Impulse includes many example experimental models like continuous motion recognition. Figure 6 illustrates the development process. To set up the target board for Edge Impulse, several tools must be hosted on the development computer in order to perform data acquisition from a variety of sources – see Figure 7.

The Edge Impulse data acquisition options
Figure 7 – The Edge Impulse data acquisition options (source Edge Impulse)

TinyML – Are You Ready To Get Started?

The boards featured in this article can be used in most development tasks. Mouser’s machine learning includes project examples like a label position checking application for use in product packaging andmany other great ideas to get you started.