ML Plugin User's Guide

 Objective

This guide will teach you how to use the Machine Learning Plugin to capture and transmit your data to our ML partner platforms where you can develop machine learning solutions that can be deployed in your embedded application. The ML Plugin works within MPLAB® Data Visualizer so that you can capture a live data stream from your target device, select the desired data region, and then upload it to the platform of your choice or log the data to a file for later use.

mplab-dv-ml.png

Collecting and curating data is one of the most critical steps in developing an ML solution. It is generally the most time-consuming activity in the design cycle, and the performance of the resultant model is highly dependent on the quality of the data. This is why it is crucial to have effective tools and processes for gathering data. The goal of the ML Plugin is to simplify the data collection process and to enable the rapid development of embedded ML solutions.

 Microchip's ML Design Partners

Microchip has partnered with the experts in embedded ML in order to empower our developers with the latest tools and technologies in the field. We have multiple partners because they each have their own unique strengths and areas of expertise.

ml-partner.png

With Edge Impulse and SensiML, developers can create solutions for a wide array of use-cases utilizing various sensor types. With Motion Gestures, developers can easily create new recognition models for custom, user-defined gestures. Our partners provide the necessary tools within their development platform to validate and optimize model performance before deployment.


ei-logo-rgb.png

Upload your data to the Edge Impulse Studio and start creating the next generation of intelligent devices with embedded machine learning. Edge Impulse is an open-source TensorFlow Lite-based framework for classification, regression, and anomaly detection. To learn more visit Edge Impulse's website.


Motion-Gestures-logo.png
  • Solution Type: Complex 2D Gesture Recognition
  • Dev Platform: Motion Gestures SDK
  • Output Format: Gesture Recognition Engine (Static C Library)

Define new touch gestures on your target hardware and upload them to the Motion Gestures SDK to train a gesture recognition model that can be deployed back to your embedded application. Motion Gestures offers an advanced, out-of-the-box solution for complex gesture recognition that enables rapid high-accuracy gesture recognition systems. To learn more visit Motion Gestures' website.


sensiml-logo.png

Log your data to a file for import into SensiML's Analytics Toolkit to get started developing classification and anomaly detection solutions that can be deployed in your embedded application. The Analytics Toolkit is great for beginners and experts alike as it provides AutoML tools as well as fully customizable ML pipelines. To learn more visit SensiML's website.


Partner solutions are suitable for deployment on Microchip Arm® Cortex®-based 32-bit microcontrollers and microprocessors.

 Materials

Hardware Tools

mlkits.png

This guide covers using the SAMD21 ML Kit for data collection as an example, however, any time-series data can be used for building ML solutions with Edge Impulse and SensiML.

Software Tools

The ML Plugin and MPLAB® Data Visualizer can be installed as plugins to MPLAB® X via the plugins manager or the ML Plugin as a plugin to MPLAB® Data Visualizer Standalone.

Example Firmware

For Motion Gestures, there is a different example firmware (samc21-qt8-data-logger.X) for logging 2D touch position data. Refer to "Using the ML Plugin with Motion Gestures" for details on uploading new gestures to the Motion Gestures SDK.

 Procedure

To use the ML Plugin, MPLAB® Data Visualizer must first be configured to receive data from the desired target device. This involves the configuration of the serial connection as well as the variable streamer, which will parse variables from the serial stream. Once data sources are plotted in the Time pane, click the Mark button to tag the visible data for use in the ML Plugin. The Mark button places the cursors (A & B) at the bounds of the visible graph so that the viewable data can then be used within the ML Plugin.

3dof-plot.png
Figure 1: MPLAB® Data Visualizer with 3-axis IMU data plotted in the Time Window

To select a new region of data, scroll, and zoom as needed in the Time pane and then click Mark when the desired segment is precisely within the viewable window. Alternatively, the bounds of the time window can be set manually in the Time Axis menu on the right, before pressing the Mark button.

Capturing sensor data with MPLAB® Data Visualizer

1

Program the Kit with Data Logger Firmware

Use MPLAB® X IDE to program the SAMD21 ML Kit with the provided example project. Be sure to select the correct project configuration in MPLAB® X before programming the device. There are two configurations to support both versions of the SAMD21 ML Kit.

  • IMU2 (Bosch): SAMD21_IOT_WG_BMI160
  • IMU14 (TDK): SAMD21_IOT_WG_ICM42688
project-config.png
Figure 2: MPLAB® X project configuration

If you are not familiar with MPLAB® X IDE, please visit the "MPLAB® X IDE" Developer Help page.

The general application settings for the SAMD21 Data Logger, such as the sensor sampling rate and the data logging format, can be found in app_config.h. This is also where the individual axes of the IMU can be enabled or disabled. This guide will use only the three axes from the accelerometer, however, the logged axes can be reconfigured as needed based on the application.

app-config.png
Figure 3: Data logger application settings in app_config.h

Once the kit is programmed with the desired configuration, you are ready to move on to collecting the serial data stream with MPLAB® Data Visualizer.

If you are not familiar with MPLAB® Data Visualizer, please see the "Data Visualizer User's Guide".

2

Configure MPLAB® Data Visualizer

Leave the board connected to the computer and open MPLAB® Data Visualizer. Load the Data Visualizer workspace file 3dof-imu-acc.dvws found in the example firmware repository. This workspace already contains the variable streamer required to parse the IMU data, and it will plot each variable once the serial port is configured.

load-ws.png
Figure 4: Load Data Visualizer workspace

After loading the Data Visualizer workspace file, select the Serial/CDC Connection that corresponds to the SAMD21 ML Kit. Adjust the baud rate to 115200 and click Apply. The DGI connection can also be disabled since we will not use any debug data.

serial-port-config.png
Figure 5: Serial port configuration

Use the play button on the Serial/CDC Connection to start data collection from the kit. Once data is streaming, it is available for use with the variable streamer.

play-pause.png
Figure 6: Start streaming selected serial connection

Now select the same Serial/CDC Connection as the input data source for the IMU variable streamer, so that the data axes can be parsed from the stream.

data-source.png
Figure 7: Variable streamer data source selection

The IMU data should now be available in the time plot. Double click anywhere within the time plot to start/stop scrolling of the time axis.

time-plot.png
Figure 8: Data Visualizer Time Plot

4

Select Data Region and Mark Time Window

Once the desired data sources are plotted in the graph, select a region of interest in the data by focusing the Time Plot on that region. You can drag the plots in the time window to the desired region of data while scrolling to zoom in or out as needed. When you are satisfied with the data viewable in the Time Plot, click the Mark button to tag this region of data for use in the ML plugin.

mark-time-window.png
Figure 9: Mark the data visible in the time window for use in the ML Plugin

Pressing Mark will place the cursors at the bounds of the visible window. First, select a new region of data, reposition the desired data within the Time Plot, and then press Mark again.

marked-data.png
Figure 10: Time Plot with marked data ready for use in the ML Plugin

After marking the Time pane, the data is ready to be used within the ML Plugin. This general process of configuring, plotting, and marking the serial data stream can be followed for any type of time-series data that is available in Data Visualizer.

Data can also be logged directly to a file by clicking Snapshot in the Time Axis menu of Data Visualizer. This will automatically mark the visible data and allow for saving in .csv or .json format.

Using the ML Plugin

Edge Impulse

When using the ML Plugin with Edge Impulse, you can upload new data to the Data Acquisition tab in the Edge Impulse Studio. This is used to collect the training and testing data that will be used to develop the model. Once you have trained a model within the Edge Impulse studio, the ML Plugin can be used to test model performance by uploading new data segments to the testing endpoint and then viewing the classification results within the ML Plugin.

For a detailed guide on the Edge Impulse functionality, see "Using the ML Plugin with Edge Impulse".


Motion Gestures

When using the ML Plugin with Motion Gestures, you can upload new user-defined, 2D gestures to the Motion Gestures SDK. This is used to add new gestures to the Gesture Library within your account and also to upload new gestures for testing model performance.

For a detailed guide on the Motion Gestures functionality, see "Using the ML Plugin with Motion Gestures".


SensiML

When using the ML Plugin with SensiML, you can save data to a CSV file that is formatted for import into SensiML's Data Capture Lab. The ML plugin also provides an option to generate a .dcli file (a JSON-based SensiML metadata file) that contains additional metadata about your sample.

For a detailed guide on the SensiML functionality, see "Using the ML Plugin with SensiML".

 Results

You should now understand the general process of collecting live sensor data from your target device with MPLAB® Data Visualizer and then sending it to one of our ML partner platforms with the Machine Learning Plugin. Now that this is out of the way, you can get to work developing edge-optimized, ML solutions that can be deployed in your embedded application.

© 2021 Microchip Technology, Inc.
Notice: ARM and Cortex are the registered trademarks of ARM Limited in the EU and other countries.
Information contained on this site regarding device applications and the like is provided only for your convenience and may be superseded by updates. It is your responsibility to ensure that your application meets with your specifications. MICROCHIP MAKES NO REPRESENTATIONS OR WARRANTIES OF ANY KIND WHETHER EXPRESS OR IMPLIED, WRITTEN OR ORAL, STATUTORY OR OTHERWISE, RELATED TO THE INFORMATION, INCLUDING BUT NOT LIMITED TO ITS CONDITION, QUALITY, PERFORMANCE, MERCHANTABILITY OR FITNESS FOR PURPOSE. Microchip disclaims all liability arising from this information and its use. Use of Microchip devices in life support and/or safety applications is entirely at the buyer's risk, and the buyer agrees to defend, indemnify and hold harmless Microchip from any and all damages, claims, suits, or expenses resulting from such use. No licenses are conveyed, implicitly or otherwise, under any Microchip intellectual property rights.