Using the ML Plugin with Motion Gestures


This guide covers usage of the ML Plugin with our design partner, Motion Gestures. From the ML Plugin, it is possible to upload new gestures to the Motion Gestures SDK, where they can be used to train and test new gesture recognition models that can be deployed back to the embedded application.



Hardware Tools

Figure 1: SAMC21 Xplained Pro with the QT8 Xplained Pro connected to EXT1

Touch gesture data can be gathered from any platform that supports 2D touch sensing. One example is the Integrated Graphics and Touch (IGAT) Curiosity Development Kit, which also demos the Motion Gestures solution in the Legato Showcase Demo Firmware

Software Tools

The ML Plugin and MPLAB® Data Visualizer can both be installed as plugins to MPLAB® X via the plugins manager or the ML Plugin can be installed as a plugin to MPLAB® Data Visualizer Standalone.

Exercise Files


This section contains instructions for using the ML plugin with Motion Gestures. The first section of this guide is about capturing 2D touch position data streamed over a serial connection from the SAMC21 Xplained Pro and viewing it within the XY Plot of Data Visualizer. Then the guide covers using the ML Plugin to upload captured gestures to the Motion Gestures SDK for training new models and testing their performance. To learn more about Motion Gestures' touch gesture recognition solution, check out the "Motion Gestures Touch Demo".

Capturing sensor data with MPLAB® Data Visualizer


Program the Kit with Data Logger Firmware

Use MPLAB® X IDE to program the SAMC21 Xplained Pro with the provided example project. This firmware utilizes the on-chip Peripheral Touch Controller (PTC) to detect and track touch contacts on the QT8 Xplained Pro Touch Surface. The resultant touch coordinates, calculated by the QTouch Modular Library (QTML), are streamed out over serial connection to MPLAB® Data Visualizer.

If you are not familiar with MPLAB® X IDE, please visit the "MPLAB® X IDE" Developer Help page.

The touch configuration parameters, such as the sensor sampling rate, can be found in touch.h. These settings can also be viewed or reconfigured within MPLAB® Harmony v3. The sampling rate should be noted for later use in the project configuration menu of the Motion Gestures SDK. For the example firmware, a sampling rate of 200 Hz is used, and the library is configured for self-capacitance sensing to use the QT8 Xplained Pro as the touch sensor.

Figure 2: Touch acquisition settings in touch.h

Once the kit is programmed with the desired configuration, you are ready to move on to collecting the serial data stream with MPLAB® Data Visualizer.

If you are not familiar with MPLAB® Data Visualizer, please see the "Data Visualizer User's Guide".


Configure MPLAB® Data Visualizer

Leave the board connected to the computer and open MPLAB® Data Visualizer. Load the Data Visualizer workspace file 2d-touch-position.dvws found in the example firmware repository. This workspace already contains the variable streamer required to parse the X/Y position data, and it will plot each variable once the serial port is configured.

Figure 3: Load Data Visualizer workspace

After loading the Data Visualizer workspace file, select the Serial/CDC Connection that corresponds to the SAMC21 Kit. Adjust the baud rate to 115200 and click Apply. The DGI connection can also be disabled since we will not use any debug data.

Figure 4: Serial port configuration

Use the Play button on the Serial/CDC Connection to start data collection from the kit. Once data is streaming, it is available for use with the variable streamer.

Figure 5: Start streaming selected serial connection

Now select the same Serial/CDC Connection as the input data source for the Touch Position variable streamer, so that the data axes can be parsed from the stream.

Figure 6: Variable streamer data source selection

The X/Y position data is now available in the time plot. Two-dimensional touch data is also visualized in the XY Plot, which plots the data visible in the Time Plot in two dimensions. Double click anywhere within the time plot to start/stop scrolling of the time axis, and zoom in or out to decrease or increase the length of the plotting window.

Figure 7: Data Visualizer Time Plot


Select Data Region and Mark Time Window

Once the touch position data is visualized in the XY Plot, gesture data can be marked for use in the ML Plugin. To capture a new gesture, simply draw it on the QT8 Touch Surface and then focus the time window on the specific time region where the gesture was performed. Confirm that the gesture is properly segmented by checking the XY Plot. When ready, click the Mark button in the Time Axis menu.

Figure 8: Mark the data visible in the time window for use in the ML Plugin

Pressing Mark will place the cursors at the bounds of the visible window. To select a new region of data, first, reposition the desired data within the Time Plot, and then press Mark again.

Figure 9: Time Plot with marked data ready for use in the ML Plugin

After marking the time window, the gesture data displayed in the XY Plot is ready to be used within the ML Plugin. Repeat this process as needed to capture new gestures for use with Motion Gestures.

Uploading data to the Motion Gestures SDK


Log in With Your Motion Gestures Credentials

After selecting Motion Gestures within the ML plugin you will be prompted to log in. This will enable the plugin to connect with your account on the Motion Gestures SDK for uploading the captured gesture data. If you still need to create an account on the Motion Gestures SDK, you can do so for free on the registration page.

Figure 10: Login to Motion Gestures from the ML Plugin


Upload to Gesture Library

To upload the new gesture to your Gesture Library, enter a name for the gesture and click upload. Once the gesture is available in the Motion Gestures SDK library, you can add it to one of your projects and then train a model that can recognize it. To learn more about using the Motion Gestures SDK, see the "Motion Gestures User's Guide".

Figure 11: Upload a new gesture to the gesture library


Upload to Project for Model Testing

Once you have trained a model in the Motion Gestures SDK you can test the model performance by uploading new gestures from the ML Plugin. The recognition results are displayed in the ML Plugin. This allows for validating model performance, and if tuning is needed, the test results can be used to iterate on and improve the project configuration.

Figure 12: Upload a new gesture for testing

To find your project's API key within the Motion Gestures SDK, open the project settings menu, and select Details. Then copy and paste the API key into the ML Plugin to upload new gestures for classification.

Figure 13: Motion Gestures SDK - Project Details

After uploading a gesture for testing, the response will be displayed in the ML Plugin along with the confidence rating associated with the recognition. This way the solution performance can be assessed with data directly from the target hardware.


You should now understand how to collect live touch gesture data from your target device and send it to the Motion Gestures SDK for developing gesture recognition solutions. To deploy your own model as a static C library on a Microchip Arm® Cortex®-based 32-bit device, please contact Motion Gestures. If you would like to try out a demo library running in an embedded application then check out the "Motion Gestures Touch Demo".

© 2021 Microchip Technology, Inc.
Notice: ARM and Cortex are the registered trademarks of ARM Limited in the EU and other countries.
Information contained on this site regarding device applications and the like is provided only for your convenience and may be superseded by updates. It is your responsibility to ensure that your application meets with your specifications. MICROCHIP MAKES NO REPRESENTATIONS OR WARRANTIES OF ANY KIND WHETHER EXPRESS OR IMPLIED, WRITTEN OR ORAL, STATUTORY OR OTHERWISE, RELATED TO THE INFORMATION, INCLUDING BUT NOT LIMITED TO ITS CONDITION, QUALITY, PERFORMANCE, MERCHANTABILITY OR FITNESS FOR PURPOSE. Microchip disclaims all liability arising from this information and its use. Use of Microchip devices in life support and/or safety applications is entirely at the buyer's risk, and the buyer agrees to defend, indemnify and hold harmless Microchip from any and all damages, claims, suits, or expenses resulting from such use. No licenses are conveyed, implicitly or otherwise, under any Microchip intellectual property rights.