Motion Gestures Touch Demo
igat-demo.gif
Figure 1: Integrated Graphics and Touch (IGaT) Curiosity Development Kit with the Motion Gestures Touch Demo

This demo showcases Motion Gesture's embedded machine learning solution for touch gesture recognition. With Motion Gestures, users can quickly develop new ML models to recognize custom, user-defined gestures. The recognition models can then be deployed back to the embedded application and integrated within Microchip's 2D Touch Surface Library. The Motion Gestures solution is suitable for deployment on Microchip Arm® Cortex®-based 32-bit microcontrollers and microprocessors.

How it Works

The Motion Gestures recognition engine is deployed as a static C library within the embedded application and it interfaces with Microchip's 2D Touch Surface Library in order to detect gestures input by the user on the 2D touch surface.

New gesture recognition models can be developed rapidly because only a single example of each desired gesture is required to train the model. The Motion Gesture's Software Development Kit (SDK) takes care of generating the training dataset from the one provided example of the desired gesture so that developers can skip the long data collection process that is typically required for ML solutions.

shortcut.jpg
Figure 2: Motion Gestures Development Flow

To get started developing new gesture recognition models, please contact Motion Gestures.

Gesture Definitions

The eight distinct gestures that the demo library is trained to recognize are shown below. The start of each gesture is marked by a dot and the end of the gesture is marked with an arrow. Recognition is path-dependent. Refer to the gesture definitions to see how each gesture should be drawn. Keep in mind that the finger should not be lifted until the gesture is complete.

8-gestures.png
Figure 3: Gestures recognized by the Motion Gestures Touch Demo library

The star gesture must begin at the top point, trace down to the left first, and then trace around until reaching the top again. This is the only variation of the five pointed star that is recognized by this demo.

These eight distinct gestures map to six labels: M, Check Mark, S, 2, Alpha, Star. There are two variations of the M and 2 gestures. In the demo, these variations will map to the same predicted label, so that a capital M and a lowercase m will both be mapped to the M gesture label.

Hardware Platforms

The Motion Gestures demo is available on two different hardware platforms. On both platforms, the demo is integrated within Microchip's 2D Touch Surface Library so that all of the standard 2D Touch and Gestures functionality is available as well. The Motion Gestures Demo Library will detect the complex demo gestures while Microchip's 2D Touch Surface Library will detect fundamental single-finger gestures such as taps, swipes, wheels, and even dual-finger gestures such as pinch, zoom, or dual-swipe.

1. SAME51 Integrated Graphics and Touch (IGaT) Curiosity Development Board

IGaT-showcase.png
Figure 4: IGaT Curiosity Development Kit Running the Legato Showcase Demo

The IGaT Curiosity comes preprogrammed with the initial version of the Legato Showcase Demo. This demo application is a part of the MPLAB® Harmony 3 Graphics application examples for SAM D5x/E5x Family. The latest firmware can be found on GitHub or it can be installed from the Harmony Content Manager within MPLAB X IDE. The Motion Gestures Touch Demo is integrated within the AI/ML portion of the Legato Showcase Demo.

2. SAMC21 Xplained Pro with QT8 Xplained Pro Touch Surface

SAMC21+QT8.png
Figure 5: SAMC21 Xplained Pro with the QT8 Xplained Pro Connected to EXT1

To use the Motion Gestures Touch Demo on the SAMC21, download SAMC21_MG_Touch_Demo.zip and program the board, then use Microchip's 2D Touch Surface Utility on a Windows® machine to view the gesture recognition results.

If you are not familiar with Microchip's 2D Touch Surface Utility, please visit the "Guide to Connect to Touch Surface Utility" Developer Help page.

The 2D Touch Surface Utility can be used to simultaneously display gesture recognition results from the Motion Gestures Demo Library and Microchip's 2D Touch Surface Library. The Current Gesture window will display the most recent recognized gesture. Results from the Microchip Touch Library will be displayed in blue and results from the Motion Gestures library will be displayed in green.

2D_SU_demo.png
Figure 6: Microchip's 2D Touch Surface Utility Showing That the M Gesture Has Been Recognized

Summary

With the Motion Gestures solution, complex touch gesture recognition models can be developed in a matter of minutes thanks to the power of machine learning. With just one example of each desired gesture, the Motion Gestures SDK can generate highly-accurate recognition models. To learn more, visit Motion Gestures' website.

© 2024 Microchip Technology, Inc.
Notice: ARM and Cortex are the registered trademarks of ARM Limited in the EU and other countries.
Information contained on this site regarding device applications and the like is provided only for your convenience and may be superseded by updates. It is your responsibility to ensure that your application meets with your specifications. MICROCHIP MAKES NO REPRESENTATIONS OR WARRANTIES OF ANY KIND WHETHER EXPRESS OR IMPLIED, WRITTEN OR ORAL, STATUTORY OR OTHERWISE, RELATED TO THE INFORMATION, INCLUDING BUT NOT LIMITED TO ITS CONDITION, QUALITY, PERFORMANCE, MERCHANTABILITY OR FITNESS FOR PURPOSE. Microchip disclaims all liability arising from this information and its use. Use of Microchip devices in life support and/or safety applications is entirely at the buyer's risk, and the buyer agrees to defend, indemnify and hold harmless Microchip from any and all damages, claims, suits, or expenses resulting from such use. No licenses are conveyed, implicitly or otherwise, under any Microchip intellectual property rights.