Inspired by Google's Soli sensor, i tried to imagine possible ways to work with gesture recognition. I made two experiments: in the first one i used machine learning to help the webcam to recognize some hand gestures and then use them to play a simple game. In the second one i wanted to build something physical so i used a thermal sensor array to detect the position of my hand and i visualized it with Processing.
For this project i used the Ml5.js library's "regression with feature extraction" model and i trained it to recognize some gestures of my right hand, then i wrote a simple game using p5.js. If you want to play it here's the link to my github page, but since the training of the model was not very accurate and extensive probably the game it's not going to work well for you (i will try to fix it).
In this experiment i used a TPA81 thermal sensory array, it's a matrix of 8 sensor that are able to read infra-red 2um-22um range, that is the wavelength of radiant heat. I mapped the sensor to recognize the temperature of the skin (28-29 degrees) with the help of an Arduino Uno, then i visualized the data with Processing. With this system it's possibile to write different algorithms and use them to control simple devices, (check this example).