This speed project from Design I/O uses Google’s Project Soli – Alpha Dev Kit combined with the excellent machine learning tool Wekinator and openFrameworks to detect small movements that look like someone playing a tiny violin and translate that to the volume and playback of a violin solo.
source/image: Design I/O
Design I/O combined a few technologies to accomplish this. The first is Google’s Project Soli, a tiny radar on a chip. Project Soli’s goal is to do away with physical controls by using a miniature radar for doing touchless gesture interactions.
Soli is a miniature radar that understands human motions at various scales: from the tap of your finger to the movements of your body.Soli aims to understand the nuances of human movements so that we can use our natural body language and gestures as a form of input.
Sliding your thumb across the side of your outstretched index finger, for example, can be interpreted as moving a slider to change the numerical value of something, perhaps turning up the air conditioner in your car.We created an interaction framework that groups human movements according to levels of proximity and engagement between the user and Soli: aware, engaged and active. The framework is based on regular human nonverbal communication models.