Rebecca Fiebrink, the creator of Wekinator, recently participated in a panel on machine learning in the arts and taught a workshop (with Phoenix Perry) at Resonate ’16. Another project in a similar vein is the Wekinator, which is featured in a free online course on machine learning for musicians and artists. Also building on the GRT is ml-lib, a machine learning toolkit for Max and Pure Data.
Our project is part of a broader wave of projects aimed at helping electronics hobbyists make more sophisticated use of sensors in their interactive projects. Installations instructions are on our GitHub project page.
The software is still rough (and Mac only for now) but we’d welcome your feedback. We’re building on the Gesture Recognition Toolkit (GRT) and openFrameworks. The project is a part of my research at the University of California, Berkeley and is being done in collaboration with Ben Zhang, Audrey Leung, and my advisor Björn Hartmann. We’re working on building up a library of code examples for different applications so that Arduino users can easily apply machine learning to a broad range of problems. The machine learning algorithms that power this pattern recognition are specified in Arduino-like code, while the recording and tuning of example sensor data is done in an interactive graphical interface. Our ESP (Example-based Sensor Predictions) software recognizes patterns in real-time sensor data, like gestures made with an accelerometer or sounds recorded by a microphone. for recognizing spam emails or recommending related products. Machine learning is a technique for teaching software to recognize patterns using data, e.g. At Arduino Day, I talked about a project I and my collaborators have been working on to bring machine learning to the maker community.