Dr. Rebecca Fiebrink is a Senior Lecturer in Computing at Goldsmiths, University of London, where she works with the Embodied AudioVisual Interaction group, developing new technologies to enable new forms of human expression, creativity, and embodied interaction.

Her current research combines techniques from human-computer interaction, machine learning, and signal processing to allow people to apply machine learning more effectively to new problems, such as the design of new digital musical instruments and gestural interfaces for gaming and health. She’s also involved in projects developing rich interactive technologies for digital humanities scholarship, and using digital music creation to engage youth in learning computer programming and computational thinking.
Workshop
Workshop: Building Creative Interactions with Machine Learning

MONDAY, JUNE 4th • 9:00AM • WALKER ART CENTER - SKYLINE ROOM

Are you interested in creating real-time interactions with sensors, cameras, depth sensors, gaming controllers, or microphones? Machine learning can be a great tool for giving such inputs control over animation, sounds, robots, game engines, or other systems you’ve built. Machine learning makes it possible to build complex interactions that are difficult or impossible to create using only programming; machine learning also makes it possible for non-programmers to build and customize systems, and for programmers to build things more quickly.

In this workshop, you’ll get a hands-on introduction to using machine learning for designing new interactive art, music, games, and other real-time systems. We’ll teach you the basics of a few standard machine learning techniques and help you get started hacking with machine learning on your own projects.

For students who want to prototype things quickly without code, we’ll be using the Wekinator., a free and cross-platform software tool that connects to a wide variety of existing hardware and software (e.g., Arduino, Unity 3D, Max/MSP, PD, Ableton, openFrameworks, Processing, Kinect, Bitalino, …). We’ll also be showing how the same techniques can be used within code (including openFrameworks/C++ and JavaScript) using free libraries such as the RAPID-MIX API.

We’ll talk about how to use machine learning to work more effectively with sensors, audio, and video data, and to build expressive & embodied interactions. You don’t need any prior machine learning knowledge (though you’ll still learn a lot even if you’ve previously studied machine learning in a more conventional context!). We’ll combine lectures and discussion with plenty of hands-on hacking. We’ll be using free and open source software to hook up game controllers, sensors, webcams, and microphones to interact with sound, animation, game engines, actuators, and other creative gear.

SKILL LEVEL: Intro / Intermediate / Advanced
The workshop will be most useful for people who can do a bit of coding in some environment (e.g., Processing, openFrameworks, JavaScript). But people who don’t do any programming will still be able to fully participate, as we have plenty of off-the-shelf examples which can be run without coding.

HARDWARE TO BRING:
• All attendees should bring a laptop (any operating system).
• Optionally, attendees can also bring input devices such as those listed at Wekinator.org/examples (e.g., Leap Motion, Arduino + sensors, joysticks, mobile phone with touchOSC, ...).
• Attendees may also want to bring software/hardware they might want to control with machine learning (e.g., Arduino with motors; Max/MSP, Unity, Processing, openFrameworks, ...)

SOFTWARE TO BRING:
• Install Wekinator from wekinator.org/downloads
• Make sure it runs! If not, install the most recent version of Java for your operating system.
• If you're a Processing programmer, install the Processing code "Quick Start Pack" from Wekinator.org/examples/#Quick_Start_Pack. Follow the instructions at this Youtube How to run Wekinator examples in Processing Video to install the Processing libraries for OSC and video if you don't already have these.
• Or if you're not a Processing programmer, install the "Quick Start Pack" for your operating system at Wekinator.org/examples/#Quick_Start_Pack. Run the executable in Inputs/Simple_Mouse_DraggedObject_2Inputs/ and make sure you see a green box on a black screen. If you don't, please download the "last resort" examples from Wekinator.org/examples/#Quick_Start_Pack.