Neural Networks as Music

Tuesday, Mar 15, 2016

This grammy-winning pianists’s latest project? Teaching a computer to compose music of its own. 

Aaron Arntz [ITP 2015] has played with Grizzly Bear, Beirut, Edward Sharp and the Magnetic Zeros, and most recently with a recurrent neural network. As the first ever researcher in residence at DBRS’s Innovation Lab in New York [ Directed by ITP Alum Amelia Winger-Bearskin], Arntz partnered with machine learning experts Jen Rubinovitz and Jamis Johnson to compose its own piano music.  

Neural networks are algorithms modeled after the synaptic firing of the human brain, and are at the vanguard of research in AI and machine learning. You may have heard a lot about them recently; they are currently used in everything from the algorithms that handle from search queries to image and speech recognition, to producing mind-boggling art in the style of Google’s “Deep Dreaming” project.

For this project, Arntz, Rubinovitz, and Johnson trained a neural network on a large corpus of sheet music for piano and programmed it to be able to recognize and reproduce  sheet music of its own.

The project ended up unearthing some pretty interesting challenges for the team, such as: How to communicate to a neural network the difference between a left hand and a right hand, or to explain that they are supposed to go together?

Music is a great test case for exploring the “polyphony” of large datasets because analyzing it requires algorithms that detect not only patterns but also relationships between patterns. The team hopes to apply some of the insights from this project to time series financial data.  Aaron Arntz will also be playing Le Poisson Rouge as Recurrent Neural Network on March 30th: 

Read about the teams explorations and trials in full detail this post was written by Eamon O'Connor [ITP '15] with graphics and photos by John Farrell [ITP '15] and Fletcher Bach [ITP '15].