Rhythm Generating

Project: Learning Machine, ITP
Role: Developer
Instructor: Patrick Hebron
Tools: Magenta, Python
Date: 10/2016 - 12/2016


Composition with Machine Together

Machine Learning has been the hottest topic along with AR/VR these several years, which draws my personal great interest to dig into it and make something with this technology.

In the scenario of current applications of machine learning, creative works are weaker areas compared to the relatively mature areas like natural language processing and object recognition. However, this is some areas for designers like me wonder most about in which way should the machines working with our human beings.


Tech Approaches

I started the interest about music generating as the wavenet published the papers about deep learning model for raw audio. As it turns out it requires high s performance tools to generate 16, 000 samples per sec of raw audio, which is beyond something I can reach as a student. 

MIDI Dataset

After consideration, I decided to play with midi files currently. Firstly I started by looking for the right dataset for training. I have tried several datasets, most of them are typical piano pieces of midi files. Then I surprisingly found a dataset of many genres of midi music uploaded by a midi man.

I tried several types of midi files, and decided to use all the metal rock midi files, including almost most the popular rock bands.

Magenta, Attention_RNN

Then thanks to the Magenta group of people from Google Brain team, opening discussion and github tutorials for using tensorflow to make creative works. I used the attention RNN look back methods for generating.

AWS for training

For training the data, I used amazon AWS, and trained about 16 hours with 1200 steps. Here is some training rates screenshots and accuracies.


Failures and Surprise

After giving some premier notes, and adjust some combinations and many rounds of failure, some results are really mass, but small are pretty surprising.

Merging MIDI Results

Then after got these results, I am pretty excited, but wonder a lot about how we could use these midi rhythms. First simplest thing I did is to randomly pick three midi tracks from the above, drag them into the garage band, and gave each one a instrument to check out what is the result. Then goes the sample at the beginning of this project.

Merged Result


This project is not a very advanced attempt, however, it helps me open the gate of machine learning, especially in the are of creative works, I believe machine can help human in certain perspectives, the most valuable one is to try every possible solutions, and giving inspirations for any creative works.

This project is still in process, I am currently working on the generating rhythm idea further.

All Copyrights Reserved