Automatic music composition experiment

Automatic music composition experiment (2016)

  • Scale-based melody and chord generation using charRNN
  • Combined with some additional rule-based instrument arrangements
  • Audio output rendering using Kontakt samples.

Used Keras(Tensorflow) for neural network / Ruby on Rails for web / C++ for audio rendering

Demo Video

Implementation details

  • Chord generation module
  • Melodic curve generation module 
  • Variational Autoencoder for sampling the seed input melody and seed input chord sequence
  • Audio rendering using virtual instruments loaded on  KONTAKT plugin.

Visualization and getting intuition on how RNN learned about musical structures

Screen Shot 2017-04-21 at 12.50.30 AM
<Neurons activated in certain chord progression patterns (Chord sequence RNN)>

Screen Shot 2017-04-21 at 12.50.22 AM
<Neurons activated only at the beginning part of a song. (Chord sequence RNN)>

Screen Shot 2017-04-21 at 12.49.59 AM
<Neurons activated in case of ‘Dominant chord resolution movement’ (Dominant or substitute Dominant chord to Tonic Chord)>

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

Create your website with
Get started
%d bloggers like this: