Improvising with an AI musician


by
28 October 2020

share this:

piano | AIhub

For those interested in music and AI, a session on “Human collaboration with an AI musician” at the AI for Good global summit proved to be a real treat. The session included a performance between two musicians situated on opposite sides of the globe who improvised alongside the third member of the group – an AI “musician”.

Researchers and musicians have been experimenting with machine learning algorithms for a number of years now. There has been much work devoted to recreating specific styles of music, from Bach to The Beatles and from Mozart to Mogwai. This year even saw AI-assisted song writing collide with the world of Eurovision in the AI Song Contest.

As any musician will tell you, group improvisation is perhaps the most difficult skill to master. It requires a very high level of creativity, skill, empathy and intuition. Researchers at the University of Monash and Goldsmith’s (University of London) entered into this complex world and went one step further – introducing an AI improvisor into the mix. In the video you can watch Mark d’Inverno on piano and Alon Ilsar on AirSticks (an electronic percussive instrument developed at Monash), with the AI improvisor taking the lead from Mark and its music being transmitted across the globe to Alon.

You can watch the session in full here.

As part of this session attendees also got to see four different versions of an algorithmic improvisor in action as Mark played alongside this AI “musician”. Researcher Matthew Yee-King was on hand to explain the basics of each iteration of the model. The basis for the algorithms is a variable-order Markov model.

Following the demonstrations and explanations, there was a discussion about the role of AI in music. Crucially, all involved in this project were keen to stress that they are not looking at bettering or replacing human musicians, rather they want to support and expand human creativity. The hope is that playing alongside algorithmically generated music will stimulate musicians to explore new ways of improvisation. From watching the demonstrations it was clear that this is definitely possible. It will be very interesting to see how this field of creative research develops.




Lucy Smith , Managing Editor for AIhub.




            AIhub is supported by:


Related posts :



Counterfactual predictions under runtime confounding

We propose a method for using offline data to build a prediction model that only requires access to the available subset of confounders at prediction time.
07 May 2021, by

Tweet round-up from #ICLR2021

Peruse some of the chat from this week's International Conference on Learning Representations.
06 May 2021, by

Hot papers on arXiv from the past month: April 2021

What’s hot on arXiv? Here are the most tweeted papers that were uploaded onto arXiv during April 2021.
05 May 2021, by


















©2021 - Association for the Understanding of Artificial Intelligence