ΑΙhub.org
 

Researchers develop real-time lyric generation technology to inspire song writing


by
06 September 2021



share this:
guitar

Music artists can find inspiration and new creative directions for their song writing with technology developed by Waterloo researchers.

LyricJam, a real-time system that uses artificial intelligence (AI) to generate lyric lines for live instrumental music, was created by members of the University’s Natural Language Processing Lab.

The lab, led by Olga Vechtomova, a Waterloo Engineering professor cross-appointed in Computer Science, has been researching creative applications of AI for several years.

The lab’s initial work led to the creation of a system that learns musical expressions of artists and generates lyrics in their style.

Recently, Vechtomova, along with Waterloo graduate students Gaurav Sahu and Dhruv Kumar, developed technology that relies on various aspects of music such as chord progressions, tempo and instrumentation to synthesize lyrics reflecting the mood and emotions expressed by live music.

As a musician or a band plays instrumental music, the system continuously receives the raw audio clips, which the neural network processes to generate new lyric lines. The artists can then use the lines to compose their own song lyrics.

“The purpose of the system is not to write a song for the artist,” Vechtomova explains. “Instead, we want to help artists realize their own creativity. The system generates poetic lines with new metaphors and expressions, potentially leading the artists in creative directions that they haven’t explored before.”

The neural network designed by the researchers learns what lyrical themes, words and stylistic devices are associated with different aspects of music captured in each audio clip.

For example, the researchers observed that lyrics generated for ambient music are very different than those for upbeat music.

The research team conducted a user study, inviting musicians to play live instruments while using the system.

“One unexpected finding was that participants felt encouraged by the generated lines to improvise,” Vechtomova said. “For example, the lines inspired artists to structure chords a bit differently and take their improvisation in a new direction than originally intended. Some musicians also used the lines to check if their improvisation had the desired emotional effect.”

Another finding from the study highlighted the co-creative aspect of the experience. Participants commented that they viewed the system as an uncritical jamming partner and felt encouraged to play their musical instruments even if they were not actively trying to write lyrics.

Since LyricJam went live in June this year, over 1,500 users worldwide have tried it out.

The team’s research, to be presented at the International Conference on Computations Creativity this September, has been pre-published on arXiv. Musicians interested in trying out LyricJam can access it here.




University of Waterloo




            AIhub is supported by:


Related posts :



Interview with Aneesh Komanduri: Causality and generative modeling

  31 Jul 2025
Read the latest interview in our series featuring the AAAI/SIGAI Doctoral Consortium participants.
monthly digest

AIhub monthly digest: July 2025 – RoboCup round-up, ICML in Vancouver, and leveraging feedback in human-robot interactions

  30 Jul 2025
Welcome to our monthly digest, where you can catch up with AI research, events and news from the month past.

Interview with Yuki Mitsufuji: Text-to-sound generation

  29 Jul 2025
We hear from Sony AI Lead Research Scientist Yuki Mitsufuji to find out more about his latest research.

Open-source Swiss language model to be released this summer

  29 Jul 2025
This summer, EPFL and ETH Zurich will release a large language model (LLM) developed on public infrastructure.

Interview with Kate Candon: Leveraging explicit and implicit feedback in human-robot interactions

  25 Jul 2025
Hear from PhD student Kate about her work on human-robot interactions.

#RoboCup2025: social media round-up part 2

  24 Jul 2025
Find out what participants got up to during the second half of RoboCup2025 in Salvador, Brazil.

Visualising the digital transformation of work

Does it matter that the existing images of AI and digital technologies are so unrealistic?

#ICML2025 social media round-up part 2

  22 Jul 2025
Find out what participants got up to during the second half of the conference.



 

AIhub is supported by:






©2025.05 - Association for the Understanding of Artificial Intelligence


 












©2025.05 - Association for the Understanding of Artificial Intelligence