ΑΙhub.org
 

Researchers develop real-time lyric generation technology to inspire song writing


by
06 September 2021



share this:
guitar

Music artists can find inspiration and new creative directions for their song writing with technology developed by Waterloo researchers.

LyricJam, a real-time system that uses artificial intelligence (AI) to generate lyric lines for live instrumental music, was created by members of the University’s Natural Language Processing Lab.

The lab, led by Olga Vechtomova, a Waterloo Engineering professor cross-appointed in Computer Science, has been researching creative applications of AI for several years.

The lab’s initial work led to the creation of a system that learns musical expressions of artists and generates lyrics in their style.

Recently, Vechtomova, along with Waterloo graduate students Gaurav Sahu and Dhruv Kumar, developed technology that relies on various aspects of music such as chord progressions, tempo and instrumentation to synthesize lyrics reflecting the mood and emotions expressed by live music.

As a musician or a band plays instrumental music, the system continuously receives the raw audio clips, which the neural network processes to generate new lyric lines. The artists can then use the lines to compose their own song lyrics.

“The purpose of the system is not to write a song for the artist,” Vechtomova explains. “Instead, we want to help artists realize their own creativity. The system generates poetic lines with new metaphors and expressions, potentially leading the artists in creative directions that they haven’t explored before.”

The neural network designed by the researchers learns what lyrical themes, words and stylistic devices are associated with different aspects of music captured in each audio clip.

For example, the researchers observed that lyrics generated for ambient music are very different than those for upbeat music.

The research team conducted a user study, inviting musicians to play live instruments while using the system.

“One unexpected finding was that participants felt encouraged by the generated lines to improvise,” Vechtomova said. “For example, the lines inspired artists to structure chords a bit differently and take their improvisation in a new direction than originally intended. Some musicians also used the lines to check if their improvisation had the desired emotional effect.”

Another finding from the study highlighted the co-creative aspect of the experience. Participants commented that they viewed the system as an uncritical jamming partner and felt encouraged to play their musical instruments even if they were not actively trying to write lyrics.

Since LyricJam went live in June this year, over 1,500 users worldwide have tried it out.

The team’s research, to be presented at the International Conference on Computations Creativity this September, has been pre-published on arXiv. Musicians interested in trying out LyricJam can access it here.




University of Waterloo




            AIhub is supported by:



Related posts :



We asked teachers about their experiences with AI in the classroom — here’s what they said

  05 Dec 2025
Researchers interviewed teachers from across Canada and asked them about their experiences with GenAI in the classroom.

Interview with Alice Xiang: Fair human-centric image dataset for ethical AI benchmarking

  04 Dec 2025
Find out more about this publicly-available, globally-diverse, consent-based human image dataset.

The Machine Ethics podcast: Fostering morality with Dr Oliver Bridge

Talking machine ethics, superintelligence, virtue ethics, AI alignment, fostering morality in humans and AI, and more.

Interview with Frida Hartman: Studying bias in AI-based recruitment tools

  02 Dec 2025
In the next in our series of interviews with ECAI2025 Doctoral Consortium participants, we caught up with Frida, a PhD student at the University of Helsinki.

Forthcoming machine learning and AI seminars: December 2025 edition

  01 Dec 2025
A list of free-to-attend AI-related seminars that are scheduled to take place between 1 December 2025 and 31 January 2026.
monthly digest

AIhub monthly digest: November 2025 – learning robust controllers, trust in multi-agent systems, and a new fairness evaluation dataset

  28 Nov 2025
Welcome to our monthly digest, where you can catch up with AI research, events and news from the month past.

EU proposal to delay parts of its AI Act signal a policy shift that prioritises big tech over fairness

  27 Nov 2025
The EC has proposed delaying parts of the act until 2027 following intense pressure from tech companies and the Trump administration.



 

AIhub is supported by:






 












©2025.05 - Association for the Understanding of Artificial Intelligence