New system composes lyrics for instrumental music

Researchers at the University of Waterloo have developed LyricJam, an advanced computing system that generates lyrics for live instrumental music.

The new LyricJam system will help artists compose new lyrics that go well with the music.

The goal of our research is to develop a system that can generate lyrics that reflect moods and emotions, expressed through various aspects of music. We are talking about chords, used instruments, tempo. We aimed to create an instrument that musicians could use to draw inspiration for their own songs.

Olga Vechtomova, study author

Essentially, LyricJam is an artificial intelligence system. From the user’s point of view, the application is very simple: the music artist plays live music, and the system displays lines of text that he generates in real time in response to the music he hears. The generated lines are saved.

The system, created by the researchers, works by converting raw audio files into spectrograms. Deep learning models are then used to create lyrics that match the music they were processing in real time. The architecture of the model consists of two variational autoencoders, one of which is for studying the presentation of musical sound, and the other for studying texts.

If you have found a spelling error, please, notify us by selecting that text and pressing Ctrl+Enter.

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Author: John Kessler
Graduated From the Massachusetts Institute of Technology. Previously, worked in various little-known media. Currently is an expert, editor and developer of Free News.
Function: Director
John Kessler

Spelling error report

The following text will be sent to our editors:

36 number 0.314591 time