IBM’s Watson is an AI platform that’s having an increasingly vital impact in many different areas, including banking, medicine, and sports. Now Watson has begun making a mark in the music world as well. In fact, as a result of a collaboration with Grammy award winning producer Alex da Kid, Watson can lay legitimate claim to Rock Star status. The song they composed together, “Not Easy”, debuted at #6 on Billboard’s Rock Digital Song Sales chart and at #12 on the Hot Rock Songs list.
Alex da Kid has produced hits for major artists including Rihanna, Nicki Minaj, Dr. Dre, and Eminem, but never had a top hit of his own until he connected with Watson.
How the Watson/Alex da Kid Collaboration Worked
Alex da Kid wanted to write a song that connected with people on a deep emotional level. Having chosen the theme of “heartbreak”, he began working with Watson to understand the interaction between music and emotion, and to identify the musical characteristics that make a song a hit.
As a composition assistant, Watson consists of a suite of open APIs (Application Program Interfaces) that can be employed to analyze the musical structure of popular songs as well as human emotional and social tendencies, plus the interactions between those factors.
In the collaboration with Alex, the Watson Alchemy Language module (a set of text analysis APIs that has since been replaced by the Watson Natural Language Understanding service) analyzed five years of data, including front pages of the New York Times, movie summaries, internet searches, Wikipedia articles, and even Supreme Court rulings to identify significant cultural trends. Then Watson Tone Analyzer searched news articles, blogs, and social media posts to understand the human emotions those trends evoked.
Watson Tone Analyzer also examined the lyrics of 26,000 Billboard Hot 100 songs, while Watson BEAT analyzed the interaction between musical characteristics such as pitch, rhythm, chord progressions, and instrumentation. The idea, according to IBM, was to determine the “emotional fingerprints” that make a song popular.
All this resulted in Watson presenting Alex with compositional ideas that fit the mood he wanted to evoke. The process, according to IBM researcher Dr. Janani Mukundan, involved Alex entering a short section of music (as little as 10 seconds) into the system. Then, said Dr. Mukundan, “Watson will listen to this and scan it. He can also tell Watson, ‘Give me something that sounds romantic, or give me something that sounds like something I want to dance to.’ And since Watson understands these emotional ranges and can understand music as well, he will then use his original piece as an inspiration and also add on top of it the layer of emotion that he wants.”
Watson Is Not the Composer, but a Muse
Watson is not intended to compose fully fleshed-out songs on its own. Rather, it provides musical samples that fit a genre and mood. The human composer then uses those samples as inspiration on which to build a finished piece that is the result of his or her own artistic vision.
Still, the day may come when Watson, via it’s Natural Language Understanding module, hears its own name called out at the Grammy Awards.