The Artificial Intelligence in Music Debate
The Artificial Intelligence in Music Debate
Music has come a long way from the early days where the only instrument was the human vocal cord. The use of computer-based technology in music started in the 60s when the iconic Moog synthesizers starting taking over the British music scene and paved way to the irreplaceable artform that is progressive rock. If it was not for the Minimoog, Keith Emerson would have just stuck to literally rocking the Hammond L100, riding it like a bucking bronco, brutalizing it with knives and occasionally when he got the time even playing music on it. Yes indeed, the 60s and 70s were crazy times. But the introduction of integrated circuits and eventually processors paved way for a lot of innovation in music and the discovery of many new genres of music.
However, technology in music didn’t stop with the synthesizers. Over the past few decades, many technologies have been incorporated into music including the controversial auto-tune. In recent times though the most technology to make its way into the musician’s arsenal is Artificial Intelligence. While the overuse of technology in music is heavily debated and often shunned by the musicians of yore, there is still a lot of support as well. So, let us have a look at how AI is being used in the music industry today.
Algorithmic Composition
Surprisingly the roots of Artificial Intelligence in music date back to as early as 1965 when synth pioneer and inventor Ray Kurzweil showcased a piano piece create by a computer. Over the years, the use of computer programs in the composition, production, performance, and mass communication of music has allowed researchers and musicians to experiment and come up with newer technologies to suit their needs.
Algorithmic composition is one of the more complex applications of AI in music. Kurzweil’s early efforts towards producing music using computer algorithms relied on pattern recognition and recreation of the same in different combinations to create a new piece. The process of creating music cognitively used today have spawned from this idea. The algorithms used therein are created by analyzing certain human parameters that influence how music is perceived. The first step here is for the cognitive system to understand the music as a listener and then draw vital information from it as a musician would. To achieve this the AI system must first be able to compute the notational data in relation to its audio output. Aspects such as pitch, tone, intervals, rhythm, and timing are all taken into consideration here.
Based on the approach and the result intended there are several computational models for algorithmic composition such as mathematical models, knowledge-based systems, evolutionary methods etc. Each model has its own way of uploading a piece of music into the system, how the system would process it, what information it can derive from it and how it would do it. Composition is not always the purpose of an Algorithmic system. Musicians can use it for comprehension or even draw inspiration.
Non-Compositional Applications
While AI-based music composition is a work in progress that has been going on for decades and could take quite a few more to perfect, AI serves the music industry in many ways outside the studio.
Engagement Tracking
Today almost every form of art and entertainment profits most from the digital medium. This is also the case with music if it wasn’t obvious enough. Artists create music for a target audience which is spread around the many platforms like Youtube and Spotify. So, the engagement data recovered from these platforms have a huge impact on the songs that are created as well as how they are promoted. Artificial Intelligence plays a central role in this process. AI systems often act as the buffer between the distributors, advertisers, marketers and the audience for all their processes including monetization and the transactions involved therein.
Analytics
Analytics data is an essential aspect of almost every online venture today. And music is no exception. Music creators, independent and signed alike rely on analytics data to manage their online presence. AI-based analytics tools help bring speed and accuracy to the analytics processes, so as to help musicians keep up with the fast-paced internet competition.
It’s not Rock and Roll, Man
Music, being an art form that demands a great deal of creativity also demands a human mind behind it that is creative rather than analytical. This is the debate that has been surrounding the AI-music alliance for a long time. Technology, in general, has had the music world divided for eons. In a decade where bands like Pink Floyd thrived exclusively on ‘space age’ technology like the EMS VCS3 and a whole bunch of other stuff, acts like Rush, Motorhead and Aerosmith roughed it out with just the rudimentary instruments. Although most musicians are always open to new technologies or eventually warm up to them, that has not been the case with AI. The prospect of having to put in very less creative effort to compose may entice some, (especially given the volume of music that is being put out everyday) it also opens up the debate of industry giants such as Sony, Universal, Warner and Tips misusing or overusing it which could lead to an eventual vacuum in creativity.
I’m Perfect! Are You?
AI has always been a source of contradiction among all communities so it is no surprise that the music community is both supportive as well as skeptical of it. So far efforts towards harnessing Artificial Intelligence-based composition have been of infinitesimal proportions. So, there is not much out there for us to judge and take sides. Ventures like Artificial Intelligence Virtual Artist (AIVA) are working exclusively towards bringing out the full potential of AI as a means of completely automating music production. While such ideas are a grey area today, how these will come to influence music as we know it, only time will tell.