To the beat of technology
Technology has enabled an American music composer to live and create far more than she would have otherwise, and her AI-composed album, the world’s first, proves that. She tells Su Aziz more in the third issue of In Focus.
Ignoring the heat and bustle of America’s City of Angels and bent over a desk holding multiple computer screens, facial muscles rigid in concentration, is 34-year-old Taryn Southern. At the time of writing this article, she’s working with four different AI music composition tools – AIVA, Amper Music, Google’s Magenta and IBM’s Watson Beat – to compose her latest album, I Am AI.
‘I’ve always considered myself a musician hobbyist. It’s my side hustle, I do it because I love it!’ Taryn confesses. ‘I was an actress and TV host for five to six years before I started a YouTube channel and a digital production company. So, music is something I always incorporate into my work. Now I’m directing my first feature. It’s a documentary on the future of man and machine, while working on I Am AI album on nights and weekends.’
Taryn’s album consists of eight tracks, although, in total, she has composed 14 songs with AI. The first single from the album is Break Free, out in 2017. Funnily enough, hardly anyone has been able to discern the difference between AI-composed music and human-composed music. ‘A lot of people comment they can’t tell the difference. Although, I think a lot of people feel threatened by the idea of AI composing music. Creativity has always been believed to be sacred to us but the reality is that creativity is quite simply something experienced by the eye of a beholder.’
Taryn points out that, while one may argue when monkeys and elephants create paintings, they aren’t exercising creativity, ‘That doesn’t mean the person buying the art doesn’t experience something in that painting and imbues it with his or her own meaning.’
‘But aside from all of that, what a lot of people don’t yet see, is how AI will actually empower solo artists, like myself, to be able to create things using a different language than they have in the past,’ she adds. ‘It still takes hard work and effort. But it’s very possible that this new language for music helps level the playing field and create opportunities for those who don’t have a traditional background in music.’
Tech, beat by beat
In general, Taryn discloses, most publications have focused on her work using Amper, ‘It’s the easiest for anyone to use. But very few have focused on Watson Beat or AIVA. AIVA’s interesting specifically because it is trained on classical music. So, I like that I’m able to participate in a different type of collaboration than what I’d have access to in the “pop producer” world. Watson Beat requires some coding knowledge but a lot more creative input from the artist to change or edit a song.’
In her experience, there’s a learning curve for Watson Beat and Magenta’s tools. ‘The Watson team actually got on multiple Skype sessions with me, to help me through it. But once you get the hang of it, it just becomes a matter of being familiar with the “language” of each system, so you know how to direct it to get a sound or style you like,’ Taryn explains.
‘I don’t know if technology saves me any time. There’s a fair argument that it actually took me more time than the traditional way of making music. What it does, is give me a set of unique, forced constraints and then I have to come up with creative solutions to make that work,’ she says. ‘I find that sort of challenge immensely delicious and an amazing way to kickstart creativity. The other huge benefit is that I can make this album all on my own. Prior to working with AI, the only way I could produce tracks was to work with a music producer.’
Taryn’s non-traditional music background and the fact that she was never trained in music production or theory, would have made the learning curve incredibly steep. However, she admits to how, ‘Working with AI, I could actually construct the songs using the output stems and arranging them in Logic or Garageband. No experience with music composition required, just basic editing skills and a good ear. It’s empowering to know that I can produce the music all on my own.’
It’s really not just about pushing a button and a song pops out. Interestingly, in working with AI, Taryn becomes dual-tasking as both composer and editor of the tracks rather than just a composer, something that has been predicted to happen to us all with the advent of technology.
First of all, Taryn tells the AI tool what sort of song she wants, along with such details as what musical key or if it’s fast or slow beat. After which, it’ll churn out music that, according to Taryn, ‘Is usually pretty bad.’ She then takes that music and tweaks it for hours before the AI tool works at the tweaked version, and the process repeats and continues till the music is what she wants. It can be a lot of hits and misses when it comes to what the AI suggests, but persevering will pay off in the end.
This tells us two things: that working with AI doesn’t necessarily save time and the creative process seems to equal that of a human’s.
Tuning the details
Taryn shares the profits from the album with the AI that helped compose the music. But how does that work, exactly? ‘It depends on the AI company, but keep in mind it’s not going to a piece of software. There are engineers or musicians in each case who built the system or code that creates the music and they have to keep the lights on,’ she answers. ‘I’ve never made much money on my music – that’s the sad reality for most musicians. The cost of making and marketing songs is high. You do it because you love it and you have something to say. If I end up making money, I’m happy to share the pie with all of those who helped make this album possible.’
When it comes to regulations in the music industry on rights and ethics of AI machines, Taryn has this to say, ‘No one’s going to be fighting for the rights of machines in the music industry, yet. The argument right now centres around the legal and due process – at what stage of AI is truly generating new forms, versus derivative works and how does that affect backend compensation rights. All of these things are being discussed and by no means do I think I should be the expert or consulted on these things, it’s a discussion that needs to happen.’
For Taryn, the future of music is now. ‘Just as hip hop and electronic music birthed new tools and forms of storytelling in music, collaborations with AI will do the same. We’ve seen a trend toward democratisation and globalisation across the industry,’ she says. ‘And I do believe that’ll continue to happen and further enabled by AI.’
Her two main advice to musicians considering the same technological path as she has taken are, ‘Have fun experimenting with musical styles outside of your preferred style, that’ll inevitably lead to exciting new collaborations, and play with all the tools to see which best fits your needs.’