The L.A. synth-pop group’s new album “Chain Tripping” pushes musical boundaries with AI.
You could call “Chain Tripping” YACHT‘s artificial intelligence album, but that would be an oversimplification. For the L.A. synth-pop outfit’s seventh full-length record, YACHT — an acronym for Young Americans Challenging High Technology — embarked on an experiment that began more than three years ago, bringing together multiple machine-learning processes and a global network of collaborators. Their efforts brought forth computer-generated music guided by human creativity and a lesson, not just for the machines, but for the band members themselves.
“We wanted to understand it,” says band member Claire L. Evans of the album’s concept. “We knew that the best way to do that is to make something.”
A few days have passed since the release of “Chain Tripping” and Evans, along with fellow bandmates Jona Bechtolt and Rob Kieswetter, is sitting inside a Chinatown restaurant breaking down the heady tech behind the new album.
“AI is like any technology,” Evans says. “It’s what we do with it that’s good or bad.”
Maybe AI will be the thing that kills all our jobs. Maybe it will increase the power of surveillance in an age when it already feels like everyone’s watching. Maybe it’s another step deep into dystopia. Or, maybe it’s a catch-all phrase for a collection of tools whose future is dependent on how humans implement them.
“Every technology has a ‘Reefer Madness’ moment,” Bechtolt says, citing attempts by the U.K.’s Musicians’ Union to ban synthesizers in the early ’80s.
Kieswetter adds that, when it comes to emerging tech, it’s “important for artists to get involved in these early stages to shape the narrative, so it’s not just coming from these monolithic corporate entities.”
The trio quickly learned that using AI to make an album is a much more complicated and time-consuming process than they had anticipated. You can’t command a computer to write a song and expect to instantly hear something like their hit “I Thought the Future Would Be Cooler,” even if the 2015 single is part of the material that’s training the program. That’s not how the technology works.
“There’s no such thing as an AI. It’s not a guy or a gal,” Evans says. “It’s a set of interrelated, complicated, discrete mathematical processes that can do one thing really well, one specific thing really well … They can’t do larger, structural things, like write songs that have meanings and words and lyrics and instrumentation and production and structure. Those are all different problems that maybe individual systems can approach, but there’s no one that does it all.”
YACHT combed through the output, piecing together bits of melody and beats into music they could play and transforming words and phrases into lyrics.
YACHT isn’t only a tech-forward band, they’re tech aficionados. Evans, is a science journalist as well as a musician, and she authored the 2018 book “Broad Band: The Untold Story of the Women Who Made the Internet.” Evans and Bechtolt were also instrumental in reinvigorating downtown L.A.’s Triforium, a 1970s tech art piece, fixed up for a series of events late last year.
So it’s no surprise their approach to “Chain Tripping” is steeped in history. With AI, YACHT could dive deep into their musical influences and past albums to better understand themselves. “It’s not about moving forward,” Evans says. “It’s about using these tools to analyze the past in a more reflective way.”
Plus, there are historical frameworks for the processes employed in making the album.
Think of cut-up poetry or sample-based music as just a few of the predecessors for this experiment. “It was sort of like sampling an alternate dimension version of ourselves,” Bechtolt says. “It’s going through and pulling out guitar parts that we never wrote, but could have — sampling latent space.”
The group employed multiple of these processes to make “Chain Tripping.” For the music, they used a latent space interpolation model, which essentially extracted hidden melodies within existing YACHT songs. What’s tricky about this method though, is that the music generated wasn’t necessarily music that they could actually perform.
“We grew up on punk rock,” Bechtolt says, “none of us are trained musicians. I’m sure you could give a bunch of these melodies to seasoned players and they would be like, ‘yes, this is super simple.’”
As musicians, it prompted them to think about what they play and why. “For us, it made us realize that everything we play is just regurgitating something that we learned in the past that feels good to play,” Bechtolt says. “Whether it’s a guitar riff or a drum pattern, it’s something that I had heard before or had already played, a twist on that.”
“It was sort of like sampling an alternate dimension version of ourselves. It’s going through and pulling out guitar parts that we never wrote, but could have — sampling latent space.”
Jona Bechtolt, YACHT band member, on using AI tools to examine past songs.
Then, there are the lyrics. The band turned to Ross Goodwin, the L.A.-based technologist and writer whose work includes the AI-penned short film “Sunspring” and the experimental AI-crafted novel “1 the Road,” to develop the neural network used for the lyrical portion of the project.
“I think what I really liked about Claire and Jona, the whole band, is that their focus is very similar to mine in that they’re all about instrumentalizing technology,” Goodwin says. “To me, a personal computer is like a saxophone or a musical instrument. You can be very expressive.”
Trained with 2.2 million words, including lyrics from previous YACHT songs, the machine churned out voluminous possibilities; in the studio, the band worked through a 3-inch dot matrix print-out. Plenty of the wordplay on the album incorporates incredibly unusual phrases like “palm of your eye” in the song “Scatterhead.” Evans likens it to hearing her mom, who is from France, translate French phrases into English.
“I could see that’s a phrase, but it’s not a phrase that we use,” she says. “It sort of makes sense, but it sort of doesn’t either.”
And that presents an interesting opportunity, putting Evans in a position similar to people hearing YACHT’s music. “As a consumer of music, you try to hear things and guess what the artist meant when they wrote that thing and do a lot of projecting,” Evans says. “In the process of making this, I found myself doing that while writing the songs.”
That has the potential to alter, maybe even strengthen, the connection between the artist and the fan. “We’re all in it together, trying to understand what the song means and we’re all projecting our own meaning onto it,” Evans says. “There’s something kind of beautiful about that.”
Evans’ singing style changed as well. “Trying to sing these words on top of the melodies meant sometimes having to chop the words up in weird ways and pronounce them differently than I normally would, just to make them fit on the generated melodies,” she says. “I found it was helpful in the process of doing that to stop thinking of the words as meaning and just think of them as sound, which is something that I’ve never done before as a songwriter.”
In the end, they learned more than just how to use machine-learning to make music. “You’re able to see your own patterns because you’re being challenged by something that’s outside of the realm of what you’re used to,” Evans says. “It forces you to break outside of some embodied habit that you didn’t even know was there — and I think it ostensibly does make you better from doing it.”