Are A.I. Music Tools Taking Over? In today’s episode, we dig into exactly why big socials like TikTok and Meta are getting into A.I. Music Tools & AI Music Generation. With TikTok’s newest tool “Ripple”, we’re seeing more tech companies getting their share of the AI Music world.
We also dive into the world of other popular AI Music Tools like Boomy and Moises in order to understand what’s possible now and where we predict the world of music AI is going.
Will A.I. Replace Human Musicians? What Two Industry Insiders Are Saying
Ryan Withrow’s hooded sweatshirt and baseball cap make a stark contrast to his usually slick style. But his laidback look matches the informal vibe as Withrow and co-host Jonathan Boyd dive into the hot topic on every musician’s mind lately – is A.I. set to take over music creation?
You may have caught wind of new social media apps debuting advanced A.I. music generation tools. Instagram just launched Threads as an AI-powered songwriting workspace. TikTok’s parent company playfully unveiled Ripple, letting users hum melodies that get auto-composed into full tracks. And Meta continues pushing its eerily-lifelike VR metaverse performers.
So are robots destined to replace human artistry? The hosts of Future of Music bring a nuanced musician’s perspective. Ryan himself produces electronic dance tracks when not podcasting. Jonathan gigs locally as a guitar player. Both remain open-minded about A.I. innovation while questioning the ultimate impact on their industry.
Breaking Down The Proliferation of New A.I. Music Tools
What stunned the hosts is the pure pace of new platforms materializing daily, as Ryan notes: “It’s a new one, a new one, a new one…you can throw a dart at the screen if you need to try one!” Jonathan astutely asks whether commodity dynamics apply – does each music app remain differentiated enough?
They highlight apps like Amper Music (ranked top 10) which rapidly generates full songs in any genre at the touch of a button. AIVA builds custom scores using a trained A.I. composer. Ecrett Music and Soundraw promise similar auto-composition functionalities. Even Jonathan’s favored simple practice tool Boomy strums back chord progressions.
One offering stood out for its unique utility among musicians – the oddly dubbed Moises app. It analyzes imported tracks to extract or isolate any instrumentation. Jonathan suggests clever applications like practicing guitar solos over just a song’s vocal line. This demonstrates AI crossing over into augmentation rather than pure replacement of human artistry.
Why Would Social Platforms Rush Towards A.I. Music Generation?
Riffing on the business motivations, Withrow highlights how media platforms currently pay royalties for trending sounds. Human creators earn from samples in viral TikTok or Instagram videos. But A.I.-generated tracks allow ownership by the platform itself, saving huge royalty expenses.
They also underscore how interest continues swelling around music composition. Ryan notes: “The massive majority of people out there all want to create music…they just don’t want to learn an instrument.” A.I. songwriting apps promise to unlock that latent creativity without skill development.
Jonathan suggests another driver – major platforms simply want ownership over this emerging space. So they materialize whatever products possible around music creation, regardless of strategic fit. The potential remains unclear, so they race to pole position in any way imaginable.
Possibilities (and Perils) When Humming Becomes Songwriting
The hosts good-naturedly attempt pronouncing the eccentric Moises as they explore exotic implications of its pitch correction capability. Humans can vocalize melodies, and AI supplies harmonizing instruments mapped to the humming. Jonathan and Ryan geekily conjure visions of this democratizing worldwide music-making. No longer limited by instrumental fluency, any amateur can whistle song ideas into fruition!
But Ryan interjects notes of caution around such tech-mediated creativity. When humming a sparse melody, much ambiguity remains around intended chord progressions and instrumentation. So AI may manifest musical ideas diverging far from what users hear internally. Withrow poses the question: “Is it really what you were having in your head though?”
Jonathan similarly analogizes to improvisational chemistry between adept human bands. Individual members interpret and enhance each other’s riffs, with results organically exceeding the imagination of any single contributor. Perhaps AI collaboration produces similar surprises, for better or worse.
They concur that vocal pitch correction could enable those lacking singing proficiency to realize their musical notions through assistive AI. Again, revealing creative potential while substituting technical practice.
Takeaways – Coexistence Rather Than Extinction
Rather than prophesying the imminent extinction of working musicians, Ryan and Jonathan emphasize the likely integration between emerging technologies and existing creative processes. Ryan distinguishes passionate musical pursuits from casual hobbyists satisfied creating just one AI collab track for enjoyment.
Jonathan notes how this explosion around machine-learning music apps focuses mainstream attention more wholly on the industry. He observes “a shift towards sounds and music and what’s possible” as an indication that musical artistry holds space firmly within the cultural zeitgeist.
So should musical artists welcome their purported successors with open arms or rage against the machines? Tune into Future of Music’s full discussion to hear these practitioners weigh benefits against disruptions and barriers against possibilities in A.I.’s booming new musical landscape!
A.I. Offers a Hand, Not a Replacement
If anxieties plague your nightmares over keyboard-toting cyborgs Line dancing at your next gig, rest reassured. Jonathan and Ryan examine realistic boundaries on A.I. encroachment into music’s hallowed halls.
Complex nuances of culture, emotion, and shared experience all thunderously echo through music’s insert soul. The hosts believe machines can hardly replicate these innately human inputs. But technology promises new portals for musical expression by novices and experts alike.
So rather than some unilateral changing of the guard, look towards symbiosis. A.I. manifests new tools placing composer apprentices of all backgrounds on more equal footing. Yet the masters wield these technologies as enhancements too. Jonathan says it well – “this gets one step closer, actually a leap closer to getting the music that’s in your head out into the world.”
For curious ears wanting exposure to music’s technological cutting edge without the clickbait hype.
Ryan and Jonathan agree unanimously on one thing – whatever form the future of music assumes, humans ourselves direct where it leads through the tools we choose to create.
Don’t forget to like, subscribe, and follow the Future of Music Podcast to stay updated on the latest episodes and discussions. Join the growing community of tech enthusiasts, musicians, and curious minds who are shaping the future of music in the digital age. The journey is just beginning, and you won’t want to miss a moment of it.