top of page

How Generative AI Is Driving Market Demand for Creator-based Tools for Music and Video

How is generative AI impacting market demand in the music and video tech space? Find out.


Music technologies have been gradually evolving for years, but new innovations are now speeding up the tempo. Artists say they want technologies that will enable real creative expression and authenticity, and the developers who listen to them will be the ones who succeed. Gregg Stein, founding partner and CEO of Triple G Ventures, explores why that is the case and how generative AI is impacting the market trend around creator-based tools.


It’s an exhilarating time for the music industry, but it’s not without its growing pains. Thanks to the latest boom in generative AI and associated tools, such as Google’s new music platform MusicLM, AI-based voice cloning for sampling and NFT technologies for licensing and monetization, such as Rihanna’s recent announcement shortly before her Superbowl performance, music of all kinds is reaching more people and being monetized in more ways than ever before.


Opportunities abound for the creators of tools that help musicians make and share their content. But to get traction, many developers will need to change their mindset.


For years, tech companies have seen musicians ignore or ridicule their creations. A common refrain has been, “This thing was designed by engineers who obviously never tried to make music in their lives!”


This needs to change because artists need fewer enablers and middlemen to share their work than they ever did in the past. The industry must learn to take its cues from musicians – and tune out everyone else.


Tips for Music Developers to Navigate the Future with AI

The year ahead is a great time for this mindset change, as there are new opportunities on the horizon for musicians and for tech developers. Here are several tips to help music tech developers navigate the future:


1. Build solutions that boost creativity


There will always be a market for USB microphones and other gadgets, but in such a crowded market, the real area for growth is in new ways to make content creation easier.


For example, today, there is a thriving market for podcasting interfaces: combination devices with microphone preamps, mixing, soundboards with effects or anti-profanity masking, digital recorders, and streaming interfaces to a computer – all in one affordable box that sits on a desktop. Two years ago, this type of product was unheard of.


That’s just one of many ways the music industry is changing. Anything designed to stimulate the creative process can open up new markets, so think about how your solution enables, empowers and improves the way creative people can better express themselves.


2. Perfect the expressive interface between creators and their tools


Electronic instruments are trending heavily in the direction of feeling and sounding more like actual musical instruments, and musicians love it.


Keyboard players, for example, used to bemoan the fact that synthesizers and workstations were not genuinely expressive. But polyphonic aftertouch keyboards, new expressive controls, and new standards from MIDI Polyphonic Expression (MPE) to the nascent MIDI 2.0 standard have changed their tune.


For drummers, modeled drum kits with improved sound creation capabilities and playing surfaces – drum heads, cymbals, hi-hats – that respond naturally to the nuances of how drummers play are gaining in popularity. Does it really matter that a hi-hat cymbal swings a bit in the rebound from a stick hit? You bet it does if you’re a drummer.


So if you are building electronic instruments, start out by imagining how it will feel for a musician to play. Strive for authenticity.


3. Prepare for musicians jamming with generative AI


AI has been with us for years, but the explosion of creative content generated by machines while directed by humans is a recent phenomenon. Album artwork is AI-generated from user prompts, and chatbots are now writing essays and lines of computer code. AI will creep into more music creation apps, starting with the boom of tools to follow Google’s MusicLM, and developers should get ready.


Some AI-enabled music programs can follow the music you create or like, and adjust their parameters to suit your tastes. Music creation apps can listen to your mixes and suggest combinations of multiple signal-processing tasks to accomplish results that might take hours otherwise. While initial tests have artists chuckling skeptically, AI programs like Google’s are now capable of generating entire songs.


Jay Z was famously heard in an interview with Howard Stern that he doesn’t compose the songs by writing music. He hums what he wants to a producer, who in turn creates the music. With tools like MusicLM, this could mean that anyone could feasibly create music with this mentality without going through the lengthy process to create it or the expense of having others do this for you.


It’s only a matter of time before these products begin to create music indistinguishable from that made by human musicians. When this happens, musicians will want to compose music and jam with AI programs as a backup band. Developers should look ahead to this and think of ways to make this fun and easy.


4. Don’t reinvent the wheel, upgrade the gear


Some things never go out of style. Acoustic and electric guitars, basses, drum kits and other instruments are still strong sellers. There’s low risk associated with the idea that people wanted guitars and drums last year, and they’re going to want them next year. Many developers are looking for ways to tie these “old reliables” into new tech that improves aspects of them without sacrificing musicality.


Some possibilities include improved physical modeling, such as allowing a guitar to sound like a piano and remain playable with real nuance. Today’s modeling devices have evolved from unwieldy gadgets to simple stompboxes that can be plopped onto a pedalboard next to the fuzzbox and echo machine – and they sound and play just as well, or better.


There has also been a strong trend toward the use of in-ear monitors (IEMs) rather than stage wedges for reasons ranging from better feedback control and less cabling to protecting musicians’ hearing. This is only one part of a profusion of affordable and pervasive wireless systems where multiple mics and monitors can be controlled from a central location.


See More: Seven Technology Scenarios to Watch For in 2023


In Tune with Change


All of this points to a continuing revolution in bringing traditional music performance and engineering methods into the 21st Century.


The year ahead can be a great one for music tech developers, but only if they create products that empower creators to do their best work. Musicians are striving for real creative expression and authenticity. Developers should be listening to them and aiming to do the same.


Do you think how we perceive creativity will change with the use of generative AI? Share with us on FacebookOpens a new window , TwitterOpens a new window , and LinkedInOpens a new window.


Image Source: Shutterstock



bottom of page