The era of bad robot music has come. Adobe is working on a modern AI tool that will allow anyone to become a music producer – no instrument knowledge or assembly required.
The company revealed “Music GenAI Project Control” – very long name – last week. The tool allows users to create and edit music by simply typing text prompts into a generative AI model. These may include descriptions such as “powerful rock,” “elated dance,” or “sorrowful jazz,” Adobe explained.
Project Music GenAI Control will then create an initial melody based on the user’s prompts, which can also be edited with text. Adobe says users can, for example, edit the intensity of the generated music, extend the length of a music clip, or create a repeating loop.
The target audience for this modern tool is podcasters, broadcasters and “anyone who needs audio with the right mood, tone and length,” said Nicholas Bryan, a senior researcher at Adobe and one of the creators of the technology.
“One of the stimulating things about these modern tools is that they don’t just generate audio – they take it to the level of Photoshop, giving creators the same deep control over shaping, tweaking and editing audio,” Bryan stated in Adobe blog. “It’s a kind of pixel-level control of music.”
Adobe uploaded a video showing how Project Music GenAI Control works. It was terrifying how easily this tool could create music. It also turned out to be very quick. While the music she created won’t win any Grammys, I can certainly imagine hearing it in the background of YouTube videos, TikToks, or Twitch streams.
Project GenAI Music Controls | Adobe Research
This isn’t quite a good thing. Artificial intelligence has made a ripple effect in professions like writing and acting, forcing workers to take a stand and prevent their livelihoods from being stolen. Comments on the company’s YouTube video echoed these concerns and sharply criticized the company for making “music written by robots for robots” and “good corporate grief.”
“Thank you to Adobe for trying to find even more ways for corporations to put creators out of work. Also, which artists did you steal material from to train your AI? wrote one user.
When asked for comment by Gizmodo, Adobe did not disclose details about the music used to train the artificial intelligence model for Project Music GenAI Control. However, she emphasized that in the case of Firefly, a family of AI image generators, it trained its models only in an openly licensed public domain for which copyright has expired.
“Project Music GenAI Control is a very early look at the technology developed by Adobe Research, and while we are not revealing model details yet, we can share the following information: Adobe has always taken a proactive approach, ensuring we innovate responsibly,” Anais Gragueb told Gizmodo in an email. Adobe spokeswoman.
Music is an art and is inherently human. As such, we need to be cautious when it comes to modern tools like Adobe – otherwise we risk a future where music sounds as hollow as the machines that generate it.