We often hear that machine learning and artificial intelligence excel at rote tasks, like pattern recognition....
By submitting your personal information, you agree that TechTarget and its partners may contact you regarding relevant content, products and special offers.
But, increasingly, developers are looking to apply AI to more creative problems.
"We as developers can impart the creative knowledge that we've built up over the years," said Drew Silverstein, co-founder and CEO of AI-driven music service Amper Music Inc., in an interview at the O'Reilly AI Conference in New York.
Amper Music, based in New York, automatically generates musical scores for commercial purposes like marketing videos, video games or podcast intros and outros. It lets users define the length, style and mood of the piece. They can also control how long certain aspects last, such as verse and chorus. Then, the creative AI algorithm puts together a piece of music to those specifications.
The team at Amper has essentially broken down their musical knowledge into component parts and then codified them. The AI algorithm then reconstitutes these components.
"These are things that are just inherently music theory," said Cole Ingraham, lead developer at Amper. "We can replicate that structure in our models ourselves."
How to replicate creativity
Amper uses a proprietary mix of methods to model the fundamentals of music and replicate them. The process can involve neural networks, the types of deep learning models that are fueling much of today's AI renaissance. But, just as often, the team uses other methods that were not specified in the interview.
Ingraham said neural networks are good at divining some tasks in the creative process, but not others. This is because they often obscure larger context in favor of specific probabilities.
"One reason I shy away from the neural-network approach is that it learns a global probability of things," he said. "What it's shown is what it knows."
Not everyone feels this is a problem. Google is also teaching an AI model to play music, and it is relying on a neural-network-heavy approach. This may be more feasible for Google, which has access to some of the largest data stores in the world. At a certain point, context becomes apparent when you work with enough data.
Google's project is called Magenta. Doug Eck, a senior staff research scientist at Google, explained the research at the O'Reilly conference. The team at Google trained the AI model on 1,400 performances by humans of classical piano compositions. The performances were captured using MIDI technology, which is a kind of computer encoding of music. The neural network then analyzed this data to learn things like scales, note timing and musical form. From this, it learned to compose its own pieces with expressive timing and volume.
The team is working with the assumption that no matter how imaginative a piece of music is, it still has a structure and pattern that a neural network can identify. The researchers have made clear progress, but admit much more work needs to be done.
"We haven't heard a machine-generated music that sounds that good," Eck said. "With a good composer, there's so much nested structure that we haven't captured."
Businesses should care about creative AI
Eck said the project he's working on, teaching AI to play music, is not just for fun. Ultimately, he said he hopes that identifying and replicating the subtle ways humans minds and hands work together to create something complex like music will lead to a more intuitive and humanlike -- and, therefore, useful -- type of AI.
"If we can build models that capture what we do in the everyday world, we get much further in extending that behavior into new worlds," he said. "Capturing that data and what we're doing in our everyday lives is crucial."
More immediately, businesses are trying to simulate smaller aspects of the creative process to create business value. For example, several news organizations are using natural language bots to write short briefs on sports events and financial reports.
At the conference, Adam Marcus, co-founder and CTO at B12, described how the New York company's AI platform is helping in the web design process. Unlike in the music-generating examples, his company's AI algorithms work collaboratively with human designers. The models basically do all the engineering work, creating the code for a basic layout for a business's website. Then, human designers go in and customize the page according to the individual business owner's preferences.
Marcus said this is probably going to be the easiest way to integrate creative AI into the business, at least for the time being, because it takes advantage of both machines' and humans' strengths.
While bots are good at repetitive, detail-oriented work like building out a website framework, humans excel at understanding client requests and implementing subtle changes. "It's not until you combine the best properties of these two different forces that you can unlock the full potential of AI," Marcus said.
Does your company need a chief AI officer to manage AI-related change?
AI apps bring power and promise, but not everyone is onboard
AI for HR helps companies find the best people
Why you can't overlook data scientists when considering AI