Have you ever had a song really stuck in your head, and you don't know for sure where you heard that melody from? Or maybe you don't know the lyrics because it's not in your language? Well, a new model of artificial intelligence created by a study between the University of Osaka together with Google, might be able to help you in the future.
As reported by MusicRadar, researchers from Google and Osaka University have made an intriguing advancement by being able to create music using only your brain activity.
In a study described in Brain2Music: Reconstructing Music from Human Brain Activity, participants listened to music samples from ten different genres, including rock, classical, metal, hip-hop, pop, and jazz, while researchers used functional MRI (fMRI) scans to measure their brain activity. An fMRI measures small changes in blood flow that occur over time with brain activity, in contrast to MRI readings that only show us the anatomical structured. With the use of these readings, a deep neural network was trained to discover patterns of brain activity linked to musical elements including mood, genre, and instrumentation.
In addition, the researchers included the MusicLM model, another Google AI system, in their investigation. Based on text descriptions, MusicLM creates music in any genre, taking into account elements like instrumentation, rhythm, and emotions.
The fMRI data and the MusicLM database were combined by the AI model to reconstruct the music that the test subjects had been listening to based on the context of their brain activity. Examples provided by the team demonstrate how musical fragments that sound remarkably similar were translated by Brain2Music based on the participants' brainwaves.
Although AI isn't quite ready to record meticulously orchestrated tunes, that time may not be too far off. The uses for Brain2Music are numerous and fascinating, though according to their researchers "the decoding technology described in the study is unlikely to become practical in the near future."
As AI develops, the potential to change the music creation process develops as well. Over time, composers will be able to get connected to an AI model, think of a tune, or several, while this model, or even a more sophisticated one, automatically generates a music sheet.