Skip to main content

Stories

Mind Your Music

Musicians and computer engineering seniors Rachel Goldstein and Andy Vainauskas have designed a device to help those of us less talented or disabled make music using our own brain activity. Pairing EEG technology with music improvisation software, they have created MindMusic, interpreting biofeedback and translating it into music.
Mind music

            Computer engineering seniors Rachel Goldstein and Andy Vainauskas are musical. She plays cello, he plays violin. In addition to their rigorous engineering coursework, they were both members of SCU’s orchestra, and he was also a member of the chamber singers. Over the years, their ability to make music helped them manage the stress of university life and gave them an outlet for their creativity. So, for their senior capstone project the pair is helping bring that capability to non-musicians through Mind Music, a tool that uses EEG technology and a musical improviser system to allow users to create music using their own brain activity.

            Advised by computer engineering Assistant Professor Maya Ackerman, a leading researcher in artificial intelligence and machine learning and opera singer, the pair is also working with Robert Keller, professor of computer science at Harvey Mudd University and accomplished jazz musician. “Dr. Keller created software that allows users to play the piano and improvise with the computer program. Instead of a piano, our system uses a Muse headband—a device currently on the market that provides EEG biofeedback to enhance meditation,” said Andy. “A big milestone in our project was connecting the Muse as hardware to Dr. Keller’s software,” Rachel explained. “It’s been a huge undertaking to understand how everything works together. First, we had to get the Muse talking to the software, then figure out how the biofeedback fits into the actual code—which module is listening, and which is interpreting the data; and finally reading those values and generating music.”

            Not surprisingly, it’s difficult to interpret brain data. To determine if their system was working, Rachel and Andy ran tests on something more familiar to them. Using an accelerometer to monitor head movement, they tested the software by assigning quarter notes and eighth notes to left and right head tilts. Once satisfied with those results, they began extracting actual brainwave data from the Muse outputs and figured out how to turn differing values of brainwave signals into categories that could be used to create a grammar, or set of rules, to tell the software what to do in different situations. “We had to decide how we want music generated based on the specific type of brain activity—assigning shorter, faster notes to certain values, and longer, slower notes to others,” Rachel said.

            And that’s where the fun comes in. “Now, we’re taking the brain data and being more creative,” said Andy. “We have lots of choices in how we interpret brain activity. We’re looking at fun ways to interpret the biofeedback and translate it into music.” Catching his excitement, Rachel added, “There are so many different qualities in music you can change—dynamics, volume, tempo, pitch, high or low register—but ideally, when the user puts on the Muse headband, there should be a connection between their perceived mental state and the music that is being produced. If they try to calm down, they should hear that reflected.”

            Rachel and Andy imagine their system being a boon for people currently restricted from producing music due to disability or lack of training. “I just envision someone coming home and creating a piece that reflects their day, celebrating good news or helping them cope with stress or anger or sadness. It’s what you do with an instrument,” said Andy.

 

Contact Us

Santa Clara University
School of Engineering
500 El Camino Real
Santa Clara, CA 95053

Heafey-Bergin, Bldg. 202
Sobrato Discovery, Bldg. 402

408-554-4600
408-554-5474 fax