Features I by I 20.08.17

The Man-Machine: How bio-hacking can change the future of music

Could the future of music be body modifications that change the way we interact with instruments and technology, or even a new realm of sound opened up with specialist hearing implants? Jack Needham investigates how bio-hacking is not only changing the we create music, but the way we perceive it.

Transhumanism is the belief that humans can reconfigure their genetic boundaries through science and technology. It’s a term that encompasses body-hacking, body modification, prosthetics and wearable tech, a way to determine your own evolutionary path or advance your own human abilities. The capabilities of doing this are becoming simpler by the day: cyborgs are no longer visions of our future, they exist in our present.

After losing his right eye, filmmaker Rob Spence replaced it with a camera and became the ‘Eyeborg’. For under £200 you can magnetize your fingertips in an afternoon. But it’s within the realms of music and art that some of the greatest strides towards a bionic future are being realized.

“Taking ownership of your own body and making it do what you want is inherent in the human species,” says Trevor Goodman, the founder of body-hacking convention BDYHAX. “Cultures across the globe have been tattooing and piercing themselves for longer than we have recorded history. We use glasses to bring our vision to ‘normal’ levels and we drink caffeine to make us feel alert, so technology is in and of itself an expression of our humanity.”

Neil Harbisson was born colorblind, but now he experiences color in ways the rest of us cannot – through an antenna surgically attached to the back of his skull that dangles over his forehead. “I always used the piano to express my feelings about color, so I was interested to see if technology could also allow me to feel color and express my feelings about color through music,” says the cyborg musician. “I looked at nature to see what body parts I could add to my body and I decided that an antenna would be the best as it allows me to sense in 360 degrees. At first, the aim was to create a third eye in the middle of my forehead, but this way I can look at you and sense the colors behind me.”

As a classical pianist from a young age, Harbisson was initially drawn to music over bionics, but through studying music composition at the Dartington College of Arts in the early 2000s, the two became intertwined. In 2004, the process of building his antenna took control of his attentions, and Harbisson failed his degree after submitting his final project in the form of a coloring book. “Each color I drew on the paper was a musical chord,” says Harbisson. “That, for me, was music.”

How bio-hacking can change the future of music

“The antenna is an ear that allows me to sense the vibrations of color through my skull” Neil Harbisson

Today, Harbisson can ‘listen’ to priceless artworks, and he chooses his outfit each day by how it sounds. Turquoise, purple and orange. for example. comprise Harbisson’s chosen funeral attire, which plays in C major, the same key as Led Zeppelin’s ‘Stairway To Heaven’, Blink 182’s ‘All The Small Things’ and Carly Simon’s ‘You’re So Vain’. Full English breakfasts are an F or G sharp, and to Harbisson, Nicole Kidman’s facial frequencies are similar to that of Prince Charles’. “The antenna is an ear that allows me to sense the vibrations of color through bone conduction in my skull,” he explains. “There’s a chip inside the bone that vibrates depending on the light frequencies. These vibrations become sounds to my inner ear and allow me to hear colors across the color spectrum, from infrared to ultraviolet.”

Harbisson no longer makes music in the traditional sense, however – now he’s just the vessel through which it travels. “It’s not that I create music; my reality has become my music,” he explains. “The art came when I created of the organ, in a sense. It transformed my reality into music, so I no longer need to compose music in the traditional sense, I can compose it by looking at things.”

Harbisson is not the first to take his post-art audial experiments to the public. ‘Music for Solo Performer’ was sound artist Alvin Lucier’s 1965 collaboration with scientist Edmond Dewan. Wearing an electrode headset, Lucier transmitted alpha waves through a brainwave amplifier, split into channels and routed through a loudspeaker which sent vibrations to a series of percussive instruments. Over the past five years, however, similar wearable, or insertable, tech has become increasingly popular. In 2011, performance artist Dani Ploeger premiered the ‘ELECTRODE’, an anally-inserted electrode that connects to a sensor and transforms the movements of his sphincter muscles as audial frequencies.

If interpreting music through a suppository is not to your liking, San Francisco-based company Doppler Labs offers a more practical solution. The Here One in-ear headphones allow the user to filter, reverb, or alter the volume of the outside world in real-time, all controlled through a free phone app. Sat next to a crying baby on a long haul flight? Pinpoint the frequency of the child’s wailing and use the built-in EQ to block it out. For music audiences, this could give them the opportunity to shape the sound of a live performance to enhance their enjoyment.

How bio-hacking can change the future of music

“Here One is not for every environment, but this is just one more altering tool that could make it better for you” Noah Kraft, Doppler Labs

“We set out to make sure that the art is maintained and people embrace what we’re doing, asking, ‘is what we’re doing actually benefitting the music, or hurting it?’” says Doppler Labs co-founder Noah Kraft. “Every single live experience is going to be different, but nobody likes a show where it’s so loud you’re not enjoying it, so giving personalized control to the people makes them a part of that musical experience.”

The Here One is not perfect for every performance or every artist. Take Mogwai, whose eardrum-shattering live sets keep otologists awake at night. Would a way to lower their volume by a few dozen decibels be a creative hijack? “There are scenarios where the Here One isn’t appropriate, but I don’t think what you just described is one of them,” says Kraft when I ask him the same question. Kraft speaks in analogies too, which is pretty handy when trying to explain both the future of live music and human biology in the same sentence. “A chef puts a plate on the table, but do we put salt on every person’s meal? No, because we all taste differently and we all hear the world differently. We’re the first to say the Here One is not for every environment, but this is just one more altering tool that could make it better for you.”

While the Here One can flourish in a live environment – Coachella is one of Doppler Labs’ longstanding partners – Harbisson’s performances haven’t been met with the same reaction. Through the antenna’s built-in wi-fi link, Harbisson can connect to NASA’s live video feed from the International Space Station, becoming the conductor of an ultraviolet orchestra. Three shows in Birmingham, Barcelona and Toronto respectively have been performed so far, with varying degrees of success. “In Birmingham, the entire audience left and I began to feel dizzy too, so I stopped. In Barcelona, a guy over-stimulated his brain and had a seizure, and in Toronto someone simply shouted from the audience, ‘You guys suck!’ I’m just the medium, however. I’m transmitting what I’m sensing to an audience, so it doesn’t really affect me when people criticize. They’re criticizing the frequencies of space, not me,” he adds.

“Technology that physically extends us is at its most interesting when in the creative realm,” says product designer Dani Clode. For her 2017 Masters degree project, Clode created the Third Thumb, a 3D printed extra thumb controlled through pressure sensors attached to your shoes and connecting to your finger via Bluetooth. The thumb can contract, extend, grab and be used as a pint-holding device or an extra pinky for playing guitar. Right now the thumb only exists as a prototype, yet Clode’s goal is for it to “shift the focus from ‘fixing’ disability, to extending ability”. “Within art and music, when it comes to physical interaction, products are so open for interpretation and interaction. There is no set way to play a guitar, the only limit is the amount of fingers or toes we have, so surely it only gets more interesting if that’s to change?”

One place where Clode’s concept of ‘interpretation and interaction’ is being investigated is with the mi.mu glove. Devised by musician Imogen Heap as a way to form a “better relationship with the music software and hardware that forms her musical toolbox,” the mi.mu gloves map the flexes and gestures of your hand to wirelessly control the effects within your DAW. A fist could be a drum hit, an extended index finger a synth note, or the flick of a wrist a filter. “We believe gloves make electronic music performance more – rather than less – human,” says mi.mu researcher Adam Stark. “Compare pressing a button or moving a slider to the elegance of a hand gesture or the directness of making a ‘fist’ – we are capturing human body movement and harnessing its expressive power. In a more practical sense, the gloves have open fingers and palms, allowing musicians to continue to play piano, guitar or clap their hands.”

“Using the Third Thumb has definitely altered the way I view my hand” Dani Clode

“My background is as a performer so I have my biases about instruments, but everyone makes a pair of gloves and thinks it’s the most amazing thing, but usually it is not,” thinks Joseph Malloch, an instrument designer and researcher with the Graphics and Experiential Media laboratory at Dalhousie University. His works include a digital instrument called the T-Stick and a prosthetic spine that twists, turns and bends to create sound. “In my opinion, the idea of having sensored gloves is better than actually having sensored gloves, because we interact with objects in the world and we’re really good at it. If you have something you can bend or squeeze then you can do much more with it.”

Kraft’s work with the Here One began as a means to control music, but the greatest and most important use of his technology lies in combatting hearing loss. “I definitely came at this from a music perspective, but that’s evolved with the notion that if we can get the technology right then there’s a lot more we can do with this platform,” explains Noah. The Here One has been available to the public for six months, in that time taking on a life of its own. “The users are the people who are most important. We now have this feedback and we can make real time changes.”

That lead to a “democratizing and de-stigmatizing of in-ear tech,” in Kraft’s words. “The average hearing aid is $3000 (£2300), which is absurd, and it’s one of the most stigmatized products on the planet. We’re all wearing headphones far too much and we’re all going to be deaf when we’re older, so you should not be ashamed of the fact that you’re hard of hearing. Nobody should be afraid to hear, so we think of our work as providing abilities, not focussing on disabilities.”

Clode’s Third Thumb goes further in democratizing prosthetics and wearable tech. After research and testing costs, the final Third Thumb prototype was completed for under £250. “Using the Third Thumb has definitely altered the way I view my hand,” she says. Similar to how some recent amputees experience the ‘phantom limb’, Clode’s thumb has begun to work in a reverse. “My body has started to associate it as a limb the more I wear it. I miss it when I take it off now too, so I think there is a certain element of body integration happening.”

“If creativity is based on our life experiences, then the more we can sense and feel the more we can experience,” adds Harbisson. It stands to reason that the next big breakthrough could explore a completely new realm of sound, an area that currently can’t be heard by 99.9% of the population. “In music, we hear a very limited amount of sound because the human audio spectrum only ranges from 20 to 20,000Hz, but sound goes beyond our senses. If we had the ability to hear infrasound then we could create new genres that would only be perceptible to those who had that extra sense. We’ll have more options for expression, and that opens up a whole new range of possibilities.”

Jack Needham is on Twitter

Read next: Is the future of music artificial?



Share Tweet