Features I by I 01.12.13

The human touch: five futuristic controllers reconnecting musicians with machines

Page 2 of 6

AUUG Motion Synth

Our recent live music roundtable underlined the importance of ‘humanity’ in electronic music performances.

Despite appearances, it’s not quite ‘rise of the machines’ out there just yet. While plenty of people are content with live shows that are by-and-large pre-programmed and leave little room for error, thus being more akin to live dubbing or DJ sets than live music per se, our conversation with Scanner, debruit, Comfort Fit and Archie Pelago underlined the importance of human error and control in modern music performance. And what better way to bring this to audiences – both sonically and visually – than with new types of controllers and instruments?

There are plenty of controllers out there that allow musicians to control laptops and hardware in the studio and on stage, though some of the most interesting progress with regards to their evolution is coming from the fringes. A perfect example of this is the monome, a custom-built controller that has grown over the past decade to encompass a worldwide community of users and creators, with perhaps its most famous live exponent being Daedelus, who uses it for his own live shows.

During our roundtable chat, debruit gave us the example of a video game guitar that he and his band had customised to act as a MIDI device, taking the music he’d written in the studio to the stage in a way that was easy to visualise and understand while remaining different to a standard MIDI guitar. It’s not so much that instruments and controllers should be ‘new’, but rather fun and intuitive to use and see or hear in action.

Over the following pages we take a look at five of these new instruments developed over the past few years to allow musicians to take more control of their music on stage. Many of these also make the most of the new mobile and touchscreen technology that have become increasingly omnipresent in our lives, making them accessible to more musicians than ever. So if you’re looking for inspiration to unlock your creativity in the studio and get your music heard and seen in new ways, this should be a good place to start.

Use your keyboard’s arrow keys or hit the prev / next arrows on your screen to turn pages (page 1/6)

Beat Surfing

BEAT SURFING

Beat Surfing is described by its creators as an organic MIDI controller builder. It’s available as an iPad app that lets you draw three-dimensional controllers which you can then use via the iPad’s touchscreen interface to control any MIDI-enabled device, be it software, hardware or even certain other apps.

While this isn’t necessarily a new idea, Beat Surfing has really improved on it by making movement the key component of this new controller. Where most touchscreen controllers rely on tapping, Beat Surfing works best when you slide on the iPad’s surface, hence the name. The interactions this creates between instruments and controllers are like nothing ever seen before with physical hardware.

The app is one of the most interesting of its kind in recent years, not least because of how it’s used by its creators, Belgian duo Herrmutt Lobby, who built the app with Yaniv De Ridder and have in the past worked with Cupp Cave, among others. Herrmutt Lobby work fully outside of the grid (like Japanese producer Jealousguy, who we profiled this summer), meaning their music – both recorded and live – isn’t locked to any sort of clock or counter. They designed the app originally as an extension of various controllers they’d been developing for years to allow them to fulfil their own musical ambitions on stage. For those seeking a way to get a real human feel into their music, Beat Surfing is a great place to start.

Herrmutt plays a Cupp Cave Beat Surfing ‘scene’:

Recent live show including Beat Surfing and other controllers:

 

Use your keyboard’s arrow keys or hit the prev / next arrows on your screen to turn pages (page 2/6)

Beat Jazz

BEAT JAZZ

Onyx Ashanti is an American musician who has travelled the world, notably spending time in London in the early 2000s where he played alongside Basement Jaxx and Soul II Soul. Later he began developing a custom-made instrument and controller called Beat Jazz, which ended up landing him a spot at a TED talk. Following this, the project has now been renamed ‘exo-voice’ and made open-source.

Onyx explains the latest evolution of his invention as “an integrated sonic exploration construct – fully custom software and 3D-printed hardware – whose internal architecture is based on fractal logic.” So essentially anyone can now use Onyx’s tools and work to create their own version of the open-source software and hardware with 3D printing technology.

While Beat Jazz/Exo-voice is arguably a lot more complex than other new instruments, the project’s evolution and its creator’s desire to make it so open are interesting as examples of alternate ways to evolve instruments and live performances. As the video below shows, it certainly beats watching someone twists knobs on an MPD.

2011 Beat Jazz presentation:

 

Use your keyboard’s arrow keys or hit the prev / next arrows on your screen to turn pages (page 3/6)

AUUG Motion Synth

AUGG MOTION

Augg Motion is a brand new project that aims to make iPhone or iPod touch musical apps more interesting to use in a live or studio setting. Started by Joshua Young, the AUGG Motion synth app allows you to control other audio apps and MIDI-enabled hardware remotely while leaving you free to move instead of being stuck with both hands on your phone or iPod.

It comes with a grip that’s easily attached and has button windows to trigger keys on the screen. The app’s motion sensor converts touch and movement into sound signals which the app transfers to other iOS music apps or external MIDI-enabled devices. It also allows for control of background visuals, which is a nice touch for live performances. As you devise ways of controlling sound through motion you can also save these as presets.

AUGG Motion Synth is still in the early stages and currently seeking funding via Kickstarter, but it already has an online community where people can share presets, ideas and tips. As far as portable app controllers go, this is one of the more interesting and easy-to-use ideas out there and in a way not dissimilar to what Onyx Ashanti has been going for with his own device. Considering how many producers have iPhones, this could easily become quite a normal addition to a portable studio and live setup.

Watch the controller in action:

 

Use your keyboard’s arrow keys or hit the prev / next arrows on your screen to turn pages (page 4/6)

Polyplayground

POLYPLAYGROUND

Polyplayground is an iPad app designed by producer and instrument builder Mike Gao. Based in L.A, Mike has been a part of the local beat-focused scene for years, as well as its turntablist roots. Mike made early forays into app-based instruments back in 2010 with a beatbox-to-MIDI converter called Vocal Beater, an app that enables you to beatbox a drum pattern into your phone and email the corresponding MIDI information to yourself for use with a DAW or hardware.

With the arrival of the iPad, Mike designed Polyplayground to take full advantage of bigger control screens. Much like Herrmutt Lobby with Beat Surfing, Mike created Polyplayground first and foremost as something he could use himself (it has appeared on every one of his recent releases). The app allows you to write and improvise chords and melodies in a simpler, more intuitive way. It has an onboard synth and it can also be synched to your favourite synth via MIDI. Scales are represented by colour regions, making them easier to remember, while progressions can be memorised via a system based on Tetris-like shapes. Harmonic relationships can be mastered through colours and shapes, and you can record chords and play them back to yourself while playing over the top, or even use the app to see what others are playing thus allowing you to improvise without being lost. The iPad’s tilt sensor is also used to control parameters, like a mod wheel on a synth.

Basically if you’ve ever been half decent at Tetris, Polyplayground allows you to take this ‘skill’ and apply it to playing chords and melodies. And considering the convergence of electronic music and video games in the past couple of decades, Mike’s approach is not just logical but also obvious. Mike’s use of the app in his live shows hints at its great potential for visual display and interaction with the crowd. just like any normal instrument.

Polyplayground demonstration:

 

Use your keyboard’s arrow keys or hit the prev / next arrows on your screen to turn pages (page 5/6)

SIGMA

SIGMA

The last project is perhaps the most ambitious. Developed by a team of Japanese musicians and interactive designers and designed by Funktronic Labs with beat boxer Ryo Fujimoto, aka Humanelectro, SIGMA is very much a futuristic ideal of what instruments could become.

The device comes in the shape of a pair gloves fitted with sensors that track heart rate, muscle movement and finger positioning. The movements and information from the sensors are turned into data that is then converted into audio and visual output.

The core technology behind SIGMA at the moment is the leap motion controller, a new type of user interface that enables very precise tracking of human motions. This technology is at the core of various new projects that open up possibilities for controlling applications via touch and gestures.

As the video below shows, SIGMA isn’t yet aimed at being a consumer device. Certainly the way its creators talk about it implies that they’re most interested in seeing how far they can go with it in a live setting that’s halfway between a live music show and art installation. There’s little doubt that their progress, and that of others using the technology, will at some point become integrated into simpler apps that everyone can use.

SIGMA creators introduce their idea:

 

Page 2 of 6
Latest

Latest



		
	
Share Tweet