Sole to soul – foot tapping to a new beat

Sole sensors linked to a central interface and linked to algorithm software could help walkers create flexible music as they stroll.


OSLO, NORWAY (RECENT) (REUTERS) – Norwegian researchers have developed a prototype wearable music sole that uses algorithms which allow listeners to change the speed of the music they are enjoying simply by walking.

Associate Professor in Music Technology at the University of Oslo (UiO), Kristian Nymoen, spent two years developing devices that create interactive, flexible, music that is controllable by the listener.

The sole contains embedded force-sensing resistors which control the music using a general purpose sensor interface. A gesture recognition algorithm enables and disables various control possibilities and also selects which part of the song to play. A central interface box is strapped to the user’s leg.

“Here there is a sole with three or four sensors and a central interface that sends the information from these sensors to the computer,” explained Nymoen. “The type of gait that I have, if I’m walking or tapping the front or the back of my foot is classified with a machine learning algorithm and this classification is to determine which part of the music is currently playing.”

He explained: “In the different parts of the music I influence different aspects of the music itself, so if I’m walking I can let the walking tempo detect (determine) the speed of the music, and if I’m tapping the front of my foot it will change to another pattern, and the activity level of say one of the instruments is changing, and the different filters is influenced by another pattern.”

Nymoen says that along the axis between active performers and passive listeners, there is room for listener participation in how music is made and listened to. Directly inputting low-level control actions to the system is one option. Another is to allow indirect control by the listener through higher-level features extracted from sensor data that calibrates a user’s mood or energy levels.

“It could be used for exercise, say if you’re out running and you’re sort of slowing down maybe you want a sensor system to detect that and to push you to run a little bit faster,” said Nymoen. “Just for the fun of it while dancing to music maybe you want to influence the music that you listen to, so if you dance more energetically then maybe the music will reflect that somehow.”

Working in collaboration with Professor Jim Torresen, of the UiO Department of Informatics, Nymoen believes his EU-funded work could potentially represent a whole new market where audio producers make flexible music rather than pre-recorded songs. Torresen agrees.

Although current technology would probably create a network lag between users, Nymoen says that before long it could be possible for two users on the other side of the world to create interactive music together.

“I see multiple approaches,” said Nymoen. “One is to actually create a sole with these sensors implanted that would connect to your smartphone and maybe a custom-made app where you would have flexible music composed that you would influence. Or it could be all the sensors embedded into your phone because of all the sensors that are there already.”

Nymoen and Torresen, together with researcher Hakon Knutzen, have also released an Apple app called PheroMusic, which allows users to listen to music and control it by moving their smartphone, in conjunction with selecting from five different musical scenarios.

Demonstrating the concept to Reuters, Torresen said the algorithm used in PheroMusic was inspired by ants. “By tilting it (the phone) you will make a change in this kind of song you are hearing, similar to our sandal example. The kind of interesting thing here is that you can yourself be tapping what you would like to listen to but then by an algorithm inspired by ants and how ants are interacting, we are collecting knowledge about your preferences.”

The two projects are part of the EU-funded trans-national multi-disciplinary research project called EPiCS (Engineering Proprioception in Computing Systems).