Virtual simulation of tactile sound

image cannot be viewed.

submitted link to my project on Igor: download

 

Project description:

The initial idea of the project was to create a compiled simulation that initialises three senses simultaneously; hearing, sight, and touch. The simulation would present the user with a sample, while the graphics represented the FFT analysis used to create different stimulants  as the audio sample progressed. As the user sees these graphics at play, they would place their fingers inside the holes of a machine. These contain sensors that vibrate with the FFT analysis, and each finger experience a different vibration as each finger of their hand represents a different frequency.

 

Audience: The intended audience for this project are music-enthusiasts interested in visual art that represents sound with tactile elements.

 

Ideal scenario & story board:

image not found

Story board shows a user interacting with the machine. The user places their hands inside the holes where the solenoids are placed. The user then experiences a visual simulation on a project board of what the machine is doing. Finally as the sample plays, the solenoids vibrate causing movement on the finger tips.

Design Process:

The design of the project came from a conceptual art project called “amazing tangible media” by MIT. To achieve the effects used by MIT, I needed to understand the basics of FFT in process to get graphical variables to move based on a frequency spectrum. For this, I used processing as the references and code examples were comprehendible to understand. My first target was to research on finding quantised frequency amplitude from a sound/decibel, and how to split quantised frequencies using FFT.

This is the first sketch version of my project demonstrating my understanding of FFT:

 

image cannot be found

Link to my first sketch: download

 

This version also contained a library called ControlP5, which I used to experiment on manipulation variables in real time such as the speed of the bars, height, and sensitivity. At the early stages, the design was not a concern as I was more focused with the functionality of the concept.

After the first sketch, I begun creating hand-drawn designs of what I imagined the machine would physically look like.

first design:

image not found

second design

image not found

These designs helped me plan a better simulation with the features intended for tactile sound, as well as what the simulation should emulate. As I was designing the these, I was thinking that it would be too typical if the solenoids were represented by bands/bars in a FFT spectrum. So for the simulation, I came up with the idea of having the solenoids be represented by small clip-art speaker called “finger-units”. This gives better control of the manipulation each frequency what to do with it in terms of analysis, such as beat detection.

sketch of final simulation:

image not found.

 

Problems that occurred:

I was unable to find a solution to get the moogFilter class to operate as intended on a sample. When I tried to implement the class, it would not apply a lowPassFilter to the sample. No matter how many code examples I tried to analyse, it could not work with my code structure. Due this, the idea of using filters had to be scrapped.

There was also a problem aligning the finger units. I could find a way to iterate the finger units to create a diamond allignment. So as a resort, Some of the finger units were hard-coded (outside the for-loop) in a specific coordinate on screen to get the shape I wanted them to be aligned in.

Another issue I had with the finger units was making  them respond to the mouse going over them by turning red. The idea of finger units was for them to represent different frequencies in the sample, and movement would occur at the users finger tips. However, I wanted more control over the finger units such as stopping one individually at real time. Doing so can result more cognitive combinations for user to experience if a particular solenoid deactivated and reactivate during the sample, as well as more conditionals for graphics to change to. This was difficult to implement because the if-conditionals for the mouse to hover them would not respond, and it would not change the state of finger unit to a highlighted image; they were either all activated on mouse click or it just didn’t function at all . The solution was to manually find the radius of the finger units and create a two boolean operations for when the mouse is hovers over them (highlighted) and then clicked on in that area (deactivated).

 

Production notes from my computer notepad:

MAIN GOAL:
a Tactile object that responds and represents frequencies/sound that is interactive.
How to do this?
        Answer: Make an audioVisualiser in processing for the solenoids to mimic and create a simulation of the user experience, and get a single line of solenoids to represent tactile audio visualisers
Design ideas:
  • LED lights.
  • a human ear.
  • One single saolenoid moving up and down.
  • a sphere (maybe mine bomb).
  • orbiting spheres.
Main target:
 
research finding quantised frequency amplitude from a sound/decibal
how to split quantised frequencies.
 
two paths: design and functionality.
 
design: use filters, scripted variables, and frame count to amplify different frequencies or emphasise graphics.
  • draw the full spectrum with draw the linear averages
  • since linear averages group equal numbers of adjacent frequency bands
  •  we can simply precalculate how many pixel wide each average’s rectangle should be.

 functionality: solenoids stop at mouse click.

solenoids beat detect different frequencies.
push button to make p5 
 
How to achieve this functionality?
      Answer: FFT analysis
How to represent a frequency range; how we split that by ten for ten solenoids?
       Answer: use gaussian distribution, perhaps use randomGaussian() in processing.
       alternative: multiplay certain ratio of frequency spectrum
Representation of sound and its relation to the amplitude spectrum in a graph:
image not found
X-axis = frequency
Y-axis = amplitude
50hz-44000khZ
To do list for later stages of production:
Practicality and platform: final software interface for stereo using two ffts to represent two human hands : two pairs of  5 circles on each side of the of screen that do two different things.
include some p5.
put some nice visualisation in it, maybe at the center of the screen.
make finger unit switch bands to different frequencies.

Evaluation:

I think the overall concept of the work was proficient in combining cognitive dissonance between the audio aspects of the project and the visual stimuli that incorporated the conceptual functions of touch. I feel though the practical work of project could have went further in taking advantages of manipulating graphics through changes in variable values. Some of the targets in my to-do list were not completed such as making the finger units respond to different frequencies.

references:

MIT amazing technology- tangible media: https://www.youtube.com/watch?v=lvtfD_rJ2hE

Inspiration for unusual designs of the machine: http://www.demilked.com/unusual-cool-speakers/

 

 

Comments are closed.