submitted link to my project on Igor: download
The initial idea of the project was to create a compiled simulation that initialises three senses simultaneously; hearing, sight, and touch. The simulation would present the user with a sample, while the graphics represented the FFT analysis used to create different stimulants as the audio sample progressed. As the user sees these graphics at play, they would place their fingers inside the holes of a machine. These contain sensors that vibrate with the FFT analysis, and each finger experience a different vibration as each finger of their hand represents a different frequency.
Audience: The intended audience for this project are music-enthusiasts interested in visual art that represents sound with tactile elements.
Ideal scenario & story board:
Story board shows a user interacting with the machine. The user places their hands inside the holes where the solenoids are placed. The user then experiences a visual simulation on a project board of what the machine is doing. Finally as the sample plays, the solenoids vibrate causing movement on the finger tips.
The design of the project came from a conceptual art project called “amazing tangible media” by MIT. To achieve the effects used by MIT, I needed to understand the basics of FFT in process to get graphical variables to move based on a frequency spectrum. For this, I used processing as the references and code examples were comprehendible to understand. My first target was to research on finding quantised frequency amplitude from a sound/decibel, and how to split quantised frequencies using FFT.
This is the first sketch version of my project demonstrating my understanding of FFT:
Link to my first sketch: download
This version also contained a library called ControlP5, which I used to experiment on manipulation variables in real time such as the speed of the bars, height, and sensitivity. At the early stages, the design was not a concern as I was more focused with the functionality of the concept.
After the first sketch, I begun creating hand-drawn designs of what I imagined the machine would physically look like.
These designs helped me plan a better simulation with the features intended for tactile sound, as well as what the simulation should emulate. As I was designing the these, I was thinking that it would be too typical if the solenoids were represented by bands/bars in a FFT spectrum. So for the simulation, I came up with the idea of having the solenoids be represented by small clip-art speaker called “finger-units”. This gives better control of the manipulation each frequency what to do with it in terms of analysis, such as beat detection.
sketch of final simulation:
Problems that occurred:
I was unable to find a solution to get the moogFilter class to operate as intended on a sample. When I tried to implement the class, it would not apply a lowPassFilter to the sample. No matter how many code examples I tried to analyse, it could not work with my code structure. Due this, the idea of using filters had to be scrapped.
There was also a problem aligning the finger units. I could find a way to iterate the finger units to create a diamond allignment. So as a resort, Some of the finger units were hard-coded (outside the for-loop) in a specific coordinate on screen to get the shape I wanted them to be aligned in.
Another issue I had with the finger units was making them respond to the mouse going over them by turning red. The idea of finger units was for them to represent different frequencies in the sample, and movement would occur at the users finger tips. However, I wanted more control over the finger units such as stopping one individually at real time. Doing so can result more cognitive combinations for user to experience if a particular solenoid deactivated and reactivate during the sample, as well as more conditionals for graphics to change to. This was difficult to implement because the if-conditionals for the mouse to hover them would not respond, and it would not change the state of finger unit to a highlighted image; they were either all activated on mouse click or it just didn’t function at all . The solution was to manually find the radius of the finger units and create a two boolean operations for when the mouse is hovers over them (highlighted) and then clicked on in that area (deactivated).
Production notes from my computer notepad:
- LED lights.
- a human ear.
- One single saolenoid moving up and down.
- a sphere (maybe mine bomb).
- orbiting spheres.
- draw the full spectrum with draw the linear averages
- since linear averages group equal numbers of adjacent frequency bands
- we can simply precalculate how many pixel wide each average’s rectangle should be.
functionality: solenoids stop at mouse click.
I think the overall concept of the work was proficient in combining cognitive dissonance between the audio aspects of the project and the visual stimuli that incorporated the conceptual functions of touch. I feel though the practical work of project could have went further in taking advantages of manipulating graphics through changes in variable values. Some of the targets in my to-do list were not completed such as making the finger units respond to different frequencies.
MIT amazing technology- tangible media: https://www.youtube.com/watch?v=lvtfD_rJ2hE
Inspiration for unusual designs of the machine: http://www.demilked.com/unusual-cool-speakers/