UN-REACTABLE

spacetable

UN-REACTABLE

By George Sullivan and Leon Fedden

 

 


 Abstract

spacetableOur project is an interactive installation using gesture and expression to explore sound scapes. We wanted to design an immersive environment where sound is manipulated through the users presence and kinetic movement of ‘nodes’. We have created a sound scape for the user to adapt and explore through intuitive ways which, we hope, sounds and looks pleasing.

Screen Shot 2016-04-21 at 16.03.36By using AR computer vision techniques we have build a system that gives us a positional data stream for each node present on our table. Using this data we have created different ways of interacting with Reason [1] to manipulate sounds and textures present in our sonic landscape.

We built our own box in which to house the equipment and covered it with a transparent lid. Placing ‘nodes’ on top with AR code graphics facing down into the table, we are able to track the X and Y position of each node, as well as the angle it is to the camera, giving us a constant data stream of position to manipulate. In our code we have used multiple libraries to make this possible, and rely on several different input streams to be passed through our code and into Reason to create and shape the sounds heard around the room.


Target Audience & Intended Outcomes

Our original aim was to create an installation which offered someone of any musical background a platform to experiment with sound design. To make this interesting we needed our system to be responsive to the user’s gestures and thus inspire them to interact intuitively with the space. This offers an experience that is not exclusive, but instead, sound design to those with or without any previous knowledge. The control of each sound must be obvious when heard but implemented in creative ways, we created different methods of interaction for different nodes and sounds thus making an interesting piece which encourages experimentation. Although initially we aimed to create synthesis techniques with coherent actions, after research and experimentation of our system we decided that in order to create an interesting piece which sounded rewarding. We would have to give the user control over higher level audio processes rather than the gritty DSP side.

We spent a lot of time considering the control methods used in our project. After researching into embodiment we were inspired by the idea of creating an interface which provided a seamless connection between the computer system, the user, and the environment surrounding them. Consequently, we wanted our project to conceal all technology within the box we built, and allow our users to interact with physical objects within the space around it. However to succeed in creating this connection not only a tangible interface was necessary, but thoughtful and natural output from our program.

Studio PSK's polyphonic playgroundWe researched other projects to see what methods were successful in similar pieces. Most notable was the Polyphonic Playground [2], which George visited at the start of term. After talking about it, we decided that although it was an impressive exhibit, the output sound from the system was not coherent with the interaction from the audience, especially when so many people were using it. The way the sounds work together was where we thought it fell short, and so in order for our project to be sonically impressive we decided thaReactablet we would need to put in a lot of thought into the sound materials used. Another problem we saw with it was the users interaction was limited essentially to multiple switches. Consequently we decided that our nodes should exhibit differences in interaction to remain interesting.

It is, of course, worth noting Reactable [3], a recent project that is not too dissimilar with our idea. Believe it or not, we were unaware of this when we initially started thinking of our design. But since discovering its parallels with our ideas, it absolutely shaped some of our decisions for our design. A recurring criticism of Reactable is that has a feature set which can be difficult to learn, especially without a background in music. Understandably Reactable has a different audience, budget and time constraints, however it highlighted a few importances in our build.

Leon went to a guest lecture in the Whitehead building by Dr. Nicholas Ward [4]. The lecture was how movement should be considered into the design of Musical Instruments and partially shaped how we mapped our nodes to our sound output. After researching further into some of his work we felt it really reinforced what George took from the Polyphonic Playground instillation. For example, In a paper written for a NIME conference ( New Interfaces for Musical Expression ), Ward explains the design process for his own ‘musical interface’ (The Twister) and discusses the importance of gesture design and sensible mapping which was becoming a recurring theme in our ideas. “The number of useful gestures discovered represents the starting point for the subsequent development of a musical gesture vocabulary” [5]


Design Process and Build Commentary

We had our users or target audience integrated into the plan from around the time of the project proposal; it was then we elected to build a system that was high level enough for anyone – interested in sound design or not – to be able to make interesting sounds. After some conversations with one-another we settled on a physical and digital form for the project. Having a shared vision we set out to draw components and how we envisaged users using them and how the system would be manipulated to achieve our desired output.

DSC_0079 

Having the ideas on paper really helped to ensure that there were no discrepancies in our expectations of the project. The next task was to reductively identify the key components of the project. Once we had a list of components we subsequently allocated time to build mini-projects to create each item on the list.

This was the fun bit; we were still green to openFrameworks and C++ (and still are in many ways) and we were opened up to the rather large ecosystem of addons and libraries that we might want to use. Aside from building the components as lego-bricks that plug together into a bigger cohesive model, the mini-projects served a second function; they were an objective method of rating the libraries effectiveness in providing the horsepower in some of the more complex computations. We also spent some time making classes or simple models which verified the sanity of our plans; checking and predicting approximate system dynamics for example by making a ‘table-top’ sketch controlled via mouse and keyboard that outputs midi much like the final model would.

Next is a roughly chronologically ordered list of mini-projects and what they achieved. The names reflect the titles of the projects submitted in GitLab:

  • AR_SIZE_DOESN'T_MATTER_IT'S_HOW_YOU_USE_ITofxArucoExample: This was one of the first things we did as the whole project hinged on being able to track objects of some description in the real, physical world. Initially there was a fruitless folly of a foray into ofxARToolkit. The library the addon wrapped at the time was most unfortunately not maintained and compilation was futile. However the omnipresent Arturo was soon on deck to help with a little library he wrote called ofxAruco. This project was (or is) his example that we used ensuring marker detection worked to a decent standard and it was worth proceeding with. It is worth mentioning that as of very recently, ARToolkit is back in development with a new API and subsequently a new addon Github page [6] has been made – something worth keeping an eye on perhaps.
  • MarkerTracking: This was the class that took the example and wrapped it up into a “deceptively simple” (Theodoros Papatheodorou) class that kept track of the AR markers, their position, rotation and whether they could be seen or not. Weeks of struggling with an inaccurate reading of rotation and quaternions and matrices led to us waiving the white flag and bringing out the gaffertape with a cheap, hacky solution – sometimes it is easy to get caught in the finer details and forget the overarching picture, so we pressed onwards.
  • NodeExample1:
    A small project to model the tabletop and markers – essentially circles that could be dragged around using the mouse. This was then turned into a class to interface with the forthcoming midi-library explorations.Screen Shot 2016-04-20 at 18.53.30
  • MidiAttempt1:
    Here we worked through an example to explore ofxMidi [7] and get familiar with the API. The documentation was straight forward and the examples provided gave us enough to implement our own basic midi messages.MidiToReason
  • MidiToReason:
    We then took that knowledge of ofxMidi and spent time working out how to route midi signals on a Mac from our program to Reason. As basic note-on and note-off messages had been covered, we now wanted to create our own software interface to allow control over parameters inside of reason. This was where the documentation fell short and we spent a long time figuring out how (and verifying it was possible to) to send these controller change messages. After researching into how midi itself works, and digging deeper into ofxMidi, we were able to create our own midi channels and send controller data from our software.

     

  • About ten assorted ofxMaxim projects: A lot of these projects initially stemmed or were from Leon’s tutorials for his technical research. They all fall under the bracket of fairly low level sound design so have been grouped together here. The real take away from these was that Maximillion was too low level for our users and means. The proceeding files are; Saw wave with pitch, Amplitude & Ring Modulation , Stereo Output , Frequency Modulation, and finally a combination of these processes.
  • PS3EyeGrabber: This project took the openFrameworks add-on’s [8] functionality and wrapped it into a class that allowed for straightforward interfacing with the MarkerTracking class API. Essentially it copies the openFrameworks base class VideoGrabber API so the same functions can be called in the same place.

Screen Shot 2016-04-13 at 22.25.55These mini-projects were key to refining our ideas. Often the usage of an add-on unearthed new possibilities and pathways for the project’s journey; one of the major realisations during this project was that Dr Mick Grearson’s Maximillion library [9] was too low level for what we had in mind sonically and by extension our target audience. It was here we decided to do the ofxMidi route into a Digital Audio Workstation (DAW). We went with reason because Leon had the most experience with that.

After we felt we had reasonable examples of projects that covered most components necessary for the final build we began to piece them together. From here on we iteratively refined the final code base to improve performance, remove bugs and to add new features like selecting modes, function pointer arrays to simplify modes (each mode could then have its own function in an efficient manner) and other changes.

The second stage of our design process was the physical build of the box and nodes. For this we worked alongside Henry Clements, who is a 2nd year Design student at Goldsmiths. We brainstormed initial design ideas, focussing on building something that allowed us to optimise space for interaction, and not limit the processes we wish to get out of our system. We also wanted to create something that was easily transportable and could be set up or taken apart easily, as well as allow space for the webcam, computer/raspberry pie, and any other wires or parts required in the future. Together we built a prototype design and created some initial blue prints.

boxDesign1Prototype

Once we settled on a final design, we experimented with the range of the webcam to figure out how large our build would need to be. It was important to provide enough space for the user, and furthermore a large range of pixel data. We also took into account the size of the nodes, and the amount we would be using. In conclusion we decided to build the box to be 90 x 90 x 90 cm, giving us enough depth to get a large surface to play with and space inside for the webcam (and anything else that needed to be hidden), while retaining a design that was still portable.

PanelBluePrintsBaseTopBluePrints

You can see in the images above; blue print sketches for each piece of our box (center & left), a screen grab from the CAD software we used to get the wood cut using a laser cutter (right). We booked into a woodwork studio to have it cut, before measuring up all the dimensions for the lid.

DSC_0168DSC_0175

12810357_10153855667264627_1466659742_o (1)12769464_10153934257274709_1153392241_n

To finish off, we added a stylish gold tint to our box and were ready to go. After spending a little time calibrating the camera and AR codes, we pieced all our code together and began designing sounds and interaction methods.


An Evaluation

Regarding problems and dually our solutions, we were lucky not to have any major show stopping issues over the duration of this project. It would not be true to say it went without a hitch however; we have compiled a list of reflections that we have learnt from. Note largely this list is for the reader who is not our senior; we are sure they are beyond such mistakes. For all others have a good laugh at our errors if you don’t fancy learning from them!

  1. Sometimes the code in the library just doesn’t work. Or more frustratingly, it does work but it doesn’t give the expected amount. Even more frustratingly, it might feel like one is a maths class away from actually being able to re-write the damn thing. The issue in question was obtaining the z-axis rotational value of any given marker. The code as standard returned approximately twenty to three-hundred and forty degrees when a full rotation happened. We tried to get help on the forums and by talking to anyone who’d listen at University – start there, but in the end we had to settle for a (relatively) ugly hack, mapping the returned range to the desired range. It’s in our opinion better to keep moving rather than get caught up in the minor details.
  2. Midi was also an issue for us – we had a lot of trouble routing Midi through Mac OSX to our desired DAW. Here we can only recommend ‘Google-fu’, and if you’re working in C++, reading the header files! That eventually got us through.
  3. Sometimes libraries don’t compile, and are broken or outdated beyond repair or changing a few pre-processing directives. Don’t be afraid to throw in the towel and find a new alternative or better yet, depending on the scale, writing the solution yourselves.
  4. If working with hardware, paint or glue specifically, then ensure you test a corner before committing to the whole sheet. We eagerly tinted our Acrylic sheet gold before realising the tint was too dark for effective AR tracking. The solution was subsequently spending hours removing glue residue and it has never got back to one-hundred percent the transparency it used to have.
  5. A major, really stupid mistake I – Leon – made borders arrogance. Do yourself a favour and never make a major change to operating systems mid-way through an important computing project; having to change development ecosystems can really stifle progress whilst you have to wrap your head around new ways of doing things. I in particular spent time talking to different Tutors and friends to weigh up whether I wanted a new Mac computer or Linux system. Whilst I don’t regret the change ultimately; I have enjoyed the rather arcane art of some of the more masochistic Linux operating systems, I regret the timing and wish I had waited until the end of term.
  6. A more abstract error on our part was perhaps too much ambition. A personal view of ours is that it’s always worth being ambitious because the results are usually larger in a project the more optimistic you begin, but perhaps we expected we would be able to deliver a completely professional project on such a small budget and the time we set for ourselves. Time management of course played a part and there is always room for improvement on that end, but I never thought we were being lazy. The solution to this is not to stifle ambition, but perhaps be realistic in terms of the expected final project and remember you are only human!

DSC_0026      Some things aren't meant to be


 

Conclusion & Future Proposals

We are pleased with what we have achieved within this project. As we are using reason we can change and alter the source sounds appropriately.We managed to implement data mapping not just using X and Y positions but also from each nodes; speed, distance between them, how many are within a short distance, and rotation. It is possible to add new AR codes into our system providing you can identify them (use drawData function in markerDetection.cpp). We believe that if it was an exhibition piece, it would be of importance to be able to create the output for particular occasion, hence why we designed our software to be flexible. The video above is a short performance to demonstrate what is possible using our system. On a final note, we would like to add that it is difficult to get a feel for what is capable from our installation without spending some time experimenting with it.


 

GitLab Repository   http://gitlab.doc.gold.ac.uk/lfedd001/CP2_Final_Project_Interactive_Synth.git


 

Here speaking as myself, Leon, I can talk about what I took from the project, and more pertinently what it has inspired me to go on to do. After being introduced to the world of AR, I have taken an interest into the mechanics behind it and the applications of it.

Largely I would like to partake in more adventures in the blend of the digital and real world – mixed reality. Hololens [10] from Microsoft is a great real world example. Unfortunately however there are two issues here, firstly it’s Microsoft and I have little interest working in that eco-system and secondly the level of polish of their project is likely beyond my scope of possible achievement. However I like the idea of three-dimensional data representation and visualisation in the real world and would be interested in exploring avenues along those lines.


 

References

  1. Propellerheads.se. (2016). Create more music, record and produce with Reason | Propellerhead. [online] Available at: https://www.propellerheads.se/reason [Accessed 21 Apr. 2016].
  2. Studiopsk.com. (2016). Polyphonic Playground. [online] Available at: http://www.studiopsk.com/polyphonicplayground.html [Accessed 21 Apr. 2016].
  3. Technology, R. (2016). Reactable. [online] – Music Knowledge Technology. Available at: http://reactable.com/ [Accessed 21 Apr. 2016].
  4. DMARC | Digital Media and Arts Research Centre. (2016). Dr. Nicholas Ward. [online] Available at: http://www.dmarc.ie/people/academic-staff/nicholas-ward/ [Accessed 21 Apr. 2016].
  5. Ward, N. and Torre, G. (2014). Constraining Movement as a Basis for DMI Design and Performance. [online] NIME. Available at: http://www.nime.org/proceedings/2014/nime2014_404.pdf [Accessed 18 Apr. 2016].
  6. GitHub. (2016). naus3a/ofxArtool5. [online] Available at: https://github.com/naus3a/ofxArtool5 [Accessed 21 Apr. 2016].
  7. GitHub. (2016). danomatika/ofxMidi. [online] Available at: https://github.com/danomatika/ofxMidi [Accessed 21 Apr. 2016].
  8. GitHub. (2016). bakercp/ofxPS3EyeGrabber. [online] Available at: https://github.com/bakercp/ofxPS3EyeGrabber [Accessed 21 Apr. 2016].
  9. GitHub. (2016). micknoise/Maximilian. [online] Available at: https://github.com/micknoise/Maximilian [Accessed 21 Apr. 2016].
  10. Microsoft HoloLens. (2016). Microsoft HoloLens. [online] Available at: https://www.microsoft.com/microsoft-hololens/en-us [Accessed 21 Apr. 2016].

 

 

Comments are closed.