The Space that Brought Us Here

IMG_9761

The space that Brought Us Here is a piece that challenges audiences perception of their surroundings in a more intimate way, though the physical engagement with the piece itself. Screens are used to show sections of the same space. The viewer is able to then reorientate the screens, through movement, creating new-found compositions.

The idea behind ‘The Space that Brought Us Here’ was to challenge a viewer’s understanding of the space around them. As someone walks around a space they form an understanding about how they are operating within their surroundings and how other people are also interacting with the Space. Individuals form their own perspectives and compositions of the space. The piece had 6 tablets suspended on metal steel wire within a wooden frame. A viewer was able to move the tablets up and down the wire, rotate the tablets on the wire. The tablets took video from there rear camera and displayed the video on the screens. However, as all the tablets were connected to the same local area network they could share the video between one and another. A viewer could tap on the screen of the tablet in order to change the video shown to one of the other 5 tablets. This resulted in mismatched video, creating confusion but intrigued about how that perspective had come about.

Intended outcomes & background research

The piece was based on a lot of background research I had done. Particularly with the work of Olafur Eliasson. I was intrigued with him talking about public space with ‘New York City Waterfalls’. Here the waterfalls acted as an intervention in public space. This allowed people to evaluate the space around them. Following on from this Eliasson’s work on the glass façade in the Harpa building also demonstrates this by it’s “three-dimensional quasi-brick structure” that creates “fivefold symmetry” [2]. The sections create shifts in their appearance and colour according to people in the building and the environment. Looking through the facade distorts the view, creating new found compositions. Also with Jeppe Hein’s work, ‘Please Touch The Art’, where he created a mirror labyrinth made out of a spiral of mirrored stainless steel planks set against views of lower Manhattan. The posts are arranged in various arcs that distort the surrounding park and city. I found this piece intriguing with how the mirrors create distortions which created extra perceptions of the space.

I wanted to take this further to explore how individuals could form different understanding of a space, in terms of perspective and how other viewers have an impact on the space that is viewed. Just prior to this I had worked with perspective and perception with Focal Grid which split the same view in to multiple different focus points. Interaction controlled the different focus points that were focalgridshown, from a tablet with virtual buttons. From the exhibition of this piece it was clear that the interaction was not always obvious and the viewer didn’t feel connected with the scene that was shown.

I wanted to make the viewer more of a part of the piece through physical engagement with the piece. In the reorientation of the tablets and being able to be part of the video shown. This removes conventional boundaries with how the piece can be interacted with. There is a sense of playful fun, moving different objects independently of one another like a jigsaw, allowing a viewer to see different possibilities with how the objects can be positioned.

I wanted the video shown on the tablets from the outset to be familiar to the viewer. This was in order to augment and intervene with a space they were familiar with in order to challenge there understandings of the space. I achieved this by using live streamed video of the space from the tablets, for the desire to create engagement within the viewers surroundings.

Software Process

(For a full account of the build process see the documentation log)

(All notebook work can be found here)

I had decided upon using the Amazon Kindle fire for my project due to it’s relative low cost and running android which openFrameworks could compile on to. I started initial tests by trying to set up of the openFrameworks android examples on a Kindle Fire. This took a lot longer than expected. I found that using Android Studio was a better way of compiling on to the tablets.

I worked on some rotation tests with the accelerometer in the tablet to help me with the rotations of video that would be displayed. I had issues with being able to get the values required to allow continued rotation of the tablet because of the lack of a gyroscope within the tablet. But this still meant that I could impact the rotation of the image displayed based on if the tablet had been rotated left or right.

Following on from this I was not able to get the camera examples for openFrameworks Android compiling on to an Android device. This resulted in the program crashing very quickly after compiling on to the device. I tried this with 2 devices running new and older versions of Android. The solution for this was to use an IP Cam that ran in the background of the tablet which would then stream the video to my openFrameworks app. I used IP WebCam. In order to get the video into my oF app I used the oF addon ofxIpVideoGrabber this took in a list of IP cams via an XML file to display them on screen.

From this solution being used it allowed me to share the video between all of the tablets which worked very well. I got some good feedback from people I showed this too.

Hardware process

(For a full account of the build process see the documentation log)

(All notebook work can be found here)

I needed to build a frame that would allow the tablets to be suspended within the frame and also, allow them to move up and down and rotate on the wire all with keeping the tablets fixed within the frame. It became clear after talking with Nicky Donald, one of the technical support team, that the frame and the attachments for the tablets would be easiest custom made as we were not able to find anything commercially available to create this. As a result, the frame would be made out of wood and metal steel wire would be used to attach the tablets to the frame.

3D Printing

The attachments would need to be 3D printed to allow them to grip on to the wire and rotate the tablet at that position. I set about looking for 3D designs on thingiverse.com to find the basis of the parts. From this I started modifying the designs, going through tens of iterations of design and print tests.

IMG_0012

Scan 6 May 2016, 14.55

I was unable to find a suitable case for the for the kindle fires that I was using with my project. I came to the conclusion that it would be best to print cases for the tablets. This came with the advantage of being able to have the attachment plate printed on to the case, saving having to glue resulting in a messy look. However, there were issues with the printing of the cases. It was very difficult to perfect. The cases took around 7 hours each to print meaning that it took a while to be able to make modifications to the design and get all printed at a high enough quality. I also had issues with the rafts and supports that the 3D printer uses to support the print as it is being built. The supports were not consistent in the print due to the complexity of the design, this resulted in weaker prints. I started using a program called MeshMixer, this allowed me to adjust my cased in real world measurements to fine tune the cases. It was also used make the cases optimum for 3D printing. However this still did not solve the issues with the supports for the printing and as the 3D printers that I had access to didn’t always print correctly.

I had issues with the main rotation part snapping when the cases were assembled. This damaged some of the cases meaning that modifications had to be made to them so some were not able to rotate on the steel wire.

Wooden frame

IIMG_0004 (1) had initially been considering a square wooden frame for my piece. However, through thinking about how my piece would sit in the gallery space I thought I would opt for a portrait frame. This was for several reasons including, creating the look of a window or portate that would frame the tablets to allow for this change of perspective. Also to match the aspect ratio of the tablets within the frame.

 

 

 

The first frame that I build wouldn’t have been strong enough to support the tension from the metal steal wire. Nicky Donald advised me to use structural wood used for building to support the tension from the wire. As a result, I purchased more wood and added metal braces as well as wooden mitred braces for each corner. This resulted in a very strong frame that was capable of supporting the tensioned wire.
frameIMG_1187IMG_1189

For the steel wire, I used wire grippers and Gripples, which allow steel wire to be gripped but adjusted for tensioning of the wire. The wire grippers were put at the top of the frame and the Gripples at the bottom. Pliers were used to pull the wire through the Gripple to tension the wire.

Opening night

I was really pleased to see people’s intrigue at the opening night. This intrigue initially was from the video being very often missmatched which drew viewers in to interact further with the piece. Mainly viewers rotated and moved the tablets up and down the wire, however usually quite tentative as most seemed to think the tablets look a bit precarious. One interaction I had not been expecting was people to twist the tablets left or right on the wire to see around. This surprised some people as it became very clear then that the tablet that they were moving didn’t always respond to their movements. This was due to the feed shown on the tablet being another tablet within the frame. This discovery lead to alot of viewers interacting with the exhibition further. However, for some the mismatched video wasn’t enough for them to physically engage with the piece.

IMG_4801 IMG_9692 IMG_9693 IMG_9781

There were a couple of tablets that were a bit precarious on the frame. This resulted in 2 of them falling off. This did cause issues with viewers wanting to interact with the piece due to these issues from the 3D printed attachments.

 

Evaluation

The Space that brought us here was a successful piece. I felt like my piece lived up to the original concept of challenging a viewers perception of their surroundings. I had ideas of using the rotation to control the rotation of the image and using an image buffer to create a delay. Although I did not explore these ideas fully, it became clear that this would add too much complexity to my piece and confusion in physical interactivity. Through my solution of using an IP Cam to capture the video instead of using the cameras locally allowed me the ability to share the video between all of the tablets. This created a very subtle shift in perspective but was obvious enough from just looking that the perspective had been altered. I feel that this was better than adding a time dimension or unexpected rotation of the video as I wanted to make the intentions of the piece clear. However, due to the latency of the video stream, there was sometimes a small delay that was noticeable when someone walked in front of the tablets. This was useful to illustrate how the tablets were creating a different perception of the surroundings and allowed a viewer to think about the various operations and alternative compositions occurring within the space.

The issues with 3D printing did hamper the viewers ability to interact with the piece. This was due to my difficulties with the complexity of the printing. It also resulted in some of the attachments looking quite rough. If I was to do that component again I would be hesitant to 3D print the components due to the complexities of it and breakages that are hard to prevent. Using proper cases for a tablet and metal or molded plastic for the rotation and vertical movement would be better. However considering I haven’t seen someone else suspend tablets and have the ability to move them vertically and rotate on steel wire I think my solution was good. More time should have been put towards the mechanics of the structure in order for this to work better.

Overall I am happy with my piece. I negotiated a lot of challenges with the 3D printing as well as issues with compiling software for android with openFrameworks. I feel that my piece striking and allowed viewers to explore the space differently through the intervention of the frame.

Software/Libraries used

Android Studio
Eclipse
MeshMixer
Blender
IP WebCam
ofxIpVideoGrabber

References and bibliography

ofxIpVideoGrabber

Thingivse attachment base

Thingivse kindle case

IP Web Cam

For a full account of the build process see the documentation log

All notebook work can be found here

Comments are closed.