Mixed Reality (5)

I had gotten myself really stuck in to this project, when I fell ill. Again. This time around I was particularly frustrated by it, because I was pretty much halfway there with where I wanted to be, with a plan in mind and the end in sight. My app had the potential to be really cool, or at least to be part of a learning process. But I just wasn’t able to finish it for the deadline. You can’t argue with a vomiting bug, you just have to take bed rest and wait it out, which is what I did. Still sad that my project work is being affected for the second time this year.

Nonetheless, only a bad artist will allow such things to get in their way. This was a barrier to my finishing the project, but I can still describe where I hoped for it to go. Through the week I had been busily scribbling down ideas in my sketchbook. I drew out a rough user-journey type storyboard, imagining my target audience as a person browsing the supermarket with a smartphone in hand.


I realised that this situation was a little fanciful; when someone is browsing a shop, they’re pretty unlikely to get their phone out and start opening apps, especially unprompted. They probably just want to walk in, buy what they want, and walk out again as quickly as possible. I still wanted to keep the part where the dead bird was augmented at the user’s feet, but otherwise, I had a few changes. I drew a new storyboard, this time trying to make the situation a bit more likely.


This time, the idea is that a user might download this app because it is associated with Coca Cola, a brand they enjoy. They will install the app, and launch it to see what it’s about. The app will prompt them to point their phone at a bottle; upon detecting the logo, they will then be prompted to look down (if necessary). At their feet, a dead albatross will be shown. This should be a life-sized scale – the bird would have a wingspan just a little taller than a human. The user would be able to walk all around the space to view the entire length of the bird on the ground. After the bird has been displayed, a message would appear, to explain to the user what they are seeing and what the significance is.

I think the second version seems more realistic, also it is also not perfect. Really, this is the sort of thing you have to test out on a real person, to observe how they would react. I had also prepared a handful of questions, and had planned to seek out some Coca Cola consumers to answer these. There were just a few questions, mainly to gage their reaction to the concept for my app, and to decide whether or not I needed to make some changes. Coca Cola customers would be my target audience with this app; the idea is to raise awareness around the impact Coca Cola has upon the oceans, and to make sure their customer base is aware of this damage.

Thus far, I had gotten to the stage of putting my app onto an Android device. When pointed at the Coca Cola logo, a 3D model of a bird appears. This is primitive, but communicates my concept to a degree. The next stage I was planning to work on was to make the bird appear on the ground, in a fixed position. I had watched a couple of YouTube tutorials and was about to start experimenting with the new Ground Plane features in Vuforia before I fell ill. If I was to progress towards my finished project, the next step would be to have the ground plane fully operational, to add messages onto the user’s screen at the correct moments, and to create the 3D model of the albatross.

I had thought about making a scaled down version of an albatross using clay, in physical form, so that I could sculpt something very raw and free flowing – something artistic and passionate rather than perfectly representative. I just wanted the idea dead bird to be communicated, otherwise it could be somewhat abstract. Once I had sculpted this model, I would use a software called Photoscan to capture a 3D image of the clay and convert this into a digital model. I would then scale the model up in size so that it was life sized. Although a bit of a convoluted process, this would give me the opportunity to learn Photoscan, a new software to me, but more importantly, it would give my 3D model a far more organic feel than if I had modelled it using CAD software. Realistically, I doubt that I would have been able to achieve all this in the timescale given, however with unlimited time this is definitely where I would have taken the project.

Although I didn’t manage to get this app completed, I did thoroughly enjoy the work that went into it, and would absolutely love to use Unity (and Vuforia) more for future projects. I think that Unity is surprisingly easy to use, once you get used to the interface, and it makes a lot of things possible even with a limited skill set. Definitely a useful tool.

I have shared my progress on the app below. This is very much a primitive, behind the scenes version, but it shows the vague direction in which I was headed.



Living Walls (6)

As the year comes to a close, I have been tidying up and finishing off various projects. Having been sick during the initial deadline for this project, I found that it took me months to find a quiet spell in my schedule in order to finish off the work. However, I finally succeeded, after a few disheartening VPT issues, and was able to film and document my projected “installation”.

I had previously scouted a location, and created the raw code, so all I had to do was load the sketches into VPT, and projection map them onto the inside of the window. I went down to the area one night with hopes to film the whole thing, however I hit several issues with VPT. Everything seemed okay until I tried to enter fullscreen mode, and then my entire laptop would freeze up and crash, and I would have to start over. I spent the evening trying to troubleshoot, but didn’t get any further forward, and had to go home with nothing to show.

The next week I found Gillian in the studio, and since she has good knowledge of VPT, I sat with her to see if we could figure out the problem together. Unfortunately, the universe had other plans for me, and VPT found a way to stop working which was even worse than before. I could no longer load in my Processing sketches, or even play video files. VPT just suddenly stopped working. We had a good sift through various settings, but nothing seemed to make much difference. I was frustrated, but knew I couldn’t give up at the last hurdle. I decided I would just borrow a laptop from a classmate, one who had VPT running smoothly so that it would be guaranteed to work. Thankfully Ankita fit this description, and since we live close to the studio I was able to borrow both her laptop and her general assistance for an evening.

Ankita has a Mac, whilst I have a PC, and so I had to slightly adapt my Processing sketches so that they would run with Syphon instead of Spout. This was fairly simple to do, just a few lines of code to replace. With this done, I was able to load them into VPT on her laptop, and connect up happily to the projector.

As a refresher on my original idea/ concept: I was looking into the history of the building as a medical space. I was playing with ideas of ghosts as a positive idea rather than a negative or scary one, and looking at the way that children had been helped by the staff there. I wanted to convey that there was a type of medical history in the building, but to portray this in a positive way. I was inspired by cells, X-ray scans and bodily tissue in terms of form and shape. My sketches react to sounds in the room. The idea is that when they are projected, the current inhabitants of the building are able to interact with them by passing by, speaking or making noises, as though connecting somehow with this reflection of the past. I wanted to create the sense that the past has some kind of presence, or that the building is watching (but not in a malicious way).


Mixed Reality (4)

With a fairly solid concept in place, it was time to start experimenting in Unity with Vuforia and to see what sort of thing would be possible. First of all, I decided to get the image target working. This is where an image is pointed at the camera, and when this image is recognised, an action is performed, such as an object being displayed.  I experimented a little by using the Coca Cola logo as the image target. I like the idea of using a plastic Coca Cola as the image trigger in my final version, after the prototypes are complete, but for now this gets the point across. I had a couple issues getting the objects to display at first, so I put in a button as well, since I had managed to make at least this display. Eventually after a little help from my classmates, I figured out how to make a 3D object appear and “follow” the target.


In the next iteration, I decided to show an image of a dead bird when the Coca Cola logo is shown. This was pretty simple: I just created a flat plane and added the image to this. It captured the essence of my concept nicely, but wasn’t quite where I wanted it to be.

The concept I had in my head was to create a 3D model of a dead bird, to scale, and have this positioned on the floor at the feet of the user, so that when they look down through the screen they will be confronted with this image. Whilst researching I noted that one of the birds which is apparently most impacted by sea plastic is the albatross, due to the way it sometimes skims the sea surface to feed – thus ingesting the plastic that floats there. I think that a dead albatross would be a really striking image. The largest of their species sometimes have up to 8ft wingspans – longer than a tall human. This would really be impossible to miss if it was augmented into a user’s screen environment.  Also, I think seeing that such a quick, throwaway item like a Coke bottle, which is used once and then forgotten about, is actually more than enough to kill a massive sea bird, has a really strong impact. It shows that this action isn’t inconsequential, that it’s not just one little bottle – it’s part of a wider problem, with effects that kill majestic creatures such as the albatross.

Since finding a 3D model of a dead albatross for free in the asset store was not an option, I decided to create a tiny prototype version first. I downloaded a free low-poly pack of birds, and took one of these models into Paint 3D, a microsoft program which allows for easy rendering of 3D models. Basing my “painting” upon reference images of a dead seabird (I didn’t pick an albatross for this first attempt), I coloured in the 3D model and imported it back into Unity.


Having figured out this much, I then repeated the same process as before with the image target, so that the bird model would hover over the Coca Cola logo when displayed. Finally, I flipped the bird upside down, making it look as “dead” as I could given the tools available to me. This was my second iteration, and it worked just fine. I tested it through the webcam on my laptop, to make sure it would respond to the Coca Cola logo. Success!


As a next step, I wanted to be sure that I could fully export the app to my mobile phone. I felt that it was important to test out the app as it was intended to be consumed, on the correct device. I also just wanted to practice the whole procedure, so that I would be more comfortable with the various convoluted steps when it came to exporting my final project to mobile.

On the first attempt, I managed to get the app onto my Android phone, however it had lost its AR functionality. I checked my laptop to discover that somewhere along the way, I had somehow ruined my project so that the image target was no longer working. I sat down and carefully re-did every step, taking care to make sure SDKs and JDKs were correctly installed and so forth. This time – success! The app was installed to my android phone, and the AR component worked. I cut out a paper version of the Coca Cola logo to test with, and this worked beautifully. Fantastic! Now just to tidy things up, and create a more “polished” version.



Mixed Reality (3)

A lot of my research around plastic waste has focussed upon its impact in the oceans. I had been thinking about how to communicate the impact that plastic has to the ocean and ocean life. So many statistics had really shocked me; but after talking about my ideas, I realised that whatever I produce has to be less about numbers and more about the impact of these, visually. A statistic can be googled, can be read by a member of the public. What makes it unique to AR? If I used AR just to convey a statistic, it would appear quite gimmicky. Therefore, I needed to use some kind of visual which was more abstract, which would cause an emotional impact or some kind of response in the user.


One thing I actually noticed whilst researching is that a surprising amount of seabirds are harmed by ocean plastic. I had presumed that it would mostly be aquatic life; fish, dolphins, turtles etc. I hadn’t stopped to think about the birds, since I don’t immediately think bird when I hear sealife. Turns out that around 1 million seabirds die due to plastic every year. That’s a lot of birds. Most importantly, I think that a dead bird is just a really powerful image. There’s something vaguely poetic about a creature so free, able to fly and dart around, suddenly becoming lifeless and inert. It’s very contrasting and very shocking. People react far more strongly to seeing a dead bird than they might if they saw a dead fish. I think a lot of religions and cultures tend to hold some awe around the bird, treat it like a kind of spirit. To see it dead is very tragic.

1106.2013 April GANNETS AND GUILLEMOTS 2 Ian McCarthy.jpg

My belief that seeing a dead bird could cause a strong response was confirmed. As I was reading up about seabird deaths, and browsing through images in articles, my classmates began to vocalise their horror and disgust. Several of my peers crowded round my desk to find out what I was researching and have me explain my ideas to them. They seemed genuinely upset by the dead bird imagery. It perhaps sounds a little sadistic, but their reactions were perfect. I now have confirmation that I’m on the right track. Seabird deaths are directly linked to ocean plastic. Coca cola produce 100 billion plastic bottles each year. This is adding to all of the existing plastic which our planet cannot break down. These bottles are making their way into the ocean. The plastic that Coca Cola generates is part of the problem which is killing sea life. Therefore, Coca Cola helps to kill seabirds. If I can create an app where the user is struck by an image of a dead seabird, and link this to the Coca Cola brand, I will have succeeded in my mission: to raise awareness and create a provocative statement about the impact Coca Cola has on the planet.


Mixed Reality (2)

Whilst gathering more research material around my theme of plastic bottle waste, I found several more alarming statistics. One issue which really stuck out to me was to do with the drinks company Coca-Cola. Greenpeace had conducted some research to discover how many plastic bottles the top 6 drinks companies produce. However, Coca-COla refused to disclose this information to the public. I found this worrying – surely it implies they have something to hide? I continues reading the article, which explained how Greenpeace had formed their own estimate for Coca-Cola based upon sales figures for the number of units sold. Greenpeace claims that Coca-Cola produce over 100 billion plastic bottles a year. This translates into 3,400 bottles per second. Just from one drinks company. This plastic takes up to 2000 years to biodegrade, and they are generating that much every second. I found these statistics quite sickening.


With these numbers in mind, it seemed like the perfect opportunity to refine my focus even more.  Instead of focussing widely on the problem of plastic bottle waste, I would create an app which targets Coca-Cola specifically, exposing their plastic problem to their customer base and the wider public.


Mixed Reality

Mixed Reality is a project in which we will explore AR and VR through the creation of a mobile application. For this project we are being taught by Leon, and outside lecturer who is travelling up from London to guide us through the project. This is exciting in a sense because we are lucky enough to get his specialist knowledge, but also a little daunting because he can’t be up in Glasgow teaching us every day of the week. This means a lot of our project will need to be self-guided, with occasional tech support from him to keep us on the right track.

With all this in mind, I feel like the pressure is really on for this last project of the year. Hyper conscious of timescales, I decided I needed to come up with an idea fast. My concept is based around the issue of plastic waste, something I have been very aware of recently after doing a clear out in my flat. I did a little bit of research on the topic of plastic waste, and found a few statistics which shocked and saddened me. Since I had a fairly strong reaction to the topic, and since I couldn’t think of much else to go for, I decided that this theme would be what I’d go for.

Next, I realised I needed to narrow things down a bit more. I decided that I would look at plastic bottle waste in particular, as this is a problem that is largely preventable, since most plastic bottles are made with PET, which is very easily recyclable. I would like my mobile app to make use of Augmented Reality, overlaying objects into the real environment to give the user a more personalised experience.

I created a quick one pager to outline my concept for this brief:


I don’t yet have a clear vision for the specifics of my app, but I’m hoping that by experimenting in Unity using the Vuforia toolkit, I will get more of a feel for what is possible, and develop a more focussed idea from there.

Typography (6)

Between juggling two other unfinished projects (Control and Living Walls), working on my current coursework (Mixed Reality) and writing an essay for our Design Histories and Theories module, I have somehow miraculously found the time to email back and forth about this very specific print job.

When I first thought out this concept, I hadn’t even begun to consider how difficult what I wanted to achieve might be. Yes, I understood that printers can’t print right to the edge of the paper, but I figured that, if I added a little margin in and then had the pages guillotined to size, they would be correctly aligned and everything would work nicely. Oh how naive! I guess if nothing else, this project has opened my eyes to the possibilities and limitations of print.

With a flat out “no” from Ryman, my next port of call was Hobbs Repro. This decision was based on two factors: their office is situated about 10 minutes away from my home,  and their print room is pretty big, and caters to lots of varied clients. I figured that they seemed like more specialist printers, and that therefore they might be able to realize my idea. I had a chat with the receptionist, who was lovely and helpful, but I’m not entirely sure if she understood my concept. I was given a business card with an email address, and away I went.

After back and forth emailing with one of the print guys, over the course of 2 weeks and with maybe 5 different iterations, I finally got somewhere. Each time I would send over a file asking “is this okay?” and I’d need to just add some crop marks, or extend the image into the bleed, or change the page order, or re-size the pages themselves. Little things but combined, rather time consuming. Eventually, he told me that we had something he thought he could print, and would I like a sample? Hurrah! I hastily agreed, hoping that the quick printout he created would come right up to the edges as I’d envisioned.

I went to the print shop to pick up the sample . . . and alas, I do not think this dream was meant to be. The printing had come out fine, and the pages were aligned correctly etc. The inside margins were fantastic – the text went right across the inner spine/fold just as I had intended. However, the outside margins were way off. An entire letter was missing from the text. Despite supplying the document with a 5mm bleed on the outside edges, the text had still been sliced off. It looked as though they had been cut down by 10mm, double the margin we had agreed upon. I took the booklet home to have a good think.



The pages that didn’t rely on the outer edges were fantastic, but the ones that did, not so much. In the image above, I was supposed to have an entire “the” showing on that edge, but it came out as just an “e”. This was a real issue because it meant that the text didn’t read properly.

Thankfully, as this was just a “sample” booklet, I received it free of charge. Kudos to Hobbs – although they couldn’t deliver what I wanted, they did try their best to help me. Maybe I could have added a 10mm bleed on the outside edge and a 5mm bleed on the inside, but then I would have had to change the ratio of the images and resize each page. The whole thing already seemed like a bit of a palavour. I decided not to bother paying £10 for a booklet that again, might not be what I wanted.

Sometimes our mistakes are as valuable as our successes. I learned from this project, much as it frustrated me. I had satisfied myself in that I had tried. I really did try to make this project go the way I wanted it to, and it didn’t end up quite right. Not everything in life goes our way. I have accepted defeat, and decided to label this project as done.


As another small side project, I recently created a couple of very simple timeline infographics. This wasn’t exactly a creative endeavour, but rather a commercial one. My partner is currently writing the annual report for a small Edinburgh based charity, and I volunteered to help create a couple of the pages. In previous years, this charity had just created a written list to detail past projects. This year, they wanted to have something slightly more visually engaging. My brief was to create a timeline to show the charity’s progress over the past few years. I was pretty much given freedom to create whatever I wanted, I was simply supplied the information I had to put in.

The charity branding is about being fun, bright and new, so I decided to use simple shapes and bright bold colours to tie in with this. There was a large amount of information supplied, therefore I split this down so that there were two separate timelines: One for the previous few years, and one for this financial year.


I like this timeline, as it is very simple and linear. Things are clearly displayed in a chronological order. I used Adobe Illustrator to create these graphics. I’d used it once before to create animation sequences, and once again to set up a file for laser cutting. However, Illustrator is relatively unknown to me, so this little project was as much a learning process as it was anything else.



The graphic above was the timeline I created for the entire year. I feel like this is slightly more convoluted and difficult to understand than the previous graphic, but at the same time it is still understandable. I found it slightly trickier to lay this out as there were several projects that had a duration of several months, and some months with no projects at all. Fitting everything in was a bit of an issue but I managed in the end with some trial and error. I think this infographic has a much more fun look than the previous one. This was partly unintentional – the circular shapes were more of a practical choice so that the odd spacing would look more intentional.

All in all this was a fun learning experience. I think it’s nice to do a little “commercial” style work, as it gives me a taste of what it might be like in an industry. Building up knowledge like this is useful to me if I ever want to work on freelance stuff. I think it’s also useful to my studio practice to know a little bit of each Adobe program.

Screen Language (6)

Having shown my video for our final presentations, I had decided I wanted to improve the audio quality before posting the finished piece to this site. With a little time to work on finishing off projects over our Easter break, I decided I would have a crack at fixing up the audio.

I took the existing audio track into Adobe Audition, and began listening to it to try and find the worst instances of background noise. I had already applied a background noise reduction using some Audition presets, which definitely improved it from the original version. However, there were still a lot of instances of children screaming during the dialogue. I planned to manually fix each individual noise in order to reduce the effect.

Unfortunately, Audition is still a bit of a mystery software to me. It has been so many months since I last used it for a project that I’d rather much forgotten how to work it. I’m sure there might have been a more effective method, but in the end I just reduced the volume on the parts with bad background noise.


Listening to this as I edited the audio, I felt that it was working. However, when I listened over to the whole thing and paired it with the visuals, I started realising that the quiet sections were slightly too obvious. It didn’t flow well – instead there were just awkward and abrupt sections of silence. I decided that the previous version, even with all the background noise, was better than what I had managed to create.

Therefore, I believe that this video had reached a point where it is as good as it is going to get. Yes, I could spend endless hours trying to further improve it, but I think that the underlying issue came from my original video footage. What I would really need to do is reshoot the entire thing. I think that at least this has been a good learning curve – I now know how important it is to scout quiet locations. I also think bringing a seperate mic would have helped a lot – if I had placed a mic really close to my speaker, then her voice would have at least been a lot louder than the background noises. Instead, my DSLR mic picked up all of the noise at a similar volume.

In any case, I’m not too dissatisfied with where I managed to get with this piece. It was a relatively short amount of time to storyboard, location scout, travel to two different cities, direct, film, and then edit it all together. I think that the visuals are pretty good. Honestly, the only real issue is the sound quality, and now I know how to avoid this in future.

So, after a longish wait, here is the video itself:


Typography (5)

Having decided to go ahead and print something, I ran the sketch a few times in order to create a PDF that at least had some interesting pages, rather than an eternity of empty ones. Finally the stech produced a PDF that I felt would do, and so I went ahead and saved it to another folder. I decided that even though this version of the booklet was pretty interesting, it still had a few too many blank pages. In order to add just a little bit of interest, I decided I would invert these empty pages so that they were coloured entirely black. I also deleted a few pages entirely, so that the number of pages was even and divisible by four (from my experience of making things like zines, this is good practice when it comes to book binding). Because I had spent so long waiting to see if I could correct my code, I hadn’t given myself enough days to wait and get things professionally printed, so I had no choice but to do it myself in order to meet the deadline. This was a mistake . . .

I went to the GSA library equipped with an InDesign file for the book. I had added bleed margins to the pages, so that the ink would go right up to the edges when I trimmed the pages to size. I checked over the settings, sent the file to the printer, and . . . nope. It didn’t work. I’d forgotten to select double sided printing. Sigh. I tried again, fixing my mistake this time. Nope. Despite selecting double-sided, the booklet printed on single-sided paper. I recruited the librarian at this point to assist me in my struggles. We managed to get it to print on both sides, but it wasn’t aligned properly on each side. We tried changing a few more settings, and this time I had the foresight to print just a two page sample, since I’d been ploughing through paper and printing costs up until now. Yes! Finally it looked like it had worked. I thanked the librarian for his help, and printed out the remaining pages. Nope. Because I’d printed a couple pages separately from the rest, the pages weren;t collated in the correct order when I folded them together as a booklet. No worries, I guessed. I printed off the entire thing again using the same settings. Gave it a quick check, and -YES!!!- finally it appeared to have worked.

I rushed home clutching my precious cargo of correctly aligned pages, and set about cutting off the printing margins and folding the papers in half along their “spines”. Oh no. After trimming down two or three pages, it became evident that the pages I had thought had printed out successfully, were in fact also completely misaligned. At this point, there was little I could do. The deadline was the same day. The librarian had warned me that what I wanted to print might not be possible, and that I should go to a copy shop instead. It seemed he had been right. With only an hour or so to spare, I opened my PDF and found a way to view it in presentation mode. I felt disappointed that there would be no physical book to pass around, but at least I could display something.

In the end my book got a pretty good reception. I explained my coding struggles with trying to make the text stay within some vertical constraints. I talked about how the version I had wasn’t exactly what I wanted. The feedback has made me feel a lot better however. I will go ahead and get the booklet printed, seeking out a copy shop and talking them through my vision. Then, in my own time, I can try to fix the code and create the book that I had originally envisioned. Whether I will be able to get any further forward with the code remains to be seen, but for now I think I need to leave it be and come back to things in a few weeks with a fresh head.