Remember Marvin, the depressed robot from the legendary Hitchhikers guide to the Galaxy? Today I had a chance to high five with him. Well, kind of… not really.
The point is that it was a robotic arm I fiddled with, and surely, if I was a robot with my arm taken away from me, I’d be pretty damn depressed. Luckily that’s not the case, as I was rather cheerful on my way to the Open University campus in Milton Keynes, meeting Lisa Bowers, a part time PhD student working with touch and arts.
We scheduled to carry out a study where she was comparing performance of sighted and partially sighted people, or I shall say participants, on a basic product design task; using a conventional low level prototypeing method and a kinaesthetic haptic device.
I really took away two major lessons, though the entire session was incredibly insightful. but before the conclusions, a bit more about the actual task and Lisa’s motivation. As we warmed our minds up with a conversation, Lisa told me she was really inspired by a group called Artisans. “The Artisan team is made up of 25 highly skilled, passionate artists with a wide range of abilities and are experienced in working with a variety of exciting media. They organise sessions for all abilities, a wide range of ages and include specific sessions for people with mental health problems, learning disability, Asperger’s, Stroke victims, physical disability, ME, isolated and vulnerable groups” – as you can read on their website.
Coming from a design background for people with special circumstances, not necessarily special needs, but circumstances such as, pregnant mothers, prisoners, people working night shifts and so on, the researcher wanted to make the creation of art as physically convenient as possible, to let the mind and creativity stretch its legs freely.
A typical task in design, for instance if you want to showcase a new chair design at a meeting, is to use low level prototyping, i.e. steal a few sponge blocks from your kid and put them on top of each other in a particular other to make a chair shape out of it. And that’s exactly what I was meant to do, while timed. Ey, that’s pressure on me now. In any case I examined the chair I was meant to reproduce, took a deep breath, and made it myself in an astonishingly slow 35 seconds. To my excuse, I was asked to “Think aloud” – social science research oriented people that’s your scientific terminology for the day – and this clearly hindered my physical assembly of the furniture. People who know me will confirm, I only do one thing more than I talk or write, and that is I THINK!. So thinking aloud, well, good luck stopping me in that.
Once we played enough, and recorded my time with low level prototyping using my hands, we sat at the computer desk, where the fun part has started. I was terribly curious to finally see a robotic arm used for kinaesthetic haptic feedback, as so far I only worked with mid-air and 3D printed haptic or tactile experiences, SDKs. There it was, the subject of my research papers I’ve been reading from a number of years ago. The task was simple. Build the same chair, with four virtual blocks that look alike to the previously explored ones. Using the Geomagic device proved to be incredibly insightful. Have to admit I was sceptic about the kinaesthetic haptic feedback, but I can tell you, if you confront a virtual wall, it will feel as if it was a rather stiff physical barrier. I experienced this as I had to make my construction in a virtual box, with well defined borders. It was so realistic, that when I instinctly looked for the wall with my spare hand, and of course couldn’t find anything but the empty desk surface, my neural circuits had a bit of frustrating time. Working out what’s actually happening in the real world wasn’t so straight forward as neurones received two conflicting perceptions.
Identifying shapes of blocks were a tad more difficult but with only a bit of practice, I’m sure it can be mastered. Overall, I felt this genre of haptic devices very convincing, though obviously bearing in mind the limitations in the fields of application of such solutions.
Just to conclude on a note to myself if nothing else, make sure you have a secondary method at hand if you expect to analyse data based on what your participant told you during the “think aloud” session. In this particular case, honestly I tried, but during my 1:50 minutes of building the desired product, couldn’t go any further than “well, now… now I am, so I and then… this block. I’m done I think.” Whoops, I really tried. I hope Lisa, the fact that the use of the haptic device took up more cognitive power than I anticipated is also an informative aspect, when discussing application of poor Marvin’s arm for art purposes.