This project challenges the way a person interacts with an object by digitally altering the person’s perception of what they embody. If the body is seen as a system capable of unfamiliar interactive movement, in what ways can these unfamiliar interactions be achieved? To augment what a person sees of their limbs I explored how a person might try to interact with other objects and how their intentions might change.
The implementation resulted in substituting a person’s arm with a simple object for them to move. The user places their arm underneath a screen and the screen illudes the user to thinking their arm has been replaced with a digital object, similar to that of the famous rubber hand experiment. As the user moves and acts with their hand and arm underneath the monitor, the object displayed on screen follows in kind. The object replacing the hand can be a physical object, scanned using a Kinect or a 3D model of an object created digitally. The movement of the digital object is communicated by a Leap Motion, tracking the position of the user’s arm and displaying the object on screen. Interactions to perform with these hand-replacing objects included moving a virtual ball around or drawing a line.
Once the user sees the object on screen moving with their body it can provoke a placebo effect, attaining behaviours or perceived functions of the acquired object. I found it interesting what impacts a digital object can have on a person even when that object is not bound to the laws of the physical world.
Some tests of the implementation were of a user trying to grab a virtual ball using a digital paw. They would imitate their hand as best to their ability in similar fashion to that of a cat paw. It removed the use of the opposable thumb as part of the act and as a swiping action. Likewise replacing the hand with a butterfly instead of the paw, the user would feel inclined to move delicately and slowly interact with the virtual ball.
Another test was on drawing. It revealed different flows of action when equipped with different hand-based replacements. For example, if the object were a pen the user would attempt to draw as if holding a pen. An airbrush gun would instead become a direct pointing action, using the nozzle as the trigger for drawing.
This project was presented as a poster for IE2014 in Newcastle, Australia.