interactivearchitecture.org/
Using a combination of industrial robotics and high power magnets, a seemingly inconspicuous frame on a wall, magically comes to life. Through a series of experimental films, photography and physical prototypes, the primitive effects of eye (and eye-like) stimuli have been investigated. The Eye Catcher project in its conclusion has developed a novel expressive interface where emotion recognition algorithms read audience faces and in-turn trigger the animation of a face formed of ferrofluid.
As people walk by, unaware of the interactive installation, out of the corner of their eye, they see an unexpected movement. Turning their head, they find an empty frame on the wall appearing to move towards them and as they stop in disbelief it positions itself to look straight at them. Suddenly from the murky black liquid sitting in the bottom of the frame, two primordial pupils rise up and seem to stare back at its viewer. A hidden pinhole camera in the frame captures the facial expressions of the onlooker and responds with a range of emotions crafted out of the subtle manipulation of motion cues. An uncanny and playful interaction is formed as expressions are exchanged.
Principle Researchers: Lin Zhang, Ran Xie
Supervisors: Ruairi Glynn and Dr Christopher Leung with William Bondin
The Interactive Architecture Lab is a multi-disciplinary research group and Masters Programme at the Bartlett School of Architecture, University College London. Interested in the Behaviour and Interaction of Things, Environments and their Inhabitants, their focus includes Kinetic Design and Robotics, Multi-Sensory Interfaces, the Internet of Things and the design of public interactive installations.