SPROUT
Air Giants in collaboration with Oya Celiktutan, Jeffrey Chong, Theodore Lamarche & Bowen Liu
How can we teach robots to respond to human behaviour?
Sprout, a soft, huggable robot made almost entirely of fabric and inspired by nature, is resting within this walled enclosure. Approach it one at a time, and the robot will respond to your movements and closeness, changing colour and moving in its own way.
Through its interactions with you and other visitors, Sprout is learning how we humans use our personal space. The data it collects will help inform researchers about how humans use space as a form of non-verbal communication, facilitating more effective communication between humans and robots
As an example of ‘soft robotics’, Sprout defies our preconceptions about robots, typically created with rigid, hard materials. It invites us to rethink what robots can be for and imagine new possibilities for human to non-human robotic interactions.
CREDITS
Comissioned by Science Gallery London
Supported by King’s Culture and the Faculty of Natural, Mathematical and Engineering Sciences, King’s College London
For Air Giants: Richard Sewell
For King’s Department of Engineering: Oya Celiktutan, Jeffrey Chong, Theodore Lamarche, Bowen Liu
AIR GIANTS are a creative robotics studio building inflatable, pneumatically-controlled, wonderful creatures. They are the first in the world making interactive soft robotics at this scale, and the only people making them for showing in public spaces. Their work gives people remarkable experiences with (quite literally) monumental inflatables. These are artworks which evoke instinctive and emotive responses from audiences at arts festivals, in shopping centres and on high streets. The work is enjoyed by young and old, those who get close-up and personal with the robots as well as those who like to admire the spectacle. Website
DR OYA CELIKTUTAN is a Roboticist and Senior Lecturer at the Department of Engineering, King’s College London. Her research aims to address two broad questions: How can we model human behaviour from multimodal data? How can we transfer these models to robots for learning, action, and interaction? Her research group, Social AI & Robotics Laboratory, focuses on exploring these questions, specifically, intersections of machine learning and human-robot interaction with an interdisciplinary perspective. Website / Twitter / LinkedIn