Shape Your Body

Shape Your Body is a multi-platform digital puppetry application that allows to manipulate silhouette puppets with the body. It makes use of low-cost motion capture technology (Microsoft Kinect) to provide an interactive environment for performance animation. It challenge the user to explore his body as marionette controller, in this way, we get to know a little better our body.

This project brings the art of shadow puppetry into the digital performance animation.

Shadow theatre is an ancient art form that brings life to inanimate figures. Shadow puppetry is great environment for storytelling. Thus, our objective was to build a digital puppetry tool to be used with non expert-artists to create expressive virtual shadow plays using body motion. A framework was deployed based on Microsoft Kinect using OpenNI and Unity to animate in real-time a silhouette. We challenge the participants to play and search the most adequate body movement for each character.

Requirements to run this application
– PC / Mac computer
– OpenNI drivers version 1.5 (you can find the Zigfu drivers above)
– Microsoft Kinect sensor


This project was published at ACM CHI 2012
You can download the article

Poster for CHI
poster-CHI2

 

Video
[vimeo 37128614 w=580&h=326]

 

BodyPuppetry

With BodyPuppetry the users body is the puppet controller, producing performance animation with body motion. We challenge the participants to explore their bodies to give life to puppets that are different from human morphology as if they were using hand shadows to create imaginative figures.

BePuppit: BodyPuppetry is an interactive application that explores the potential of the body motion to drive virtual puppets in particular silhouettes.
Users are challenged to deconstruct their bodies and use it as a marionette controller to drive  silhouettes. Because this puppets are in two dimensions the player needs to find the body poses that work best with the puppets. I call this search for the best pose the distance of manipulation. The more direct is the manipulation (when the manipulated subject mirrors the puppeteer) the more acting skills are needed, puppetry skills are needed when this distance increases.

BODYPUPPETRY is part of a digital puppetry research called Virtual Marionette.

BodyPuppetry challenges the participant to use his body as a puppetry controller, giving life to a virtual silhouette through acting.

Inspired by traditional marionette methods, such as shadow puppetry, the project goal is to study novel interfaces as digital puppetry controllers for performance animation. In this particular case the challenge was to deconstruct our body as if it was a marionette controller to give life to non-human silhouettes figures. There is also a human-like 3D model for comparison purposes. This project was used for an experimental study with the purpose of understanding how non-expert artists would behave with their bodies when challenged to control silhouette figures.

Implementation
This application was developed for Microsoft Kinect device using the OpenNI wrapper for Unity. NITE gestures were used to drive a virtual cursor allowing to control the interface with the hands. (in this release, only the Windows version work with NITE)

Requirements
– PC/MAC with OpenNI drivers (version 1.5)
– Microsoft Kinect sensor

If you need the drivers you can download the Zigfu for Mac or Windows available on the file section

Files



Acknowledgements
Thanks to Luís Silva (the author of the Hercules figure), Luís Felix (author of the Punch character), Marcelo Lafontana for all the puppetry support and knowledge, to Sónia Barbosa, Vasco Barbosa, Marta Barbosa for being my inspiration.

This project was developed and released in 2012 by Luís Leite (Aka GRIFU) for the Digital Media PhD. For more information please visit WWW.VIRTUALMARIONETTE.GRIFU.COM

bepuppit

BePuppit is a set of tools, applications and experiments based on interaction methods and techniques to drive and manipulate digital performing objects with our body.

Mapping the human body limbs to digital subjects is not trivial, in particular when mapping our body to digital models that present non-human morphology.

PUPPIT – Play with Virtual Puppets
Inspired by the traditional puppetry we challenge the participants to explore their bodies as marionette controllers by playing with puppets. This performance-driven animation projects explores different digital puppetry methods presenting new ways to interact with virtual puppets.

PUPPIT consists in a several prototypes that are the work-in-progress of a PhD research called “Virtual Marionette – Interaction Model for Digital Puppetry”. The goal of the prototypes is to explore and evaluate different rigging, mapping and interaction methods, and to develop tools for artists and non-expert artists that can be used in as collaborative environment for storytelling – making live narratives.

For any of the projects based on Kinect you should download and install OpenNI 1.5.

Here are two simple installation packages developed by ZigFu that provide OpenNI drivers in a very straight way. Just download and install and that´s it, you can start using the Microsoft Kinect.