Tagtool is an creative toolkit for IOS

An incredible tool for artists that can draw and animate in very few steps.

You can even produce performance animation…

 

Exploring Leapmotion interface using hand / fingers to control a puppet´s head.

Very simple example of digital puppetry build in Unity.

(more…)

The Virtual Marionette installation “BePuppit” in a puppetry exhibition at Teatro Municipal de Vila do Conde.
We invite the audience to participate in this play which recreates a traditional puppet booth.

“PUPPIT” is an interactive experience that explores different ways to control and animate virtual puppets in live environments using digital puppetry methods as a storytelling tool.
With “Body-PUPPIT” the users body is the puppet controller, producing performance animation with body motion. We challenge the participants to explore their bodies to give life to puppets that are different from human morphology as if they were using hand shadows to create imaginative figures.
By playing with puppets in a dramatic way the user can create the illusion of life and build a narrative.

PUPPIT consists in a several prototypes that are the work-in-progress of a PhD research called “Virtual Marionette – Interaction Model for Digital Puppetry”. The goal of the prototypes is to explore and evaluate different rigging, mapping and interaction methods, and to develop tools for artists and non-expert artists that can be used in as collaborative environment for storytelling – making live narratives.

The exhibition is open every day since the 14th till 22th December except mondays.

www.virtualmarionette.grifu.com
December 2013

Performance Animation LAB. at Cinanima 2013 (Espinho/Portugal) – Animation Film Festival

The main objective of this Lab is to disseminate Digital Puppetry as a methodology to create performance animation.
We present our methodology based on simple steps: Create->Rig->Map->Animate

November 2013

Reel with CINANIMA 2013 labs

(more…)

Result animation from the performance animation Lab at CINANIMA 2013 (Animation Film Festival in Espinho/Portugal).
Real-time storytelling and live directing with virtual marionettes using digital interfaces to create performance animation.
This play was performed with three digital puppeteers / actors and one director.

Framework:
Hardware used: 2 Wiimotes and 2 Nunchucks; 1 Microsoft Kinect; 1 Ipad, 1 Macbook DualCore 2005; VideoProjector
Software used: Animata, Pure Data, Osceleton, Osculator, TouchOSC.

November 2013
www.virtualmarionette.grifu.com

Multitouch surface testing (FTIR)
The purpose of this tests is to build a puppetBox, a multitouch surface to manipulate puppets using fingers.
I´ve made silicone layers with synthetic and cellular diluent.
Avoid dissolving the silicone with cellular diluent because it will take too much time to try. Using synthetic diluent
Of-course you can use expensive materials to avoid this issues.

 

Material used:
– Optoma projector PK320 80 Ansi Lumen
– EyeCam Default lens with 850 nm filter (i changed the lens to the default because i used a very small surface)
– InfraRed Ledstrip 850 nm
– Baking paper with 2 layers of silicon + synthetic diluent
– CCV 1.2 on MAC and CCV 1.3 on Windows
Conclusions:
Systems:
– Eyecam gives a better performance in Windows (60 fps) against Mac OSX (30 fps)
– Flash CCV demos work 2 times faster on windows compared to Mac OSX
Screens:
– Rosco screen gives high resolution picture and it gives you a great feeling to touch
– Baking paper gives brighter picture with low resolution but it is very low-cost
– Rosco screen needs more layers of Sillicon to track blobs

Experiment made in September 2013

Sensor Joint Sensor Joint
0 Head 12 Right Elbow
1 Neck 13 Right Wrist
2 Torso 14 Right Hand
3 Waist 15 Right Fingertip
4 Left Collar 16 Left Hip
5 Left Shoulder 17 Left Knee
6 Left Elbow 18 Left Ankle
7 Left Wrist 19 Left Foot
8 Left Hand 20 Right Hip
9 Left Fingertip 21 Right Knee
10 Right Collar 22 Right Ankle
11 Right Shoulder 23 Right Foot

7

Simple example on how to use bend sensors to control the mouth of a puppet like in glove puppets.

This project was created in 2009 using three bend sensors connected to an Arduino sending messages to Processing.