This week I have attended to the EICS 2017 in Lisbon, were I have met many wonderful people such as Paul Dourish that published a new book called “The Stuff of Bits”, or Gilbert Cockton the Editor-in-Chief of ACM Interactions.

Demo and poster of Mani-Pull-Action at EICS

Demo and poster of Mani-Pull-Action at EICS

The 9th ACM SIGCHI Symposium on Engineering Interactive Computing Systems was full of interesting interactive surprises. A great place to get to know new trends, and innovative approaches about interactive systems.

I had the privilege to present a full paper entitled “Mani-Pull-Aciton: Hand-based Digital Puppetry” and to provide a live demonstration to all the participants. A great opportunity to get feedback about this work.

The new version of Stringless for Unity supports Zeroconf networking through Bonjour for auto discovery as well RemoteUI.


Just press play on Unity, and all the parameters that you have chosen for remote control will automatically appear in RemoteUI on another computer, on a IOS or Android Device.
You do not need to manually setup the OSC addresses anymore, just drag a Receiver on a Gameobject and define the value range that RemoteUI will automatically recognize and show you a slider for you to manipulate. These sliders can also be mapped to MIDI devices or other HUD controllers in a similar manner as the MIDI learn feature. Just press the name of the parameter on RemoteUI until it starts to blink, and move a slider on your MIDI controller or a joystick on a gamepad. The mapping will be saved for future sessions. You can also define distinct devices for the same mapping allowing device interchange.


Stringless support for RemoteUI is just a prototype and still in a developing phase.
Soon, it will be available for download.


A demo and poster of Mani-pull-action was shown at the Austin|Portugal Annual Conference at UNL Lisboa on May 23rd and 24th, 2016.

Manipullaction at Austin Conference 1

For this exhibition Mani-pull-action was shown inside a paper theatre or Kamishibai as a physical connection or metaphor for the art of puppetry. The participants were invited to manipulate different puppets with their hands and experiment digital puppetry.

Manipullaction at Austin Conference

Each puppet required a different level of dexterity simulating direct and indirect manipulation with physics and inverse kinematics approaches. The participants could also play with virtual cameras and lights using a touch device with a set of controls.


This was another contribution for disseminating digital puppetry among the community and try to attract more enthusiasts to this form of art.



LEAP IT – STRING is a simple application that allows you to share the hand motion data captured by Leap Motion through network via OSC protocol with the fowling features:

– Record the performance
– Replay a recorded performance
– Select joints for broadcast
– Convert a recorded binary performance into a text file
– Hide/Show the GUI
– Convert the scale of the hand motion data
– Switch Palm Orientation between Pitch, Yaw, Roll or Yaw, Pitch, Roll

This application is a “string” from the Virtual Marionette Interaction Model developed by Luís Grifu for a research on digital puppetry. This application is free to use.

LeapIT String is now available for Mac and Windows

## Digital Hand Puppetry

ManiPullAction is a digital hand puppetry prototype built for a research project called Virtual Marionette. It was developed in March 2015 to evaluate the hand dexterity with the Leapmotion device to manipulate digital puppets in real-time. One year after its development maniPULLaction prototype is now released for Windows and OSX.
This prototype proposes an ergonomic mapping model to take advantage of the the full hand dexterity for expressive performance animation.

Try and evaluate

You can try the model and contribute with your evaluation by recording the 12 tests that are available. 8 tests with the puppet Luduvica and 4 tests with Cardboard boy puppet. If agree in making this test, just write a name in the label and start by pressing the “1” key or the “>” button. After the first test you can switch to the next by pressing the “>” button or the numeric keys “2”, “3”, and so on until the “8”. Then, you can jump to the next puppet by pressing the “2” button or the “Shift-F2” key. There are more 4 tests with this puppet that can be accessed with the “1”,”2”,”3”,”4” or using the “>” button. After finishing all tests you can send all the files that were saved to “”. You can find or reproduce the file by pressing “browse”. The files are text files and motion capture binary files.

How to use

Just use your hand to drive the puppets. This prototype presents 4 distinct puppets with different interaction approaches that you can experiment by pressing 1,2,3,4 buttons.

  • The palm of the hand drives the puppets position and orientation
  • The little finger drives the eye direction (pupil) with 2 DOF (left and right)
  • The ring finger and middle finger are mapped to the left and right eye brows with 1 DOF (up and down)
  • The Index finger drives the eyelids with 1 DOF (open and close)
  • The thumb finger is mapped to the mouth with 1 DOF (open and close)

There are other interaction methods with each puppet:

  1. Luduvica – a digital glove puppet. This yellow bird offers a simple manipulation by driving just the puppet””””””””””””””””s head. If you use a second hand, you can have two puppets for a dramatic play. The performance test (7 key) allows you to drive the puppet in the Z-deph.
  2. Cardboard Boy (paperboy) – a digital wireless marionette (or rod puppet). This physics-based puppet extends the interaction of Luduvica by introducing physics and a second hand for direct manipulation. As in traditional puppetry, you can play with the puppet with an external hand.
  3. Mr. Brown – is a digital marionette/rod. This physics-based puppet is a similar approach to the paperboy but with more expressions. A remote console interface allows to drive a full set of facial expressions and the second hand allows the digital puppeteer to drive the puppet””””””””””””””””s hands (although not available in this version)
  4. Minster Monster – is a digital muppet with arm and hand. Facial and hands are the most expressive components of our bodies as well in monster characters. While the head of the puppet is driven by the palm of the hand, the mouth is controlled by a pinch gesture in a similar way to traditional muppets. A second hand drives the arm and hand of the monster for expressive animation.

Live cameras and lighting

To simulate a live show you can switch among cameras, change the lighting setup or even focus and defocus the camera.

It also presents live camera switching and lighting control through remote OSC in port “7000” by using applications such as TouchOSC (try the“simple” preset)
.Switch between the 4 cameras with the messages “/2/push1” .. to “/2/push4”, to manipulate the lights use “/1/fader1” to “/1/fader4”, to focus and defocus the camera 3 on the Cardboard and Mr. Brown scenes use the “/1/fader5”. You can also navigate through scenes with the addresses “/2/push13” to “2/push16”.


maniPULLaction for OSX

maniPULLaction for Windows


Developed by Grifu in 2015 for the Digital Media PhD at FEUP (Austin | Portugal) supported by FCTIt requires the Leapmotion device.

Demo video


mani-PULL-action is a digital hand puppetry interactive environment that provides a rich and expressive way for performance animation.

It is based on the maxim that our hands are a powerful tool for manipulation and we are used to manipulate all kinds of things in a everyday basis.

digital hand puppetry

ManiPULLaction makes use of hand-based interface devices such as the LeapMotion to offer a high degree of freedom of manipulation providing an expressive mean for the creation of animation in real-time that can enhance the storytelling experience.

You can find some previous video experiments such as the DigiMario


or the Ludivica digital bird puppet


Or Mr.Gonzaga