Virtual Marionette puppet tools were used to develop several interactive installations for the animation exhibition Animar 12 at the cinematic gallery Solar in Vila do Conde. The exhibition was debuted in the 18th of February of 2017.


Animar 12 - Interactive Installations

Faz bem Falar de Amor

An interactive installation that challenges the participants to interpret with virtual characters some of the scenes from the animation music clip developed by Jorge Ribeiro for the music “Faz bem falar de amor” from Quinta do Bill.

Puppit tool was adapted to drive two cartoon characters (a very strong lady and a skinny young man) using the body motion of the visitors with with one Microsoft Kinect. The virtual character skeletons differ from the human body proportions. Thus, there are distinct behaviors on each puppet that do not mirror exactly the participant movement. Although our intention was to present this challenge to the participant for him to adapt his body to the target puppet we help him a little bit. To solve this discrepancy, I used two skeletons. A human like skeleton is mapped directly to the performer”s body. Then, the virtual character skeleton is mapped to the human clone skeleton with an offset and scale function. In this way, it was possible to scale up and down the movement on specific joints from the clone skeleton and make the virtual character behave more natural and cartoonish. This application was developed using Unity, OpenNI and Blender.

É Preciso que eu Diminua

This animation music clip was developed by Pedro Serrazina for Samuel Úria. It describes the story of a character that suffers from a scale problem transcending the size of the buildings. The character tries to shrink placing the arms and legs near to is body. On the other hand there is the need to free from the strict boundaries and expand all the things around him. To provide the feeling of body expansion pushing the boundaries away from his body the visitor drives a silhouette as a shadow with a contour line around him that grows when the participant expands his body and shrinks when he reduce his size. The silhouette captured by a Microsoft Kinect is projected on a set of cubes that deform the body shape. This application was developed with Openframeworks with ofxKinect and ofxOpenCV.


For this short animation film produced by Jorge Miguel Ribeiro that addresses the portuguese colonial war there was the intention to trigger segments of the film when the visitor place his body above a mine. Two segments of the film show two distinct perspectives from the war, one from a father that experienced the war and the other from his child that understood the war through the indirect report of his father. A webcam was used to capture the position of the visitor”s body and whenever his body enters the mine space it sends an OSC message with a trigger to a video player application. Both video trigger and video player applications were developed with Openframeworks.




Untitled Anarchive is an experimental multimodal multimedia live performance of the POEX arquive. The Digital Arquive of Portuguese Experimental Literature was performed live at Salão Brazil on the 4th of February by members of Retroescavadora (Rui Torres, Luís Aly, and Luís Grifu) and their invited guests (Bruno Ministro and Sandra Guerreiro).

frames from Untitled archive at Salão Brazil Coimbra

This collaborative intervention explores new approaches to experimental literature mixing media and manipulation techniques, in a interactive environment that offers a common space for a dialog between the inter-actors.

A demo and poster of Mani-pull-action was shown at the Austin|Portugal Annual Conference at UNL Lisboa on May 23rd and 24th, 2016.

Manipullaction at Austin Conference 1

For this exhibition Mani-pull-action was shown inside a paper theatre or Kamishibai as a physical connection or metaphor for the art of puppetry. The participants were invited to manipulate different puppets with their hands and experiment digital puppetry.

Manipullaction at Austin Conference

Each puppet required a different level of dexterity simulating direct and indirect manipulation with physics and inverse kinematics approaches. The participants could also play with virtual cameras and lights using a touch device with a set of controls.


This was another contribution for disseminating digital puppetry among the community and try to attract more enthusiasts to this form of art.


## Digital Hand Puppetry

ManiPullAction is a digital hand puppetry prototype built for a research project called Virtual Marionette. It was developed in March 2015 to evaluate the hand dexterity with the Leapmotion device to manipulate digital puppets in real-time. One year after its development maniPULLaction prototype is now released for Windows and OSX.
This prototype proposes an ergonomic mapping model to take advantage of the the full hand dexterity for expressive performance animation.

Try and evaluate

You can try the model and contribute with your evaluation by recording the 12 tests that are available. 8 tests with the puppet Luduvica and 4 tests with Cardboard boy puppet. If agree in making this test, just write a name in the label and start by pressing the “1” key or the “>” button. After the first test you can switch to the next by pressing the “>” button or the numeric keys “2”, “3”, and so on until the “8”. Then, you can jump to the next puppet by pressing the “2” button or the “Shift-F2” key. There are more 4 tests with this puppet that can be accessed with the “1”,”2”,”3”,”4” or using the “>” button. After finishing all tests you can send all the files that were saved to “”. You can find or reproduce the file by pressing “browse”. The files are text files and motion capture binary files.

How to use

Just use your hand to drive the puppets. This prototype presents 4 distinct puppets with different interaction approaches that you can experiment by pressing 1,2,3,4 buttons.

  • The palm of the hand drives the puppets position and orientation
  • The little finger drives the eye direction (pupil) with 2 DOF (left and right)
  • The ring finger and middle finger are mapped to the left and right eye brows with 1 DOF (up and down)
  • The Index finger drives the eyelids with 1 DOF (open and close)
  • The thumb finger is mapped to the mouth with 1 DOF (open and close)

There are other interaction methods with each puppet:

  1. Luduvica – a digital glove puppet. This yellow bird offers a simple manipulation by driving just the puppet””””””””””””””””s head. If you use a second hand, you can have two puppets for a dramatic play. The performance test (7 key) allows you to drive the puppet in the Z-deph.
  2. Cardboard Boy (paperboy) – a digital wireless marionette (or rod puppet). This physics-based puppet extends the interaction of Luduvica by introducing physics and a second hand for direct manipulation. As in traditional puppetry, you can play with the puppet with an external hand.
  3. Mr. Brown – is a digital marionette/rod. This physics-based puppet is a similar approach to the paperboy but with more expressions. A remote console interface allows to drive a full set of facial expressions and the second hand allows the digital puppeteer to drive the puppet””””””””””””””””s hands (although not available in this version)
  4. Minster Monster – is a digital muppet with arm and hand. Facial and hands are the most expressive components of our bodies as well in monster characters. While the head of the puppet is driven by the palm of the hand, the mouth is controlled by a pinch gesture in a similar way to traditional muppets. A second hand drives the arm and hand of the monster for expressive animation.

Live cameras and lighting

To simulate a live show you can switch among cameras, change the lighting setup or even focus and defocus the camera.

It also presents live camera switching and lighting control through remote OSC in port “7000” by using applications such as TouchOSC (try the“simple” preset)
.Switch between the 4 cameras with the messages “/2/push1” .. to “/2/push4”, to manipulate the lights use “/1/fader1” to “/1/fader4”, to focus and defocus the camera 3 on the Cardboard and Mr. Brown scenes use the “/1/fader5”. You can also navigate through scenes with the addresses “/2/push13” to “2/push16”.


maniPULLaction for OSX

maniPULLaction for Windows


Developed by Grifu in 2015 for the Digital Media PhD at FEUP (Austin | Portugal) supported by FCTIt requires the Leapmotion device.

Demo video





Marionette Programming Language

Pull The Strings is a digital puppetry visual node-based control environment for performance animation.
It acts as a middle-ware interface between puppeteers and puppets, between device drivers and driven objects, between input interfaces and applications.

It is a marionette programming engine that works as a digital mediator between devices and applications, it provides the building blocks for the digital puppeteer to establish the manipulation semantics. It is a visual programming language inspired by the strings of the marionettes and by the patch cords that connect modules in the old analog video synthesizers. In this environment the signals are processed and sent to the network. Finally the data arrives to Remote Control for Unity which is a plugin that facilitates the mapping of OSC messages.

It was designed with a minimalistic but intuitive interface and was developed in C++ with Openframeworks making use of multiple add-ons to provide their main features.
A multidisciplinary enviorment (MMMM: Multi-plataform, Multi-modal, Multi-mediator, Multi-user).
An enviorment for artists with non-programming background. Design for digital puppeteers.

Pull the strings is a middle-ware, an interface between applications, devices, and performing objects. It can be consider a visual programming environment for remote control made for artists and designers.
Sometimes you find hard to use one environment or application that offers all those nice features that your are looking for, so why not use a multiple set of applications in real-time ?
The goal of Pull the Strings is to facilitate the use and control of all your resources found in your computer and on the network.
An abstraction from technology making use of generic signals and communication protocols.
Connects, and transforms signals from input and virtual devices into performing object controls. Map and orchestrate multimedia data using OSC, DMX, MIDI and VRPN
Is a remote control application, an interface that connects inputs to outputs and provides a series of functionalities that helps to control the animation. It is a OSC-based environment and all nodes work with OSC format.

Version Alpha 0.050316 (first release 6 March 2016)
Pull The Strings will become open source soon. It is completly free to use in your art, research, or professional projects.
Built with [openFrameworks](
It is developed by Luís Leite (GRIFU) for the Digital Media PhD with the support of FCT – Fundação para a Ciência e a Tecnologia.

## Download

Download Pull The Strings:

[Mac OS X 10.9 (32 bits) only ]

## Demo video

## License

Copyright 2012-2016 Luís Leite

Version 3, 29 June 2007

Copyright (C) 2007 Free Software Foundation, Inc. <>
Everyone is permitted to copy and distribute verbatim copies
of this license document, but changing it is not allowed.


The GNU General Public License is a free, copyleft license for
software and other kinds of works.

## Pull The Strings Interface

Pull The Strings allows you to create, save and load your projects.
You should start by always saving your project first which will create a folder to gather all the files needed.
To navigate through the canvas use the key, to open the menu use the and then start writing the operator name or use the mouse to navigate.
To delete an operator just use the to access to operators options click the <*> on the operator.
Connect nodes from the outputs to the inputs. Add new input nodes by clicking the <+ O> button.
You can create multiple graph windows.

### OSC Communication
Pull The Strings is based on OSC protocol. It receives OSC messages and sends OSC messages and all its interface communicate internally via OSC.
You can send several messages from one output port.
It supports bonjour so you can easely find Pull The strings with Bonjour compatible applications, such as TouchOSC.
It makes use of RemoteUI to remote control the operators with other applications or devices.

### Pull The Strings operators

input.number (to create a Float or Int inside the interface or with remote applications)
input.string (to create strings)
math.scale (a scalar function with a learning input range option)
osc.combine (combine several OSC messages into one, it allows message syncincing)
osc.expand (expands an OSC message into seperated out nodes, it allows to inspect the data types and choose the address) (opens port for incoming OSC data)
osc.port.out (opens port for sending)
print (for debugging, it prints in the interface the incoming messages)
show.plot (makes a plot from the values)

## Version History

– Alpha 003 (March 5th 2016)

## Support


## Other features
Pull The Strings as other operators that are not yet avaiable such as:
LeapMotion support
Wiimote support
Math operations
Recording OSC performance

## Developed with
Pull The Strings is based on ofxDuct and makes use of many other ofxAddons such as:

Remote Control for Unity (RCu) is a open source generic mapping interface for controlling and exposing properties and methods of object””s in Unity using remote applications and devices.

Remote Control Framework Scheme

Remote Control Framework Scheme

Remote Control for Unity was published in December 2015 and is available at GitHub

This Video shows the framework of this digital puppetry environment.

Remote Control is open source and provides a GUI to facilitate the mapping procedure ideal for non-expert programmers, an interface that provides an easy way for controlling Unity parameters in real-time from devices such as a smartphone.

RCu makes use of Jorge Garcia””s UnityOSC to access to the Open Sound Control (OSC) functionalities and provides connectivity with all devices and applications that support this protocol.

Pre-Release Video

This video shows the pre-released version from June 2015.

Tutorial 1 [Simple]: Controlling the object´s position

In this tutorial you will learn how to connect, map, and control objects position from remote.

Tutorial 2 [Intermediate]: Controlling Blend-Shapes

In this tutorial you will learn how to control Blend-shapes of your character with a flexible interface.

Tutorial 3 [Complex]: Control Lights, Cameras and Activator

In this tutorial you will learn how to change light parameters, how to move cameras and how to activate them remotely to create an interactive performative environment.

Published in Dec.2015

Resurrection performance at Casa da Música ( in the 7th of December of 2015.
A major production from ESMAE ( for the 30th anniversary of IPP ( embracing all the school departments (Theatre, Music, Audiovisual, Multimedia).

Resurrection at Casa da Música - 7.December.2015

Resurrection at Casa da Música – 7.December.2015

Inspired by the 2th Symphony of Gustav Mahler, Lee Beagley wrote and directed “Resurrection – Life lines” based on 65 interviews that he made during one year in Portugal with people who needed a second change in their life””s.

This video shows interventions that were performed in several places of Casa da Música and were adapted from the theatrical project presented at Teatro Helena Sá e Costa in November 2015.

Because my intervention was mainly on the multimedia field, I give you a brief insights of the project setup.

– Lobby and main Foyer (two 10k Lumen video projectors for the walls)
– Sala Renascença (one 7k Lumen projecting onto a translucent screen on a window to Sala Suggia)
– Exterior Wall (one video projector for the south wall)
– Two Macintosh””s
– Two Microsoft Kinect””s

– Qlab (Video mapping, video routing, video effects)
– OpenFrameworks (Computer Vision + Image Processing)
– Syphon (Video routing)
– Open Sound Control (for remote control)

A project with the students of ESMAE.

Texto e Encenacão: Lee Beagley
Traducão: José Topa

Ass. Encenacão e Produção: Xana Miranda
Ass. Movimento: Vítor Gomes
Ass. Voz: Bernardo Soares, Gabriela Amaro

Direcção de Cena e Produção: Gonçalo Gregório, Mariana Silva
Cenografia: Inês Mota, Luís Mesquita, Miguel Costa, Vera Matias
Figurinos: Manuel de Faria, Mónica Melo, Samanta Duarte
Luz e Som: Alexandre Candeias, Sérgio Vilela

Interpretação: Aldair Pereira, Carlos Alves, Cláudia Gomes, Gabriela Brás, Gabriela Costa, Guilherme de Sousa, Hugo Olim, Inês de Oliveira, João Lourenço, Mafalda Canhola, Maria Inês Peixoto, Mariana Coelho, Mariana Santos Silva, Marta Dias, Marta Rosas, Nuno Granja, Raquel Cunha, Rita Fernandes, Sara Xavier, Teresa Zabolitzki, Xavier Miguel, Miguel Marinho (participação especial)

Músicos: Diego Alonso, Gorka Oya Diez, Bernardo Soares, Ricardo Casaleiro, Gabriela Amaro
Fotografia: Paula da Fonte, Joana Machado, Rui Sá
Multimédia: Grifu, João Gigante, Ricardo Couto
Docentes DAI: Luís Leite (Grifu), Marco Conceição

/// This project hopefully opens new dialogues and breaks down barriers ///

Session #04 | 4 April 2015 @ 17h00 > Text-Space [Installation, Performance, Spacial poetry].

Presentation by Rui Torres. Interventions by Nuno M Cardoso with Luís Grifu and Luís Aly.

Live Arquive is Anarchyve! Presentation and remix of the Digital Arquive of Portuguese Experimental Literature (  by Retroescavadora – Artistic Intervention Collective: Ana Carvalho, Filipe Valpereiro, Luís Aly, Luís Grifu, Nuno Ferreira, Nuno M Cardoso, Rui Torres.

At Gato Vadio bookstore
Arquivo vivo é anarquivo

On the 2th of March we will give a lecture on the puppet play “Prometeu” at the school Clara de Resende in Porto.


Picture by Pedro Martins


Almost all the production team will be present at the lecture:

Puppeteer: Marcelo Lafontana

dramaturgy: José Coutinhas

Photography: Pedro Martins

Multimedia: Luís Grifu

Scenography: Sílvia Fagundes



In 24th of July of 2014 I will be presenting the paper Prometeu – Ilusão em tempo real at the Avanca Festival in Portugal.

Leite, L., Redondo, M. (2014) Prometeu, Ilusão em tempo-real. Conferência Internacional de Cinema, Arte e Tecnologia. Avanca.

Prometeu is a multimedia theater play inspired by the Indonesian shadow puppetry, Wayang Kulit. This multidisciplinary project narrows the boundaries between theater and film, past and present, reality and illusion. Merging different art forms brings new experiences to the artists and to the public; in particular the amalgamation of puppetry with film and other disciplines contributes to a new concept, what we can call the performance cinema or ” Live Cinema”.

Using multimedia technologies and cinematic language this project motivates a different paradigm from the traditional reading of the narrative sequence. It promotes the discussion about the boundaries of theater and film, and the role of actors, puppeteers, animators and directors in this particular art form.

Preserving the heritage of traditional shadow theater, Prometeu overcomes the typical physical constraints found in traditional puppet theatre through film and multimedia techniques. Real-time post-production, performance animation or interaction techniques provide a new mean of creation to artists.  By using sensors the artists can expand the interaction. By using live cameras the performer becomes the director creating the sequence in real-time. Techniques such as “luma-key”, procedural animation or gesture tracking enhance the live experience making the performer into a sort of magician.

In Prometeu one puppeteer, performs with virtual and real objects and controls all the show remotely.

In this paper we present the methodology and the implementation of our technology as well as the aesthetics and conceptual aspects.