peregrinacao

Peregrinação – A live cinema approach for paper puppetry

Digital media as an interface between puppetry and film to enhance the traditional paper theater experience.

Traditional paper puppetry shows are usually presented in small places to a small audience because of their size. We present a miniaturized multi-camera studio set as a novel way to experience puppetry to a greater audience.
The puppeteer tells the story by performing with tangible puppets in a physical set and the audience can compare this performance with the final composed image in a screen as if they were behind the cameras.
We have developed a multimedia solution based on virtual “strings” connecting computer applications, devices, and real-time effects providing all the control to one single puppeteer – the one man show.
We overcome traditional puppetry limitations by exploring digital media technology, and by mixing puppetry with cinematic techniques we present the experience of the live cinema.
Coproduction Lafontana – Animated Forms / Teatro Nacional São João (TNSJ) from the original text of Fernão Mendes Pinto, with the interpretation of Marcelo Lafontana.

Inspired by the adventures of Fernão Mendes Pinto, a portuguese explorer, reported from the book “Peregrinação” , published in 1614, Marcelo Lafontana makes a journey through strange stories that are presented in a miniature world.

Through the intersection of puppetry, in particular, the expressiveness of paper theater with the cinematic language, this small world becomes a large space of illusion in which the narrative arises.

The stage, is transformed into a film studio, where scenarios and characters drawn and cut from cardboard are handle in front of video cameras. The images are captured by a multimedia architecture providing image processing, image mixing, sound effects, chroma key, performance animation, and all happens in real-time.  The final result is projected on a screen, like a sail of a boat that opens to the charms of a journey through the imagination.

s14

The Blackmagic pocket camera (BMPC) is small and powerful even when you operate this camera remotely.

For project “Peregrino” we searched for small cameras with changeable lens to frame our particular small size scenes. Zoom, Focus and Iris were  features that we searched for. In our first experiment we use a Gopro camera because of the size, but the image quality and the fixed lens were a problem, this camera doesnt provide zoom,  focus, or iris. We also experiment a Sony NEX-7 and we did like the way we could remotely control with WiFi using the Sony API, but the image quality and many other tiny things like the fact the camera works only in battery mode made us search for other solutions. In the other hand the Blackmagic Pocket Camera offers a great image quality with interchangeable lens with a 16mm sensor. We could also control the Focus and Iris remotely by using the LANC input.

LANC BMPC

LANCuage the syntax to control.
LANC – Logic Application Control Bus System or Control-L

LANC is a trademark protocol made by SONY. A bidirectional communication serial port where two devices can communicate with each other. For more information about LANC just click in the fowling link: http://www.boehmel.de/lanc.htm

Other video camera brands, like Cannon also implement this protocol to take advantage of the many remote control equipments in the market, but they call it “Control” port. LANC is kind of a standard remote protocol for video cameras. Blackmagic also implements the LANC protocol in their cameras, although just a few functions are supported.
Available LANC commands in BMPC: REC, Manual focus near/far; Manual iris near/far; Auto-Focus; Auto-Iris. We are using in our play all of this functions intensively except the REC functions.

S24

Framework to control IRIS and FOCUS on the BMPC. We use a computer software to control the LANC via MIDI but you can use any midi device if you implement a midi interface into the arduino.

QLAB is a show control application for MAC which we use in our project. A show control environment that orchestrates all the media and data flow. Sam Kusnetz provides a very interesting writing about how to prepare a machine for live performance using Qlab, visit: http://figure53.com/notes/2013-10-29-prepare-execute-troubleshoot/

 

The workflow: Qlab sends MIDI messages to the Hairless-midi midware software that converts standard midi to serial midi. Then, the messages are interpreted by the arduino witch sends LANC commands (pulses) into the blackmagic cameras.

[vimeo 97412299 w=580&h=326]

You can download all the files above (press more)

(more…)

chromakey qlab quartz composer

Framework for Real-time chromakey in QLAB.

The main goal was to provide live chromakey to camera cues inside Qlab using still images or video files. My work  adapts the solution provided by George Toledo implementing it in QLAB.

—————————————————————————————————————
[Q-Chromakey-still.qtz]

Q-Chromakey-still provides an easy way to generate a real-time chromakey form inside Qlab using a still image as background.

s9

You just need to drag the Q-Chromakey-still.qtz into the Video Effects tab of a camera/video cue and define the color to key, adjust the threshold and the smoothing values and finally define the image location (path and filename).

You can download all the files above (press more)

(more…)

A framework solution to load and play movies or other live material inside Qlab using syphon.

qlab syphon

Qlab provides Syphon Server and Client. You can easily insert a Syphon footage with a camera cue. But if you want to mix a camera or video cue from Qlab with your syphon source you are in trouble. You could try to send your video cue to a syphon output, mix it or blend it with the other source and return it using syphon target (camera cue), but this is not a great solution and you will end up with strange effects.

Basically this framework will send the camera/video cue to a Quartz custom composition using the video effects tab and then mix it with the syphon source returning the image using the same channel.

The main goal of this framework is to be able to mix live images with other live or recorded sources. You can mix the image from a camera with a video using luma key or chroma key.

Qlab  3 uses Quartzcomposer as an image processor and not a video processor disabling the rendering of sprites, billboards, video importers and many other objects. This framework provides a workaround to this kind of limitations. Although i provide a solution to import a video, you can actually send graphic shapes from quartz composer via syphon, or a 3D graphics from Unity yo be blended with your cameras inside Qlab.

You can download all the files above (press more)

(more…)

A framework solution to play a sequence of images inside Qlab using a Quartz custom composition.

Qlab Image Sequence

In the “Prometeu” project framework we used Modul8 as the visual effect (animation/video) engine. Since Qlab now supports video output with multiple live camera feeds,  custom surface mapping and Quartzcomposer compositions for visual effects we have chosen Qlab as the main visual engine.

But Qlab version 3 uses Quartzcomposer as an image processor only and does not allow object rendering like sprites, billboards, cubes, spheres, or meshes. There are other things  that are not allowed to be used in quartz-composer via Qlab custom composition, like: patch time, movie importer, stop watch, interpolation.  Things that are related to time and image buffer brings serious issues and won””””t work as expected.

Because in the project “Peregrinação” we use a lot of chroma key, motion was a requirement inside quartz-composer.
Although i knew some of the quartz limitations through Qlab i started to search a solution to play videos using quartz. The first experiment i´ve made was to use syphon witch i explain in another post. Then, i found a different approach, loading and playing a series of images.

You can download all the files above (press more)

(more…)

peregrinacao

From the 10th till the 23th of  May at Mosteiro São Bento da Vitória, Porto.

Coproduction Lafontana – Animated Forms / Teatro Nacional São João (TNSJ) from the original text of Fernão Mendes Pinto, with the interpretation of Marcelo Lafontana.

Inspired by the adventures of Fernão Mendes Pinto, a portuguese explorer, reported
from the book “Peregrinação” , published in 1614, Marcelo Lafontana makes a journey through strange stories that are presented in a miniature world.

Through the intersection of puppetry, in particular, the expressiveness of paper theater with the cinematic language, this small world becomes a large space of illusion in which the narrative arises.

The stage, is transformed into a film studio, where scenarios and characters drawn and cut from cardboard are handle in front of video cameras. The images are captured by a multimedia architecture providing image processing, image mixing, sound effects, chroma key, performance animation, and all happens in real-time. The final result is projected on a screen, like a sail of a boat that opens to the charms of a journey through the imagination.
from
Fernão Mendes Pinto

staging e interpretation
Marcelo Lafontana

dramaturgy
José Coutinhas

scenography
Sílvia Fagundes

design of the characters and scenarios
Luís Félix, Rebeca das Neves

photography
JPedro Martins

music
Eduardo Patriarca

Multimedia (architecture and contents)
Luís Grifu

Lighting design
Pedro Cardoso

staging assistance
Rita Nova

I´ve implemented some of the digital puppetry methods in this project. Methods and tools are described in another post in this blog.

 

c1

c5_s

 

c2

 

c3

 

c4

 

 

 

 

Street Outdoor

outdoor