Instalação Ditador fez parte da exposição “Abílio-José Santos. Revelação: Concretos e Visuais” aberta ao público de 4 de julho a 8 de setembro de 2019 no Fórum Maia.

The interactive installation Ditador was at the “Abílio-José Santos. Revelação: Concretos e Visuais” exhibition from July to September 2019.

 

This installation was developed by Grifu, Luís Aly and Marco Jerónimo using eMotion, Qlab, Kinect.

Ditador

Solitaria

Poster for the Solitária performance

Solitária is a performance-play that explores the human Loneliness in the solitary confinement imagining the behavior of a prisoner through is mind and body. Our body is the only connection to the physical world, the only sensor capable to recognize the physical space. However, our sensorial capabilities begin to misbehave and soon we became disorientated.

Debut’s on 9th November 2017 – 21h30 – Blackbox d’O Espaço do Tempo.
Other performances in 10th and 11th November at 21h30 – Blackbox d’O Espaço do Tempo.

 

Solitária follows the performance-play approaches which characterizes the work developed by Alma d’arame from its beginning. Each work starts inside a delimited space, inside it’s own confinement. In one hand, the space of narrative, theater, puppet, being and object and, on the other, the space of programming, kinetics, multimedia. Starting from the solitary and creative space of each one, we saw the common space of creation born. We all have and need that time with ourselves. It is in this time that we find the space of each one, which is ours alone, and where we can relive memories, hide, think, feel, register. Here we will reach own states. That is what this performative act is all about.
It is in this laboratory space that this solitary confrontation between man and machine, between real and virtual, unfolds, and it is this confrontation that will lead us to experimentation and the search for new narratives.
The kinetic movement of the body and how it occupies the empty space will build this visual and sound narrative.

Art team
Direcção artística | Amândio Anastácio;
Interpretação | Susana Nunes;
Multimédia | Luís Grifu;
Música | João Bastos;
Marioneta | Raul Constante Pereira;
Desenho de luz e Espaço cénico | Amândio Anastácio;
Desenho de luz e montagem | António Costa;
Direcção de Produção | Isabel Pinto Coelho;
Assistente de Produção | Alexandra Anastácio;
Fotografia | Inês Samina;
Vídeo | Pedro Grenha;

Production | Alma d’Arame

Apoio | Câmara Municipal de Montemor-o-Novo .
Parceria | Espaço do Tempo .
Estrutura Financiada por | Dgartes e Governo de Portugal

Virtual Marionette puppet tools were used to develop several interactive installations for the animation exhibition Animar 12 at the cinematic gallery Solar in Vila do Conde. The exhibition was debuted in the 18th of February of 2017.

 

Animar 12 - Interactive Installations

Faz bem Falar de Amor

An interactive installation that challenges the participants to interpret with virtual characters some of the scenes from the animation music clip developed by Jorge Ribeiro for the music “Faz bem falar de amor” from Quinta do Bill.

Puppit tool was adapted to drive two cartoon characters (a very strong lady and a skinny young man) using the body motion of the visitors with with one Microsoft Kinect. The virtual character skeletons differ from the human body proportions. Thus, there are distinct behaviors on each puppet that do not mirror exactly the participant movement. Although our intention was to present this challenge to the participant for him to adapt his body to the target puppet we help him a little bit. To solve this discrepancy, I used two skeletons. A human like skeleton is mapped directly to the performer”s body. Then, the virtual character skeleton is mapped to the human clone skeleton with an offset and scale function. In this way, it was possible to scale up and down the movement on specific joints from the clone skeleton and make the virtual character behave more natural and cartoonish. This application was developed using Unity, OpenNI and Blender.

É Preciso que eu Diminua

This animation music clip was developed by Pedro Serrazina for Samuel Úria. It describes the story of a character that suffers from a scale problem transcending the size of the buildings. The character tries to shrink placing the arms and legs near to is body. On the other hand there is the need to free from the strict boundaries and expand all the things around him. To provide the feeling of body expansion pushing the boundaries away from his body the visitor drives a silhouette as a shadow with a contour line around him that grows when the participant expands his body and shrinks when he reduce his size. The silhouette captured by a Microsoft Kinect is projected on a set of cubes that deform the body shape. This application was developed with Openframeworks with ofxKinect and ofxOpenCV.

Estilhaços

For this short animation film produced by Jorge Miguel Ribeiro that addresses the portuguese colonial war there was the intention to trigger segments of the film when the visitor place his body above a mine. Two segments of the film show two distinct perspectives from the war, one from a father that experienced the war and the other from his child that understood the war through the indirect report of his father. A webcam was used to capture the position of the visitor”s body and whenever his body enters the mine space it sends an OSC message with a trigger to a video player application. Both video trigger and video player applications were developed with Openframeworks.

 

 

 

This is a small application developed by Eight / Vladimir Gusev) to use Kinect as a motion capture system for RamDance. It requires OpenNI.

It allows more then one user and accepts external “.oni” motion files (motion capture files from openni).

It substitutes the need to work with osceleton and my own patch for maxmsp (although my patch is oriented to a wide use).

It captures the motion and sends it directly to ram dance without any intermediate application making use of the OSC protocol.  You can configure the network address to send to a different computer.

Download Mac Binary

This file is the Mac OSX binary file, please visit Github to download the source: https://github.com/eighteight/CocoKinect

 

The Virtual Marionette installation “BePuppit” in a puppetry exhibition at Teatro Municipal de Vila do Conde.
We invite the audience to participate in this play which recreates a traditional puppet booth.

“PUPPIT” is an interactive experience that explores different ways to control and animate virtual puppets in live environments using digital puppetry methods as a storytelling tool.
With “Body-PUPPIT” the users body is the puppet controller, producing performance animation with body motion. We challenge the participants to explore their bodies to give life to puppets that are different from human morphology as if they were using hand shadows to create imaginative figures.
By playing with puppets in a dramatic way the user can create the illusion of life and build a narrative.

PUPPIT consists in a several prototypes that are the work-in-progress of a PhD research called “Virtual Marionette – Interaction Model for Digital Puppetry”. The goal of the prototypes is to explore and evaluate different rigging, mapping and interaction methods, and to develop tools for artists and non-expert artists that can be used in as collaborative environment for storytelling – making live narratives.

The exhibition is open every day since the 14th till 22th December except mondays.

www.virtualmarionette.grifu.com
December 2013

Performance Animation LAB. at Cinanima 2013 (Espinho/Portugal) – Animation Film Festival

The main objective of this Lab is to disseminate Digital Puppetry as a methodology to create performance animation.
We present our methodology based on simple steps: Create->Rig->Map->Animate

November 2013

Reel with CINANIMA 2013 labs

(more…)

Result animation from the performance animation Lab at CINANIMA 2013 (Animation Film Festival in Espinho/Portugal).
Real-time storytelling and live directing with virtual marionettes using digital interfaces to create performance animation.
This play was performed with three digital puppeteers / actors and one director.

Framework:
Hardware used: 2 Wiimotes and 2 Nunchucks; 1 Microsoft Kinect; 1 Ipad, 1 Macbook DualCore 2005; VideoProjector
Software used: Animata, Pure Data, Osceleton, Osculator, TouchOSC.

November 2013
www.virtualmarionette.grifu.com

Sensor Joint Sensor Joint
0 Head 12 Right Elbow
1 Neck 13 Right Wrist
2 Torso 14 Right Hand
3 Waist 15 Right Fingertip
4 Left Collar 16 Left Hip
5 Left Shoulder 17 Left Knee
6 Left Elbow 18 Left Ankle
7 Left Wrist 19 Left Foot
8 Left Hand 20 Right Hip
9 Left Fingertip 21 Right Knee
10 Right Collar 22 Right Ankle
11 Right Shoulder 23 Right Foot

7

softkinect

Softkinect presented in CES 2012 a virtual puppet show to demonstrate finger tracking feature, a gesture control system to tout “near mode” . The firmware for its DepthSense 311 should detect finger movement from as close as 15cm (vs. Kinect””””””””””””””””s 50cm) and as far away as about three feet.

A “puppet show” app that let you control two cartoon puppets with ragdoll arms

The puppets could nod their head, twist around, and open their mouths when you un-balled a fist. The twisting mechanism didn””””””””””””””””t work perfectly, and touching your hands together resulted in rather weird behavior.

Link to a video HERE

other video