The new version of Stringless for Unity supports Zeroconf networking through Bonjour for auto discovery as well RemoteUI.

stringless

Just press play on Unity, and all the parameters that you have chosen for remote control will automatically appear in RemoteUI on another computer, on a IOS or Android Device.
You do not need to manually setup the OSC addresses anymore, just drag a Receiver on a Gameobject and define the value range that RemoteUI will automatically recognize and show you a slider for you to manipulate. These sliders can also be mapped to MIDI devices or other HUD controllers in a similar manner as the MIDI learn feature. Just press the name of the parameter on RemoteUI until it starts to blink, and move a slider on your MIDI controller or a joystick on a gamepad. The mapping will be saved for future sessions. You can also define distinct devices for the same mapping allowing device interchange.

 

Stringless support for RemoteUI is just a prototype and still in a developing phase.
Soon, it will be available for download.

Music: bensound.com
May.2017

Virtual Marionette puppet tools were used to develop several interactive installations for the animation exhibition Animar 12 at the cinematic gallery Solar in Vila do Conde. The exhibition was debuted in the 18th of February of 2017.

 

Animar 12 - Interactive Installations

Faz bem Falar de Amor

An interactive installation that challenges the participants to interpret with virtual characters some of the scenes from the animation music clip developed by Jorge Ribeiro for the music “Faz bem falar de amor” from Quinta do Bill.

Puppit tool was adapted to drive two cartoon characters (a very strong lady and a skinny young man) using the body motion of the visitors with with one Microsoft Kinect. The virtual character skeletons differ from the human body proportions. Thus, there are distinct behaviors on each puppet that do not mirror exactly the participant movement. Although our intention was to present this challenge to the participant for him to adapt his body to the target puppet we help him a little bit. To solve this discrepancy, I used two skeletons. A human like skeleton is mapped directly to the performer”s body. Then, the virtual character skeleton is mapped to the human clone skeleton with an offset and scale function. In this way, it was possible to scale up and down the movement on specific joints from the clone skeleton and make the virtual character behave more natural and cartoonish. This application was developed using Unity, OpenNI and Blender.

É Preciso que eu Diminua

This animation music clip was developed by Pedro Serrazina for Samuel Úria. It describes the story of a character that suffers from a scale problem transcending the size of the buildings. The character tries to shrink placing the arms and legs near to is body. On the other hand there is the need to free from the strict boundaries and expand all the things around him. To provide the feeling of body expansion pushing the boundaries away from his body the visitor drives a silhouette as a shadow with a contour line around him that grows when the participant expands his body and shrinks when he reduce his size. The silhouette captured by a Microsoft Kinect is projected on a set of cubes that deform the body shape. This application was developed with Openframeworks with ofxKinect and ofxOpenCV.

Estilhaços

For this short animation film produced by Jorge Miguel Ribeiro that addresses the portuguese colonial war there was the intention to trigger segments of the film when the visitor place his body above a mine. Two segments of the film show two distinct perspectives from the war, one from a father that experienced the war and the other from his child that understood the war through the indirect report of his father. A webcam was used to capture the position of the visitor”s body and whenever his body enters the mine space it sends an OSC message with a trigger to a video player application. Both video trigger and video player applications were developed with Openframeworks.

 

 

 

Untitled Anarchive is an experimental multimodal multimedia live performance of the POEX arquive. The Digital Arquive of Portuguese Experimental Literature was performed live at Salão Brazil on the 4th of February by members of Retroescavadora (Rui Torres, Luís Aly, and Luís Grifu) and their invited guests (Bruno Ministro and Sandra Guerreiro).

frames from Untitled archive at Salão Brazil Coimbra

This collaborative intervention explores new approaches to experimental literature mixing media and manipulation techniques, in a interactive environment that offers a common space for a dialog between the inter-actors.

Results from the explorative work during an artistic residence on Espaço-Tempo in December 2015.

Solitária is an ongoing project from Alma D””””arame that explores the human Loneliness in the solitaire.

We explored the human mind and body in the dark.
How do we move, see, listen and think in a place where there is no light?
The absence of light removes any spatial–temporal references and draw us to an unbalanced world.

Solitaria Work-in-Progress

Solitaria Work-in-Progress 15th December 2015

We are trapped in a square space, and our body is the only connection to the physical world, the only sensor capable to recognize the physical space.

Our focus was on the relations between the human gestures and the space.

[Portuguese]
Em termos gráficos, tentamos uma abordagem gráfica e minimal dando foco ao vazio e ao silêncio. Os elementos gráficos são geradas fundamentalmente a partir de linhas que nos remetem para as grades da prisão. Surgem igualmente letras para caracterizar as memórias e as silhuetas como representação do real. Mas esta realidade vai sendo linearizada (esquecida) começando com as silhuetas evoluindo até ao seu estado sintetizado, linhas que ganham expressão. Em termos de interação trabalhamos o corpo e objectos no espaço e no tempo. Começa-se por trabalhar o corpo e objectos no espaço da solitária, depois exploramos o espaço do próprio corpo explorando a relação espacial entre as mãos e centro gravitacional do corpo, finalmente retorna-se ao espaço da solitária relacionado o corpo com o espaço físico. Utiliza-se um sensor de profundidade e câmaras de vídeo.

Performer Amândio Anastácio
Multimedia Grifu

Technical framework
[HARDWARE]
– Macintosh Laptop
– Microsoft Kinect
– Video Camera
– Ipad

[SOFTWARE]
– Qlab
– OpenFrameworks
– eMotion
– QuartzComposer
– PureData
– MaxMsp
– PullTheStrings
– TouchOSC
– Syphon

Dec. 2015

 

Remote Control for Unity (RCu) is a open source generic mapping interface for controlling and exposing properties and methods of object””s in Unity using remote applications and devices.

Remote Control Framework Scheme

Remote Control Framework Scheme

Remote Control for Unity was published in December 2015 and is available at GitHub

This Video shows the framework of this digital puppetry environment.

Remote Control is open source and provides a GUI to facilitate the mapping procedure ideal for non-expert programmers, an interface that provides an easy way for controlling Unity parameters in real-time from devices such as a smartphone.

RCu makes use of Jorge Garcia””s UnityOSC to access to the Open Sound Control (OSC) functionalities and provides connectivity with all devices and applications that support this protocol.

Videos:
Pre-Release Video


This video shows the pre-released version from June 2015.

Tutorial 1 [Simple]: Controlling the object´s position


In this tutorial you will learn how to connect, map, and control objects position from remote.

Tutorial 2 [Intermediate]: Controlling Blend-Shapes


In this tutorial you will learn how to control Blend-shapes of your character with a flexible interface.

Tutorial 3 [Complex]: Control Lights, Cameras and Activator

In this tutorial you will learn how to change light parameters, how to move cameras and how to activate them remotely to create an interactive performative environment.

Published in Dec.2015
GitHub: https://github.com/grifu/RemoteControlUnity

holographic puppetry

First experiments with DIY holographic projection with digital puppetry.

Ravin Balakrishnan already in 2001 was exploring physical mockups to interact with 3D volumetric displays (User Interfaces for Volumetric Displays –
http://www.autodeskresearch.com/publications/volumetricui). Naiqi Weng made some interesting experiments with volumetric displays and digital puppetry in 2005. Andrew Jones extended this approach in 2007, (Rendering for an Interactive 360º Light Field Display- http://gl.ict.usc.edu/Research/3DDisplay/)

 

But this is a completely different approach, a low-cost solution that can be easily done with cheap materials and a standard computer display.

 

It is a four-sided hologram made with transparent plastic that sits above a computer screen. The screen is divided into four planes that reflects the performance animation manipulated by the digital puppeteer with his hands.

This experiment was made in PIC – Porto Interactive Center and the holographic structure was developed by Marcelo Gomes.

 

To learn more about four-sided pyramid holograms jump to: http://rimstar.org/science_electronics_projects/hologram_pyramid_diy_homemade.htm

Holographic Toy Paper Puppet

commonspaces-web

A Multimodal digital Ecosystem developed for live performances.

A solution for real-time virtual object manipulation, for show control, and for video routing.

for more information about the performance and the POEX project: http://po-ex.net/taxonomia/materialidades/performativas/retroescavadora-sem-titulo

When you develop contents for live performances relying one just one application you might come across some difficulties in creating specific effects witch are easy to create with other tools.
You might find yourself asking, why can I use multiple applications in a live performance ?
In general the problem relies on how to handle all the video outputs from the different applications without having to switch outputs during the performance and how to control them in a seamless away.

This framework integrates multiple applications sharing one output providing a simple solution to interact and handle data, video and sound. A flexible remote control system that provides scalability and provides the right path for creators to go beyond application limits. It can even handles application crashes because the output is independent from the applications.

Eco-System features
– Shared output
– Shared control
– Flexible remote control
– Scalable data-flow to connect video, audio, and data

This eco-system was developed for the “Untitled” POEX live performance
For this performance we choose to work only with free applications.
Pure Data (puredata.info) a visual programming environment, using chdh Egrégore (chdh.net/egregore_source_manual.htm) an audiovisual instruments interaction environment developed for the egregore performance. The series of patches provides us with a particle system for the creation of an audio visual organic life that would represent the placenta.
Unity (unity3d.com) gave us the right environment for simulating a physical space where the hands of the performer could interact with virtual objects. But Unity does not handle very well typography animation.
Emotion (adrienm.net/emotion/) A real-time animation environment that presents many interesting features for text animation.

Working on a macbook retina 2.4 i5.
NOTE: there is some performance issues during the video capture because further then all the eco-system running I had to open a camera and capture the screen. The sound from this video was not made during the simulation.

Digimario - Digital Rod Puppet Style

Digi Mario – A digital rod puppet style with physics. A hand manipulated puppet with leap motion controller. One hand controls the puppet and the other simulates the puppeteer virtual hand for physical interaction.

This video shows how to animate a puppet with just one hand using physics to recreate the marionette aesthetics.

Marionette animation is fascinating but requires a lot of skills, this virtual marionette is much simpler but the digital puppeteer can control many aspects of the puppet. With few training the puppeteer can bring this puppet to life in a traditional style.
All fingers are mapped to the face controls and the hand is mapped to the head. When you move the head, the body will follow along like if we had a rod connected to the head.
Interesting effect is when you turn your hand using your pinky finger first, the eyes will look to the target and the head will follow the eyes.

framework: Leapmotion v2+ Unity

Demonstration of full hand control for expressive digital puppetry

This video shows how to animate a digital puppet in real-time with just one hand in an expressive manner.
It is part of a PhD research in the digital puppetry field.

Stringless hand controller (a metaphor for the marionette controller)
Hand + 5 finger controls different aspects of the puppet

Hand: position and orientation of the puppet
Pinky finger: Eye Pupils rotation in all directions
Index finger: Eyelashes rotation in the +Y and -Y axis (open and close)
Middle finger: Right Eyebrow blend shape deformation for character expressions
Ring finger: Left Eyebrow blend shape deformation for character expressions
Thumb: Mouth (open and close) blend shape deformation

There are different degrees of freedom (DOF) for each finger mapped to a certain puppet control.
The Hand as 6 DOF (position and rotation), you can move your hand freely around the tracking area but must be carful with the occlusion problem;
The pinky finger has 3 DOF for rotation of the eye pupils, and although controlling the pinky finger independently as some constraints, for this kind of small motion is more then adequate.
The middle and ring fingers are have more constraints and it is very hard to controlled them independently, so I mapped just 1 DOF (up and down) of each finger to the eyebrows.
The index finger as more potential because you can control more degrees of freedom, I tried using the index to control the eye pupils but the results were not so great. Instead, the index finger is mapped to the eyelashes rotation with just 1 DOF (up and down)

This is a good direction for digital puppeteers that have full control of the character expressivity. It require some training to act like a character, but that´s the magic of puppeteering.

This prototype can animate two different full controlled puppets for interaction.
A powerful model for performance animation using digital puppeteering techniques.

Hardware: Leap motion device for hand tracking and a macbook
Software: Unity Engine

 

LeapMuppet is a digital hand puppet based on the Leap Motion interface.
It was developed with the new skeletal tracking in Unity.

An inspiration on the traditional glove puppet, in particular the muppets, from the great Jim Henson.
This is a digital puppetry proof of concept. The ideia is to bring the art of traditional puppetry manipulation to performance animation.
Not real´istic animation, but believable animation…
Do not believe that the puppet is for real, just believe that the puppet is alive.

A simple character made in Maya, which is real-time animated with the motion of the hand.
I use a skeleton to drive the nose and the head of the puppet (position, orientation), and blend shapes to control the mouth and the eyebrows.
The palm of the hand drives the head (direction free); the index finger drives the nose (direction free); the thumb controls the mouth (up and down); the ring finger controls the left eyebrows (shrink / expand); the pinky finger controls the right eyebrows.

With just one hand you can give great expression to the puppet. Imagine what you can do with two hands ?

Let´s have some fun with virtual puppets.