[Links for connecting devices to Game Engines]

PS Move for Unreal 4 (Mac/Win) – https://github.com/cboulay/psmove-ue4

Leap Motion for Unreal 4 (Mac / Win)

Microsoft Kinect 1 for Unreal

Microsoft Kinect 2 for Unreal

 

PS Move for Unity (Mac/Win) – https://github.com/CopenhagenGameCollective/UniMove

UniWii: Nintendo WIImote for Unity (Mac/Win) – http://wiki.unity3d.com/index.php?title=UniWii

WiiYourSelf: WIImote for Unity (Windows) – https://github.com/lonewolfwilliams/WiiYourselfUnity

Microsoft Kinect (OpenNI)

 

inversus presentation

Inversus is a digital interactive installation that explores the relationship between user actions with common objects: lamps, speakers and fans. It is an interactive machine that shifts the conceptual understanding a user traditionally has about a specific object, making people wonder: why should a lamp be used only to illuminate?

[vimeo 126317214 w=580&h=326]

Well, sometimes you feel that you should share your presentations with the community and not just with the people that attended the conference. The problem is that my presentations are a combination of images and oral explanations, and by sharing a PDF with just the images you can not understand what it means. So, I have decided to create an animated presentation with a synthesized voice to help.

This video was made from the paper presentation at ACHI (February 2015)
It explains the concept, the process and the framework.

commonspaces-web

A Multimodal digital Ecosystem developed for live performances.

A solution for real-time virtual object manipulation, for show control, and for video routing.

for more information about the performance and the POEX project: http://po-ex.net/taxonomia/materialidades/performativas/retroescavadora-sem-titulo

[vimeo 126071525 w=580&h=326]

When you develop contents for live performances relying one just one application you might come across some difficulties in creating specific effects witch are easy to create with other tools.
You might find yourself asking, why can I use multiple applications in a live performance ?
In general the problem relies on how to handle all the video outputs from the different applications without having to switch outputs during the performance and how to control them in a seamless away.

This framework integrates multiple applications sharing one output providing a simple solution to interact and handle data, video and sound. A flexible remote control system that provides scalability and provides the right path for creators to go beyond application limits. It can even handles application crashes because the output is independent from the applications.

Eco-System features
– Shared output
– Shared control
– Flexible remote control
– Scalable data-flow to connect video, audio, and data

This eco-system was developed for the “Untitled” POEX live performance
For this performance we choose to work only with free applications.
Pure Data (puredata.info) a visual programming environment, using chdh Egrégore (chdh.net/egregore_source_manual.htm) an audiovisual instruments interaction environment developed for the egregore performance. The series of patches provides us with a particle system for the creation of an audio visual organic life that would represent the placenta.
Unity (unity3d.com) gave us the right environment for simulating a physical space where the hands of the performer could interact with virtual objects. But Unity does not handle very well typography animation.
Emotion (adrienm.net/emotion/) A real-time animation environment that presents many interesting features for text animation.

Working on a macbook retina 2.4 i5.
NOTE: there is some performance issues during the video capture because further then all the eco-system running I had to open a camera and capture the screen. The sound from this video was not made during the simulation.