First open call
presented by

VRHAM!:
INTERACTIVE
ARTS

VRHam Logo

Are you a creative or artistic team developing interactive arts projects?

Are you willing to experiment and prototype collective and immersive experiences?

Do you want to open new paths for innovative and participative approaches with the audience in the creative industries?

Collective and immersive digital experiences for innovation in creative industries Discover the first open call focusing on INTERACTIVE ARTS

We are now looking for creators and artists working in the field of Interactive Arts who want to (further) develop their already existing artistic project (or a work in progress) by exploring the possibilities that lie in the innovation of 3D-scanning technologies. We want to encourage groundbreaking, forward-thinking implementations of this technology and participative interaction into your artistic work. We want to support visionary approaches in digital storytelling and see your project evolve by adding an additional layer of participation and collective accessibility. The final project aim is to showcase the results of the residency within the framework of a conference for relevant experts and industry stakeholders during VRHAM! festival in Hamburg (4- 12 June 2021), to present the final project state to the public as well as to engage in a fruitful exchange with experts in the field.

Timeline

Call opening
1 February 2021

Call closing
15 March 2021

Publishing of the results
End of March 2021

Start of the residency
End of April
Beginning of May 2021

Showcase
2 days in the course of VRHAM! Festival
4 – 12 June 2021 in Hamburg/Germany

Real-In
Technology

The technology provided by Dark Euphoria (Marseille, Fr) offers a new kind of narrative projects thanks to a real-time volumetric capture device. A multi-camera system (kinect) which combines 3D scanner & mapping for a fluid human / machine interaction, resulting in a live rendering, without any technical devices worn by the public. This allows the audience to be completely immersed in the experience, without having to handle joysticks or carry a heavy bag, glasses or other devices. The technology functions as a «state machine» which automatically validates (or not) each action of the participants according to the phase of the experience. For example: the detection of jumps, the range/scope of movement of their arms etc…

Discover
the technology