After the Skopéin exhibition at the Stadtkirche Karlsruhe was enthusiastically received by visitors, Michael and I are currently experimenting with reconstructing this experience in virtual reality (VR). In doing so, we are pursuing two main objectives:

  • We aim to recreate the church as realistically as possible1 and to transfer Skopéin into the virtual space on a 1:1 scale. This approach is intended to provide insights into the differences between a “physical” and a “virtual” media art installation.
  • Additionally, we seek to significantly expand Skopéin – specifically the projection itself – by liberating it from its physical constraints.

To achieve this, we remodeled the church’s interior architecture in Blender. Since the model must function in VR – in our case, using an Meta Quest 3 – we must constantly ensure that we do not overburden the VR hardware with excessive computational load. Thus, the modeling process was conceived as a continuous compromise between visual opulence and minimalism.

Wireframe view of the interior (screenshot from Blender). In order to stick as close as possible to the original reference, I decided to model the church in original scale (i.e. 1 physical meter equals 1 virtual meter) in blender. During this process it proved to be very valuable that we took real measurement data on location in Karlsruhe with a laser rangefinder (STABILA LD 250 BT) back in 2022.
Raytrace rendering of the interior (Blender Cycles) – here it becomes apparent how much the lighting inside the church influences the overall look and feel.
Raytrace rendering of the interior (Blender Cycles) – since the coloring of the windows was a huge inspiration for Skopéin, it was very important to us to convey the influence of the lower windows to the church’s atmosphere.

Connecting Meta Quest 3 to Blender

During texturing and lighting, we experimented with Blender’s VR plugin (VR scene inspection), which allows the scene to be displayed in real-time on the Meta Quest. The setup process is a bit of a hassle, since it requires a suitable cable2, the Meta Quest Link app and the Meta Quest Developer Hub on the PC. Both apps need to run simultaneously and need to have a stable cable connection to the headset.

Screenshot of the Meta Quest Developer Hub – in order to enable the VR scene inspection in Blender the Meta Quest Link (right bottom corner) needs to be enabled and set to ‘Cable’.

In order to avoid the error message in Blender’s VR scene inspection (“Failed to get device information. Is a device plugged in?”) set the link switch in the Meta Quest Developer App to ‘ON’ and the link to ‘Cable’. After that you should be able to start to trigger the ‘Start VR Session’ in Blender without any issues.

Although I had relatively low expectations initially, it became evident that working in Blender in VR was significantly more intuitive and ‘fun’. The ability to assume perspectives that closely resemble those of the end user was especially convenient. (It should be noted, however, that the real-time VR view only functions with EEVEE and not with Cycles.)

Experimenting with the Meta Quest 3 and Blender (snapshot of Michael Johansson at the lab for media aesthetics at Hochschule Bonn-Rhein-Sieg) – quite a boost in productivity, since we could prototype different lighting and/or texturing scenarios on the fly without the need of exporting for Unity.

Exporting an comprehensive FBX from Blender for Unity

As beneficial as the VR link between Blender and the Quest was, our ultimate goal is to transfer the model into a robust environment – i.e. a professional game engine (in our case: Unity) in order to conduct user studies and to exhibit the installation. Consequently, after modeling and texturing, an export to a file format supported by Unity was necessary. For compatibility reasons, we opted for FBX.

Screenshot of the church in Unity – correct textures and UV-maps, but the lighting is way off. Here it becomes apparent that we will have to do a lot of texture baking with Blender’s Cycles.
Screenshot of the church in Unity – the game engine is quite good in estimating the normal direction of each face, which becomes apparent when backface culling is turned on.

The FBX export, however, is not straightforward when it comes to textures. Experimenting in Blender with VR led us to a workflow that was less focused on technical particularities and more driven by aesthetic considerations. By ‘moving fast and breaking things’ we employed a lot of approaches during our creative process in Blender that could not be directly represented in the FBX file format. Therefore we used quite a few tricks and hacks to achieve a certain visual look and feel in Blender that cannot be translated into an FBX. Nevertheless, since we will eventually rely on FBX due to the final application being built in Unity, it became necessary to ‘clean up’ the Blender file after the experimental phase. To achieve a seamless export from Blender to Unity, the following aspects had to be taken into account:

Learnings:

Materials:

  • No nodes should exist between the Texture node and the Principled BSDF shader node. For example, while experimenting we used a Contrast node, which resulted in the material not being carried over into the FBX. Any adjustments (such as contrast, brightness, etc.) must therefore be applied directly to the image texture earlier in the pipeline.
  • The FBX export only supports a single shader type: Principled BSDF. All other shader nodes are simply ignored (e.g., emissions must be handled through the emission input within the Principled BSDF, rather than through a separate Emission BSDF node).
  • The Mapping Node, typically used between the Texture Coordinate and Image Texture Node, must be set to a ‘neutral‘ position, i.e., scaling = 1,1,1; rotation = 0,0,0; and location = 0,0,0. Any other settings will be ignored during the export. (The FBX exporter “doesn’t understand any nodes, only the normal map can have a normal map node“.) If scaling or rotation of the texture is still required, it must be done ‘manually’ in the UV editor by selecting all polygons associated with the texture and adjusting the texture in the UV editor.
  • In the FBX export settings, it is advisable to set the path mode to ‘Copy’. If a binary FBX is desired (i.e., if all materials/textures and mesh geometry are to be packaged into a single file), the ‘Embed Textures’ option (the small, inconspicuous button next to the dropdown) must be activated. Otherwise, Blender creates a folder at the export location where all the textures are saved as image files.
  • Procedural textures generated within Blender (e.g., via Noise node) are not automatically exported, so they must be baked beforehand.

Normals:

  • All objects need to feature ‘correct’ normals, i.e. before exporting it makes sense to recalculate all normals in Blender and do a visual check with backface culling turned on (and or with a view mode that displays the normal directaion) before exporting.

Lights:

  • Only point and directional lights are exported. The area lights we used as fill lights are ignored during export. Correction: Unity will import the area lights but they are only available as baked lights. Since this will not improve our workflow we will do a complete light baking in Blender with Cycles first and import the baked textures into Unity.


Footnotes

  1. By “realistic,” we do not mean striving for actual photorealism; rather, our goal is to transfer the atmosphere that was palpable in the church at that time as effectively as possible into VR. In this sense, the concept of verisimilitude would be more appropriate than realism. This means focusing less on visual accuracy and more on capturing and conveying the essence and feeling of the space, allowing the virtual experience to evoke the same sense of presence and ambiance as the original installation in the church. ↩︎
  2. I experimented with a lot of different cables, but the only one that worked reliably was the quite expensive original cable from Meta’s store. Make also sure that your motherboard is able to output with a USB-C connector since adapters from USB-A to USB-C also cause some issues like lag or disconnects. ↩︎