VR Transformation

Investigating the differences between a physical and a virtual media art installation – how to bring Skopéin into the Meta Quest 3

After the Skopéin exhibition at the Stadtkirche Karlsruhe was enthusiastically received by visitors, Michael and I are currently experimenting with reconstructing this experience in virtual reality (VR). In doing so, we are pursuing two main objectives:

  • We aim to recreate the church as realistically as possible1 and to transfer Skopéin into the virtual space on a 1:1 scale. This approach is intended to provide insights into the differences between a “physical” and a “virtual” media art installation.
  • Additionally, we seek to significantly expand Skopéin – specifically the projection itself – by liberating it from its physical constraints.

To achieve this, we remodeled the church’s interior architecture in Blender. Since the model must function in VR – in our case, using an Meta Quest 3 – we must constantly ensure that we do not overburden the VR hardware with excessive computational load. Thus, the modeling process was conceived as a continuous compromise between visual opulence and minimalism.

Wireframe view of the interior (screenshot from Blender). In order to stick as close as possible to the original reference, I decided to model the church in original scale (i.e. 1 physical meter equals 1 virtual meter) in blender. During this process it proved to be very valuable that we took real measurement data on location in Karlsruhe with a laser rangefinder (STABILA LD 250 BT) back in 2022.
Raytrace rendering of the interior (Blender Cycles) – here it becomes apparent how much the lighting inside the church influences the overall look and feel.
Raytrace rendering of the interior (Blender Cycles) – since the coloring of the windows was a huge inspiration for Skopéin, it was very important to us to convey the influence of the lower windows to the church’s atmosphere.

Connecting Meta Quest 3 to Blender

During texturing and lighting, we experimented with Blender’s VR plugin (VR scene inspection), which allows the scene to be displayed in real-time on the Meta Quest. The setup process is a bit of a hassle, since it requires a suitable cable2, the Meta Quest Link app and the Meta Quest Developer Hub on the PC. Both apps need to run simultaneously and need to have a stable cable connection to the headset.

Screenshot of the Meta Quest Developer Hub – in order to enable the VR scene inspection in Blender the Meta Quest Link (right bottom corner) needs to be enabled and set to ‘Cable’.

In order to avoid the error message in Blender’s VR scene inspection (“Failed to get device information. Is a device plugged in?”) set the link switch in the Meta Quest Developer App to ‘ON’ and the link to ‘Cable’. After that you should be able to start to trigger the ‘Start VR Session’ in Blender without any issues.

Although I had relatively low expectations initially, it became evident that working in Blender in VR was significantly more intuitive and ‘fun’. The ability to assume perspectives that closely resemble those of the end user was especially convenient. (It should be noted, however, that the real-time VR view only functions with EEVEE and not with Cycles.)

Experimenting with the Meta Quest 3 and Blender (snapshot of Michael Johansson at the lab for media aesthetics at Hochschule Bonn-Rhein-Sieg) – quite a boost in productivity, since we could prototype different lighting and/or texturing scenarios on the fly without the need of exporting for Unity.

Exporting an comprehensive FBX from Blender for Unity

As beneficial as the VR link between Blender and the Quest was, our ultimate goal is to transfer the model into a robust environment – i.e. a professional game engine (in our case: Unity) in order to conduct user studies and to exhibit the installation. Consequently, after modeling and texturing, an export to a file format supported by Unity was necessary. For compatibility reasons, we opted for FBX.

Screenshot of the church in Unity – correct textures and UV-maps, but the lighting is way off. Here it becomes apparent that we will have to do a lot of texture baking with Blender’s Cycles.
Screenshot of the church in Unity – the game engine is quite good in estimating the normal direction of each face, which becomes apparent when backface culling is turned on.

The FBX export, however, is not straightforward when it comes to textures. Experimenting in Blender with VR led us to a workflow that was less focused on technical particularities and more driven by aesthetic considerations. By ‘moving fast and breaking things’ we employed a lot of approaches during our creative process in Blender that could not be directly represented in the FBX file format. Therefore we used quite a few tricks and hacks to achieve a certain visual look and feel in Blender that cannot be translated into an FBX. Nevertheless, since we will eventually rely on FBX due to the final application being built in Unity, it became necessary to ‘clean up’ the Blender file after the experimental phase. To achieve a seamless export from Blender to Unity, the following aspects had to be taken into account:

Learnings:

Materials:

  • No nodes should exist between the Texture node and the Principled BSDF shader node. For example, while experimenting we used a Contrast node, which resulted in the material not being carried over into the FBX. Any adjustments (such as contrast, brightness, etc.) must therefore be applied directly to the image texture earlier in the pipeline.
  • The FBX export only supports a single shader type: Principled BSDF. All other shader nodes are simply ignored (e.g., emissions must be handled through the emission input within the Principled BSDF, rather than through a separate Emission BSDF node).
  • The Mapping Node, typically used between the Texture Coordinate and Image Texture Node, must be set to a ‘neutral‘ position, i.e., scaling = 1,1,1; rotation = 0,0,0; and location = 0,0,0. Any other settings will be ignored during the export. (The FBX exporter “doesn’t understand any nodes, only the normal map can have a normal map node“.) If scaling or rotation of the texture is still required, it must be done ‘manually’ in the UV editor by selecting all polygons associated with the texture and adjusting the texture in the UV editor.
  • In the FBX export settings, it is advisable to set the path mode to ‘Copy’. If a binary FBX is desired (i.e., if all materials/textures and mesh geometry are to be packaged into a single file), the ‘Embed Textures’ option (the small, inconspicuous button next to the dropdown) must be activated. Otherwise, Blender creates a folder at the export location where all the textures are saved as image files.
  • Procedural textures generated within Blender (e.g., via Noise node) are not automatically exported, so they must be baked beforehand.

Normals:

  • All objects need to feature ‘correct’ normals, i.e. before exporting it makes sense to recalculate all normals in Blender and do a visual check with backface culling turned on (and or with a view mode that displays the normal directaion) before exporting.

Lights:

  • Only point and directional lights are exported. The area lights we used as fill lights are ignored during export. Correction: Unity will import the area lights but they are only available as baked lights. Since this will not improve our workflow we will do a complete light baking in Blender with Cycles first and import the baked textures into Unity.

Detour (?): Experiments with real time lighting in Unreal Engine

Since light plays such an important role in this VR experience I made some experiments with the real time lighting system in Unreal Engine 5.5 to determine if the Meta Quest 3 is capable of this kind of GPU-intensive calculations. Going this way forward it could be possible that we change the whole lighting inside the church on the fly (e.g. depending on the VR-user’s preferences, the time of the day, etc.). The first experiments look promising, although Unreal seems to be a little more picky in terms of mesh quality. (There were some n-gons at the outer wall structure that needed to be fixed in Blender before exporting an FBX.)

Minor corrections were neccessary in Unreal’s shader graph: Occasionally the UV-rotation of the texture is flipped by 90°. This can be corrected by using a custom rotator object and setting the rotation angle to 0.25 (range go from 0 to 1, therefore 0.25 = 90°)

Also Unreal seems to sometimes confuse the UV-rotation of certain (not all!) textures when using a FBX with packed assets that was exported from Blender.

Screenshot directly out of Unreal’s viewport: Real time lightmap of the model without any baked shadows.

I do like Unreal’s workflow quite a bit. Especially the integration of the Skopéin-projection into the scene is quite convenient by using a single decal actor and configuring the material with a diffuse and alpha-map. (Set material domain to deferred decal and the blend mode to translucent.)

Screenshot directly out of Unreal’s viewport. I am still impressed that this kind of real time lighting quality is possible on a consumer device such as the Meta Quest 3.

Props & scale

Both experiments in Unity and Unreal showed some potential for improvement in regard to the church’s 3D model. Firstly some textures need to be replaced and/or finetuned in order to avoid ugly repetition artifacts. Also some UV-maps had to be optimized.

The most important obervation was that the VR-user loses a sense of scale rather quickly when confronted with a mostly empty room/environment. Therefore we decided the we want to equip the vicinity with some props and furniture where everybody has at least a crude understanding of the real dimensions. By placing these objects in the vicinity of the user’s spawn point we hope that these objects articulate the scale/dimensions of the architecture a lot better than an empty environment.

Work in progress: Furniture for the church (mostly untextured, vertex colors only). In order to stick to a coherent art style and to have full control in regard to poly count, we refrain from using other assets beyond our own.

After tweaking the textures and equipping the church with the furniture/probs, we will experiment what kind of direction we will continue to follow: Baking all textures and lighting with a ‘real’ raytracer in Blender and continue the development in Unity? Make use of the possibilities of real time lighting in Unreal? Developing a hybrid between real time and baked lights? Skopéin is to be continued.

Quick raytrace rendering in Blender Cycles of the church with all props and other furniture (e.g. the main organ at the top end, etc.). Please excuse the noise level, I had to make do with my old GTX980.

Footnotes

  1. By “realistic,” we do not mean striving for actual photorealism; rather, our goal is to transfer the atmosphere that was palpable in the church at that time as effectively as possible into VR. In this sense, the concept of verisimilitude would be more appropriate than realism. This means focusing less on visual accuracy and more on capturing and conveying the essence and feeling of the space, allowing the virtual experience to evoke the same sense of presence and ambiance as the original installation in the church. ↩︎
  2. I experimented with a lot of different cables, but the only one that worked reliably was the quite expensive original cable from Meta’s store. Make also sure that your motherboard is able to output with a USB-C connector since adapters from USB-A to USB-C also cause some issues like lag or disconnects. ↩︎