After the Skopéin exhibition at the Stadtkirche Karlsruhe was enthusiastically received by visitors, Michael and I are currently experimenting with reconstructing this experience in virtual reality (VR). In doing so, we are pursuing two main objectives:
- We aim to recreate the church as realistically as possible1 and to transfer Skopéin into the virtual space on a 1:1 scale. This approach is intended to provide insights into the differences between a “physical” and a “virtual” media art installation.
- Additionally, we seek to significantly expand Skopéin – specifically the projection itself – by liberating it from its physical constraints.
To achieve this, we remodeled the church’s interior architecture in Blender. Since the model must function in VR – in our case, using an Meta Quest 3 – we must constantly ensure that we do not overburden the VR hardware with excessive computational load. Thus, the modeling process was conceived as a continuous compromise between visual opulence and minimalism.



Connecting Meta Quest 3 to Blender
During texturing and lighting, we experimented with Blender’s VR plugin (VR scene inspection), which allows the scene to be displayed in real-time on the Meta Quest. The setup process is a bit of a hassle, since it requires a suitable cable2, the Meta Quest Link app and the Meta Quest Developer Hub on the PC. Both apps need to run simultaneously and need to have a stable cable connection to the headset.

In order to avoid the error message in Blender’s VR scene inspection (“Failed to get device information. Is a device plugged in?”) set the link switch in the Meta Quest Developer App to ‘ON’ and the link to ‘Cable’. After that you should be able to start to trigger the ‘Start VR Session’ in Blender without any issues.
Although I had relatively low expectations initially, it became evident that working in Blender in VR was significantly more intuitive and ‘fun’. The ability to assume perspectives that closely resemble those of the end user was especially convenient. (It should be noted, however, that the real-time VR view only functions with EEVEE and not with Cycles.)

Exporting an comprehensive FBX from Blender for Unity
As beneficial as the VR link between Blender and the Quest was, our ultimate goal is to transfer the model into a robust environment – i.e. a professional game engine (in our case: Unity) in order to conduct user studies and to exhibit the installation. Consequently, after modeling and texturing, an export to a file format supported by Unity was necessary. For compatibility reasons, we opted for FBX.


The FBX export, however, is not straightforward when it comes to textures. Experimenting in Blender with VR led us to a workflow that was less focused on technical particularities and more driven by aesthetic considerations. By ‘moving fast and breaking things’ we employed a lot of approaches during our creative process in Blender that could not be directly represented in the FBX file format. Therefore we used quite a few tricks and hacks to achieve a certain visual look and feel in Blender that cannot be translated into an FBX. Nevertheless, since we will eventually rely on FBX due to the final application being built in Unity, it became necessary to ‘clean up’ the Blender file after the experimental phase. To achieve a seamless export from Blender to Unity, the following aspects had to be taken into account:
Learnings:
Materials:
- No nodes should exist between the
Texture
node and thePrincipled BSDF
shader node. For example, while experimenting we used aContrast
node, which resulted in the material not being carried over into the FBX. Any adjustments (such as contrast, brightness, etc.) must therefore be applied directly to the image texture earlier in the pipeline. - The FBX export only supports a single shader type:
Principled BSDF
. All other shader nodes are simply ignored (e.g., emissions must be handled through the emission input within thePrincipled BSDF
, rather than through a separateEmission BSDF
node). - The Mapping Node, typically used between the Texture Coordinate and Image Texture Node, must be set to a ‘neutral‘ position, i.e.,
scaling = 1,1,1
;rotation = 0,0,0
; andlocation = 0,0,0
. Any other settings will be ignored during the export. (The FBX exporter “doesn’t understand any nodes, only the normal map can have a normal map node“.) If scaling or rotation of the texture is still required, it must be done ‘manually’ in the UV editor by selecting all polygons associated with the texture and adjusting the texture in the UV editor. - In the FBX export settings, it is advisable to set the path mode to ‘Copy’. If a binary FBX is desired (i.e., if all materials/textures and mesh geometry are to be packaged into a single file), the ‘Embed Textures’ option (the small, inconspicuous button next to the dropdown) must be activated. Otherwise, Blender creates a folder at the export location where all the textures are saved as image files.
- Procedural textures generated within Blender (e.g., via
Noise
node) are not automatically exported, so they must be baked beforehand.
Normals:
- All objects need to feature ‘correct’ normals, i.e. before exporting it makes sense to recalculate all normals in Blender and do a visual check with backface culling turned on (and or with a view mode that displays the normal directaion) before exporting.
Lights:
- Only point and directional lights are exported. The area lights we used as fill lights are ignored during export. Correction: Unity will import the area lights but they are only available as baked lights. Since this will not improve our workflow we will do a complete light baking in Blender with Cycles first and import the baked textures into Unity.
Detour (?): Experiments with real time lighting in Unreal Engine
Since light plays such an important role in this VR experience I made some experiments with the real time lighting system in Unreal Engine 5.5 to determine if the Meta Quest 3 is capable of this kind of GPU-intensive calculations. Going this way forward it could be possible that we change the whole lighting inside the church on the fly (e.g. depending on the VR-user’s preferences, the time of the day, etc.). The first experiments look promising, although Unreal seems to be a little more picky in terms of mesh quality. (There were some n-gons at the outer wall structure that needed to be fixed in Blender before exporting an FBX.)

Also Unreal seems to sometimes confuse the UV-rotation of certain (not all!) textures when using a FBX with packed assets that was exported from Blender.

I do like Unreal’s workflow quite a bit. Especially the integration of the Skopéin-projection into the scene is quite convenient by using a single decal actor and configuring the material with a diffuse and alpha-map. (Set material domain to deferred decal
and the blend mode to translucent
.)

Props & scale
Both experiments in Unity and Unreal showed some potential for improvement in regard to the church’s 3D model. Firstly some textures need to be replaced and/or finetuned in order to avoid ugly repetition artifacts. Also some UV-maps had to be optimized.
The most important obervation was that the VR-user loses a sense of scale rather quickly when confronted with a mostly empty room/environment. Therefore we decided the we want to equip the vicinity with some props and furniture where everybody has at least a crude understanding of the real dimensions. By placing these objects in the vicinity of the user’s spawn point we hope that these objects articulate the scale/dimensions of the architecture a lot better than an empty environment.

After tweaking the textures and equipping the church with the furniture/probs, we will experiment what kind of direction we will continue to follow: Baking all textures and lighting with a ‘real’ raytracer in Blender and continue the development in Unity? Make use of the possibilities of real time lighting in Unreal? Developing a hybrid between real time and baked lights? Skopéin is to be continued.

Footnotes
- By “realistic,” we do not mean striving for actual photorealism; rather, our goal is to transfer the atmosphere that was palpable in the church at that time as effectively as possible into VR. In this sense, the concept of verisimilitude would be more appropriate than realism. This means focusing less on visual accuracy and more on capturing and conveying the essence and feeling of the space, allowing the virtual experience to evoke the same sense of presence and ambiance as the original installation in the church. ↩︎
- I experimented with a lot of different cables, but the only one that worked reliably was the quite expensive original cable from Meta’s store. Make also sure that your motherboard is able to output with a USB-C connector since adapters from USB-A to USB-C also cause some issues like lag or disconnects. ↩︎