VR’s reality replacement

We are living in a world where VR will start to become more and more normalized. But at some point we will need to decide for ourselves where VR fits into our lives: will we spend days in there, only taking the goggles off to go to the bathroom? And if so, then what happens to our planet?
Created using: 
Blender, Scaniverse & Unreal Engine
Time to create: 
2 weeks
Type: 
Individual work

We are living in a world where VR will start to become more and more normalized. But at some point we will need to decide for ourselves where VR fits into our own lives: will we spend days in there, only taking the goggles off to go to the bathroom? And if so, then what happens to our planet and all the problems we are facing here when we try to escape this reality?
These were questions I tried to answer for myself while building this experience.

[siteorigin_widget class=”WP_Widget_Custom_HTML”][/siteorigin_widget]
(English subtitles available)

Virtual Reality gives us the opportunity to explore new worlds, and experience stories in ways never possible before. This is the time when we start to define what VR can mean for the public, with cheaper headsets and easier tools to develop an experience than ever before! The 3d models are interactive as the player physically walks around in them, moving them into a world which only exist in the virtual realm. But that is also where its danger lies…

With projects like the Metaverse going on, and big companies like Facebook trying to create worlds so attractive people don’t want to step out, we face a serious change in the landscape of VR. Some designers seem to want to replace our universe, our own world, with the digital ones. I created this project to discover where I stand in this conflict and where VR fits into my life. I came to realize that, for me, it is an amazing tool to extend our own reality and develop 3d models in a more natural interface. It makes sense to create 3D models in a 3D environment.

I believe people are starting to escape into the VR world to get away from problems in our physical world which feel unsolvable, like the environment heating up, and the plastic drifting in our oceans. We apply escapism, the will to get away and escape into another world so we don’t have to face the consequences in our own world. That is exactly why I started this project and the experience in a physical location: a 3d scan of the room where the assessment of this project would take place. I tried to build the real world into a VR one to connect the two.

This was the first time I uses something called 3d-scanning, which used depth sensors to capture a physical object in 3 dimensions. I borrowed an iPhone 13 Pro Max with LIDAR sensor from my school: brand new, out of the box and ready to be experiment with. After trying multiple apps I found Scaniverse to work the best, and the most reliable for free. The app requires some trial and error to get the best results though, as the first experiment wasn’t too successful:

A shot of the first 3d scan I did of a wall at my school
The very first scan wasn’t as successful as I hoped it would be, with many holes and stretched textures

 

A 3d-scan of multiple floorsThe first tests were not very promising: it seemed like the limitation of LIDAR of only being able to capture up to 5m in front of it was making it a challenge for me to scan a room as big as I initially planned. The first idea was to build the entire garage we use as a class location in VR and have walls blow away. This would reveal the destruction of the world behind it, while in the meantime making it prettier than it actually was.

The test with multiple floors turned out okay-ish but proved how challenging largely scaled rooms are for the software to understand the physical world: multiple floors would be stitched together and glass windows became a white void, as the LIDAR did not understand why there was an object there, but nothing to see.

However, I slowly learned how to use the app as the different scans slowly got better and better in quality, capturing small details as well as larger rooms. It turns out the app is quite capable of capturing very fine details. If I kept the phone close enough to the object for a longer amount of time, I was able to capture some pretty awesome details in my scans! The rubbish bin you can see here, for example, has all the little see-through dots inserted into the scan. And even though it’s far from perfect with quite some gaps and holes in the backside of the scan, it still proved this technology could be very useful in my quest to recreate the physical world in VR. It would never be perfect as there always seem to be parts missing or textures overlapping, or scans drifting and melting together in larger areas, but overall I was really impressed with how great of a scan I could get with 10 minutes of walking around the object.

Modelling this by hand would have made a perfect object, but probably taken at least 2 hours to get the materials and rendering just right. Especially because in the project I could capture the entire area and simply relight it, and all the materials would already look like they fit in the same world: because they all came from the actual, samephysical place!A 3d scan of a really big room

The attempt to scan the entire room wasn’t a huge success. After my experiments, which started to look quite nice, I decided to go for the big challenge: I had one week left, and on a quiet day without classes I sneaked into the huge room which would be my playing area. There were no other students to move around and confuse the algorithm, and after 30 minutes of walking around the entire room, I ended up with the model shown above. There were quite a few holes, and some walls were duplicated as the room was so big the app could not understand it was still the same wall as 15 minutes earlier.

I also did a couple of separate scans the following days, slowly collecting more and more models. I did a scan in high resolution, focused on one specific corner, and decided to limit the player’s motion to only this part of the world. After collecting the most important locations and objects in high resolution, I simply placed them on top of each other, making the best resolution overlap and hide the rubbish geometry (the basic points, connecting to one another in 3d-space). There was still some geometry poking through, which was removed using the basic modelling plugin from Unreal and slicing parts of the model away.

The scans did contain too many holes to function as a normal room, so I decided to cover the entire level in dark and moody lighting, hiding the problems in the models but keeping them visible enough to be recognizable. Players who have been at the physical location will see the wall painting, and the green flooring, and instantly understand where they are in the digital space.

The tree used to make the scene a little more magical, lit in Unreal Engine

With the physical space being recognizable, I shifted my focus to make the vr-world prettier than the physical one was. I imported the tree model from my floating island project as it already had some nice looking leaves and hand-painted textures. I wanted to make the space feel alive, and organic. It had to change the world from a big old garage into a bit of a pretty environment you would feel comfortable retreating into.

This was the moment I tried to visualize the escapism people are applying themselves, hiding from the real world in a prettier and safer ‘replacement’: a place where they can enjoy the peace and quiet, and maybe play with other people. I made the tree extra dreamy and visible in the dark environment by placing a spotlight on top of it, bringing out the layers in the leaves and the shadows they create. I was still a bit disappointed how poorly visible it was in the entirely dark room as the leaves did block all the light from reaching the trunk. That is why I added another point light halfway down the trunk to make sure the trunk and roots were also fully visible, even from a distance.

I did need to draw the viewer’s attention to the trees hopping into existence, as in a VR experience the player could be anywhere in the world, looking at anything they wanted. In the past, I learned that it causes extreme nausea when you take control away of the camera while the player is in virtual reality: the body does not understand why their actions suddenly have no effect on their vision. I decided I would let a few birds fly in from the doorway which would become important later in the experience, making the player conscious there even is a door. The birds have a sound effect of wings flapping, so would draw attention, no matter where the player was.

 

 

[siteorigin_widget class=”WP_Widget_Custom_HTML”][/siteorigin_widget]

The bird was sculpted in Adobe Medium, slowly working it into a recognizable shape. It needed to look big and friendly. Something you would like to follow, like the white rabbit from Alice in Wonderland. After getting some feedback on the bird, I got the tip to paint it white, which looked way better than the colour scheme I originally came up with. I used vertex paint to draw extra feathers onto the wings, as I didn’t want the geometry to become too complex to run in real-time or animate.

It turned out to be quite challenging to import the model into both Blender and Unreal engine. The colours kept disappearing, or getting incredibly messy as the texture map simply did not want to load correctly with the UV-map, which is supposed to tell the software which part of the map is supposed to go where on the model. This was fixed by exporting the models using vector colours. This would add colour data to every single vertex (the individual points which when connected together create the mesh) and keep the data inside the fbx-source file. Unreal did need to have a material adjusted to make sure it would understand to look for the data and not use the default, blank material.

The texture to represent dirt, hand drawn in Affinity Photo

The bird got a simple wing-flapping animation in Blender using a very rough armature. Then it was time to move onto the next section: adding the river. This was going to create a natural border to the edge of the playing field, and allow for bottles to float around in it later in the experience. Those would represent the plastic waste problems we are currently facing in our environment.

I modelled the river in Blender using some quick sculpting of basic geometry, and actually had surprisingly few faces (3 or more vertices connected to one another) compared to the size of the object, keeping it nice and light to run. Once done, I drew a dirt texture in Affinty Photo using some spatter brushes, and by hand added their shadows for a cartoon-shader look. This image was then passed into Gimp to make it tillable, so the texture would seamlessly be repeated on larger surfaces. Once the image was exported it had a few problems being corrupted, but after resetting Gimp’s export setting the export succeeded, ready for the next step.

After pulling the texture through Materialize, it got a lot more depth

The texture still looked quite flat, as the hand-drawn texture did not have any bumps, or difference in height of smoothness: the entire texture interacted with the lighting in the exact same way. This makes the entire thing look like one big shiny piece of plastic with an image slapped on top of it. To make the ground look like it had more depth without adding too many computing resources I opened Materialize, which is a free software which lets us create different maps to assign roughness and height to the image without needing to model this myself. Materialize is quite easy to use and with a normal, height, and smoothness map, together with the texture I drew for colours, an awesome looking dirt material could be used in both Blender and Unreal.

 

The model for the river made in Blender with all of its faces shown

I wanted to have plastic bottles floating around in the river, so I quickly threw something together in Blender. As the object would be spawned quite far from the player, and be partially hidden by the river it didn’t make sense to spend more than 10 minutes on the model. I added a simple material which vaguely represented plastic, but the transparency made the bottles hard to spot. That’s why in the end the bottles ended up in the experience with a cloudy, white-blue colour.

The water was a bit boring, so I took the custom water material I made for the floating island, adjusted some parameters and let it run freely. The direction of the water flow is controlled by the UV map of the river, and as the water was streaming sideways I needed to rotate this map 90 degrees in Blender and export again. This made it flow in the right direction.

With the playing area now mostly finished, I made a quick little bash-kit to make a few factories. Just using some simple shapes I created the silhouette of the chimneys, buildings, and some railings, and by placing them against an emissive material in Unreal, they would be clearly visible for the player to enjoy. Even though these are simple shapes, they would trigger the imagination of the player to fill in the details. These buildings do not match the style of the rest of the world at all, making them feel like they don’t belong. This is exactly why they look the way they do: weird, not supposed to be there. Making a clean, white void dirty with smoke.

Speaking of smoke: this was created by drawing some noise in Gimp and turning it black and white. I boosted the contrast a little in Affinity and threw it into Unreal, where I made it a scrolling texture. This could be used as a material inside a particle system to create a cheap and simple smoke effect, as seen from the chimneys in the experience. I wanted the factories to be like a big reveal, suddenly pushing into the face of the players that VR might not always be as nice and sweet as we make it out to be.

I found a sound effect online for big electronic breakers turning on, and animated the emissive material using a level sequencer to turn it on and off. Then, I synchronized the animation to these sounds, conveying the idea that some big electrical room just powered up, somewhere in the world. The sounds and flashing lights seem to be noticeable enough for the player to draw their attention.

With the horror visions now down, I decided to pretty up the trees a little, as the ‘pretty’ aspect was not fully there yet. The magical VR-world needed to be made just a little more magical. So after some brainstorming, I decided to add fireflies. I created a material with a radial gradient and a bright emissive colour in the middle. I then added them to a particle system where I made them fade into existence when their lifetime started, before fading them to a darker orange and making them transparent near their death.  They spawn in a sphere at random, and have different, randomized sizes. They are given a random speed in a random direction, making the flies look like they are alive and simply enjoying the trees.

I made multiple adjustments until by trial and error a particle system came together which looked fun and playful. Something easily visible from a distance, but not overly big, which made the fireflies look like random floating orbs. Thanks to a noise multiplier, the flies could flicker and become brighter at random as well, adding just that extra bit of realism in an otherwise cartoon visualization.

[siteorigin_widget class=”WP_Widget_Media_Video”][/siteorigin_widget]
[siteorigin_widget class=”WP_Widget_Media_Video”][/siteorigin_widget]
[siteorigin_widget class=”WP_Widget_Media_Video”][/siteorigin_widget]

Last but not least, I needed to loop back how users can choose to hide into the VR world. I found a really cool 3d model on Sketchfab, made by Blackcube. I try to create all my assets myself, but I was running out of time and could not make anything as nice and recognizable as this model. The only problem was that, even though this model looked quite similar to my Quest 2 with Elite strap, the colour was wrong. The Quest 1 is black, while the 2 is supposed to be white. So after quickly throwing it into Blender and adjusting the colour it was ready to be imported into Unreal Engine, where I placed it on top of a pillar I also made for the floating island project.

The VR headset, proudly floating above its pillar
The correct hierarchy to make an object a pickup in the VR template of Unreal

The player is able to grab the headset, and as soon as it touched the player’s head it restarts the level, symbolizing how the loop of staying in VR can just go over and over and over again. The pickup-system did not work at the first try, but I had a good example how to fix this: I simply recycled a few components from the VR example map, and the movement system could be imported from there. As it turns out, the grab component needs to be a child, directly below the mesh. I assume this is because the component simply looks at its parent, takes its collision mesh and makes the player able to move it around when the grip-button is pressed.

However, there is still a bug in the object, as the player can pick up an object, but not let go of it. This wasn’t really a problem, though, as the player would simply grab and reset. There was no need to be able to put the headset down, and being able to let go also created the possibility of accidentally drop it and having no way out of the experience. Furthermore, the GrabComponent also needed to have “Auto enable” checked in the inspector to work.

With the entire experience having been built, all that was left to do was to connect the different puzzle pieces: every animation sequence needed to be triggered at the right moment, after a specific sentence in my voice-over. I split the entire audio file into sections, and played them one after the other using a blueprint at the location of the player. If the animation needed to be triggered, I could simply push this in between the two wires in the blueprint. This kept the system quite structured and organized, and easy to adjust if I wanted to have a dramatic silence for a few seconds. This whole loop would start as soon as the player pressed one of the face buttons on their controller, giving them control to get used to the headset and its weight first.

Part of the level blueprint, triggering the voice-over and animations at appropriate times

In theory, the project was finished now! But the environment still didn’t quite feel right. It was a bit boring, and I was afraid it might not be recognizable enough. So I made a 3d scan of the incredibly iconic blue rubbish bins we have, sparking inspiration and joy! (Just kidding, but this school location is the only one using them)

These are recognizable and placed close to the player to give them some more sense of where they are. It makes the playable area feel more ‘real’, not just one big square and bigger than it actually is. To add some ambient sound, I quickly modelled a radio to put on top of the rubbish bins and let it play some nice, calming music with some spacial audio effects. The sound source actually made sense in the 3d world as the player moved around in it, and as the player you are able to point out exactly where it is around you.

And with that the project was done! A quick export later, I was ready to hand it in. The project runs on the Steam VR platform, Oculus VR, and even OpenXR which means pretty much any headset would be compatible, as long as they use sensible control bindings. Certainly some interesting learning experiences, and being able to capture the world using an iPhone with LIDAR is for sure a technique I would use again in the future for quick and easy models!

Game development

Feedback tree

Feedback can make or break a person’s creative process: we all have a direct influence on other people’s ideas, their motivation and their passion for future projects. This project tries to visualize this exact process.

Read More »
Game development

Dordrecht 2050

This group project challenged us to build the city of the future: with new materials and technologies, what will be possible in the year 2050?

Read More »
Game development

Infinite power loss

Trapped in a game engine I did not enjoy using, I tried to find my way out by jumping ship from Unity to Unreal engine. The result: an escape room inside an abandoned space-ship

Read More »

testing testing