Introduction

https://denisegerrits.nl/

I am an engineer that is specialised in Unity and Unreal Engine and creating tooling for both engines. My internship and minor involved a lot of VR and I think it is an interesting technology. I am looking to further improve my Unreal, VR and tooling skills, as most of my experience came from projects with no artists and little designers.

Learning goals:

Learning outcome 1: As a game engineer, I want to learn how to create NPC behaviors in Unreal Engine. These behaviors will take shape of a behavior tree filled with movement actions and interactions (with other NPC’s or the player) I will know when I achieved this when I made at least 2 different ground- and flying NPC behavior trees, and NPC behavior is recognizable to a user as the real-life counterpart. I feel like this is useful and realistic to achieve during this half year long project and is useful for me personally; I have no prior experience with creating NPC’s in Unreal Engine yet, and most games do require them.

Learning outcome 2: As a game engineer, I want to learn how to make VR interactions that feel intuitive or ‘correct’ to end users, as this is harder to do compared to non-VR. I want to achieve this by creating interactions that requires the user to make gestures, aim, shake and rotate the controllers. I will know these interactions provide the user with correct feedback (thus achieved my goal) by conducting user tests: Whether the user knows what to do and ‘gets’ what they are doing, and have fun in the game. This will make me more experienced with VR-technologies, which is valuable for companies that I am looking to work for in the future.

Sprint 1

Sprint 1 was mainly spent on setting up the studio and assembling the teams. Induvidually, we were tasked with thinking of and writing down our learning goals. My learning goals are focused on leading, tooling and VR technology. This is why I am a good fit in the micro biodiversity project. It shows potential to develop VR mechanics. I am still a bit worried about the tooling learning goal. The project is still very early in development. This is why it is not clear yet what or if tooling is needed. I also took on the role of studio lead engineer. This is a new experience for me, and I expect this will be useful for me as a future professional. I still have to find a concrete way to make this learning goal measurable.

Sprint 2

For sprint 2, we went to the Natura Docet museum and developed initial concepts and corresponding prototypes.

After the client approved our concept I went on to mainly work on Unreal prototypes. The concept proposed a magnifying glass or zoom-in camera which would enable the player to get a magnified perspective on the creatures in the level. I found it very important to do a techical feasibility research on this mechanic, as I did not want to promise this feature without knowing whether it was possible to develop it.

I was happy to find out this feature (displaying a real-time magnified image) worked very well, and this was included into the prototype we showed the client.

A new problem arose; would we go for a true zoom (magnifying glass displays a zoomed in image) or a shifted perpective. (magnifying glass shows a different scene) I build both options and the team decided on a true zoom perspective for now. This was what was used in the final sprint 2 prototype, which also included the placing of insect hotels and mowing of grass.

Lastly, I set up the version control system. I went for a self-hosted solution since this does not impose any file size limitations on the project. I am running a GitLab Community server on the computer in XR-Lab. The only con is that the connection to the repository is slow when working from home, since you can only connect through edu-VPN.

Image 1 - True zoom

Image 1 - True zoom

Image 2 - Shifted perspective

Image 2 - Shifted perspective

Image 3 - Lawnmower with progress circle in sprint 2 prototype

Image 3 - Lawnmower with progress circle in sprint 2 prototype

Sprint 3

After sprint 2, it was important to refine our concept further. While the client was not unhappy with what we have shown so far, we are getting into a tricky situation when it comes to narrative, game mechanics and reward systems. The client wants to communicate to the player that the best behaviour for a human dealing with nature is to remain passive. This goes against against the principles of gamification. Gamification encourages the player to take action, and rewards them according to their performance. So this is a problem that we have to solve.

Furthermore, the complexity and sequenciality also needs to be taken into account. Given that the client wants a 2-5 minute experience, it might be too much to develop interactions as non-linear, since the player might have too little time or interest to get the message the game needs to communicate.

The plan moving forward is to develop the mechanics in the first place and figure out complexity, sequenciality and rewards later.

We decided that the meachanic of planting flowers seeds is one that will be in the game for sure. So, I started working on it. I wanted to make use of the VR controller because of my learning goal. I thought shaking an object would be fun for the player, since the position of the controller will correspont with the position of the visuals in the game. I found a tutorial (https://youtu.be/nwcGMdRke4o) that provided me with an actorcomponent that would detect VR shakes and trigger an event. This was perfect for the seedspreader.

Image 4  - Seedspreader gives off particles and plants small flowers by shaking it

Image 4 - Seedspreader gives off particles and plants small flowers by shaking it

At this point, further interactions are still very much up in the air. I don’t know what interaction I should develop. So for now, I moved on by starting with something entirely different; the animal behviors.

I had no prior experience developing NPC’s in Unreal Engine. I learned that Unreal has a robust system to create dynamic NPC behviors. To start learning this topic, I follow a basic tutorial about birds. (https://youtu.be/suCgHsZ8r6c)

Image 5 - Unreal behavior tree. this is used for ground animals.

Image 5 - Unreal behavior tree. this is used for ground animals.